WorldWideScience

Sample records for exploitation model performance

  1. The Ahuachapan geothermal field, El Salvador: Exploitation model, performance predictions, economic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ripperda, M.; Bodvarsson, G.S.; Lippmann, M.J.; Witherspoon, P.A.; Goranson, C.

    1991-05-01

    The Earth Sciences Division of Lawrence Berkeley Laboratory (LBL) is conducting a reservoir evaluation study of the Ahuachapan geothermal field in El Salvador. This work is being performed in cooperation with the Comision Ejecutiva Hidroelectrica del Rio Lempa (CEL) and the Los Alamos National Laboratory (LANL) with funding from the US Agency for International Development (USAID). This report describes the work done during the second year of the study (FY89--90). The first year's report included (1) the development of geological and conceptual models of the field, (2) the evaluation of the reservoir's initial thermodynamic and chemical conditions and their changes during exploitation, (3) the evaluation of interference test data and the observed reservoir pressure decline and (4) the development of a natural state model for the field. In the present report the results of reservoir engineering studies to evaluate different production-injection scenarios for the Ahuachapan geothermal field are discussed. The purpose of the work was to evaluate possible reservoir management options to enhance as well as to maintain the productivity of the field during a 30-year period (1990--2020). The ultimate objective was to determine the feasibility of increasing the electrical power output at Ahuachapan from the current level of about 50 MW{sub e} to the total installed capacity of 95 MW{sub e}. 20 refs., 75 figs., 10 tabs.

  2. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Sarje, Abhinav [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jacobsen, Douglas W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ringler, Todd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-01

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  3. PROPOSITIONS REGARDING MANAGERIAL PERFORMANCE IN AGRICULTURAL EXPLOITATIONS

    Directory of Open Access Journals (Sweden)

    COSMINA – SIMONA TOADER

    2008-05-01

    Full Text Available The management is the process used by the managers to realize the functions of an organization by establishing and fulfilling the objectives set to maximize the results of the organization. The management plans, organizes, coordinates, trains and controls a group of interconnected activities in order to realize the objectives. The management uses some managerial abilities and skills which should be developed and improved by studying and understanding the management concepts, principles, techniques and methods. Peter Drucker said that “the managers are those who practice management. They do not practice either the behavioral science or the statistics. These are only instruments for the managers. As a specific discipline, the management has its own basic problems, specific approaches, separate interests. A person, who knows only techniques and skills, without understanding the basics of the management, is not a manager. In the best case, that person is a very good technician”. In order to be a successful manager, he/she has to have some qualities. In this study we will present the qualities that a manager should have and some propositions concerning the managerial performance.

  4. Exploitation of parallelism in climate models

    Energy Technology Data Exchange (ETDEWEB)

    Baer, F.; Tribbia, J.J.; Williamson, D.L.

    1992-01-01

    The US Department of Energy (DOE) through its CHAMMP initiative, hopes to develop the capability to make meaningful regional climate forecasts on time scales exceeding a decade, such capability to be based on numerical prediction type models. We propose research to contribute to each of the specific items enumerated in the CHAMMP announcement (Notice 9103); i.e., to consider theoretical limits to prediction of climate and climate change on appropriate time scales, to develop new mathematical techniques to utilize massively parallel processors (MPP), to actually utilize MPP's as a research tool, and to develop improved representations of some processes essential to climate prediction. To explore these initiatives, we will exploit all available computing technology, and in particular MPP machines. We anticipate that significant improvements in modeling of climate on the decadal and longer time scales for regional space scales will result from our efforts. This report summarizes the activities of our group during a part of the first year's effort to meet the objectives stated in our proposal. We will comment on three research foci, time compression studies, subgrid scale model studies, and distributed climate ensemble studies and additional significant technical matters.

  5. Exploitation of Parallelism in Climate Models

    Energy Technology Data Exchange (ETDEWEB)

    Baer, F.; Tribbia, J.J.; Williamson, D.L.

    1999-03-01

    The US Department of Energy (DOE), through its CHAMMP initiative, hopes to develop the capability to make meaningful regional climate forecasts on time scales exceeding a decade, such capability to be based on numerical prediction type models. We propose research to contribute to each of the specific items enumerated in the CHAMMP announcement (Notice 91-3); i.e., to consider theoretical limits to prediction of climate and climate change on appropriate time scales, to develop new mathematical techniques to utilize massively parallel processors (MPP), to actually utilize MPPs as a research tool, and to develop improved representations of some processes essential to climate prediction. In particular, our goals are to: (1) Reconfigure the prediction equations such that the time iteration process can be compressed by use of MMP architecture, and to develop appropriate algorithms. (2) Develop local subgrid scale models which can provide time and space dependent parameterization for a state- of-the-art climate model to minimize the scale resolution necessary for a climate model, and to utilize MPP capability to simultaneously integrate those subgrid models and their statistics. (3) Capitalize on the MPP architecture to study the inherent ensemble nature of the climate problem. By careful choice of initial states, many realizations of the climate system can be determined concurrently and more realistic assessments of the climate prediction can be made in a realistic time frame. To explore these initiatives, we will exploit all available computing technology, and in particular MPP machines. We anticipate that significant improvements in modeling of climate on the decadal and longer time scales for regional space scales will result from our efforts.

  6. Exploration and Exploitation Fit and Performance in International Strategic Alliances

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard; Gudergan, Siegfried

    2012-01-01

    Exploration and exploitation constitute two separate, potentially conflicting strategic choices for firms engaged in international strategic alliances. Our empirical study challenges the ambidexterity argument and demonstrates that exploration and exploitation are separate (though not necessarily...... antithetical) strategies with different antecedents and performance consequences. Our results show that while competency similarity is conducive to upstream innovative performance, prior experience with the partner is potentially damaging for this type of performance and trust and cultural distance do not play...... significant roles. When the motive is efficiency and downstream market performance, prior experience with the partner instead is beneficial, as are high levels of trust and low levels of cultural distance. These findings have key implications for literature on strategic fit and alliance performance....

  7. [Ecotourism exploitation model in Bita Lake Natural Reserve of Yunnan].

    Science.gov (United States)

    Yang, G; Wang, Y; Zhong, L

    2000-12-01

    Bita lake provincial natural reserve is located in Shangri-La region of North-western Yunnan, and was set as a demonstrating area for ecotourism exploitation in 1998. After a year's exploitation construction and half a year's operation as a branch of the 99' Kunming International Horticulture Exposition to accept tourists, it was proved that the ecotourism demonstrating area attained four integrated functions of ecotourism, i.e., tourism, protection, poverty clearing and environment education. Five exploitation and management models including function zoned exploitation model, featured tourism communication model signs system designing model, local Tibetan family reception model and environmental monitoring model, were also successful, which were demonstrated and spreaded to the whole province. Bita lake provincial natural reserve could be a good sample for the ecotourism exploitation natural reserves of the whole country.

  8. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  9. Optimization Models for Petroleum Field Exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Jonsbraaten, Tore Wiig

    1998-12-31

    This thesis presents and discusses various models for optimal development of a petroleum field. The objective of these optimization models is to maximize, under many uncertain parameters, the project`s expected net present value. First, an overview of petroleum field optimization is given from the point of view of operations research. Reservoir equations for a simple reservoir system are derived and discretized and included in optimization models. Linear programming models for optimizing production decisions are discussed and extended to mixed integer programming models where decisions concerning platform, wells and production strategy are optimized. Then, optimal development decisions under uncertain oil prices are discussed. The uncertain oil price is estimated by a finite set of price scenarios with associated probabilities. The problem is one of stochastic mixed integer programming, and the solution approach is to use a scenario and policy aggregation technique developed by Rockafellar and Wets although this technique was developed for continuous variables. Stochastic optimization problems with focus on problems with decision dependent information discoveries are also discussed. A class of ``manageable`` problems is identified and an implicit enumeration algorithm for finding optimal decision policy is proposed. Problems involving uncertain reservoir properties but with a known initial probability distribution over possible reservoir realizations are discussed. Finally, a section on Nash-equilibrium and bargaining in an oil reservoir management game discusses the pool problem arising when two lease owners have access to the same underlying oil reservoir. Because the oil tends to migrate, both lease owners have incentive to drain oil from the competitors part of the reservoir. The discussion is based on a numerical example. 107 refs., 31 figs., 14 tabs.

  10. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2003-10-01

    increases, the remaining oil saturation decreases. This is evident from log and core analysis. (5) Using a compositional simulator, we are able to reproduce the important reservoir characteristics by assuming a two layer model. One layer is high permeability region containing water and the other layer is low permeability region containing mostly oil. The results are further verified by using a dual porosity model. Assuming that most of the volatile oil is contained in the matrix and the water is contained in the fractures, we are able to reproduce important reservoir performance characteristics. (6) Evaluation of secondary mechanisms indicates that CO{sub 2} flooding is potentially a viable option if CO{sub 2} is available at reasonable price. We have conducted detailed simulation studies to verify the effectiveness of CO{sub 2} huff-n-puff process. We are in the process of conducting additional lab tests to verify the efficacy of the same displacement. (7) Another possibility of improving the oil recovery is to inject surfactants to change the near well bore wettability of the rock from oil wet to water wet. By changing the wettability, we may be able to retard the water flow and hence improve the oil recovery as a percentage of total fluid produced. If surfactant is reasonably priced, other possibility is also to use huff-n-puff process using surfactants. Laboratory experiments are promising, and additional investigation continues. (8) Preliminary economic evaluation indicates that vertical wells outperform horizontal wells. Future work in the project would include: (1) Build multi-well numerical model to reproduce overall reservoir performance rather than individual well performance. Special emphasis will be placed on hydrodynamic connectivity between wells. (2) Collect data from adjacent Hunton reservoirs to validate our understanding of what makes it a productive reservoir. (3) Develop statistical methods to rank various reservoirs in Hunton formation. This will allow

  11. Exploring, exploiting and evolving diversity of aquatic ecosystem models

    DEFF Research Database (Denmark)

    Janssen, Annette B. G.; Arhonditsis, George B.; Beusen, Arthur

    2015-01-01

    Here, we present a community perspective on how to explore, exploit and evolve the diversity in aquatic ecosystem models. These models play an important role in understanding the functioning of aquatic ecosystems, filling in observation gaps and developing effective strategies for water quality...... management. In this spirit, numerous models have been developed since the 1970s. We set off to explore model diversity by making an inventory among 42 aquatic ecosystem modellers, by categorizing the resulting set of models and by analysing them for diversity. We then focus on how to exploit model diversity...... by comparing and combining different aspects of existing models. Finally, we discuss how model diversity came about in the past and could evolve in the future. Throughout our study, we use analogies from biodiversity research to analyse and interpret model diversity. We recommend to make models publicly...

  12. Exploring, exploiting and evolving diversity of aquatic ecosystem models

    NARCIS (Netherlands)

    Janssen, A.B.G.; Arhonditsis, G.B.; Beusen, Arthur; Bolding, Karsten; Bruce, Louise; Bruggeman, Jorn; Couture, Raoul Marie; Downing, Andrea S.; Alex Elliott, J.; Frassl, M.A.; Gal, Gideon; Gerla, Daan J.; Hipsey, M.R.; Hu, Fenjuan; Ives, S.C.; Janse, J.H.; Jeppesen, Erik; Jöhnk, K.D.; Kneis, David; Kong, Xiangzhen; Kuiper, J.J.; Lehmann, M.K.; Lemmen, Carsten; Özkundakci, Deniz; Petzoldt, Thomas; Rinke, Karsten; Robson, B.J.; Sachse, René; Schep, S.A.; Schmid, Martin; Scholten, Huub; Teurlincx, Sven; Trolle, Dennis; Troost, T.A.; Dam, Van A.A.; Gerven, Van L.P.A.; Weijerman, Mariska; Wells, S.A.; Mooij, W.M.

    2015-01-01

    Here, we present a community perspective on how to explore, exploit and evolve the diversity in aquatic ecosystem models. These models play an important role in understanding the functioning of aquatic ecosystems, filling in observation gaps and developing effective strategies for water quality

  13. Exploiting communication concurrency on high performance computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Chaimov, Nicholas [Univ. of Oregon, Eugene, OR (United States); Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Iancu, Costin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-01-01

    Although logically available, applications may not exploit enough instantaneous communication concurrency to maximize hardware utilization on HPC systems. This is exacerbated in hybrid programming models such as SPMD+OpenMP. We present the design of a "multi-threaded" runtime able to transparently increase the instantaneous network concurrency and to provide near saturation bandwidth, independent of the application configuration and dynamic behavior. The runtime forwards communication requests from application level tasks to multiple communication servers. Our techniques alleviate the need for spatial and temporal application level message concurrency optimizations. Experimental results show improved message throughput and bandwidth by as much as 150% for 4KB bytes messages on InfiniBand and by as much as 120% for 4KB byte messages on Cray Aries. For more complex operations such as all-to-all collectives, we observe as much as 30% speedup. This translates into 23% speedup on 12,288 cores for a NAS FT implemented using FFTW. We also observe as much as 76% speedup on 1,500 cores for an already optimized UPC+OpenMP geometric multigrid application using hybrid parallelism.

  14. Exploiting topic modeling to boost metagenomic reads binning.

    Science.gov (United States)

    Zhang, Ruichang; Cheng, Zhanzhan; Guan, Jihong; Zhou, Shuigeng

    2015-01-01

    With the rapid development of high-throughput technologies, researchers can sequence the whole metagenome of a microbial community sampled directly from the environment. The assignment of these metagenomic reads into different species or taxonomical classes is a vital step for metagenomic analysis, which is referred to as binning of metagenomic data. In this paper, we propose a new method TM-MCluster for binning metagenomic reads. First, we represent each metagenomic read as a set of "k-mers" with their frequencies occurring in the read. Then, we employ a probabilistic topic model -- the Latent Dirichlet Allocation (LDA) model to the reads, which generates a number of hidden "topics" such that each read can be represented by a distribution vector of the generated topics. Finally, as in the MCluster method, we apply SKWIC -- a variant of the classical K-means algorithm with automatic feature weighting mechanism to cluster these reads represented by topic distributions. Experiments show that the new method TM-MCluster outperforms major existing methods, including AbundanceBin, MetaCluster 3.0/5.0 and MCluster. This result indicates that the exploitation of topic modeling can effectively improve the binning performance of metagenomic reads.

  15. Enhanced surrogate models for statistical design exploiting space mapping technology

    DEFF Research Database (Denmark)

    Koziel, Slawek; Bandler, John W.; Mohamed, Achmed S.

    2005-01-01

    We present advances in microwave and RF device modeling exploiting Space Mapping (SM) technology. We propose new SM modeling formulations utilizing input mappings, output mappings, frequency scaling and quadratic approximations. Our aim is to enhance circuit models for statistical analysis...... and yield-driven design. We illustrate our results using a capacitively-loaded two-section impedance transformer, a single-resonator waveguide filter and a six-section H-plane waveguide filter....

  16. Space Mapping Optimization of Microwave Circuits Exploiting Surrogate Models

    DEFF Research Database (Denmark)

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj

    2000-01-01

    is a convex combination of a mapped coarse model and a linearized fine model. It exploits, in a novel way, a linear frequency-sensitive mapping. During the optimization iterates, the coarse and fine models are simulated at different sets of frequencies. This approach is shown to be especially powerful......A powerful new space-mapping (SM) optimization algorithm is presented in this paper. It draws upon recent developments in both surrogate model-based optimization and modeling of microwave devices, SM optimization is formulated as a general optimization problem of a surrogate model. This model...

  17. Mathematical modeling of the behavior of geothermal systems under exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Bodvarsson, G.S.

    1982-01-01

    Analytical and numerical methods have been used in this investigation to model the behavior of geothermal systems under exploitation. The work is divided into three parts: (1) development of a numerical code, (2) theoretical studies of geothermal systems, and (3) field applications. A new single-phase three-dimensional simulator, capable of solving heat and mass flow problems in a saturated, heterogeneous porous or fractured medium has been developed. The simulator uses the integrated finite difference method for formulating the governing equations and an efficient sparse solver for the solution of the linearized equations. In the theoretical studies, various reservoir engineering problems have been examined. These include (a) well-test analysis, (b) exploitation strategies, (c) injection into fractured rocks, and (d) fault-charged geothermal reservoirs.

  18. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2002-03-31

    The West Carney Field in Lincoln County, Oklahoma is one of few newly discovered oil fields in Oklahoma. Although profitable, the field exhibits several unusual characteristics. These include decreasing water-oil ratios, decreasing gas-oil ratios, decreasing bottomhole pressures during shut-ins in some wells, and transient behavior for water production in many wells. This report explains the unusual characteristics of West Carney Field based on detailed geological and engineering analyses. We propose a geological history that explains the presence of mobile water and oil in the reservoir. The combination of matrix and fractures in the reservoir explains the reservoir's flow behavior. We confirm our hypothesis by matching observed performance with a simulated model and develop procedures for correlating core data to log data so that the analysis can be extended to other, similar fields where the core coverage may be limited.

  19. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2005-02-01

    Hunton formation in Oklahoma has displayed some unique production characteristics. These include high initial water-oil and gas-oil ratios, decline in those ratios over time and temporary increase in gas-oil ratio during pressure build up. The formation also displays highly complex geology, but surprising hydrodynamic continuity. This report addresses three key issues related specifically to West Carney Hunton field and, in general, to any other Hunton formation exhibiting similar behavior: (1) What is the primary mechanism by which oil and gas is produced from the field? (2) How can the knowledge gained from studying the existing fields can be extended to other fields which have the potential to produce? (3) What can be done to improve the performance of this reservoir? We have developed a comprehensive model to explain the behavior of the reservoir. By using available production, geological, core and log data, we are able to develop a reservoir model which explains the production behavior in the reservoir. Using easily available information, such as log data, we have established the parameters needed for a field to be economically successful. We provide guidelines in terms of what to look for in a new field and how to develop it. Finally, through laboratory experiments, we show that surfactants can be used to improve the hydrocarbons recovery from the field. In addition, injection of CO{sub 2} or natural gas also will help us recover additional oil from the field.

  20. Ambidextrous Leadership and Employees' Self-reported Innovative Performance: The Role of Exploration and Exploitation Behaviors

    Science.gov (United States)

    Zacher, Hannes; Robinson, Alecia J.; Rosing, Kathrin

    2016-01-01

    The ambidexterity theory of leadership for innovation proposes that leaders' opening and closing behaviors positively predict employees' exploration and exploitation behaviors, respectively. The interaction of exploration and exploitation behaviors, in turn, is assumed to influence employee innovative performance, such that innovative performance…

  1. Exploitation of parallelism in climate models. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Baer, Ferdinand; Tribbia, Joseph J.; Williamson, David L.

    2001-02-05

    This final report includes details on the research accomplished by the grant entitled 'Exploitation of Parallelism in Climate Models' to the University of Maryland. The purpose of the grant was to shed light on (a) how to reconfigure the atmospheric prediction equations such that the time iteration process could be compressed by use of MPP architecture; (b) how to develop local subgrid scale models which can provide time and space dependent parameterization for a state-of-the-art climate model to minimize the scale resolution necessary for a climate model, and to utilize MPP capability to simultaneously integrate those subgrid models and their statistics; and (c) how to capitalize on the MPP architecture to study the inherent ensemble nature of the climate problem. In the process of addressing these issues, we created parallel algorithms with spectral accuracy; we developed a process for concurrent climate simulations; we established suitable model reconstructions to speed up computation; we identified and tested optimum realization statistics; we undertook a number of parameterization studies to better understand model physics; and we studied the impact of subgrid scale motions and their parameterization in atmospheric models.

  2. Exploiting graph kernels for high performance biomedical relation extraction.

    Science.gov (United States)

    Panyam, Nagesh C; Verspoor, Karin; Cohn, Trevor; Ramamohanarao, Kotagiri

    2018-01-30

    Relation extraction from biomedical publications is an important task in the area of semantic mining of text. Kernel methods for supervised relation extraction are often preferred over manual feature engineering methods, when classifying highly ordered structures such as trees and graphs obtained from syntactic parsing of a sentence. Tree kernels such as the Subset Tree Kernel and Partial Tree Kernel have been shown to be effective for classifying constituency parse trees and basic dependency parse graphs of a sentence. Graph kernels such as the All Path Graph kernel (APG) and Approximate Subgraph Matching (ASM) kernel have been shown to be suitable for classifying general graphs with cycles, such as the enhanced dependency parse graph of a sentence. In this work, we present a high performance Chemical-Induced Disease (CID) relation extraction system. We present a comparative study of kernel methods for the CID task and also extend our study to the Protein-Protein Interaction (PPI) extraction task, an important biomedical relation extraction task. We discuss novel modifications to the ASM kernel to boost its performance and a method to apply graph kernels for extracting relations expressed in multiple sentences. Our system for CID relation extraction attains an F-score of 60%, without using external knowledge sources or task specific heuristic or rules. In comparison, the state of the art Chemical-Disease Relation Extraction system achieves an F-score of 56% using an ensemble of multiple machine learning methods, which is then boosted to 61% with a rule based system employing task specific post processing rules. For the CID task, graph kernels outperform tree kernels substantially, and the best performance is obtained with APG kernel that attains an F-score of 60%, followed by the ASM kernel at 57%. The performance difference between the ASM and APG kernels for CID sentence level relation extraction is not significant. In our evaluation of ASM for the PPI task, ASM

  3. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    Energy Technology Data Exchange (ETDEWEB)

    Czuchlewski, Kristina Rodriguez [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hart, William E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of human perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into

  4. Acquiring and Exploiting Rich Causal Models for Robust Decision Making

    Science.gov (United States)

    2012-01-01

    snake robot in a maze -like world. The models learns a set of motion primitives that look like the serpentine motions of real snakes – because those are 8...Figure 8: On the left: a simple maze world with shared structure between states. States are fully observable, but are augmented with information about...must navigate a maze (left); the right shows performance (vertical axis) as a function of time (horizontal axis) for different algo- rithms. BOSS

  5. Exploiting Textured 3D Models for Developing Serious Games

    Directory of Open Access Journals (Sweden)

    G. Kontogianni

    2015-08-01

    Full Text Available Digital technologies have affected significantly many fields of computer graphics such as Games and especially the field of the Serious Games. These games are usually used for educational proposes in many fields such as Health Care, Military applications, Education, Government etc. Especially Digital Cultural Heritage is a scientific area that Serious Games are applied and lately many applications appear in the related literature. Realistic 3D textured models which have been produced using different photogrammetric methods could be a useful tool for the creation of Serious Game applications in order to make the final result more realistic and close to the reality. The basic goal of this paper is how 3D textured models which are produced by photogrammetric methods can be useful for developing a more realistic environment of a Serious Game. The application of this project aims at the creation of an educational game for the Ancient Agora of Athens. The 3D models used vary not only as far as their production methods (i.e. Time of Flight laser scanner, Structure from Motion, Virtual historical reconstruction etc. is concerned, but also as far as their era as some of them illustrated according to their existing situation and some others according to how these monuments looked like in the past. The Unity 3D® game developing environment was used for creating this application, in which all these models were inserted in the same file format. For the application two diachronic virtual tours of the Athenian Agora were produced. The first one illustrates the Agora as it is today and the second one at the 2nd century A.D. Finally the future perspective for the evolution of this game is presented which includes the addition of some questions that the user will be able to answer. Finally an evaluation is scheduled to be performed at the end of the project.

  6. Television production, Funding Models and Exploitation of Content

    Directory of Open Access Journals (Sweden)

    Gillian Doyle

    2016-07-01

    Full Text Available The rise of digital platforms has transformative implications for strategies of financing media production and for exploitation of the economic value in creative content. In the television industry, changes in technologies for distribution and the emergence of SVOD services such as Netflix are gradually shifting audiences and financial power away from broadcasters while at the same time creating unprecedented opportunities for programme-makers.  Drawing on findings from recent RCUK-funded research, this article examines how these shifts are affecting production financing and the economics of supplying television content.  In particular, it focuses on how changes in the dynamics of rights markets and in strategic approaches towards the financing of television production might mean for markets, industries and for policies intended to support the economic sustainability of independent television content production businesses.

  7. Economic modelling under conditions of exploitation of cohesive construction minerals

    Directory of Open Access Journals (Sweden)

    Milan Mikoláš

    2011-01-01

    Full Text Available Managers of mining companies use for decision-making on optimization of manufacturing processes advanced modelling methods and simulations on computers. The article proposes and analyses the model of a mining company production cycle consisting of a three-dimensional quarry model, technology model and economic-mathematical model. Based on the latter model an economic simulation model of a quarry has been created in the MS Excel program currently available on all personal computers, which measures outputs in the form of changes in total and unit costs according to the generic classification of costs in response to changes in inputs in the form of parameters of technology equipment and other operating parameters. Managers use the economic simulation model of quarry as decision support to increase profitability or improve competitiveness of their product from the sector of construction minerals.

  8. Exploiting Modelling and Simulation in Support of Cyber Defence

    NARCIS (Netherlands)

    Klaver, M.H.A.; Boltjes, B.; Croom-Jonson, S.; Jonat, F.; Çankaya, Y.

    2014-01-01

    The rapidly evolving environment of Cyber threats against the NATO Alliance has necessitated a renewed focus on the development of Cyber Defence policy and capabilities. The NATO Modelling and Simulation Group is looking for ways to leverage Modelling and Simulation experience in research, analysis

  9. Exploiting Instability: A Model for Managing Organizational Change.

    Science.gov (United States)

    Frank, Debra; Rocks, William

    In response to decreased levels of funding and declining enrollments, increased competition, and major technological advances, Allegany Community College, in Maryland, has developed a model for managing organizational change. The model incorporates the following four components for effective transition and change: conceptualization; communication;…

  10. Exploiting linkage disequilibrium in statistical modelling in quantitative genomics

    DEFF Research Database (Denmark)

    Wang, Lei

    Alleles at two loci are said to be in linkage disequilibrium (LD) when they are correlated or statistically dependent. Genomic prediction and gene mapping rely on the existence of LD between gentic markers and causul variants of complex traits. In the first part of the thesis, a novel method...... the recently proposed antedependence models, which treat neighbouring marker effects as correlated; another approach involves use of haplotype block information derived using the program Beagle. The overall conclusion is that taking LD information into account in genomic prediction models potentially improves...

  11. Using CASE to Exploit Process Modeling in Technology Transfer

    Science.gov (United States)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  12. The German gas hydrate initiative sugar : innovative exploitation techniques, numerical modelling, and laboratory experiments

    Energy Technology Data Exchange (ETDEWEB)

    Haeckel, M. [IFM-GEOMAR, Kiel (Germany)

    2010-07-01

    The German gas hydrate initiative and innovative exploitation techniques, numerical modelling, and laboratory experiments were discussed in this presentation. The main objectives of the sugar project are to model spatial and temporal distribution of sub-seafloor gas hydrates; to constrain major control parameters of hydrate formation; and to develop a new tool for three-dimensional prediction of gas hydrate deposits. The overall purpose is to predict exploitable gas hydrate deposits. Several illustrations were offered, including basin/geological modelling; properties of gas hydrate layers; and nested model/local grid refinement. The future of basin modeling was also discussed with particular reference to testing and calibration of software packages. Options for carbon dioxide storage and methane hydrate exploitation as well as critical issues that needed to be addressed were also outlined. Several reservoir modelling projects were presented. Scenarios and technical concepts were discussed. The presentation concluded with a summary of international co-operation efforts in this area. tabs., figs.

  13. Exploiting Language Models to Classify Events from Twitter

    Directory of Open Access Journals (Sweden)

    Duc-Thuan Vo

    2015-01-01

    Full Text Available Classifying events is challenging in Twitter because tweets texts have a large amount of temporal data with a lot of noise and various kinds of topics. In this paper, we propose a method to classify events from Twitter. We firstly find the distinguishing terms between tweets in events and measure their similarities with learning language models such as ConceptNet and a latent Dirichlet allocation method for selectional preferences (LDA-SP, which have been widely studied based on large text corpora within computational linguistic relations. The relationship of term words in tweets will be discovered by checking them under each model. We then proposed a method to compute the similarity between tweets based on tweets’ features including common term words and relationships among their distinguishing term words. It will be explicit and convenient for applying to k-nearest neighbor techniques for classification. We carefully applied experiments on the Edinburgh Twitter Corpus to show that our method achieves competitive results for classifying events.

  14. Exploitation of Semantic Building Model in Indoor Navigation Systems

    Science.gov (United States)

    Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min

    2009-04-01

    There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication

  15. Investigating performance variability of processing, exploitation, and dissemination using a socio-technical systems analysis approach

    Science.gov (United States)

    Danczyk, Jennifer; Wollocko, Arthur; Farry, Michael; Voshell, Martin

    2016-05-01

    Data collection processes supporting Intelligence, Surveillance, and Reconnaissance (ISR) missions have recently undergone a technological transition accomplished by investment in sensor platforms. Various agencies have made these investments to increase the resolution, duration, and quality of data collection, to provide more relevant and recent data to warfighters. However, while sensor improvements have increased the volume of high-resolution data, they often fail to improve situational awareness and actionable intelligence for the warfighter because it lacks efficient Processing, Exploitation, and Dissemination and filtering methods for mission-relevant information needs. The volume of collected ISR data often overwhelms manual and automated processes in modern analysis enterprises, resulting in underexploited data, insufficient, or lack of answers to information requests. The outcome is a significant breakdown in the analytical workflow. To cope with this data overload, many intelligence organizations have sought to re-organize their general staffing requirements and workflows to enhance team communication and coordination, with hopes of exploiting as much high-value data as possible and understanding the value of actionable intelligence well before its relevance has passed. Through this effort we have taken a scholarly approach to this problem by studying the evolution of Processing, Exploitation, and Dissemination, with a specific focus on the Army's most recent evolutions using the Functional Resonance Analysis Method. This method investigates socio-technical processes by analyzing their intended functions and aspects to determine performance variabilities. Gaps are identified and recommendations about force structure and future R and D priorities to increase the throughput of the intelligence enterprise are discussed.

  16. Models for solid oxide fuel cell systems exploitation of models hierarchy for industrial design of control and diagnosis strategies

    CERN Document Server

    Marra, Dario; Polverino, Pierpaolo; Sorrentino, Marco

    2016-01-01

    This book presents methodologies for optimal design of control and diagnosis strategies for Solid Oxide Fuel Cell systems. A key feature of the methodologies presented is the exploitation of modelling tools that balance accuracy and computational burden.

  17. Shapes and excitations of heavy nuclei: Exploiting the simplicities of algebraic models

    Energy Technology Data Exchange (ETDEWEB)

    Casten, R.F. (Brookhaven National Lab., Upton, NY (United States) Koeln Univ. (Germany). Inst. fuer Kernphysik)

    1991-01-01

    Despite years of study there are still major unanswered questions concerning the shapes of medium and heavy nuclei and the nature of their intrinsic excitations. Some of these questions may profitably be addressed by exploiting the simplicities inherent in algebraic models. Examples, using the IBA, focusing on axial asymmetry, the nature of {beta} and {gamma} vibrations, and octupole correlations will be briefly discussed.

  18. Shapes and excitations of heavy nuclei: Exploiting the simplicities of algebraic models

    Energy Technology Data Exchange (ETDEWEB)

    Casten, R.F. [Brookhaven National Lab., Upton, NY (United States)]|[Koeln Univ. (Germany). Inst. fuer Kernphysik

    1991-12-31

    Despite years of study there are still major unanswered questions concerning the shapes of medium and heavy nuclei and the nature of their intrinsic excitations. Some of these questions may profitably be addressed by exploiting the simplicities inherent in algebraic models. Examples, using the IBA, focusing on axial asymmetry, the nature of {beta} and {gamma} vibrations, and octupole correlations will be briefly discussed.

  19. Reconstruction of walleye exploitation based on angler diary records and a model of predicted catches.

    Science.gov (United States)

    Willms, Allan R; Green, David M

    2007-11-01

    The walleye population in Canadarago Lake, New York, was 81-95% exploited in the 1988 fishing season, the year in which a previous restriction on the length and number of legally harvestable fish was liberalized. Using diary records from a subset of fishermen, growth estimates, and an estimate of the walleye population in the following year, a method is developed to reconstruct the fish population back to the spring of 1988 and thus determine the exploitation rate. The method is based on a model of diary catches that partitions time and fish length into a set of cells and relates predicted catches and population sizes in these cells. The method's sensitivity to the partitioning scheme, the growth estimates, and the diary data is analyzed. The method could be employed in other fish exploitation analyses and demonstrates the use of inexpensive angler-collected data in fisheries management.

  20. Exploration and exploitation within SMEs: connecting the CEO's cognitive style to product innovation performance.

    NARCIS (Netherlands)

    de Visser, Matthias; Faems, D.L.M.; van den Top, Peter

    2011-01-01

    Previous research on exploration and exploitation focuses on the firm and business unit level. Therefore, conceptual and empirically validated understanding about exploration and exploitation at the individual level of analysis is scarce. This paper addresses this gap in the literature by

  1. Documenting, modelling and exploiting P300 amplitude changes due to variable target delays in Donchin's speller

    Science.gov (United States)

    Citi, Luca; Poli, Riccardo; Cinel, Caterina

    2010-10-01

    The P300 is an endogenous event-related potential (ERP) that is naturally elicited by rare and significant external stimuli. P300s are used increasingly frequently in brain-computer interfaces (BCIs) because the users of ERP-based BCIs need no special training. However, P300 waves are hard to detect and, therefore, multiple target stimulus presentations are needed before an interface can make a reliable decision. While significant improvements have been made in the detection of P300s, no particular attention has been paid to the variability in shape and timing of P300 waves in BCIs. In this paper we start filling this gap by documenting, modelling and exploiting a modulation in the amplitude of P300s related to the number of non-targets preceding a target in a Donchin speller. The basic idea in our approach is to use an appropriately weighted average of the responses produced by a classifier during multiple stimulus presentations, instead of the traditional plain average. This makes it possible to weigh more heavily events that are likely to be more informative, thereby increasing the accuracy of classification. The optimal weights are determined through a mathematical model that precisely estimates the accuracy of our speller as well as the expected performance improvement w.r.t. the traditional approach. Tests with two independent datasets show that our approach provides a marked statistically significant improvement in accuracy over the top-performing algorithm presented in the literature to date. The method and the theoretical models we propose are general and can easily be used in other P300-based BCIs with minimal changes.

  2. DEVELOPMENT OF RESERVOIR CHARACTERIZATION TECHNIQUES AND PRODUCTION MODELS FOR EXPLOITING NATURALLY FRACTURED RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Michael L. Wiggins; Raymon L. Brown; Faruk Civan; Richard G. Hughes

    2002-12-31

    For many years, geoscientists and engineers have undertaken research to characterize naturally fractured reservoirs. Geoscientists have focused on understanding the process of fracturing and the subsequent measurement and description of fracture characteristics. Engineers have concentrated on the fluid flow behavior in the fracture-porous media system and the development of models to predict the hydrocarbon production from these complex systems. This research attempts to integrate these two complementary views to develop a quantitative reservoir characterization methodology and flow performance model for naturally fractured reservoirs. The research has focused on estimating naturally fractured reservoir properties from seismic data, predicting fracture characteristics from well logs, and developing a naturally fractured reservoir simulator. It is important to develop techniques that can be applied to estimate the important parameters in predicting the performance of naturally fractured reservoirs. This project proposes a method to relate seismic properties to the elastic compliance and permeability of the reservoir based upon a sugar cube model. In addition, methods are presented to use conventional well logs to estimate localized fracture information for reservoir characterization purposes. The ability to estimate fracture information from conventional well logs is very important in older wells where data are often limited. Finally, a desktop naturally fractured reservoir simulator has been developed for the purpose of predicting the performance of these complex reservoirs. The simulator incorporates vertical and horizontal wellbore models, methods to handle matrix to fracture fluid transfer, and fracture permeability tensors. This research project has developed methods to characterize and study the performance of naturally fractured reservoirs that integrate geoscience and engineering data. This is an important step in developing exploitation strategies for

  3. Inference of gene regulatory networks with sparse structural equation models exploiting genetic perturbations.

    Directory of Open Access Journals (Sweden)

    Xiaodong Cai

    Full Text Available Integrating genetic perturbations with gene expression data not only improves accuracy of regulatory network topology inference, but also enables learning of causal regulatory relations between genes. Although a number of methods have been developed to integrate both types of data, the desiderata of efficient and powerful algorithms still remains. In this paper, sparse structural equation models (SEMs are employed to integrate both gene expression data and cis-expression quantitative trait loci (cis-eQTL, for modeling gene regulatory networks in accordance with biological evidence about genes regulating or being regulated by a small number of genes. A systematic inference method named sparsity-aware maximum likelihood (SML is developed for SEM estimation. Using simulated directed acyclic or cyclic networks, the SML performance is compared with that of two state-of-the-art algorithms: the adaptive Lasso (AL based scheme, and the QTL-directed dependency graph (QDG method. Computer simulations demonstrate that the novel SML algorithm offers significantly better performance than the AL-based and QDG algorithms across all sample sizes from 100 to 1,000, in terms of detection power and false discovery rate, in all the cases tested that include acyclic or cyclic networks of 10, 30 and 300 genes. The SML method is further applied to infer a network of 39 human genes that are related to the immune function and are chosen to have a reliable eQTL per gene. The resulting network consists of 9 genes and 13 edges. Most of the edges represent interactions reasonably expected from experimental evidence, while the remaining may just indicate the emergence of new interactions. The sparse SEM and efficient SML algorithm provide an effective means of exploiting both gene expression and perturbation data to infer gene regulatory networks. An open-source computer program implementing the SML algorithm is freely available upon request.

  4. Balanced Exploration and Exploitation Model search for efficient epipolar geometry estimation.

    Science.gov (United States)

    Goshen, Liran; Shimshoni, Ilan

    2008-07-01

    The estimation of the epipolar geometry is especially difficult when the putative correspondences include a low percentage of inlier correspondences and/or a large subset of the inliers is consistent with a degenerate configuration of the epipolar geometry that is totally incorrect. This work presents the Balanced Exploration and Exploitation Model Search (BEEM) algorithm that works very well especially for these difficult scenes. The algorithm handles these two problems in a unified manner. It includes the following main features: (1) Balanced use of three search techniques: global random exploration, local exploration near the current best solution and local exploitation to improve the quality of the model. (2) Exploits available prior information to accelerate the search process. (3) Uses the best found model to guide the search process, escape from degenerate models and to define an efficient stopping criterion. (4) Presents a simple and efficient method to estimate the epipolar geometry from two SIFT correspondences. (5) Uses the locality-sensitive hashing (LSH) approximate nearest neighbor algorithm for fast putative correspondences generation. The resulting algorithm when tested on real images with or without degenerate configurations gives quality estimations and achieves significant speedups compared to the state of the art algorithms.

  5. A low memory cost model based reconstruction algorithm exploiting translational symmetry for photoacustic microscopy.

    Science.gov (United States)

    Aguirre, Juan; Giannoula, Alexia; Minagawa, Taisuke; Funk, Lutz; Turon, Pau; Durduran, Turgut

    2013-01-01

    A model based reconstruction algorithm that exploits translational symmetries for photoacoustic microscopy to drastically reduce the memory cost is presented. The memory size needed to store the model matrix is independent of the number of acquisitions at different positions. This helps us to overcome one of the main limitations of previous algorithms. Furthermore, using the algebraic reconstruction technique and building the model matrix "on the fly", we have obtained fast reconstructions of simulated and experimental data on both two- and three-dimensional grids using a traditional dark field photoacoustic microscope and a standard personal computer.

  6. Rational selective exploitation and distress: employee reactions to performance-based and mobility-based reward allocations.

    Science.gov (United States)

    Rusbult, C E; Campbell, M A; Price, M E

    1990-09-01

    Prior research has demonstrated that allocators frequently distribute greater rewards to persons with high professional and geographic mobility than to persons with constrained mobility, especially among the very competent. This phenomenon has been termed rational selective exploitation. Do the recipients of such allocations actually experience this distribution rule as unjust and distressing, or is it a misnomer to refer to this phenomenon as exploitation? Two studies were conducted to explore this question. Study 1 was a laboratory experiment in which we manipulated relative performance level, relative mobility level, and allocation standard: performance based versus mobility based. Study 2 was a cross-sectional survey of actual employees in which subjects reported the degree to which performance and mobility were the basis for pay decisions at their places of employment, as well as the degree to which they perceived each standard to be fair. Both studies demonstrated that people regard mobility-based allocations as less fair and more distressing than performance-based allocations. Furthermore, the degree of distress resulting from mobility-based allocations is greater among persons who are disadvantaged by that standard: among people with constrained mobility, especially those who perform at high levels. These findings provide good support for the assertion that so-called rational selective exploitation is indeed distressing to employees. Reactions to this form of distress are also explored, and the implications of these findings for the allocation process are discussed.

  7. Improving the Effectiveness of Integral Property Calculation in a CSG Solid Modeling System by Exploiting Predictability

    Science.gov (United States)

    Clark, A. L.

    1985-01-01

    Integral property calculation is an important application for solid modeling systems. Algorithms for computing integral properties for various solid representation schemes are fairly well known. It is important to deigners and users of solid modeling systems to understand the behavior of such algorithms. Specifically the trade-off between execution time and accuracy is critical to effective use of integral property calculation. The average behavior of two algorithms for Constructive Solid Geometry (CSG) representations is investigated. Experimental results from the PADL-2 solid modeling system show that coarse decompositions can be used to predict execution time and error estimates for finer decompositions. Exploiting this predictability allow effective use of the algorithms in a solid modeling system.

  8. Photovoltaic array performance model.

    Energy Technology Data Exchange (ETDEWEB)

    Kratochvil, Jay A.; Boyson, William Earl; King, David L.

    2004-08-01

    This document summarizes the equations and applications associated with the photovoltaic array performance model developed at Sandia National Laboratories over the last twelve years. Electrical, thermal, and optical characteristics for photovoltaic modules are included in the model, and the model is designed to use hourly solar resource and meteorological data. The versatility and accuracy of the model has been validated for flat-plate modules (all technologies) and for concentrator modules, as well as for large arrays of modules. Applications include system design and sizing, 'translation' of field performance measurements to standard reporting conditions, system performance optimization, and real-time comparison of measured versus expected system performance.

  9. Modelling the response of size and diversity spectra of fish assemblages to changes in exploitation

    DEFF Research Database (Denmark)

    Gislason, Henrik; Rice, J.

    1998-01-01

    to changes in natural mortality, but sensitive to changes in growth and to the relationship between stock and recruitment. The results agree will with results obtained from previous analysis of survey data from the North Sea and suggest that the slope of the size spectrum is a useful measure of fishing......In this paper we investigate whether single and multispecies fisheries models call be used to predict the response of sire and diversity spectra of fish assemblages to changes in exploitation. Both types of models estimate that the slope of the size spectrum will steepen and the intercept...... will increase when fishing intensity increases, while the response of the slope and intercept of the diversity spectrum depend on the model used. The changes in the slope and intercept of the size spectrum are found to be proportional to the change in fishing intensity. The proportionality is insensitive...

  10. Exploiting remote sensing land surface temperature in distributed hydrological modelling: the example of the Continuum model

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2013-01-01

    Full Text Available Full process description and distributed hydrological models are very useful tools in hydrology as they can be applied in different contexts and for a wide range of aims such as flood and drought forecasting, water management, and prediction of impact on the hydrologic cycle due to natural and human-induced changes. Since they must mimic a variety of physical processes, they can be very complex and with a high degree of parameterization. This complexity can be increased by necessity of augmenting the number of observable state variables in order to improve model validation or to allow data assimilation.

    In this work a model, aiming at balancing the need to reproduce the physical processes with the practical goal of avoiding over-parameterization, is presented. The model is designed to be implemented in different contexts with a special focus on data-scarce environments, e.g. with no streamflow data.

    All the main hydrological phenomena are modelled in a distributed way. Mass and energy balance are solved explicitly. Land surface temperature (LST, which is particularly suited to being extensively observed and assimilated, is an explicit state variable.

    A performance evaluation, based on both traditional and satellite derived data, is presented with a specific reference to the application in an Italian catchment. The model has been firstly calibrated and validated following a standard approach based on streamflow data. The capability of the model in reproducing both the streamflow measurements and the land surface temperature from satellites has been investigated.

    The model has been then calibrated using satellite data and geomorphologic characteristics of the basin in order to test its application on a basin where standard hydrologic observations (e.g. streamflow data are not available. The results have been compared with those obtained by the standard calibration strategy based on streamflow data.

  11. The Peace and Power Conceptual Model: An Assessment Guide for School Nurses Regarding Commercial Sexual Exploitation of Children.

    Science.gov (United States)

    Fraley, Hannah E; Aronowitz, Teri

    2017-10-01

    Human trafficking is a global problem; more than half of all victims are children. In the United States (US), at-risk youth continue to attend school. School nurses are on the frontlines, presenting a window of opportunity to identify and prevent exploitation. Available papers targeting school nurses report that school nurses may lack awareness of commercial sexual exploitation and may have attitudes and misperceptions about behaviors of school children at risk. This is a theoretical paper applying the Peace and Power Conceptual Model to understand the role of school nurses in commercial sexual exploitation of children.

  12. Inverse modeling and forecasting for the exploitation of the Pauzhetsky geothermal field, Kamchatka, Russia

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, Stefan; Kiryukhin, A.V.; Asaulova, N.P.; Finsterle, S.

    2008-04-01

    A three-dimensional numerical model of the Pauzhetsky geothermal field has been developed based on a conceptual hydrogeological model of the system. It extends over a 13.6-km2 area and includes three layers: (1) a base layer with inflow; (2) a geothermal reservoir; and (3) an upper layer with discharge and recharge/infiltration areas. Using the computer program iTOUGH2 (Finsterle, 2004), the model is calibrated to a total of 13,675 calibration points, combining natural-state and 1960-2006 exploitation data. The principal model parameters identified and estimated by inverse modeling include the fracture permeability and fracture porosity of the geothermal reservoir, the initial natural upflow rate, the base-layer porosity, and the permeabilities of the infiltration zones. Heat and mass balances derived from the calibrated model helped identify the sources of the geothermal reserves in the field. With the addition of five makeup wells, simulation forecasts for the 2007-2032 period predict a sustainable average steam production of 29 kg/s, which is sufficient to maintain the generation of 6.8 MWe at the Pauzhetsky power plant.

  13. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    CERN Document Server

    Bonacorsi, D; Giordano, D; Girone, M; Neri, M; Magini, N; Kuznetsov, V; Wildish, T

    2015-01-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This...

  14. Coupled modeling of land hydrology–regional climate including human carbon emission and water exploitation

    Directory of Open Access Journals (Sweden)

    Zheng-Hui Xie

    2017-06-01

    Full Text Available Carbon emissions and water use are two major kinds of human activities. To reveal whether these two activities can modify the hydrological cycle and climate system in China, we conducted two sets of numerical experiments using regional climate model RegCM4. In the first experiment used to study the climatic responses to human carbon emissions, the model were configured over entire China because the impacts of carbon emissions can be detected across the whole country. Results from the first experiment revealed that near-surface air temperature may significantly increase from 2007 to 2059 at a rate exceeding 0.1 °C per decade in most areas across the country; southwestern and southeastern China also showed increasing trends in summer precipitation, with rates exceeding 10 mm per decade over the same period. In summer, only northern China showed an increasing trend of evapotranspiration, with increase rates ranging from 1 to 5 mm per decade; in winter, increase rates ranging from 1 to 5 mm per decade were observed in most regions. These effects are believed to be caused by global warming from human carbon emissions. In the second experiment used to study the effects of human water use, the model were configured over a limited region—Haihe River Basin in the northern China, because compared with the human carbon emissions, the effects of human water use are much more local and regional, and the Haihe River Basin is the most typical region in China that suffers from both intensive human groundwater exploitation and surface water diversion. We incorporated a scheme of human water regulation into RegCM4 and conducted the second experiment. Model outputs showed that the groundwater table severely declined by ∼10 m in 1971–2000 through human groundwater over-exploitation in the basin; in fact, current conditions are so extreme that even reducing the pumping rate by half cannot eliminate the groundwater depletion cones observed in the area

  15. Exploitation of parallelism in climate models. [Annual] report, 1 September 1991--29 February 1992

    Energy Technology Data Exchange (ETDEWEB)

    Baer, F.; Tribbia, J.J.; Williamson, D.L.

    1992-05-01

    The US Department of Energy (DOE) through its CHAMMP initiative, hopes to develop the capability to make meaningful regional climate forecasts on time scales exceeding a decade, such capability to be based on numerical prediction type models. We propose research to contribute to each of the specific items enumerated in the CHAMMP announcement (Notice 9103); i.e., to consider theoretical limits to prediction of climate and climate change on appropriate time scales, to develop new mathematical techniques to utilize massively parallel processors (MPP), to actually utilize MPP`s as a research tool, and to develop improved representations of some processes essential to climate prediction. To explore these initiatives, we will exploit all available computing technology, and in particular MPP machines. We anticipate that significant improvements in modeling of climate on the decadal and longer time scales for regional space scales will result from our efforts. This report summarizes the activities of our group during a part of the first year`s effort to meet the objectives stated in our proposal. We will comment on three research foci, time compression studies, subgrid scale model studies, and distributed climate ensemble studies and additional significant technical matters.

  16. Analytically exploiting noise correlations inside the feedback loop to improve locked-oscillator performance

    CSIR Research Space (South Africa)

    Sastrawan, J

    2016-08-01

    Full Text Available , providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate...

  17. Comparing Sustainable Performance of Industrial System Alternatives by Integrating Environment, Costs, Clients and Exploitation Context

    OpenAIRE

    Leroy, Yann; Cluzel, François; Lamé, Guillaume

    2014-01-01

    International audience; Life Cycle Assessment (LCA) is a methodology to assess environmental performances of products throughout their life cycles. Traditionally, LCA-based decision-making focuses on environmental impacts, excluding customer expectations and economic considerations. Moreover, it usually uses generic data while environmental performances of industrial systems often depend on local contexts. The aim of this paper is to provide a comprehensive framework to identify the solution ...

  18. DM-BLD: differential methylation detection using a hierarchical Bayesian model exploiting local dependency.

    Science.gov (United States)

    Wang, Xiao; Gu, Jinghua; Hilakivi-Clarke, Leena; Clarke, Robert; Xuan, Jianhua

    2017-01-15

    The advent of high-throughput DNA methylation profiling techniques has enabled the possibility of accurate identification of differentially methylated genes for cancer research. The large number of measured loci facilitates whole genome methylation study, yet posing great challenges for differential methylation detection due to the high variability in tumor samples. We have developed a novel probabilistic approach, D: ifferential M: ethylation detection using a hierarchical B: ayesian model exploiting L: ocal D: ependency (DM-BLD), to detect differentially methylated genes based on a Bayesian framework. The DM-BLD approach features a joint model to capture both the local dependency of measured loci and the dependency of methylation change in samples. Specifically, the local dependency is modeled by Leroux conditional autoregressive structure; the dependency of methylation changes is modeled by a discrete Markov random field. A hierarchical Bayesian model is developed to fully take into account the local dependency for differential analysis, in which differential states are embedded as hidden variables. Simulation studies demonstrate that DM-BLD outperforms existing methods for differential methylation detection, particularly when the methylation change is moderate and the variability of methylation in samples is high. DM-BLD has been applied to breast cancer data to identify important methylated genes (such as polycomb target genes and genes involved in transcription factor activity) associated with breast cancer recurrence. A Matlab package of DM-BLD is available at http://www.cbil.ece.vt.edu/software.htm CONTACT: Xuan@vt.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  20. Exploiting multiple sources of information in learning an artificial language: human data and modeling.

    Science.gov (United States)

    Perruchet, Pierre; Tillmann, Barbara

    2010-03-01

    This study investigates the joint influences of three factors on the discovery of new word-like units in a continuous artificial speech stream: the statistical structure of the ongoing input, the initial word-likeness of parts of the speech flow, and the contextual information provided by the earlier emergence of other word-like units. Results of an experiment conducted with adult participants show that these sources of information have strong and interactive influences on word discovery. The authors then examine the ability of different models of word segmentation to account for these results. PARSER (Perruchet & Vinter, 1998) is compared to the view that word segmentation relies on the exploitation of transitional probabilities between successive syllables, and with the models based on the Minimum Description Length principle, such as INCDROP. The authors submit arguments suggesting that PARSER has the advantage of accounting for the whole pattern of data without ad-hoc modifications, while relying exclusively on general-purpose learning principles. This study strengthens the growing notion that nonspecific cognitive processes, mainly based on associative learning and memory principles, are able to account for a larger part of early language acquisition than previously assumed. Copyright © 2009 Cognitive Science Society, Inc.

  1. Enhanced identification and exploitation of time scales for model reduction in stochastic chemical kinetics.

    Science.gov (United States)

    Gómez-Uribe, Carlos A; Verghese, George C; Tzafriri, Abraham R

    2008-12-28

    Widely different time scales are common in systems of chemical reactions and can be exploited to obtain reduced models applicable to the time scales of interest. These reduced models enable more efficient computation and simplify analysis. A classic example is the irreversible enzymatic reaction, for which separation of time scales in a deterministic mass action kinetics model results in approximate rate laws for the slow dynamics, such as that of Michaelis-Menten. Recently, several methods have been developed for separation of slow and fast time scales in chemical master equation (CME) descriptions of stochastic chemical kinetics, yielding separate reduced CMEs for the slow variables and the fast variables. The paper begins by systematizing the preliminary step of identifying slow and fast variables in a chemical system from a specification of the slow and fast reactions in the system. The authors then present an enhanced time-scale-separation method that can extend the validity and improve the accuracy of existing methods by better accounting for slow reactions when equilibrating the fast subsystem. The resulting method is particularly accurate in systems such as enzymatic and protein interaction networks, where the rates of the slow reactions that modify the slow variables are not a function of the slow variables. The authors apply their methodology to the case of an irreversible enzymatic reaction and show that the resulting improvements in accuracy and validity are analogous to those obtained in the deterministic case by using the total quasi-steady-state approximation rather than the classical Michaelis-Menten. The other main contribution of this paper is to show how mass fluctuation kinetics models, which give approximate evolution equations for the means, variances, and covariances of the concentrations in a chemical system, can feed into time-scale-separation methods at a variety of stages.

  2. Exploiting extracellular polymeric substances (EPS) controlling strategies for performance enhancement of biological wastewater treatments: An overview.

    Science.gov (United States)

    Shi, Yahui; Huang, Jinhui; Zeng, Guangming; Gu, Yanling; Chen, Yaoning; Hu, Yi; Tang, Bi; Zhou, Jianxin; Yang, Ying; Shi, Lixiu

    2017-08-01

    Extracellular polymeric substances (EPS) are present both outside of the cells and in the interior of microbial aggregates, and account for a main component in microbial aggregates. EPS can influence the properties and functions of microbial aggregates in biological wastewater treatment systems, and specifically EPS are involved in biofilm formation and stability, sludge behaviors as well as sequencing batch reactors (SBRs) granulation whereas they are also responsible for membrane fouling in membrane bioreactors (MBRs). EPS exhibit dual roles in biological wastewater treatments, and hence the control of available EPS can be expected to lead to changes in microbial aggregate properties, thereby improving system performance. In this review, current updated knowledge with regard to EPS basics including their formation mechanisms, important properties, key component functions as well as sub-fraction differentiation is given. EPS roles in biological wastewater treatments are also briefly summarized. Special emphasis is laid on EPS controlling strategies which would have the great potential in promoting microbial aggregates performance and in alleviating membrane fouling, including limitation strategies (inhibition of quorum sensing (QS) systems, regulation of environmental conditions, enzymatic degradation of key components, energy uncoupling etc.) and elevation strategies (enhancement of QS systems, addition of exogenous agents etc.). Those strategies have been confirmed to be feasible and promising to enhance system performance, and they would be a research niche that deserves further study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. High-performance computing on the Intel Xeon Phi how to fully exploit MIC architectures

    CERN Document Server

    Wang, Endong; Shen, Bo; Zhang, Guangyong; Lu, Xiaowei; Wu, Qing; Wang, Yajuan

    2014-01-01

    The aim of this book is to explain to high-performance computing (HPC) developers how to utilize the Intel® Xeon Phi™ series products efficiently. To that end, it introduces some computing grammar, programming technology and optimization methods for using many-integrated-core (MIC) platforms and also offers tips and tricks for actual use, based on the authors' first-hand optimization experience.The material is organized in three sections. The first section, "Basics of MIC", introduces the fundamentals of MIC architecture and programming, including the specific Intel MIC programming environment

  4. Exploitation, Exploration or Continuous Innovation? Strategy: Focus, Fit and Performance in different business environments

    DEFF Research Database (Denmark)

    Gröessler, Andreas; Laugen, Bjørge Timenes; Lassen, Astrid Heidemann

    The purpose of this paper is to investigate the extent to which continuous innovation is pursued as a strategy for manufacturing firms in different types of competitive environments, and whether continuous innovation firms perform better than focused firms in certain environments. Statistical...... analyses are used of data collected from an international sample of manufacturing firms through the International Manufacturing Strategy Survey. The main findings are that, while focused as well as continuous innovation firms exist in all three types of business environments identified in this paper...

  5. Exploitation of Ultrahigh-Performance Fibre-Reinforced Concrete for the Strengthening of Concrete Structural Members

    Directory of Open Access Journals (Sweden)

    Mohammed A. Al-Osta

    2018-01-01

    Full Text Available The repair and strengthening of reinforced concrete members are very important due to several factors, including unexpected increases in load levels and/or the damaging impact of aggressive environmental conditions on structural concrete members. Many researchers have turned to using materials for the repair and strengthening of damaged structures or the construction of new concrete structural members. Ultrahigh-performance fibre-reinforced concrete (UHPFRC, characterized by superior structural and durability performance in aggressive environmental conditions, is one of the materials that have been considered for the repair and strengthening of concrete structural members. The repair or strengthening of concrete structures using UHPFRC needs a thorough knowledge of the behaviour of both the strengthening material and the strengthened concrete structure at service load conditions, in addition to an understanding of the design guidelines governing the use of such materials for effective repair and strengthening. In this study, the recent issues and findings regarding the use of UHPFRC as a repair or strengthening material for concrete structural members are reviewed, analysed, and discussed. In addition, recommendations were made concerning areas where future attention and research on the use of UHPFRC as a strengthening material needs to be focused if the material is to be applied in practice.

  6. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    Science.gov (United States)

    Bonacorsi, D.; Boccali, T.; Giordano, D.; Girone, M.; Neri, M.; Magini, N.; Kuznetsov, V.; Wildish, T.

    2015-12-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for

  7. Redefining Exploitation

    DEFF Research Database (Denmark)

    Agarwala, Rina

    2016-01-01

    -employed workers are organizing as workers. They are fighting labor exploitation by redefining the concept to include additional exploitation axes (from the state and middle class) and forms (including sexual). In doing so, they are redefining potential solutions, including identities and material benefits, to fit...... their unique needs. By expanding the category of “workers” beyond those defined by a narrow focus on a standard employer-employee relationship, these movements are also fighting exclusion from earlier labor protections by increasing the number of entitled beneficiaries. These struggles provide an important...

  8. Autoregressive higher-order hidden Markov models: exploiting local chromosomal dependencies in the analysis of tumor expression profiles.

    Science.gov (United States)

    Seifert, Michael; Abou-El-Ardat, Khalil; Friedrich, Betty; Klink, Barbara; Deutsch, Andreas

    2014-01-01

    Changes in gene expression programs play a central role in cancer. Chromosomal aberrations such as deletions, duplications and translocations of DNA segments can lead to highly significant positive correlations of gene expression levels of neighboring genes. This should be utilized to improve the analysis of tumor expression profiles. Here, we develop a novel model class of autoregressive higher-order Hidden Markov Models (HMMs) that carefully exploit local data-dependent chromosomal dependencies to improve the identification of differentially expressed genes in tumor. Autoregressive higher-order HMMs overcome generally existing limitations of standard first-order HMMs in the modeling of dependencies between genes in close chromosomal proximity by the simultaneous usage of higher-order state-transitions and autoregressive emissions as novel model features. We apply autoregressive higher-order HMMs to the analysis of breast cancer and glioma gene expression data and perform in-depth model evaluation studies. We find that autoregressive higher-order HMMs clearly improve the identification of overexpressed genes with underlying gene copy number duplications in breast cancer in comparison to mixture models, standard first- and higher-order HMMs, and other related methods. The performance benefit is attributed to the simultaneous usage of higher-order state-transitions in combination with autoregressive emissions. This benefit could not be reached by using each of these two features independently. We also find that autoregressive higher-order HMMs are better able to identify differentially expressed genes in tumors independent of the underlying gene copy number status in comparison to the majority of related methods. This is further supported by the identification of well-known and of previously unreported hotspots of differential expression in glioblastomas demonstrating the efficacy of autoregressive higher-order HMMs for the analysis of individual tumor expression

  9. A Model Performance

    Science.gov (United States)

    Thornton, Bradley D.; Smalley, Robert A.

    2008-01-01

    Building information modeling (BIM) uses three-dimensional modeling concepts, information technology and interoperable software to design, construct and operate a facility. However, BIM can be more than a tool for virtual modeling--it can provide schools with a 3-D walkthrough of a project while it still is on the electronic drawing board. BIM can…

  10. Exploiting magnetic resonance angiography imaging improves model estimation of BOLD signal.

    Directory of Open Access Journals (Sweden)

    Zhenghui Hu

    Full Text Available The change of BOLD signal relies heavily upon the resting blood volume fraction ([Formula: see text] associated with regional vasculature. However, existing hemodynamic data assimilation studies pretermit such concern. They simply assign the value in a physiologically plausible range to get over ill-conditioning of the assimilation problem and fail to explore actual [Formula: see text]. Such performance might lead to unreliable model estimation. In this work, we present the first exploration of the influence of [Formula: see text] on fMRI data assimilation, where actual [Formula: see text] within a given cortical area was calibrated by an MR angiography experiment and then was augmented into the assimilation scheme. We have investigated the impact of [Formula: see text] on single-region data assimilation and multi-region data assimilation (dynamic cause modeling, DCM in a classical flashing checkerboard experiment. Results show that the employment of an assumed [Formula: see text] in fMRI data assimilation is only suitable for fMRI signal reconstruction and activation detection grounded on this signal, and not suitable for estimation of unobserved states and effective connectivity study. We thereby argue that introducing physically realistic [Formula: see text] in the assimilation process may provide more reliable estimation of physiological information, which contributes to a better understanding of the underlying hemodynamic processes. Such an effort is valuable and should be well appreciated.

  11. Modeling a hierarchical structure of factors influencing exploitation policy for water distribution systems using ISM approach

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, Małgorzata; Wyczółkowski, Ryszard; Gładysiak, Violetta

    2017-12-01

    Water distribution systems are one of the basic elements of contemporary technical infrastructure of urban and rural areas. It is a complex engineering system composed of transmission networks and auxiliary equipment (e.g. controllers, checkouts etc.), scattered territorially over a large area. From the water distribution system operation point of view, its basic features are: functional variability, resulting from the need to adjust the system to temporary fluctuations in demand for water and territorial dispersion. The main research questions are: What external factors should be taken into account when developing an effective water distribution policy? Does the size and nature of the water distribution system significantly affect the exploitation policy implemented? These questions have shaped the objectives of research and the method of research implementation.

  12. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar

  13. Development of Reservoir Characterization Techniques and Production Models for Exploiting Naturally Fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, Michael L.; Brown, Raymon L.; Civan, Frauk; Hughes, Richard G.

    2001-08-15

    Research continues on characterizing and modeling the behavior of naturally fractured reservoir systems. Work has progressed on developing techniques for estimating fracture properties from seismic and well log data, developing naturally fractured wellbore models, and developing a model to characterize the transfer of fluid from the matrix to the fracture system for use in the naturally fractured reservoir simulator.

  14. Exploitation of homogeneous isotropic turbulence models for optimization of turbulence remote sensing

    NARCIS (Netherlands)

    Oude Nijhuis, A.C.P.; Krasnov, O.K.; Unal, C.M.H.; Russchenberg, H.W.J.; Yarovoy, A.

    2015-01-01

    Homogeneous isotropic turbulence (HIT) models are compared, with respect to optimization of turbulence remote sensing. HIT models have different applications such as load calculation for wind turbines (Mann, 1998) or droplet track modelling (Pinsky and Khain, 2006). Details of vortices seem of less

  15. Performance and design optimization of a heaving point absorber for the exploitation of wave energy in the Italian Seas

    Science.gov (United States)

    Archetti, Renata; Moreno Miquel, Adrià; Bozzi, Silvia; Antonini, Alessandro; Passoni, Giuseppe

    2017-04-01

    The presentation aims to assess the potential for wave energy production in the Italian seas by the deployment of a heaving point absorbers, specifically optimized for mild climates. We model a single-body WEC, consisting of a cylindrical heaving buoy, attached to a linear electric generator placed on the seabed. The model includes both hydrodynamic and electromechanical forces. Two different version of the device are modeled, a two-body device consisting in a floating buoy attached to a linear generator placed at the sea bed and a three-body device, which also includes a submerged sphere located halfway from the float and the generator, which increases the performance by going easily to resonance. For each version of the device, the model takes into account either the heave only or the heave and surge combined. The devices have been tuned according to the Mediterranean Sea wave climate, taking particular attention to the floaters dimensioning and to the geometrical design of the PTO, which has been adapted to particular working conditions introduced by the surge mode. The Annual Energy production is estimeted, showing encouraging results and enlarge the perspective on wave energy production in the Italian and Mediterranean Seas. In the last part of the work the feasibility of supplying electricity through energy produced by wave by the described device in array at a small Italian island will be presented.

  16. On the advantages of exploiting the hierarchical structure of astrodynamical models

    Science.gov (United States)

    Dei Tos, Diogene Alessandro; Topputo, Francesco

    2017-07-01

    In this paper an algorithm is developed that combines the capabilities and advantages of several different astrodynamical models of increasing complexity. Splitting these models in a strict hierarchical order yields a clearer grasp on what is available. With the effort of developing a comprehensive model overhead, the equations for the spacecraft motion in simpler models can be readily obtained as particular cases. The proposed algorithm embeds the circular and elliptic restricted three-body problems, the four-body bicircular and concentric models, an averaged n-body model, and, at the top hierarchic ladder, the full ephemeris SPICE-based restricted n-body problem. The equations of motion are reduced to the assignment of 13 time-varying coefficients, which multiply the states and the gravitational potential to reproduce the proper vector field. This approach yields an efficient and quick way to check solutions for different dynamics and parameters. We show that in bottom-up applications, a gradual increase of model complexity benefits accuracy, the chances of success and the convergence rate of a continuation algorithm. Case studies are simple periodic orbits and low-energy transfers.

  17. Exploiting the Expressiveness of Cyclo-Static Dataflow to Model Multimedia Implementations

    Directory of Open Access Journals (Sweden)

    Henk Corporaal

    2007-01-01

    Full Text Available The design of increasingly complex and concurrent multimedia systems requires a description at a higher abstraction level. Using an appropriate model of computation helps to reason about the system and enables design time analysis methods. The nature of multimedia processing matches in many cases well with cyclo-static dataflow (CSDF, making it a suitable model. However, channels in an implementation often use for cost reasons a kind of shared buffer that cannot be directly described in CSDF. This paper shows how such implementation specific aspects can be expressed in CSDF without the need for extensions. Consequently, the CSDF graph remains completely analyzable and allows reasoning about its temporal behavior. The obtained relation between model and implementation enables a buffer capacity analysis on the model while assuring the throughput of the final implementation. The capabilities of the approach are demonstrated by analyzing the temporal behavior of an MPEG-4 video encoder with a CSDF graph.

  18. Next-Generation Image and Sound Processing Strategies: Exploiting the Biological Model

    Science.gov (United States)

    2007-05-01

    improved models of the nonlinear center-surround antagonism in the early stages of visual processing. We will then incorporate 2 these improved models...detect nonspecific conspicuous targets in cluttered auditory scenes before fully processing and recognizing the targets. We developed a novel biologically...during speech perception, a particular phoneme or syllable can be perceived to be more salient than the others due to the coarticulation between

  19. Statistical modeling of program performance

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available A task of evaluation of program performance often occurs in the process of design of computer systems or during iterative compilation. A traditional way to solve this problem is emulation of program execution on the target system. A modern alternative approach to evaluation of program performance is based on statistical modeling of program performance on a computer under investigation. This statistical method of modeling program performance called Velocitas was introduced in this work. The method and its implementation in the Adaptor framework were presented. Investigation of the method's effectiveness showed high adequacy of program performance prediction.

  20. Exploiting the functional and taxonomic structure of genomic data by probabilistic topic modeling.

    Science.gov (United States)

    Chen, Xin; Hu, Xiaohua; Lim, Tze Y; Shen, Xiajiong; Park, E K; Rosen, Gail L

    2012-01-01

    In this paper, we present a method that enable both homology-based approach and composition-based approach to further study the functional core (i.e., microbial core and gene core, correspondingly). In the proposed method, the identification of major functionality groups is achieved by generative topic modeling, which is able to extract useful information from unlabeled data. We first show that generative topic model can be used to model the taxon abundance information obtained by homology-based approach and study the microbial core. The model considers each sample as a “document,” which has a mixture of functional groups, while each functional group (also known as a “latent topic”) is a weight mixture of species. Therefore, estimating the generative topic model for taxon abundance data will uncover the distribution over latent functions (latent topic) in each sample. Second, we show that, generative topic model can also be used to study the genome-level composition of “N-mer” features (DNA subreads obtained by composition-based approaches). The model consider each genome as a mixture of latten genetic patterns (latent topics), while each functional pattern is a weighted mixture of the “N-mer” features, thus the existence of core genomes can be indicated by a set of common N-mer features. After studying the mutual information between latent topics and gene regions, we provide an explanation of the functional roles of uncovered latten genetic patterns. The experimental results demonstrate the effectiveness of proposed method.

  1. Exploiting proteomic data for genome annotation and gene model validation in Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Grigoriev Igor V

    2009-02-01

    Full Text Available Abstract Background Proteomic data is a potentially rich, but arguably unexploited, data source for genome annotation. Peptide identifications from tandem mass spectrometry provide prima facie evidence for gene predictions and can discriminate over a set of candidate gene models. Here we apply this to the recently sequenced Aspergillus niger fungal genome from the Joint Genome Institutes (JGI and another predicted protein set from another A.niger sequence. Tandem mass spectra (MS/MS were acquired from 1d gel electrophoresis bands and searched against all available gene models using Average Peptide Scoring (APS and reverse database searching to produce confident identifications at an acceptable false discovery rate (FDR. Results 405 identified peptide sequences were mapped to 214 different A.niger genomic loci to which 4093 predicted gene models clustered, 2872 of which contained the mapped peptides. Interestingly, 13 (6% of these loci either had no preferred predicted gene model or the genome annotators' chosen "best" model for that genomic locus was not found to be the most parsimonious match to the identified peptides. The peptides identified also boosted confidence in predicted gene structures spanning 54 introns from different gene models. Conclusion This work highlights the potential of integrating experimental proteomics data into genomic annotation pipelines much as expressed sequence tag (EST data has been. A comparison of the published genome from another strain of A.niger sequenced by DSM showed that a number of the gene models or proteins with proteomics evidence did not occur in both genomes, further highlighting the utility of the method.

  2. Exploiting the flexibility of a family of models for taxation and redistribution

    Science.gov (United States)

    Bertotti, M. L.; Modanese, G.

    2012-08-01

    We discuss a family of models expressed by nonlinear differential equation systems describing closed market societies in the presence of taxation and redistribution. We focus in particular on three example models obtained in correspondence to different parameter choices. We analyse the influence of the various choices on the long time shape of the income distribution. Several simulations suggest that behavioral heterogeneity among the individuals plays a definite role in the formation of fat tails of the asymptotic stationary distributions. This is in agreement with results found with different approaches and techniques. We also show that an excellent fit for the computational outputs of our models is provided by the κ-generalized distribution introduced by Kaniadakis in [Physica A 296, 405 (2001)].

  3. Modelling Crop Pattern Changes and Water Resources Exploitation: A Case Study

    Directory of Open Access Journals (Sweden)

    Donato Zingaro

    2017-09-01

    Full Text Available Agriculture and farming worldwide are responsible for numerous environmental threats, including degradation of land and water resource depletion. Underlining the dynamic interaction between bio-physical and socio-economic drivers is the key towards a more sustainable land and water management. With regard to a highly-developed agricultural area in Southern Italy, multi-regression models were developed to provide an ex-post interpretation of the observed inter-annual variability of cropped land. The main drivers related to Common Agricultural Policy support, product market prices, crop yield, and irrigation water availability were investigated. The adopted models revealed the different weights of each driver. The findings reported the role that direct payments played in supporting the extension of irrigated crops, such as processing tomato. Likewise, the models pointed out the decoupled payment scheme as the most important driver of change in the crop pattern over the last years.

  4. Mammalian models of chemically induced primary malignancies exploitable for imaging-based preclinical theragnostic research.

    Science.gov (United States)

    Liu, Yewei; Yin, Ting; Feng, Yuanbo; Cona, Marlein Miranda; Huang, Gang; Liu, Jianjun; Song, Shaoli; Jiang, Yansheng; Xia, Qian; Swinnen, Johannes V; Bormans, Guy; Himmelreich, Uwe; Oyen, Raymond; Ni, Yicheng

    2015-10-01

    Compared with transplanted tumor models or genetically engineered cancer models, chemically induced primary malignancies in experimental animals can mimic the clinical cancer progress from the early stage on. Cancer caused by chemical carcinogens generally develops through three phases namely initiation, promotion and progression. Based on different mechanisms, chemical carcinogens can be divided into genotoxic and non-genotoxic ones, or complete and incomplete ones, usually with an organ-specific property. Chemical carcinogens can be classified upon their origins such as environmental pollutants, cooked meat derived carcinogens, N-nitroso compounds, food additives, antineoplastic agents, naturally occurring substances and synthetic carcinogens, etc. Carcinogen-induced models of primary cancers can be used to evaluate the diagnostic/therapeutic effects of candidate drugs, investigate the biological influential factors, explore preventive measures for carcinogenicity, and better understand molecular mechanisms involved in tumor initiation, promotion and progression. Among commonly adopted cancer models, chemically induced primary malignancies in mammals have several advantages including the easy procedures, fruitful tumor generation and high analogy to clinical human primary cancers. However, in addition to the time-consuming process, the major drawback of chemical carcinogenesis for translational research is the difficulty in noninvasive tumor burden assessment in small animals. Like human cancers, tumors occur unpredictably also among animals in terms of timing, location and the number of lesions. Thanks to the availability of magnetic resonance imaging (MRI) with various advantages such as ionizing-free scanning, superb soft tissue contrast, multi-parametric information, and utility of diverse contrast agents, now a workable solution to this bottleneck problem is to apply MRI for noninvasive detection, diagnosis and therapeutic monitoring on those otherwise

  5. CSDFa: a model for exploiting the trade-off between data and pipeline parallelism

    NARCIS (Netherlands)

    Koek, Peter; Geuns, S.J.; Hausmans, J.P.H.M.; Corporaal, Henk; Bekooij, Marco Jan Gerrit

    2016-01-01

    Real-time stream processing applications, such as SDR applications, are often executed concurrently on multiprocessor systems. A unified data flow model and analysis method have been proposed that can be used to simultaneously determine the amount of pipeline and coarse-grained data parallelism

  6. Bacteria, Yeast, Worms, and Flies: Exploiting Simple Model Organisms to Investigate Human Mitochondrial Diseases

    Science.gov (United States)

    Rea, Shane L.; Graham, Brett H.; Nakamaru-Ogiso, Eiko; Kar, Adwitiya; Falk, Marni J.

    2010-01-01

    The extensive conservation of mitochondrial structure, composition, and function across evolution offers a unique opportunity to expand our understanding of human mitochondrial biology and disease. By investigating the biology of much simpler model organisms, it is often possible to answer questions that are unreachable at the clinical level.…

  7. Subsidence Modeling of the Over-exploited Granular Aquifer System in Aguascalientes, Mexico

    Science.gov (United States)

    Solano Rojas, D. E.; Wdowinski, S.; Minderhoud, P. P. S.; Pacheco, J.; Cabral, E.

    2016-12-01

    The valley of Aguascalientes in central Mexico experiences subsidence rates of up to 100 [mm/yr] due to overexploitation of its aquifer system, as revealed from satellite-based geodetic observations. The spatial pattern of the subsidence over the valley is inhomogeneous and affected by shallow faulting. The understanding of the subsoil mechanics is still limited. A better understanding of the subsidence process in Aguascalientes is needed to provide insights for future subsidence in the valley. We present here a displacement-constrained finite-element subsidence model using Deltares iMOD (interactive MODeling), based on the USGS MODFLOW software. The construction of our model relies on 3 main inputs: (1) groundwater level time series obtained from extraction wells' hydrographs, (2) subsurface lithostratigraphy interpreted from well drilling logs, and (3) hydrogeological parameters obtained from field pumping tests. The groundwater level measurements were converted to pore pressure in our model's layers, and used in Terzaghi's equation for calculating effective stress. We then used the effective stresse along with the displacement obtained from geodetic observations to constrain and optimize five geo-mechanical parameters: compression ratio, reloading ratio, secondary compression index, over consolidation ratio, and consolidation coefficient. Finally, we use the NEN-Bjerrum linear stress model formulation for settlements to determine elastic and visco-plastic strain, accounting for the aquifer system units' aging effect. Preliminary results show higher compaction response in clay-saturated intervals (i.e. aquitards) of the aquifer system, as reflected in the spatial pattern of the surface deformation. The forecasted subsidence for our proposed scenarios show a much more pronounced deformation when we consider higher groundwater extraction regimes.

  8. Rapid immunoassay exploiting nanoparticles and micromagnets: proof-of-concept using ovalbumin model.

    Science.gov (United States)

    Delshadi, Sarah; Blaire, Guillaume; Kauffmann, Paul; Fratzl, Mario; Devillers, Thibaut; Delabouglise, Didier; Weidenhaupt, Marianne; Dempsey, Nora M; Cugat, Orphée; Bruckert, Franz; Marche, Patrice N

    2017-03-01

    We present a fast magnetic immunoassay, combining magnetic nanoparticles and micromagnets. High magnetic field gradients from micromagnets are used to develop a new approach to the standard ELISA. Materials & methods/results: A proof-of-concept based on colorimetric quantification of antiovalbumin antibody in buffer is performed and compared with an ELISA. After optimization, the magnetic immunoassay exhibits a limit of detection (40 ng/ml) and a dynamic range (40-2500 ng/ml) similar to that of ELISAs developed using same biochemical tools. Micromagnets can be fully integrated in multiwell plates at low cost to allow the efficient capture of immunocomplexes carried by magnetic nanoparticles. The method is generic and permits to perform magnetic ELISA in 30 min.

  9. Development and exploitation of a controlled vocabulary in support of climate modelling

    Science.gov (United States)

    Moine, M.-P.; Valcke, S.; Lawrence, B. N.; Pascoe, C.; Ford, R. W.; Alias, A.; Balaji, V.; Bentley, P.; Devine, G.; Callaghan, S. A.; Guilyardi, E.

    2014-03-01

    There are three key components for developing a metadata system: a container structure laying out the key semantic issues of interest and their relationships; an extensible controlled vocabulary providing possible content; and tools to create and manipulate that content. While metadata systems must allow users to enter their own information, the use of a controlled vocabulary both imposes consistency of definition and ensures comparability of the objects described. Here we describe the controlled vocabulary (CV) and metadata creation tool built by the METAFOR project for use in the context of describing the climate models, simulations and experiments of the fifth Coupled Model Intercomparison Project (CMIP5). The CV and resulting tool chain introduced here is designed for extensibility and reuse and should find applicability in many more projects.

  10. Firm Sustainability Performance Index Modeling

    Directory of Open Access Journals (Sweden)

    Che Wan Jasimah Bt Wan Mohamed Radzi

    2015-12-01

    Full Text Available The main objective of this paper is to bring a model for firm sustainability performance index by applying both classical and Bayesian structural equation modeling (parametric and semi-parametric modeling. Both techniques are considered to the research data collected based on a survey directed to the China, Taiwan, and Malaysia food manufacturing industry. For estimating firm sustainability performance index we consider three main indicators include knowledge management, organizational learning, and business strategy. Based on the both Bayesian and classical methodology, we confirmed that knowledge management and business strategy have significant impact on firm sustainability performance index.

  11. Efficient Execution of Networked MPSoC Models by Exploiting Multiple Platform Levels

    Directory of Open Access Journals (Sweden)

    Christoph Roth

    2012-01-01

    Full Text Available Novel embedded applications are characterized by increasing requirements on processing performance as well as the demand for communication between several or many devices. Networked Multiprocessor System-on-Chips (MPSoCs are a possible solution to cope with this increasing complexity. Such systems require a detailed exploration on both architectures and system design. An approach that allows investigating interdependencies between system and network domain is the cooperative execution of system design tools with a network simulator. Within previous work, synchronization mechanisms have been developed for parallel system simulation and system/network co-simulation using the high level architecture (HLA. Within this contribution, a methodology is presented that extends previous work with further building blocks towards a construction kit for system/network co-simulation. The methodology facilitates flexible assembly of components and adaptation to the specific needs of use cases in terms of performance and accuracy. Underlying concepts and made extensions are discussed in detail. Benefits are substantiated by means of various benchmarks.

  12. Exploiting anti-obesity mechanism of Clerodendrum phlomidis against two different models of rodents

    OpenAIRE

    Chidrawar, Vijay R.; Krishnakant N Patel; Shiromwar, Shruti S.; Kshirsagar, Ajay D.

    2011-01-01

    Roots of Clerodendrum phlomidis are used by the local people of Dibrugarh district of Assam state India as a dietary supplement for treating weight issues and are also mentioned in the traditional system of Indian medicine as a remedy for obesity. We examined the anti-obesity effect of Clerodendrum phlomidis (family Verbenaceae) L. roots against cafeteria diet (CD) and progesterone-induced obesity. In CD-induced model obesity was induced by feeding CD for 48-days and increase in body weight a...

  13. Development of Reservoir Characterization Techniques and Production Models for Exploiting Naturally Fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, Michael L.; Brown, Raymon L.; Civan, Faruk; Hughes, Richard G.

    2003-02-11

    This research was directed toward developing a systematic reservoir characterization methodology which can be used by the petroleum industry to implement infill drilling programs and/or enhanced oil recovery projects in naturally fractured reservoir systems in an environmentally safe and cost effective manner. It was anticipated that the results of this research program will provide geoscientists and engineers with a systematic procedure for properly characterizing a fractured reservoir system and a reservoir/horizontal wellbore simulator model which can be used to select well locations and an effective EOR process to optimize the recovery of the oil and gas reserves from such complex reservoir systems.

  14. Exploitation of Digital Surface Models Generated from WORLDVIEW-2 Data for SAR Simulation Techniques

    Science.gov (United States)

    Ilehag, R.; Auer, S.; d'Angelo, P.

    2017-05-01

    GeoRaySAR, an automated SAR simulator developed at DLR, identifies buildings in high resolution SAR data by utilizing geometric knowledge extracted from digital surface models (DSMs). Hitherto, the simulator has utilized DSMs generated from LiDAR data from airborne sensors with pre-filtered vegetation. Discarding the need for pre-optimized model input, DSMs generated from high resolution optical data (acquired with WorldView-2) are used for the extraction of building-related SAR image parts in this work. An automatic preprocessing of the DSMs has been developed for separating buildings from elevated vegetation (trees, bushes) and reducing the noise level. Based on that, automated simulations are triggered considering the properties of real SAR images. Locations in three cities, Munich, London and Istanbul, were chosen as study areas to determine advantages and limitations related to WorldView-2 DSMs as input for GeoRaySAR. Beyond, the impact of the quality of the DSM in terms of building extraction is evaluated as well as evaluation of building DSM, a DSM only containing buildings. The results indicate that building extents can be detected with DSMs from optical satellite data with various success, dependent on the quality of the DSM as well as on the SAR imaging perspective.

  15. On Better Exploring and Exploiting Task Relationships in Multitask Learning: Joint Model and Feature Learning.

    Science.gov (United States)

    Li, Ya; Tian, Xinmei; Liu, Tongliang; Tao, Dacheng

    2017-04-17

    Multitask learning (MTL) aims to learn multiple tasks simultaneously through the interdependence between different tasks. The way to measure the relatedness between tasks is always a popular issue. There are mainly two ways to measure relatedness between tasks: common parameters sharing and common features sharing across different tasks. However, these two types of relatedness are mainly learned independently, leading to a loss of information. In this paper, we propose a new strategy to measure the relatedness that jointly learns shared parameters and shared feature representations. The objective of our proposed method is to transform the features of different tasks into a common feature space in which the tasks are closely related and the shared parameters can be better optimized. We give a detailed introduction to our proposed MTL method. Additionally, an alternating algorithm is introduced to optimize the nonconvex objection. A theoretical bound is given to demonstrate that the relatedness between tasks can be better measured by our proposed MTL algorithm. We conduct various experiments to verify the superiority of the proposed joint model and feature MTL method.

  16. From Desktop to Teraflop: Exploiting the U.S. Lead in High Performance Computing. NSF Blue Ribbon Panel on High Performance Computing.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…

  17. Composite performance and dependability modelling

    NARCIS (Netherlands)

    Trivedi, Kishor S.; Muppala, Jogesh K.; Woolet, Steven P.; Haverkort, Boudewijn R.H.M.

    1992-01-01

    Composite performance and dependability analysis is gaining importance in the design of complex, fault-tolerant systems. Markov reward models are most commonly used for this purpose. In this paper, an introduction to Markov reward models including solution techniques and application examples is

  18. Acetylcholine-Based Entropy in Response Selection: A Model of How Striatal Interneurons Modulate Exploration, Exploitation, and Response Variability in Decision Making

    Directory of Open Access Journals (Sweden)

    Andrea eStocco

    2012-02-01

    Full Text Available The basal ganglia play a fundamental role in decision making. Their contribution is typically modeled within a reinforcement learning framework, with the basal ganglia learning to select the options associated with highest value and their dopamine inputs conveying performance feedback. This basic framework, however, does not account for the role of cholinergic interneurons in the striatum, and does not easily explain certain dynamic aspects of decision-making and skill acquisition like the generation of exploratory actions. This paper describes BABE (Basal ganglia Acetylcholine-Based Entropy, a model of the acetylcholine system in the striatum that provides a unified explanation for these phenomena. According to this model, cholinergic interneurons in the striatum control the level of variability in behavior by modulating the number of possible responses that are considered by the basal ganglia, as well as the level of competition between them. This mechanism provides a natural way to account for the role of basal ganglia in generating behavioral variability during the acquisition of certain cognitive skills, as well as for modulating exploration and exploitation in decision making. Compared to a typical reinforcement learning model, BABE showed a greater modulation of response variability in the face of changes in the reward contingencies, allowing for faster learning (and re-learning of option values. Finally, the paper discusses the possible applications of the model to other domains.

  19. AAV exploits subcellular stress associated with inflammation, endoplasmic reticulum expansion, and misfolded proteins in models of cystic fibrosis.

    Directory of Open Access Journals (Sweden)

    Jarrod S Johnson

    2011-05-01

    Full Text Available Barriers to infection act at multiple levels to prevent viruses, bacteria, and parasites from commandeering host cells for their own purposes. An intriguing hypothesis is that if a cell experiences stress, such as that elicited by inflammation, endoplasmic reticulum (ER expansion, or misfolded proteins, then subcellular barriers will be less effective at preventing viral infection. Here we have used models of cystic fibrosis (CF to test whether subcellular stress increases susceptibility to adeno-associated virus (AAV infection. In human airway epithelium cultured at an air/liquid interface, physiological conditions of subcellular stress and ER expansion were mimicked using supernatant from mucopurulent material derived from CF lungs. Using this inflammatory stimulus to recapitulate stress found in diseased airways, we demonstrated that AAV infection was significantly enhanced. Since over 90% of CF cases are associated with a misfolded variant of Cystic Fibrosis Transmembrane Conductance Regulator (ΔF508-CFTR, we then explored whether the presence of misfolded proteins could independently increase susceptibility to AAV infection. In these models, AAV was an order of magnitude more efficient at transducing cells expressing ΔF508-CFTR than in cells expressing wild-type CFTR. Rescue of misfolded ΔF508-CFTR under low temperature conditions restored viral transduction efficiency to that demonstrated in controls, suggesting effects related to protein misfolding were responsible for increasing susceptibility to infection. By testing other CFTR mutants, G551D, D572N, and 1410X, we have shown this phenomenon is common to other misfolded proteins and not related to loss of CFTR activity. The presence of misfolded proteins did not affect cell surface attachment of virus or influence expression levels from promoter transgene cassettes in plasmid transfection studies, indicating exploitation occurs at the level of virion trafficking or processing. Thus

  20. Performance Modeling of Enterprise Grids

    Science.gov (United States)

    Hoffman, Doug L.; Apon, Amy; Dowdy, Larry; Lu, Baochuan; Hamm, Nathan; Ngo, Linh; Bui, Hung

    Modeling has long been recognized as an invaluable tool for predicting the performance behavior of computer systems. Modeling software, both commercial and open source, is widely used as a guide for the development of new systems and the upgrading of exiting ones. Tools such as queuing network models, stochastic Petri nets, and event driven simulation are in common use for stand-alone computer systems and networks. Unfortunately, no set of comprehensive tools exists for modeling complex distributed computing environments such as the ones found in emerging grid deployments. With the rapid advance of grid computing, the need for improved modeling tools specific to the grid environment has become evident. This chapter addresses concepts, methodologies, and tools that are useful when designing, implementing, and tuning the performance in grid and cluster environments

  1. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method-A Case Study of Western Jilin Province.

    Science.gov (United States)

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-07-30

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately.

  2. An ecosystem model of an exploited southern Mediterranean shelf region (Gulf of Gabes, Tunisia) and a comparison with other Mediterranean ecosystem model properties

    Science.gov (United States)

    Hattab, Tarek; Ben Rais Lasram, Frida; Albouy, Camille; Romdhane, Mohamed Salah; Jarboui, Othman; Halouani, Ghassen; Cury, Philippe; Le Loc'h, François

    2013-12-01

    In this paper, we describe an exploited continental shelf ecosystem (Gulf of Gabes) in the southern Mediterranean Sea using an Ecopath mass-balance model. This allowed us to determine the structure and functioning of this ecosystem and assess the impacts of fishing upon it. The model represents the average state of the ecosystem between 2000 and 2005. It includes 41 functional groups, which encompass the entire trophic spectrum from phytoplankton to higher trophic levels (e.g., fishes, birds, and mammals), and also considers the fishing activities in the area (five fleets). Model results highlight an important bentho-pelagic coupling in the system due to the links between plankton and benthic invertebrates through detritus. A comparison of this model with those developed for other continental shelf regions in the Mediterranean (i.e., the southern Catalan, the northern-central Adriatic, and the northern Aegean Seas) emphasizes similar patterns in their trophic functioning. Low and medium trophic levels (i.e., zooplankton, benthic molluscs, and polychaetes) and sharks were identified as playing key ecosystem roles and were classified as keystone groups. An analysis of ecosystem attributes indicated that the Gulf of Gabes is the least mature (i.e., in the earliest stages of ecosystem development) of the four ecosystems that were compared and it is suggested that this is due, at least in part, to the impacts of fishing. Bottom trawling was identified as having the widest-ranging impacts across the different functional groups and the largest impacts on some commercially-targeted demersal fish species. Several exploitation indices highlighted that the Gulf of Gabes ecosystem is highly exploited, a finding which is supported by stock assessment outcomes. This suggests that it is unlikely that the gulf can be fished at sustainable levels, a situation which is similar to other marine ecosystems in the Mediterranean Sea.

  3. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  4. A Geographic Model to Assess and Limit Cumulative Ecological Degradation from Marcellus Shale Exploitation in New York, USA

    Directory of Open Access Journals (Sweden)

    John B. Davis

    2012-06-01

    Full Text Available When natural resources are exploited, environmental costs and economic benefits are often asymmetric. An example is apparent in the environmental impacts from fossil fuel extraction by hydraulic fracturing. So far, most scrutiny has been focused on water quality in affected aquifers, with less attention paid to broader ecological impacts beyond individual drilling operations. Marcellus Shale methane exploitation in New York State, USA, has been delayed because of a regulatory moratorium, pending evaluation that has been directed primarily at localized impacts. We developed a GIS-based model, built on a hexagonal grid underlay nested within the U.S. Environmental Protection Agency's EMAP system, to examine potential cumulative ecological impacts. In a two-step process, we characterized > 19,000 hexagons, each sized to approximate the footprint of one drilling site (2.57 km², using ecological attributes; we then developed a method for apportioning resource access that includes assessments of cumulative ecological costs. Over one-quarter of the hexagons were excluded as off-limits on the basis of six criteria: slope suitability, regulated wetland cover, protected-land cover, length of high-quality streams, mapped road density, and open water cover. Three additional criteria were applied to assess the estimated conservation vulnerability of the remaining sites: density of grassland birds (North American Breeding Bird Survey, percent core forest (Coastal Change Analysis Program, and total density of all state-mapped streams; these were determined and used in combination to rank the 14,000 potentially accessible sites. In a second step, an iterative process was used to distribute potential site access among all towns (sub-county governments within the Marcellus Shale Formation. At each iteration, one site was selected per town, either randomly or in rank order of increasing vulnerability. Results were computed as percent cumulative impact versus the

  5. Characterisation and exploitation of Atlas electromagnetic calorimeter performances: muons study and timing resolution use; Caracterisation et exploitation des performances du calorimetre electromagneique d'Atlas: etude des muons et mise a profit de la resolution en temps

    Energy Technology Data Exchange (ETDEWEB)

    Camard, A

    2004-10-01

    The ATLAS detector in LHC involves electromagnetic calorimeters. The purpose of this work is to study the calorimeter response to the muons contaminating the beam used to test the different modules of ATLAS. We have showed how data analysis from the testing beam can be used to assure that the required performance for the study of the detector response to muons provides a complementary diagnostic tool for electrons. We have taken part into the design of a testing bench aimed at assessing the performance of the receiver circuit for timing and triggering signals. We have developed, in the framework of a quick simulation of ATLAS, a tool for the reconstruction in a simple and fast manner of the localization of the main event vertex by using the measurement of the arrival time of particles with ATLAS's calorimeters. It is likely that this tool will be fully used during the starting phase of the ATLAS experiment because it is easier to operate it quickly and is less sensitive to the background noise than traditional tools based on charged-particle tracks recognition inside the detector.

  6. Performance Evaluation of a SOA-based Rack-To-Rack Switch for Optical Interconnects Exploiting NRZ-DPSK

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Prince, Kamau

    2012-01-01

    We experimentally study the transmission performance of 10-Gb/s NRZ-DPSK through concatenated AWG MUX/DMUXs and SOAs employed in an optimized 64×64 optical supercomputer interconnect architecture. NRZ-DPSK offers 9-dB higher dynamic range compared to conventional IM/DD.......We experimentally study the transmission performance of 10-Gb/s NRZ-DPSK through concatenated AWG MUX/DMUXs and SOAs employed in an optimized 64×64 optical supercomputer interconnect architecture. NRZ-DPSK offers 9-dB higher dynamic range compared to conventional IM/DD....

  7. Performance comparison of 850-nm and 1550-nm VCSELs exploiting OOK, OFDM, and 4-PAM over SMF/MMF links for low-cost optical interconnects

    DEFF Research Database (Denmark)

    Karinou, Fotini; Deng, Lei; Rodes Lopez, Roberto

    2013-01-01

    We experimentally compare the performance of two commercially available vertical-cavity surface-emitting laser diodes (VCSELs), a multi-mode 850-nm and a single-mode 1550-nm, exploiting on–off keying/direct detection (OOK/DD), and orthogonal frequency division multiplexed (OFDM) quadrature phase...... modulation (4-PAM), for the 1550-nm transmitter over SMF and MMF links and we compare it to the data-rate equivalent NRZ-OOK. The extensive performance comparison under various transmission scenarios shows the superiority of 1550-nm single-mode VCSEL compared to its multi-mode 850-nm counterpart. Moreover......, OFDM/DD and 4-PAM in conjunction with low-cost, inexpensive VCSELs as transmitters prove to be an enabling technology for next-generation WDM, point-to-point, short-reach, SMF/MMF optical interconnects and potential candidates to substitute NRZ-OOK. Nevertheless, the sensitivity requirements are higher...

  8. Development of a 1 D hydrodynamic habitat model for the Hippopotamus amphibious as basis for sustainable exploitation of hydroelectric power

    Science.gov (United States)

    Manful, D. Y.; Kaule, G.; Wieprecht, S.; Rees, J.; Hu, W.

    2009-12-01

    operating rules of the reservoir in the post-construction phase of the dam. A great deal of work has been done on the effects of stream flow changes on fish especially salmonids. Very little work however has been done assessing the impact of hydropower schemes on aquatic mammals especially in Africa. HIPStrA is the first attempt at developing a computer-based habitat model for a large aquatic megaherbivore. The need for energy for development, the availability of large rivers and a rich biodiversity base in Africa makes a case for careful and ecological smart exploitation. The overarching aim of the study is the sustainable development of hydroelectric power through the use of methodologies and tools to rigorously assess changes in instream conditions that impact aquatic mammals.

  9. Elevated temperature and acclimation time affect metabolic performance in the heavily exploited Nile perch of Lake Victoria.

    Science.gov (United States)

    Nyboer, Elizabeth A; Chapman, Lauren J

    2017-10-15

    Increasing water temperatures owing to anthropogenic climate change are predicted to negatively impact the aerobic metabolic performance of aquatic ectotherms. Specifically, it has been hypothesized that thermal increases result in reductions in aerobic scope (AS), which lead to decreases in energy available for essential fitness and performance functions. Consequences of warming are anticipated to be especially severe for warm-adapted tropical species as they are thought to have narrow thermal windows and limited plasticity for coping with elevated temperatures. In this study we test how predicted warming may affect the aerobic performance of Nile perch (Lates niloticus), a commercially harvested fish species in the Lake Victoria basin of East Africa. We measured critical thermal maxima (CTmax) and key metabolic variables such as AS and excess post-exercise oxygen consumption (EPOC) across a range of temperatures, and compared responses between acute (3-day) exposures and 3-week acclimations. CTmax increased with acclimation temperature; however, 3-week-acclimated fish had higher overall CTmax than acutely exposed individuals. Nile perch also showed the capacity to increase or maintain high AS even at temperatures well beyond their current range; however, acclimated Nile perch had lower AS compared with acutely exposed fish. These changes were accompanied by lower EPOC, suggesting that drops in AS may reflect improved energy utilization after acclimation, a finding that is supported by improvements in growth at high temperatures over the acclimation period. Overall, the results challenge predictions that tropical species have limited thermal plasticity, and that high temperatures will be detrimental because of limitations in AS. © 2017. Published by The Company of Biologists Ltd.

  10. Generation of Digital Surface Models from satellite photogrammetry: the DSM-OPT service of the ESA Geohazards Exploitation Platform (GEP)

    Science.gov (United States)

    Stumpf, André; Michéa, David; Malet, Jean-Philippe

    2017-04-01

    The continuously increasing fleet of agile stereo-capable very-high resolution (VHR) optical satellites has facilitated the acquisition of multi-view images of the earth surface. Theoretical revisit times have been reduced to less than one day and the highest spatial resolution which is commercially available amounts now to 30 cm/pixel. Digital Surface Models (DSM) and point clouds computed from such satellite stereo-acquisitions can provide valuable input for studies in geomorphology, tectonics, glaciology, hydrology and urban remote sensing The photogrammetric processing, however, still requires significant expertise, computational resources and costly commercial software. To enable a large Earth Science community (researcher and end-users) to process easily and rapidly VHR multi-view images, the work targets the implementation of a fully automatic satellite-photogrammetry pipeline (i.e DSM-OPT) on the ESA Geohazards Exploitation Platform (GEP). The implemented pipeline is based on the open-source photogrammetry library MicMac [1] and is designed for distributed processing on a cloud-based infrastructure. The service can be employed in pre-defined processing modes (i.e. urban, plain, hilly, and mountainous environments) or in an advanced processing mode (i.e. in which expert-users have the possibility to adapt the processing parameters to their specific applications). Four representative use cases are presented to illustrate the accuracy of the resulting surface models and ortho-images as well as the overall processing time. These use cases consisted of the construction of surface models from series of Pléiades images for four applications: urban analysis (Strasbourg, France), landslide detection in mountainous environments (South French Alps), co-seismic deformation in mountain environments (Central Italy earthquake sequence of 2016) and fault recognition for paleo-tectonic analysis (North-East India). Comparisons of the satellite-derived topography to airborne

  11. User modeling for exploratory search on the Social Web. Exploiting social bookmarking systems for user model extraction, evaluation and integration

    OpenAIRE

    Gontek, Mirko

    2011-01-01

    Exploratory search is an information seeking strategy that extends be- yond the query-and-response paradigm of traditional Information Retrieval models. Users browse through information to discover novel content and to learn more about the newly discovered things. Social bookmarking systems integrate well with exploratory search, because they allow one to search, browse, and filter social bookmarks. Our contribution is an exploratory tag search engine that merges social bookmarking with ex...

  12. Performance modeling of a feature-aided tracker

    Science.gov (United States)

    Goley, G. Steven; Nolan, Adam R.

    2012-06-01

    In order to provide actionable intelligence in a layered sensing paradigm, exploitation algorithms should produce a confidence estimate in addition to the inference variable. This article presents a methodology and results of one such algorithm for feature-aided tracking of vehicles in wide area motion imagery. To perform experiments a synthetic environment was developed, which provided explicit knowledge of ground truth, tracker prediction accuracy, and control of operating conditions. This synthetic environment leveraged physics-based modeling simulations to re-create both traffic flow, reflectance of vehicles, obscuration and shadowing. With the ability to control operating conditions as well as the availability of ground truth, several experiments were conducted to test both the tracker and expected performance. The results show that the performance model produces a meaningful estimate of the tracker performance over the subset of operating conditions.

  13. Data harmonization and model performance

    Science.gov (United States)

    The Joint Committee on Urban Storm Drainage of the International Association for Hydraulic Research (IAHR) and International Association on Water Pollution Research and Control (IAWPRC) was formed in 1982. The current committee members are (no more than two from a country): B. C. Yen, Chairman (USA); P. Harremoes, Vice Chairman (Denmark); R. K. Price, Secretary (UK); P. J. Colyer (UK), M. Desbordes (France), W. C. Huber (USA), K. Krauth (FRG), A. Sjoberg (Sweden), and T. Sueishi (Japan).The IAHR/IAWPRC Joint Committee is forming a Task Group on Data Harmonization and Model Performance. One objective is to promote international urban drainage data harmonization for easy data and information exchange. Another objective is to publicize available models and data internationally. Comments and suggestions concerning the formation and charge of the Task Group are welcome and should be sent to: B. C. Yen, Dept. of Civil Engineering, Univ. of Illinois, 208 N. Romine St., Urbana, IL 61801.

  14. Empirical estimation of recreational exploitation of burbot, Lota lota, in the Wind River drainage of Wyoming using a multistate capture–recapture model

    Science.gov (United States)

    Lewandoski, S. A.; Guy, Christopher S.; Zale, Alexander V.; Gerrity, Paul C.; Deromedi, J. W.; Johnson, K.M.; Skates, D. L.

    2017-01-01

    Burbot, Lota lota (Linnaeus), is a regionally popular sportfish in the Wind River drainage of Wyoming, USA, at the southern boundary of the range of the species. Recent declines in burbot abundances were hypothesised to be caused by overexploitation, entrainment in irrigation canals and habitat loss. This study addressed the overexploitation hypothesis using tagging data to generate reliable exploitation, abundance and density estimates from a multistate capture–recapture model that accounted for incomplete angler reporting and tag loss. Exploitation rate μ was variable among the study lakes and inversely correlated with density. Exploitation thresholds μ40 associated with population densities remaining above 40% of carrying capacity were generated to characterise risk of overharvest using exploitation and density estimates from tagging data and a logistic surplus-production model parameterised with data from other burbot populations. Bull Lake (μ = 0.06, 95% CI: 0.03–0.11; μ40 = 0.18) and Torrey Lake (μ = 0.02, 95% CI: 0.00–0.11; μ40 = 0.18) had a low risk of overfishing, Upper Dinwoody Lake had intermediate risk (μ = 0.08, 95% CI: 0.02–0.32; μ40 = 0.18) and Lower Dinwoody Lake had high risk (μ = 0.32, 95% CI: 0.10–0.67; μ40 = 0.08). These exploitation and density estimates can be used to guide sustainable management of the Wind River drainage recreational burbot fishery and inform management of other burbot fisheries elsewhere.

  15. The Ethics of Exploitation

    Directory of Open Access Journals (Sweden)

    Paul McLaughlin

    2008-11-01

    Full Text Available Philosophical inquiry into exploitation has two major deficiencies to date: it assumes that exploitation is wrong by definition; and it pays too much attention to the Marxian account of exploitation. Two senses of exploitation should be distinguished: the ‘moral’ or pejorative sense and the ‘non-moral’ or ‘non-prejudicial’ sense. By demonstrating the conceptual inadequacy of exploitation as defined in the first sense, and by defining exploitation adequately in the latter sense, we seek to demonstrate the moral complexity of exploitation. We contend, moreover, that moral evaluation of exploitation is only possible once we abandon a strictly Marxian framework and attempt, in the long run, to develop an integral ethic along Godwinian lines.

  16. On a model of mixtures with internal variables: Extended Liu procedure for the exploitation of the entropy principle

    Directory of Open Access Journals (Sweden)

    Francesco Oliveri

    2016-01-01

    Full Text Available The exploitation of second law of thermodynamics for a mixture of two fluids with a scalar internal variable and a first order nonlocal state space is achieved by using the extended Liu approach. This method requires to insert as constraints in the entropy inequality either the field equations or their gradient extensions. Consequently, the thermodynamic restrictions imposed by the entropy principle are derived without introducing extra terms neither in the energy balance equation nor in the entropy inequality.

  17. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  18. Generating Orthorectified Multi-Perspective 2.5D Maps to Facilitate Web GIS-Based Visualization and Exploitation of Massive 3D City Models

    Directory of Open Access Journals (Sweden)

    Jianming Liang

    2016-11-01

    Full Text Available 2.5D map is a convenient and efficient approach to exploiting a massive three-dimensional (3D city model in web GIS. With the rapid development of oblique airborne photogrammetry and photo-based 3D reconstruction, 3D city models are becoming more and more accessible. 3D Geographic Information System (GIS can support the interactive visualization of massive 3D city models on various platforms and devices. However, the value and accessibility of existing 3D city models can be augmented by integrating them into web-based two-dimensional (2D GIS applications. In this paper, we present a step-by-step workflow for generating orthorectified oblique images (2.5D maps from massive 3D city models. The proposed framework can produce 2.5D maps from an arbitrary perspective, defined by the elevation angle and azimuth angle of a virtual orthographic camera. We demonstrate how 2.5D maps can benefit web-based visualization and exploitation of massive 3D city models. We conclude that a 2.5D map is a compact data representation optimized for web data streaming of 3D city models and that geometric analysis of buildings can be effectively conducted on 2.5D maps.

  19. Improving the Performance Scalability of the Community Atmosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, Arthur [Lawrence Livermore National Laboratory (LLNL); Worley, Patrick H [ORNL

    2012-01-01

    The Community Atmosphere Model (CAM), which serves as the atmosphere component of the Community Climate System Model (CCSM), is the most computationally expensive CCSM component in typical configurations. On current and next-generation leadership class computing systems, the performance of CAM is tied to its parallel scalability. Improving performance scalability in CAM has been a challenge, due largely to algorithmic restrictions necessitated by the polar singularities in its latitude-longitude computational grid. Nevertheless, through a combination of exploiting additional parallelism, implementing improved communication protocols, and eliminating scalability bottlenecks, we have been able to more than double the maximum throughput rate of CAM on production platforms. We describe these improvements and present results on the Cray XT5 and IBM BG/P. The approaches taken are not specific to CAM and may inform similar scalability enhancement activities for other codes.

  20. Exploitability Assessment with TEASER

    Science.gov (United States)

    2017-05-01

    exploits. We saw this as an advantage of our dataset because we had to confirm that either a bug was exploitable or not exploitable. For the 28 CHAPTER 5...corruption which demonstrates that there is very little activity within c-ares to take advantage of after the heap corruption. This idea is in line with the...remote code execution POCs. 42 Bibliography [1] Analyze crashes to find security vulnerabilities in your apps . https: //blogs.msdn.microsoft.com

  1. Anthropology of sexual exploitation

    Directory of Open Access Journals (Sweden)

    Lalić Velibor

    2009-01-01

    Full Text Available In this paper, the authors observe sexual exploitation from an anthropological perspective. They analyze the rational, ethical, emotional and mythological dimensions of human sexuality. Consequently, after setting the phenomenon in a social and historical context, sexual exploitation is closely observed in the contemporary age. Based on thoughts of relevant thinkers, they make the conclusion that the elimination of sexual exploitation is not an utterly legal issue, but political and economical issues as well. Namely, legal norms are not sufficient to overcome sexual exploitation, but, political and economical relationships in contemporary societies, which will be based on sincere equal opportunities must be established.

  2. Evaluating survival model performance: a graphical approach.

    Science.gov (United States)

    Mandel, M; Galai, N; Simchen, E

    2005-06-30

    In the last decade, many statistics have been suggested to evaluate the performance of survival models. These statistics evaluate the overall performance of a model ignoring possible variability in performance over time. Using an extension of measures used in binary regression, we propose a graphical method to depict the performance of a survival model over time. The method provides estimates of performance at specific time points and can be used as an informal test for detecting time varying effects of covariates in the Cox model framework. The method is illustrated on real and simulated data using Cox proportional hazard model and rank statistics. Copyright 2005 John Wiley & Sons, Ltd.

  3. EXPLOITATION OF GRANITE BOULDER

    Directory of Open Access Journals (Sweden)

    Ivan Cotman

    1994-12-01

    Full Text Available The processes of forming, petrography, features, properties and exploitation of granite boulders are described. The directional drilling and black powder blasting is the succesful method in exploitation of granite boulders (boulder technology (the paper is published in Croatian.

  4. Exploiting elasticity: Modeling the influence of neural control on mechanics and energetics of ankle muscle-tendons during human hopping.

    Science.gov (United States)

    Robertson, Benjamin D; Sawicki, Gregory S

    2014-07-21

    We present a simplified Hill-type model of the human triceps surae-Achilles tendon complex working on a gravitational-inertial load during cyclic contractions (i.e. vertical hopping). Our goal was to determine the role that neural control plays in governing muscle, or contractile element (CE), and tendon, or series elastic element (SEE), mechanics and energetics within a compliant muscle-tendon unit (MTU). We constructed a 2D parameter space consisting of many combinations of stimulation frequency and magnitude (i.e. neural control strategies). We compared the performance of each control strategy by evaluating peak force and average positive mechanical power output for the system (MTU) and its respective components (CE, SEE), force-length (F-L) and -velocity (F-V) operating point of the CE during active force production, average metabolic rate for the CE, and both MTU and CE apparent efficiency. Our results suggest that frequency of stimulation plays a primary role in governing whole-MTU mechanics. These include the phasing of both activation and peak force relative to minimum MTU length, average positive power, and apparent efficiency. Stimulation amplitude was primarily responsible for governing average metabolic rate and within MTU mechanics, including peak force generation and elastic energy storage and return in the SEE. Frequency and amplitude of stimulation both played integral roles in determining CE F-L operating point, with both higher frequency and amplitude generally corresponding to lower CE strains, reduced injury risk, and elimination of the need for passive force generation in the CE parallel elastic element (PEE). Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  6. Modeling of Ship Propulsion Performance

    DEFF Research Database (Denmark)

    Pedersen, Benjamin Pjedsted; Larsen, Jan

    2009-01-01

    Full scale measurements of the propulsion power, ship speed, wind speed and direction, sea and air temperature, from four different loading conditions has been used to train a neural network for prediction of propulsion power. The network was able to predict the propulsion power with accuracy...... between 0.8-2.8%, which is about the same accuracy as for the measurements. The methods developed are intended to support the performance monitoring system SeaTrend® developed by FORCE Technology (FORCE (2008))....

  7. METAPHOR (version 1): Users guide. [performability modeling

    Science.gov (United States)

    Furchtgott, D. G.

    1979-01-01

    General information concerning METAPHOR, an interactive software package to facilitate performability modeling and evaluation, is presented. Example systems are studied and their performabilities are calculated. Each available METAPHOR command and array generator is described. Complete METAPHOR sessions are included.

  8. Assembly line performance and modeling

    Science.gov (United States)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-03-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  9. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  10. Teotihuacan, tepeapulco, and obsidian exploitation.

    Science.gov (United States)

    Charlton, T H

    1978-06-16

    Current cultural ecological models of the development of civilization in central Mexico emphasize the role of subsistence production techniques and organization. The recent use of established and productive archeological surface survey techniques along natural corridors of communication between favorable niches for cultural development within the Central Mexican symbiotic region resulted in the location of sites that indicate an early development of a decentralized resource exploitation, manufacturing, and exchange network. The association of the development of this system with Teotihuacán indicates the importance such nonsubsistence production and exchange had in the evolution of this first central Mexican civilization. The later expansion of Teotihuacán into more distant areas of Mesoamerica was based on this resource exploitation model. Later civilizations centered at Tula and Tenochtitlán also used such a model in their expansion.

  11. Testing predictive performance of binary choice models

    NARCIS (Netherlands)

    A.C.D. Donkers (Bas); B. Melenberg (Bertrand)

    2002-01-01

    textabstractBinary choice models occur frequently in economic modeling. A measure of the predictive performance of binary choice models that is often reported is the hit rate of a model. This paper develops a test for the outperformance of a predictor for binary outcomes over a naive prediction

  12. Student Modeling in Orthopedic Surgery Training: Exploiting Symbiosis between Temporal Bayesian Networks and Fine-Grained Didactic Analysis

    Science.gov (United States)

    Chieu, Vu Minh; Luengo, Vanda; Vadcard, Lucile; Tonetti, Jerome

    2010-01-01

    Cognitive approaches have been used for student modeling in intelligent tutoring systems (ITSs). Many of those systems have tackled fundamental subjects such as mathematics, physics, and computer programming. The change of the student's cognitive behavior over time, however, has not been considered and modeled systematically. Furthermore, the…

  13. Modeling and simulation of stamp deflections in nanoimprint lithography: Exploiting backside grooves to enhance residual layer thickness uniformity

    DEFF Research Database (Denmark)

    Taylor, Hayden; Smistrup, Kristian; Boning, Duane

    2011-01-01

    We describe a model for the compliance of a nanoimprint stamp etched with a grid of backside grooves. We integrate the model with a fast simulation technique that we have previously demonstrated, to show how etched grooves help reduce the systematic residual layer thickness (RLT) variations that ...

  14. Compressed sensing MRI exploiting complementary dual decomposition.

    Science.gov (United States)

    Park, Suhyung; Park, Jaeseok

    2014-04-01

    Compressed sensing (CS) MRI exploits the sparsity of an image in a transform domain to reconstruct the image from incoherently under-sampled k-space data. However, it has been shown that CS suffers particularly from loss of low-contrast image features with increasing reduction factors. To retain image details in such degraded experimental conditions, in this work we introduce a novel CS reconstruction method exploiting feature-based complementary dual decomposition with joint estimation of local scale mixture (LSM) model and images. Images are decomposed into dual block sparse components: total variation for piecewise smooth parts and wavelets for residuals. The LSM model parameters of residuals in the wavelet domain are estimated and then employed as a regional constraint in spatially adaptive reconstruction of high frequency subbands to restore image details missing in piecewise smooth parts. Alternating minimization of the dual image components subject to data consistency is performed to extract image details from residuals and add them back to their complementary counterparts while the LSM model parameters and images are jointly estimated in a sequential fashion. Simulations and experiments demonstrate the superior performance of the proposed method in preserving low-contrast image features even at high reduction factors. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Summary of photovoltaic system performance models

    Energy Technology Data Exchange (ETDEWEB)

    Smith, J. H.; Reiter, L. J.

    1984-01-15

    The purpose of this study is to provide a detailed overview of photovoltaics (PV) performance modeling capabilities that have been developed during recent years for analyzing PV system and component design and policy issues. A set of 10 performance models have been selected which span a representative range of capabilities from generalized first-order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Next, each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. Then each of the issues is discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. Finally, the models are grouped into categories to illustrate their purposes and perspectives.

  16. 3D Massive MIMO Systems: Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-07-30

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. Recently, the trend is to enhance system performance by exploiting the channel’s degrees of freedom in the elevation, which necessitates the characterization of 3D channels. We present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles. Based on this model, we provide analytical expression for the cumulative density function (CDF) of the mutual information (MI) for systems with a single receive and finite number of transmit antennas in the general signalto- interference-plus-noise-ratio (SINR) regime. The result is extended to systems with finite receive antennas in the low SINR regime. A Gaussian approximation to the asymptotic behavior of MI distribution is derived for the large number of transmit antennas and paths regime. We corroborate our analysis with simulations that study the performance gains realizable through meticulous selection of the transmit antenna downtilt angles, confirming the potential of elevation beamforming to enhance system performance. The results are directly applicable to the analysis of 5G 3D-Massive MIMO-systems.

  17. Classification model and analysis on students' performance ...

    African Journals Online (AJOL)

    The purpose of this paper is to propose a classification model for classifying students' performance in SijilPelajaran ... along with the examination data.This research shows that first semester results can be used to identify students' performance. Keywords: educational data mining; classification model; feature selection ...

  18. Translation from UML to SPN Model: A Performance Modeling Framework

    OpenAIRE

    Khan, Razib Hayat; Heegaard, Poul E.

    2010-01-01

    International audience; This work focuses on the delineating a performance modeling framework for a communication system that proposes a translation process from high level UML notation to Stochastic Petri Net model (SPN) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and a...

  19. Towards the development of run times leveraging virtualization for high performance computing; Contribution a l'elaboration de supports executifs exploitant la virtualisation pour le calcul hautes performances

    Energy Technology Data Exchange (ETDEWEB)

    Diakhate, F.

    2010-12-15

    In recent years, there has been a growing interest in using virtualization to improve the efficiency of data centers. This success is rooted in virtualization's excellent fault tolerance and isolation properties, in the overall flexibility it brings, and in its ability to exploit multi-core architectures efficiently. These characteristics also make virtualization an ideal candidate to tackle issues found in new compute cluster architectures. However, in spite of recent improvements in virtualization technology, overheads in the execution of parallel applications remain, which prevent its use in the field of high performance computing. In this thesis, we propose a virtual device dedicated to message passing between virtual machines, so as to improve the performance of parallel applications executed in a cluster of virtual machines. We also introduce a set of techniques facilitating the deployment of virtualized parallel applications. These functionalities have been implemented as part of a runtime system which allows to benefit from virtualization's properties in a way that is as transparent as possible to the user while minimizing performance overheads. (author)

  20. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  1. Maintaining the confidentiality of plot locations by exploiting the low sensitivity of forest structure models to different spectral extraction kernels

    Science.gov (United States)

    Sean P. Healey; Elizabeth Lapoint; Gretchen G. Moisen; Scott L. Powell

    2011-01-01

    The United States Forest Service Forest Inventory and Analysis (FIA) unit maintains a large national network of inventory plots.While the consistency and extent of this network make FIA data attractive for ecological modelling, the FIA is charged by statute not to publicly reveal inventory plot locations. However, use of FIA plot data by the remote sensing community...

  2. Tumor hypoxia - A confounding or exploitable factor in interstitial brachytherapy? Effects of tissue trauma in an experimental rat tumor model

    NARCIS (Netherlands)

    van den Berg, AP; van Geel, CAJF; van Hooije, CMC; van der Kleij, AJ; Visser, AG

    2000-01-01

    Purpose: To evaluate the potential effects of tumor hypoxia induced by afterloading catheter implantation on the effectiveness of brachytherapy in a rat tumor model. Methods and Materials: Afterloading catheters (4) Here implanted in subcutaneously growing R1M rhabdomyosarcoma in female Wag/Rij

  3. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  4. Groundwater – Geothermal preliminary model of the Acque Albule Basin (Rome: future perspectives of geothermal resources exploitation

    Directory of Open Access Journals (Sweden)

    Francesco La Vigna

    2013-12-01

    Full Text Available This work presents the preliminary results of a groundwater and geothermal model applied to the hydrothermal system of the Tivoli- Guidonia plain, located in the east surroundings of Rome. This area, which is characterized by a thick outcropping travertine deposit, has been an important quarry extraction area since roman age. Today the extraction is in deepening helped by a large dewatering action. By an hydrogeological point of view, the travertine aquifer of the Tivoli- Guidonia Plain, is recharged by lateral discharge in the Lucretili and Cornicolani Mts., and by piping trough important regional faults, located in the basal aquiclude, in the central area of the basin. Piping hydrothermal groundwater is the main contribution on flow in the basin. Preliminary simulations of the groundwater-geothermal model, reproduce quite well the heat and mineralization plumes of groundwater observed in the travertine aquifer.

  5. Exploiting amoeboid and non-vertebrate animal model systems to study the virulence of human pathogenic fungi.

    Directory of Open Access Journals (Sweden)

    Eleftherios Mylonakis

    2007-07-01

    Full Text Available Experiments with insects, protozoa, nematodes, and slime molds have recently come to the forefront in the study of host-fungal interactions. Many of the virulence factors required for pathogenicity in mammals are also important for fungal survival during interactions with non-vertebrate hosts, suggesting that fungal virulence may have evolved, and been maintained, as a countermeasure to environmental predation by amoebae and nematodes and other small non-vertebrates that feed on microorganisms. Host innate immune responses are also broadly conserved across many phyla. The study of the interaction between invertebrate model hosts and pathogenic fungi therefore provides insights into the mechanisms underlying pathogen virulence and host immunity, and complements the use of mammalian models by enabling whole-animal high throughput infection assays. This review aims to assist researchers in identifying appropriate invertebrate systems for the study of particular aspects of fungal pathogenesis.

  6. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  7. Modelling the sequential geographical exploitation and potential collapse of marine fisheries through economic globalization, climate change and management alternatives

    Directory of Open Access Journals (Sweden)

    Gorka Merino

    2011-07-01

    Full Text Available Global marine fisheries production has reached a maximum and may even be declining. Underlying this trend is a well-understood sequence of development, overexploitation, depletion and in some instances collapse of individual fish stocks, a pattern that can sequentially link geographically distant populations. Ineffective governance, economic considerations and climate impacts are often responsible for this sequence, although the relative contribution of each factor is contentious. In this paper we use a global bioeconomic model to explore the synergistic effects of climate variability, economic pressures and management measures in causing or avoiding this sequence. The model shows how a combination of climate-induced variability in the underlying fish population production, particular patterns of demand for fish products and inadequate management is capable of driving the world’s fisheries into development, overexploitation, collapse and recovery phases consistent with observations. Furthermore, it demonstrates how a sequential pattern of overexploitation can emerge as an endogenous property of the interaction between regional environmental fluctuations and a globalized trade system. This situation is avoidable through adaptive management measures that ensure the sustainability of regional production systems in the face of increasing global environmental change and markets. It is concluded that global management measures are needed to ensure that global food supply from marine products is optimized while protecting long-term ecosystem services across the world’s oceans.

  8. Development of a complex groundwater model to assess the relation among groundwater resource exploitation, seawater intrusion and land subsidence

    Science.gov (United States)

    Hsi Ting, Fang; Yih Chi, Tan; Chen, Jhong Bing

    2016-04-01

    The land subsidence, which is usually irreversible, in Taiwan Pintung Plain occurred due to groundwater overexploitation. Many of the land subsidence areas in Taiwan are located in coastal area. It could not only result in homeland loss, but also vulnerability to flooding because the function of drainage system and sea wall are weakened for the lowered ground surface. Groundwater salinization and seawater intrusion could happen more easily as well. This research focuses on grasping the trend of environmental change due to the damage and impact from inappropriate development of aquaculture in the last decades. The main task is developing the artificial neural networks (ANNs) and complex numerical model for conjunctive use of surface and groundwater which is composed of a few modules such as land use, land subsidence, contamination transportation and etc. An approach based on self-organizing map (SOM) is proposed to delineate groundwater recharge zones. Several topics will be studied such as coupling of surface water and groundwater modeling, assessing the benefit of improving groundwater resources by recharge, identifying the improper usage of groundwater resources, and investigating the effect of over-pumping on land subsidence in different depth. In addition, a complete plan for managing both the flooding and water resources will be instituted by scheming non-engineering adaptation strategies for homeland planning, ex. controlling pumping behavior in area vulnerable to land subsidence and increasing groundwater recharge.

  9. Dark matters: exploitation as cooperation.

    Science.gov (United States)

    Dasgupta, Partha

    2012-04-21

    The empirical literature on human cooperation contains studies of communitarian institutions that govern the provision of public goods and management of common property resources in poor countries. Scholars studying those institutions have frequently used the Prisoners' Dilemma game as their theoretical tool-kit. But neither the provision of local public goods nor the management of local common property resources involves the Prisoners' Dilemma. That has implications for our reading of communitarian institutions. By applying a fundamental result in the theory of repeated games to a model of local common property resources, it is shown that communitarian institutions can harbour exploitation of fellow members, something that would not be possible in societies where cooperation amounts to overcoming the Prisoners' Dilemma. The conclusion we should draw is that exploitation can masquerade as cooperation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Network exploitation using WAMI tracks

    Science.gov (United States)

    Rimey, Ray; Record, Jim; Keefe, Dan; Kennedy, Levi; Cramer, Chris

    2011-06-01

    Creating and exploiting network models from wide area motion imagery (WAMI) is an important task for intelligence analysis. Tracks of entities observed moving in the WAMI sensor data are extracted, then large numbers of tracks are studied over long time intervals to determine specific locations that are visited (e.g., buildings in an urban environment), what locations are related to other locations, and the function of each location. This paper describes several parts of the network detection/exploitation problem, and summarizes a solution technique for each: (a) Detecting nodes; (b) Detecting links between known nodes; (c) Node attributes to characterize a node; (d) Link attributes to characterize each link; (e) Link structure inferred from node attributes and vice versa; and (f) Decomposing a detected network into smaller networks. Experimental results are presented for each solution technique, and those are used to discuss issues for each problem part and its solution technique.

  11. Long Term Association of Tropospheric Trace gases over Pakistan by exploiting satellite observations and development of Econometric Regression based Model

    Science.gov (United States)

    Zeb, Naila; Fahim Khokhar, Muhammad; Khan, Saud Ahmed; Noreen, Asma; Murtaza, Rabbia

    2017-04-01

    . Furthermore to explore causal relation, regression analysis is employed to estimate model for CO and TOC. This model numerically estimated the long term association of trace gases over the region.

  12. Translation from UML to Markov Model: A Performance Modeling Framework

    Science.gov (United States)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  13. Performance of the Community Earth System Model

    Energy Technology Data Exchange (ETDEWEB)

    Worley, Patrick H [ORNL; Craig, Anthony [National Center for Atmospheric Research (NCAR); Dennis, John [National Center for Atmospheric Research (NCAR); Mirin, Arthur A. [Lawrence Livermore National Laboratory (LLNL); Taylor, Mark [Sandia National Laboratories (SNL); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR)

    2011-01-01

    The Community Earth System Model (CESM), released in June 2010, incorporates new physical process and new numerical algorithm options, significantly enhancing simulation capabilities over its predecessor, the June 2004 release of the Community Climate System Model. CESM also includes enhanced performance tuning options and performance portability capabilities. This paper describes the performance engineering aspects of the CESM and reports performance and performance scaling on both the Cray XT5 and the IBM BG/P for four representative production simulations, varying both problem size and enabled physical processes. The paper also describes preliminary performance results for high resolution simulations using over 200,000 processor cores, indicating the promise of ongoing work in numerical algorithms and where further work is required.

  14. Exploration, Exploitation, and Organizational Coordination Mechanisms

    Directory of Open Access Journals (Sweden)

    Silvio Popadiuk

    2016-03-01

    Full Text Available This paper presents an empirical relationship among exploration, exploitation, and organizational coordination mechanisms, classified as the centralization of decision-making, formalization, and connectedness. In order to analyze the findings of this survey, we used two techniques: Principal Component Analysis (PCA and Partial Least Squares Path Modeling (PLS-PM. Our analysis was supported by 249 answers from managers of companies located in Brazil (convenience sampling. Contrary to expectations, centralization and exploitation were negatively associated. Our data supports the research hypothesis that formalization is positively associated with exploitation. Although the relationship between formalization and exploration were significant, the result is contrary to the research hypothesis that we made. The relationships among connectedness and exploitation, and connectedness and exploration were both positive and significant. This relationship means that the more connectedness increases, the higher the likelihood of exploitation and exploration.

  15. Performance of hedging strategies in interval models

    NARCIS (Netherlands)

    Roorda, Berend; Engwerda, Jacob; Schumacher, J.M.

    2005-01-01

    For a proper assessment of risks associated with the trading of derivatives, the performance of hedging strategies should be evaluated not only in the context of the idealized model that has served as the basis of strategy development, but also in the context of other models. In this paper we

  16. Modeling, Simulation, and Performance Analysis of Decoy State Enabled Quantum Key Distribution Systems

    OpenAIRE

    Mailloux, Logan O.; Grimaila, Michael R.; Hodson, Douglas D.; Ryan Engle; Colin McLaughlin; Gerald Baumgartner

    2017-01-01

    Quantum Key Distribution (QKD) systems exploit the laws of quantum mechanics to generate secure keying material for cryptographic purposes. To date, several commercially viable decoy state enabled QKD systems have been successfully demonstrated and show promise for high-security applications such as banking, government, and military environments. In this work, a detailed performance analysis of decoy state enabled QKD systems is conducted through model and simulation of several common decoy s...

  17. Exploiting diverse crowd-sourced data as part of a mixed-methods approach to validating modelled flood extents and dynamics

    Science.gov (United States)

    Rollason, Edward; Bracken, Louise; Hardy, Richard; Large, Andy

    2017-04-01

    The use of flood models for evaluating flood risk from rivers and the sea is now a standard practice across Europe since the introduction of the 2007 EU Floods Directive requiring the assessment and mapping of flood risk from all major rivers and the sea. The availability of high quality topographic data from LiDAR and other remotely sensed sources has led to the increasing adoption of 2 dimensional models for simulating the dynamics of flooding on the floodplain. However, the ability to effectively validate dynamic floodplain inundation has not kept pace with the increasing complexity and spatial resolution of flood models. Validation remains dependent upon in-channel validation using flood level gauges or post-event data collection of wrack-marks, sometimes supplemented by community-derived anecdotal data. This poster presents the findings of a 'mixed-methods approach' to flood model validation using the winter 2016 floods on the River Tyne, UK. Using flood inundation results from a simple LISFLOOD-FP model of the River Tyne at Corbridge, the research develops a novel mixed-methods approach to validating both the maximum flood depths and extents, and the dynamics of the flood through the event. A crowd-sourced dataset of anecdotal information on flood dynamics, supported by photographic and video evidence, as well as community-derived, high definition UAV footage captured 24 and 48 hours after the peak of the event, allows for the comprehensive reconstruction of the flood dynamics and a more complete validation of the effectiveness of the model in reconstructing not just the maximum flood extent but also the dynamics of the rising and falling stages of an event. The findings of the research indicate the potential for making use of a much greater variety of locally-sourced data, particularly exploiting new technologies which offer opportunities for the collection of high quality data in the immediate aftermath of flooding events when traditional agencies may still

  18. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  19. Biofilm carrier migration model describes reactor performance.

    Science.gov (United States)

    Boltz, Joshua P; Johnson, Bruce R; Takács, Imre; Daigger, Glen T; Morgenroth, Eberhard; Brockmann, Doris; Kovács, Róbert; Calhoun, Jason M; Choubert, Jean-Marc; Derlon, Nicolas

    2017-06-01

    The accuracy of a biofilm reactor model depends on the extent to which physical system conditions (particularly bulk-liquid hydrodynamics and their influence on biofilm dynamics) deviate from the ideal conditions upon which the model is based. It follows that an improved capacity to model a biofilm reactor does not necessarily rely on an improved biofilm model, but does rely on an improved mathematical description of the biofilm reactor and its components. Existing biofilm reactor models typically include a one-dimensional biofilm model, a process (biokinetic and stoichiometric) model, and a continuous flow stirred tank reactor (CFSTR) mass balance that [when organizing CFSTRs in series] creates a pseudo two-dimensional (2-D) model of bulk-liquid hydrodynamics approaching plug flow. In such a biofilm reactor model, the user-defined biofilm area is specified for each CFSTR; thereby, Xcarrier does not exit the boundaries of the CFSTR to which they are assigned or exchange boundaries with other CFSTRs in the series. The error introduced by this pseudo 2-D biofilm reactor modeling approach may adversely affect model results and limit model-user capacity to accurately calibrate a model. This paper presents a new sub-model that describes the migration of Xcarrier and associated biofilms, and evaluates the impact that Xcarrier migration and axial dispersion has on simulated system performance. Relevance of the new biofilm reactor model to engineering situations is discussed by applying it to known biofilm reactor types and operational conditions.

  20. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  1. Global Climate Responses to Anthropogenic Groundwater Exploitation

    Science.gov (United States)

    Zeng, Y.; Xie, Z.

    2015-12-01

    In this study, a groundwater exploitation scheme is incorporated into the earth system model, Community Earth System Model 1.2.0 (CESM1.2.0), which is called CESM1.2_GW, and the climatic responses to anthropogenic groundwater withdrawal are then investigated on global scale. The scheme models anthropogenic groundwater exploitation and consumption, which are then divided into agricultural irrigation, industrial use and domestic use. A group of 41-year ensemble groundwater exploitation simulations with six different initial conditions, and a group of ensemble control simulations without exploitation are conducted using the developed model CESM1.2_GW with water supplies and demands estimated. The results reveal that the groundwater exploitation and water consumption cause drying effects on soil moisture in deep layers and wetting effects in upper layers, along with a rapidly declining groundwater table in Central US, Haihe River Basin in China and Northern India and Pakistan where groundwater extraction are most severe in the world. The atmosphere also responds to anthropogenic groundwater exploitation. Cooling effects on lower troposphere appear in large areas of North China Plain and of Northern India and Pakistan. Increased precipitation occurs in Haihe River Basin due to increased evapotranspiration from irrigation. Decreased precipitation occurs in Northern India because water vapor here is taken away by monsoon anomalies induced by anthropogenic alteration of groundwater. The local reducing effects of anthropogenic groundwater exploitation on total terrestrial water storage evinces that water resource is unsustainable with the current high exploitation rate. Therefore, a balance between slow groundwater withdrawal and rapid human economic development must be achieved to maintain a sustainable water resource, especially in over-exploitation regions such as Central US, Northern China, India and Pakistan.

  2. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  3. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  4. Towards Systematic Benchmarking of Climate Model Performance

    Science.gov (United States)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine

  5. Viruses exploiting peroxisomes.

    Science.gov (United States)

    Lazarow, Paul B

    2011-08-01

    Viruses that are of great importance for global public health, including HIV, influenza and rotavirus, appear to exploit a remarkable organelle, the peroxisome, during intracellular replication in human cells. Peroxisomes are sites of lipid biosynthesis and catabolism, reactive oxygen metabolism, and other metabolic pathways. Viral proteins are targeted to peroxisomes (the spike protein of rotavirus) or interact with peroxisomal proteins (HIV's Nef and influenza's NS1) or use the peroxisomal membrane for RNA replication. The Nef interaction correlates strongly with the crucial Nef function of CD4 downregulation. Viral exploitation of peroxisomal lipid metabolism appears likely. Mostly, functional significance and mechanisms remain to be elucidated. Recently, peroxisomes were discovered to play a crucial role in the innate immune response by signaling the presence of intracellular virus, leading to the first rapid antiviral response. This review unearths, interprets and connects old data, in the hopes of stimulating new and promising research. Copyright © 2011. Published by Elsevier Ltd.

  6. Dissemination and Exploitation Strategy

    DEFF Research Database (Denmark)

    Badger, Merete; Monaco, Lucio; Fransson, Torsten

    The research infrastructure project Virtual Campus Hub (VCH) runs from October 1, 2011 to September 30, 2013. Four technical universities in Europe, who are all active in the field of sustainable energy, form the project consortium: the Technical University of Denmark, The Royal Institute...... of Technology in Sweden, Politecnico di Torino in Italy, and Eindhoven University of Technology in the Netherlands. The project is partially funded by the European Commission under the 7th Framework Programme (project no. RI-283746). This report describes the final dissemination and exploitation strategy...... for project Virtual Campus Hub. A preliminary dissemination and exploitation plan was setup early in the project as described in the deliverable D6.1 Dissemination strategy paper - preliminary version. The plan has been revised on a monthly basis during the project’s lifecycle in connection with the virtual...

  7. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  8. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  9. PV performance modeling workshop summary report.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Tasca, Coryne Adelle (SRA International, Inc., Fairfax, VA); Cameron, Christopher P.

    2011-05-01

    During the development of a solar photovoltaic (PV) energy project, predicting expected energy production from a system is a key part of understanding system value. System energy production is a function of the system design and location, the mounting configuration, the power conversion system, and the module technology, as well as the solar resource. Even if all other variables are held constant, annual energy yield (kWh/kWp) will vary among module technologies because of differences in response to low-light levels and temperature. A number of PV system performance models have been developed and are in use, but little has been published on validation of these models or the accuracy and uncertainty of their output. With support from the U.S. Department of Energy's Solar Energy Technologies Program, Sandia National Laboratories organized a PV Performance Modeling Workshop in Albuquerque, New Mexico, September 22-23, 2010. The workshop was intended to address the current state of PV system models, develop a path forward for establishing best practices on PV system performance modeling, and set the stage for standardization of testing and validation procedures for models and input parameters. This report summarizes discussions and presentations from the workshop, as well as examines opportunities for collaborative efforts to develop objective comparisons between models and across sites and applications.

  10. Modeling of Near-Field Blast Performance

    Science.gov (United States)

    2013-11-01

    Army Research Laboratory Modeling of Near-Field Blast Performance by Barrie E. Homan , Matthew M. Biss, and Kevin L. McNesby ARL-TR-6711 November 2013...Laboratory Aberdeen Proving Ground, MD 21005-5066 ARL-TR-6711 November 2013 Modeling of Near-Field Blast Performance Barrie E. Homan , Matthew M. Biss, and... Homan Matthew M. Biss Kevin L. McNesby U.S. Army Research Laboratory ATTN: RDRL-WMP-G Aberdeen Proving Ground, MD 21005-5066 Two hydrocode packages

  11. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  12. A Procurement Performance Model for Construction Frameworks

    Directory of Open Access Journals (Sweden)

    Terence Y M Lam

    2015-07-01

    Full Text Available Collaborative construction frameworks have been developed in the United Kingdom (UK to create longer term relationships between clients and suppliers in order to improve project outcomes. Research undertaken into highways maintenance set within a major county council has confirmed that such collaborative procurement methods can improve time, cost and quality of construction projects. Building upon this and examining the same single case, this research aims to develop a performance model through identification of performance drivers in the whole project delivery process including pre and post contract phases. A priori performance model based on operational and sociological constructs was proposed and then checked by a pilot study. Factor analysis and central tendency statistics from the questionnaires as well as content analysis from the interview transcripts were conducted. It was confirmed that long term relationships, financial and non-financial incentives and stronger communication are the sociological behaviour factors driving performance. The interviews also established that key performance indicators (KPIs can be used as an operational measure to improve performance. With the posteriori performance model, client project managers can effectively collaboratively manage contractor performance through procurement measures including use of longer term and KPIs for the contract so that the expected project outcomes can be achieved. The findings also make significant contribution to construction framework procurement theory by identifying the interrelated sociological and operational performance drivers. This study is set predominantly in the field of highways civil engineering. It is suggested that building based projects or other projects that share characteristics are grouped together and used for further research of the phenomena discovered.

  13. Models for Automated Tube Performance Calculations

    Energy Technology Data Exchange (ETDEWEB)

    C. Brunkhorst

    2002-12-12

    High power radio-frequency systems, as typically used in fusion research devices, utilize vacuum tubes. Evaluation of vacuum tube performance involves data taken from tube operating curves. The acquisition of data from such graphical sources is a tedious process. A simple modeling method is presented that will provide values of tube currents for a given set of element voltages. These models may be used as subroutines in iterative solutions of amplifier operating conditions for a specific loading impedance.

  14. Performance Evaluation and Modelling of Container Terminals

    Science.gov (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh

    2017-11-01

    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  15. Hacking the art of exploitation

    CERN Document Server

    Erickson, Jon

    2003-01-01

    A comprehensive introduction to the techniques of exploitation and creative problem-solving methods commonly referred to as "hacking," Hacking: The Art of Exploitation is for both technical and non-technical people who are interested in computer security. It shows how hackers exploit programs and write exploits, instead of just how to run other people's exploits. Unlike many so-called hacking books, this book explains the technical aspects of hacking, including stack based overflows, heap based overflows, string exploits, return-into-libc, shellcode, and cryptographic attacks on 802.11b.

  16. The catastrophic decline of the Sumatran rhino (Dicerorhinus sumatrensis harrissoni in Sabah: Historic exploitation, reduced female reproductive performance and population viability

    Directory of Open Access Journals (Sweden)

    P. Kretzschmar

    2016-04-01

    Full Text Available The reasons for catastrophic declines of Sumatran rhinos are far from clear and data necessary to improve decisions for conservation management are often lacking. We reviewed literature and assembled a comprehensive data set on surveys of the Sumatran rhino subspecies (Dicerorhinus sumatrensis harrissoni in the Malaysian state of Sabah on Borneo to chart the historical development of the population in Sabah and its exploitation until the present day. We fitted resource selection functions to identify habitat features preferred by a remnant population of rhinos living in the Tabin Wildlife Reserve in Sabah, and ran a series of population viability analyses (PVAs to extract the key demographic parameters most likely to affect population dynamics. We show that as preferred habitat, the individuals in the reserve were most likely encountered in elevated areas away from roads, in close distance to mud-volcanoes, with a low presence of human trespassers and a wallow on site, and within a neighbourhood of dense forest and grassland patches preferably on Fluvisols and Acrisols. Our population viability analyses identified the percentage of breeding females and female lifetime reproductive period as the crucial parameters driving population dynamics, in combination with total protection even moderate improvements could elevate population viability substantially. The analysis also indicates that unrestrained hunting between 1930 and 1950 drastically reduced the historical rhino population in Sabah and that the remnant population could be rescued by combining the effort of total protection and stimulation of breeding activity. Based on our results, we recommend to translocate isolated reproductively healthy individuals to protected locations and to undertake measures to maximise conceptions, or running state-of-the-art reproductive management with assisted reproduction techniques. Our study demonstrates that a judicious combination of techniques can do

  17. The immunomodulatory effects of pegylated liposomal doxorubicin are amplified in BRCA1--deficient ovarian tumors and can be exploited to improve treatment response in a mouse model.

    Science.gov (United States)

    Mantia-Smaldone, Gina; Ronner, Lukas; Blair, Anne; Gamerman, Victoria; Morse, Christopher; Orsulic, Sandra; Rubin, Stephen; Gimotty, Phyllis; Adams, Sarah

    2014-06-01

    Women with BRCA-associated ovarian cancer demonstrate excellent responses to Pegylated Liposomal Doxorubicin (PLD). PLD has also been shown to enhance T cell recognition of tumor cells. Here we characterize immunophenotypic changes associated with BRCA1 dysfunction in ovarian cancer cells, and evaluate the T cell contribution to the therapeutic efficacy of PLD in a BRCA1- ovarian cancer model to determine whether enhanced anti-tumor immunity contributes to the improved response to PLD in BRCA1- ovarian cancers. The immunophenotype of BRCA1- and wild-type (WT) ovarian cancer cells and their response to PLD were compared in vitro using flow cytometry. T cell recruitment to BRCA1- tumors was evaluated with flow cytometry and immunohistochemistry. The contribution of T cell populations to the therapeutic effect of PLD in a BRCA1- model was evaluated using immunodepleting antibodies with PLD in vivo. The cytotoxic response to PLD was similar in BRCA1- and WT cells in vitro. BRCA1- inactivation resulted in higher expression of Fas and MHC-I at baseline and after PLD exposure. PLD prolonged the survival of BRCA1- tumor bearing mice and increased intratumoral T cell recruitment. CD4+ depletion combined with PLD significantly prolonged overall survival (p=0.0204) in BRCA1- tumor-bearing mice. Differences in the immunophenotype of BRCA1- and WT cells are amplified by PLD exposure. The enhanced immunomodulatory effects of PLD in BRCA1- tumors may be exploited therapeutically by eliminating suppressive CD4+ T cells. Our results support further study of combination therapy using PLD and immune agents, particularly in women with BRCA gene mutations. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. 3D Massive MIMO Systems: Channel Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-03-01

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. More recently, the trend is to enhance the system performance by exploiting the channel\\'s degrees of freedom in the elevation through the dynamic adaptation of the vertical antenna beam pattern. This necessitates the derivation and characterization of three-dimensional (3D) channels. Over the years, channel models have evolved to address the challenges of wireless communication technologies. In parallel to theoretical studies on channel modeling, many standardized channel models like COST-based models, 3GPP SCM, WINNER, ITU have emerged that act as references for industries and telecommunication companies to assess system-level and link-level performances of advanced signal processing techniques over real-like channels. Given the existing channels are only two dimensional (2D) in nature; a large effort in channel modeling is needed to study the impact of the channel component in the elevation direction. The first part of this work sheds light on the current 3GPP activity around 3D channel modeling and beamforming, an aspect that to our knowledge has not been extensively covered by a research publication. The standardized MIMO channel model is presented, that incorporates both the propagation effects of the environment and the radio effects of the antennas. In order to facilitate future studies on the use of 3D beamforming, the main features of the proposed 3D channel model are discussed. A brief overview of the future 3GPP 3D channel model being outlined for the next generation of wireless networks is also provided. In the subsequent part of this work, we present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles of departure and

  19. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  20. New Diagnostics to Assess Model Performance

    Science.gov (United States)

    Koh, Tieh-Yong

    2013-04-01

    The comparison of model performance between the tropics and the mid-latitudes is particularly problematic for observables like temperature and humidity: in the tropics, these observables have little variation and so may give an apparent impression that model predictions are often close to observations; on the contrary, they vary widely in mid-latitudes and so the discrepancy between model predictions and observations might be unnecessarily over-emphasized. We have developed a suite of mathematically rigorous diagnostics that measures normalized errors accounting for the observed and modeled variability of the observables themselves. Another issue in evaluating model performance is the relative importance of getting the variance of an observable right versus getting the modeled variation to be in phase with the observed. The correlation-similarity diagram was designed to analyse the pattern error of a model by breaking it down into contributions from amplitude and phase errors. A final and important question pertains to the generalization of scalar diagnostics to analyse vector observables like wind. In particular, measures of variance and correlation must be properly derived to avoid the mistake of ignoring the covariance between north-south and east-west winds (hence wrongly assuming that the north-south and east-west directions form a privileged vector basis for error analysis). There is also a need to quantify systematic preferences in the direction of vector wind errors, which we make possible by means of an error anisotropy diagram. Although the suite of diagnostics is mentioned with reference to model verification here, it is generally applicable to quantify differences between two datasets (e.g. from two observation platforms). Reference publication: Koh, T. Y. et al. (2012), J. Geophys. Res., 117, D13109, doi:10.1029/2011JD017103. also available at http://www.ntu.edu.sg/home/kohty

  1. Performance modeling of network data services

    Energy Technology Data Exchange (ETDEWEB)

    Haynes, R.A.; Pierson, L.G.

    1997-01-01

    Networks at major computational organizations are becoming increasingly complex. The introduction of large massively parallel computers and supercomputers with gigabyte memories are requiring greater and greater bandwidth for network data transfers to widely dispersed clients. For networks to provide adequate data transfer services to high performance computers and remote users connected to them, the networking components must be optimized from a combination of internal and external performance criteria. This paper describes research done at Sandia National Laboratories to model network data services and to visualize the flow of data from source to sink when using the data services.

  2. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  3. The Geohazards Exploitation Platform

    Science.gov (United States)

    Laur, Henri; Casu, Francesco; Bally, Philippe; Caumont, Hervé; Pinto, Salvatore

    2016-04-01

    The Geohazards Exploitation Platform, or Geohazards TEP (GEP), is an ESA originated R&D activity of the EO ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. This encompasses on-demand processing for specific user needs, systematic processing to address common information needs of the geohazards community, and integration of newly developed processors for scientists and other expert users. The platform supports the geohazards community's objectives as defined in the context of the International Forum on Satellite EO and Geohazards organised by ESA and GEO in Santorini in 2012. The GEP is a follow on to the Supersites Exploitation Platform (SSEP) an ESA initiative to support the Geohazards Supersites & Natural Laboratories initiative (GSNL). Today the GEP allows to exploit 70+ Terabyte of ERS and ENVISAT archive and the Copernicus Sentinel-1 data available on line. The platform has already engaged 22 European early adopters in a validation activity initiated in March 2015. Since September, this validation has reached 29 single user projects. Each project is concerned with either integrating an application, running on demand processing or systematically generating a product collection using an application available in the platform. The users primarily include 15 geoscience centres and universities based in Europe: British Geological Survey (UK), University of Leeds (UK), University College London (UK), ETH University of Zurich (CH), INGV (IT), CNR-IREA and CNR-IRPI (IT), University of L'Aquila (IT), NOA (GR), Univ. Blaise Pascal & CNRS (FR), Ecole Normale Supérieure (FR), ISTERRE / University of Grenoble-Alpes (FR). In addition, there are users from Africa and North America with the University of Rabat (MA) and the University of Miami (US). Furthermore two space agencies and four private companies are involved: the German Space Research Centre DLR (DE), the European Space Agency (ESA), Altamira Information (ES

  4. Exploitative Learning by Exporting

    DEFF Research Database (Denmark)

    Golovko, Elena; Lopes Bento, Cindy; Sofka, Wolfgang

    Decisions on entering foreign markets are among the most challenging but also potentially rewarding strategy choices managers can make. In this study, we examine the effect of export entry on the firm investment decisions in two activities associated with learning about new technologies...... and learning about new markets ? R&D investments and marketing investments, in search of novel insights into the content and process underlying learning by exporting. We draw from organizational learning theory for predicting changes in both R&D and marketing investment patterns that accompany firm entry......, it is predominantly the marketing-related investment decisions associated with starting to export that lead to increases in firm productivity. We conclude that learning-by-exporting might be more properly characterized as ?learning about and exploiting new markets? rather than ?learning about new technologies...

  5. Image exploitation for MISAR

    Science.gov (United States)

    Heinze, N.; Edrich, M.; Saur, G.; Krüger, W.

    2007-04-01

    The miniature SAR-system MiSAR has been developed by EADS Germany for lightweight UAVs like the LUNASystem. MiSAR adds to these tactical UAV-systems the all-weather reconnaissance capability, which is missing until now. Unlike other SAR sensors, that produce large strip maps at update rates of several seconds, MiSAR generates sequences of SAR images with approximately 1 Hz frame rate. photo interpreters (PI) of tactical drones, now mainly experienced with visual interpretation, are not used to SARimages, especially not with SAR-image sequence characteristics. So they should be supported to improve their ability to carry out their task with a new, demanding sensor system. We have therefore analyzed and discussed with military PIs in which task MiSAR can be used and how the PIs can be supported by special algorithms. We developed image processing- and exploitation-algorithms for such SAR-image sequences. A main component is the generation of image sequence mosaics to get more oversight. This mosaicing has the advantage that also non straight /linear flight-paths and varying squint angles can be processed. Another component is a screening-component for manmade objects to mark regions of interest in the image sequences. We use a classification based approach, which can be easily adapted to new sensors and scenes. These algorithms are integrated into an image exploitation system to improve the image interpreters ability to get a better oversight, better orientation and helping them to detect relevant objects, especially considering long endurance reconnaissance missions.

  6. Learning Metasploit exploitation and development

    CERN Document Server

    Balapure, Aditya

    2013-01-01

    A practical, hands-on tutorial with step-by-step instructions. The book will follow a smooth and easy-to-follow tutorial approach, covering the essentials and then showing the readers how to write more sophisticated exploits.This book targets exploit developers, vulnerability analysts and researchers, network administrators, and ethical hackers looking to gain advanced knowledge in exploitation development and identifying vulnerabilities. The primary goal is to take readers wishing to get into more advanced exploitation discovery and reaching the next level.Prior experience exploiting basic st

  7. Multilevel Modeling of the Performance Variance

    Directory of Open Access Journals (Sweden)

    Alexandre Teixeira Dias

    2012-12-01

    Full Text Available Focusing on the identification of the role played by Industry on the relations between Corporate Strategic Factors and Performance, the hierarchical multilevel modeling method was adopted when measuring and analyzing the relations between the variables that comprise each level of analysis. The adequacy of the multilevel perspective to the study of the proposed relations was identified and the relative importance analysis point out to the lower relevance of industry as a moderator of the effects of corporate strategic factors on performance, when the latter was measured by means of return on assets, and that industry don‟t moderates the relations between corporate strategic factors and Tobin‟s Q. The main conclusions of the research are that the organizations choices in terms of corporate strategy presents a considerable influence and plays a key role on the determination of performance level, but that industry should be considered when analyzing the performance variation despite its role as a moderator or not of the relations between corporate strategic factors and performance.

  8. Modelling fuel cell performance using artificial intelligence

    Science.gov (United States)

    Ogaji, S. O. T.; Singh, R.; Pilidis, P.; Diacakis, M.

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed.

  9. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Assessed the performance of MiRTif by taking the interaction duplex from miRecords database (the experimentally validated database). The total number of interaction duplex collected from the. miRecords is 117 and that had been used by us to train the model. We found 87 out of 117 positive interaction are correctly ...

  10. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-07-01

    Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan. Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities. Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems. Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk. Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product. Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  11. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-02-01

    Full Text Available Orientation: The article discussed the importance of rigour in credit risk assessment.Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan.Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities.Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems.Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk.Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product.Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  12. Implementation of a model-independent search for new physics with the CMS detector exploiting the world-wide LHC Computing Grid

    CERN Document Server

    Hof, Carsten

    With this year's start of CERN's Large Hadron Collider (LHC) it will be possible for the first time to directly probe the physics at the TeV-scale at a collider experiment. At this scale the Standard Model of particle physics will reach its limits and new physical phenomena are expected to appear. This study performed with one of the LHC's experiments, namely the Compact Muon Solenoid (CMS), is trying to quantify the understanding of the Standard Model and is hunting for deviations from the expectation by investigating a large fraction of the CMS data. While the classical approach for searches of physics beyond the Standard Model assumes a specific theoretical model and tries to isolate events with a certain signature characteristic for the new theory, this thesis follows a model-independent approach. The method relies only on the knowledge of the Standard Model and is suitable to spot deviations from this model induced by particular theoretical models but also theories not yet thought of. Future data are to ...

  13. High-performance flexible perovskite solar cells exploiting Zn2SnO4 prepared in solution below 100 °C.

    Science.gov (United States)

    Shin, Seong Sik; Yang, Woon Seok; Noh, Jun Hong; Suk, Jae Ho; Jeon, Nam Joong; Park, Jong Hoon; Kim, Ju Seong; Seong, Won Mo; Seok, Sang Il

    2015-06-22

    Fabricating inorganic-organic hybrid perovskite solar cells (PSCs) on plastic substrates broadens their scope for implementation in real systems by imparting portability, conformability and allowing high-throughput production, which is necessary for lowering costs. Here we report a new route to prepare highly dispersed Zn2SnO4 (ZSO) nanoparticles at low-temperature (introduction of the ZSO film significantly improves transmittance of flexible polyethylene naphthalate/indium-doped tin oxide (PEN/ITO)-coated substrate from ∼75 to ∼90% over the entire range of wavelengths. The best performing flexible PSC, based on the ZSO and CH3NH3PbI3 layer, exhibits steady-state power conversion efficiency (PCE) of 14.85% under AM 1.5G 100 mW·cm(-2) illumination. This renders ZSO a promising candidate as electron-conducting electrode for the highly efficient flexible PSC applications.

  14. High Performance Modeling of Novel Diagnostics Configuration

    Science.gov (United States)

    Smith, Dalton; Gibson, John; Lodes, Rylie; Malcolm, Hayden; Nakamoto, Teagan; Parrack, Kristina; Trujillo, Christopher; Wilde, Zak; Los Alamos Laboratories Q-6 Students Team

    2017-06-01

    A novel diagnostics method to measure the Hayes Electric Effect was tested and verified against computerized models. Where standard PVDF diagnostics utilize piezoelectric materials to measure detonation pressure through strain-induced electrical signals, the PVDF was used in a novel technique by also detecting the detonation's induced electric field. The ALE-3D Hydro Codes predicted the performance by calculating detonation velocities, pressures, and arrival times. These theoretical results then validated the experimental use of the PVDF repurposed to specifically track the Hayes Electric Effect. Los Alamos National Laboratories Q-6.

  15. Exploiting Performance of Different Low-Cost Sensors for Small Amplitude Oscillatory Motion Monitoring: Preliminary Comparisons in View of Possible Integration

    Directory of Open Access Journals (Sweden)

    Elisa Benedetti

    2016-01-01

    Full Text Available We address the problem of low amplitude oscillatory motion detection through different low-cost sensors: a LIS3LV02DQ MEMS accelerometer, a Microsoft Kinect v2 range camera, and a uBlox 6 GPS receiver. Several tests were performed using a one-direction vibrating table with different oscillation frequencies (in the range 1.5–3 Hz and small challenging amplitudes (0.02 m and 0.03 m. A Mikrotron EoSens high-resolution camera was used to give reference data. A dedicated software tool was developed to retrieve Kinect v2 results. The capabilities of the VADASE algorithm were employed to process uBlox 6 GPS receiver observations. In the investigated time interval (in the order of tens of seconds the results obtained indicate that displacements were detected with the resolution of fractions of millimeters with MEMS accelerometer and Kinect v2 and few millimeters with uBlox 6. MEMS accelerometer displays the lowest noise but a significant bias, whereas Kinect v2 and uBlox 6 appear more stable. The results suggest the possibility of sensor integration both for indoor (MEMS accelerometer + Kinect v2 and for outdoor (MEMS accelerometer + uBlox 6 applications and seem promising for structural monitoring applications.

  16. Information source exploitation/exploration and NPD decision-making

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    of gate decision-making and information sources was developed across five generic stages (idea, concept, design, test, and commercialization). Our data was generated with a participatory agent-based simulation of NPD gate decision-points in the development process. The sample consists of 134 managers from......The purpose of this study is to examine how the exploration/exploitation continuum is applied by decision-makers in new product gate decision-making. Specifically, we analyze at gate decision-points how the evaluation of a new product project is affected by the information source exploitation....../exploration search behavior of decision-makers. In addition, overexploitation and overexploration in new product development decision-making is investigated through mediating effects of perceived information usefulness and applied performance criteria by decision-makers at gates. To this end a conceptual model...

  17. Tonic dopamine modulates exploitation of reward learning

    Directory of Open Access Journals (Sweden)

    Jeff A Beeler

    2010-11-01

    Full Text Available The impact of dopamine on adaptive behavior in a naturalistic environment is largely unexamined. Experimental work suggests that phasic dopamine is central to reinforcement learning whereas tonic dopamine may modulate performance without altering learning per se; however, this idea has not been developed formally or integrated with computational models of dopamine function. We quantitatively evaluate the role of tonic dopamine in these functions by studying the behavior of hyperdopaminergic DAT knockdown mice in an instrumental task in a semi-naturalistic homecage environment. In this closed economy paradigm, subjects earn all of their food by pressing either of two levers, but the relative cost for food on each lever shifts frequently. Compared to wild-type mice, hyperdopaminergic mice allocate more lever presses on high-cost levers, thus working harder to earn a given amount of food and maintain their body weight. However, both groups show a similarly quick reaction to shifts in lever cost, suggesting that the hyperdominergic mice are not slower at detecting changes, as with a learning deficit. We fit the lever choice data using reinforcement learning models to assess the distinction between acquisition and expression the models formalize. In these analyses, hyperdopaminergic mice displayed normal learning from recent reward history but diminished capacity to exploit this learning: a reduced coupling between choice and reward history. These data suggest that dopamine modulates the degree to which prior learning biases action selection and consequently alters the expression of learned, motivated behavior.

  18. On the issue of Roşia Montană gold exploitation An application and extension of the Arrow-Fisher uncertainty model on local issues

    Directory of Open Access Journals (Sweden)

    Săveanu Mircea

    2015-03-01

    Full Text Available The aim of this article is to analyse the prospects of gold mine development at Roşia Montană, with a focus on the uncertainty regarding this process. We are specifically interested in the issues associated with cyanide spill accidents and how these uncertain events can alter the cost/benefit analysis. Making use of an established methodology, we conclude that the projected gold exploitation can pose considerable risks to the environment, and that an efficient operation would imply either reducing these risks, their effect on the environment, or both. Finally, we also draw from historical knowledge regarding the ancient mine Alburnus Maior, in order to assess the viability of the modern exploitation. We conclude that the modern project could be improved by technological progress, which would seek to maximize the scale of operations, while minimizing both the risk of accidents and their impact on the environment.

  19. Evaluation of Renewable Energy Sources Exploitation at remote regions, using Computing Model and Multi-Criteria Analysis: A Case-Study in Samothrace, Greece

    OpenAIRE

    Mourmouris, J. C.; Potolias, C.; Fantidis, Jacob G

    2016-01-01

    The exhaustion of conventional energy reserves (diesel, coal, gas etc.), coupled with environmental impacts, influence seriously sustainability aspects of the whole planet, both from an economic as well as an environmental point of view. The use of renewable energy resources for electricity generation is being promoted by most countries. The present job focuses on exploitation of Renewable Energy Sources (RES) based not only on economical criteria which derived from a HOMER software tool but ...

  20. Virtual Exploitation Environment Demonstration for Atmospheric Missions

    Science.gov (United States)

    Natali, Stefano; Mantovani, Simone; Hirtl, Marcus; Santillan, Daniel; Triebnig, Gerhard; Fehr, Thorsten; Lopes, Cristiano

    2017-04-01

    The scientific and industrial communities are being confronted with a strong increase of Earth Observation (EO) satellite missions and related data. This is in particular the case for the Atmospheric Sciences communities, with the upcoming Copernicus Sentinel-5 Precursor, Sentinel-4, -5 and -3, and ESA's Earth Explorers scientific satellites ADM-Aeolus and EarthCARE. The challenge is not only to manage the large volume of data generated by each mission / sensor, but to process and analyze the data streams. Creating synergies among the different datasets will be key to exploit the full potential of the available information. As a preparation activity supporting scientific data exploitation for Earth Explorer and Sentinel atmospheric missions, ESA funded the "Technology and Atmospheric Mission Platform" (TAMP) [1] [2] project; a scientific and technological forum (STF) has been set-up involving relevant European entities from different scientific and operational fields to define the platforḿs requirements. Data access, visualization, processing and download services have been developed to satisfy useŕs needs; use cases defined with the STF, such as study of the SO2 emissions for the Holuhraun eruption (2014) by means of two numerical models, two satellite platforms and ground measurements, global Aerosol analyses from long time series of satellite data, and local Aerosol analysis using satellite and LIDAR, have been implemented to ensure acceptance of TAMP by the atmospheric sciences community. The platform pursues the "virtual workspace" concept: all resources (data, processing, visualization, collaboration tools) are provided as "remote services", accessible through a standard web browser, to avoid the download of big data volumes and for allowing utilization of provided infrastructure for computation, analysis and sharing of results. Data access and processing are achieved through standardized protocols (WCS, WPS). As evolution toward a pre

  1. Performance Management: A model and research agenda

    NARCIS (Netherlands)

    D.N. den Hartog (Deanne); J.P.P.E.F. Boselie (Paul); J. Paauwe (Jaap)

    2004-01-01

    textabstractPerformance Management deals with the challenge organizations face in defining, measuring and stimulating employee performance with the ultimate goal to improve organizational performance. Thus, Performance Management involves multiple levels of analysis and is clearly linked to the

  2. M-Commerce Exploitation

    DEFF Research Database (Denmark)

    Ulhøi, John Parm; Jørgensen, Frances

    2008-01-01

    into this emerging market may well depend on development of new business models that emphasize the socio-technical intricacies of these networks. The objective of this paper is to examine the development of these networks as a central part of new M-commerce business models in SME's and report on initial findings...

  3. Numerical modeling capabilities to predict repository performance

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used.

  4. Fishery of the Uçá Crab Ucides Cordatus (Linnaeus, 1763 in a Mangrove Area in Cananéia, State of São Paulo, Brazil: Fishery Performance, Exploitation Patterns and Factors Affecting the Catches

    Directory of Open Access Journals (Sweden)

    Luis Felipe de Almeida Duarte

    2014-09-01

    Full Text Available The fishery of the mangrove crab (Ucides cordatus is one of the oldest sources of food, income and extractive activity in the estuarine systems of Brazil. The state of São Paulo has the largest population of any Brazilian state, and the city of Cananéia, in the Brazilian southeast has the highest recorded level of exploitation of the uçá-crab. Since 1990, this species has been under intense exploitation pressure due to the unauthorized use of a type of trap called 'redinha'. This type of fishing gear is considered harmful and is prohibited by Brazilian law, although its use is very common throughout the country. This study aims to evaluate the exploitation patterns of U. cordatus based on landing data and monitoring of the crab fishermen to verify the population structure of the crab stock and to identify the factors that influence the catches. A general view of the sustainability of the fishery for this resource is also provided for five defined mangrove sectors (areas A to E at Cananéia. For this purpose, fishery data were recorded during 2009-2010 by the Instituto de Pesca (APTA/SAA-SP, and monitoring of the capture procedures used by two fishermen was conducted to obtain biometry data (CW, carapace width and gender data for the captured crabs. The redinha trap was very efficient (86.4% and produced sustainable catches because the trapped crabs were legal-sized males (CW>60 mm, although some traps are lost or remain in the mangrove swamps and can cause pollution by introducing plastic debris. The fishery data were evaluated with a General Linear Model (GLM based on six factors: the characteristics of the crab fishermen, the time of capture (by month and year, the lunar phase, the productive sector and the reproductive period. The individual crab fishermen's empirical knowledge, the year of capture and the productive sector were the strongest influences on the crab catch per unit effort (CPUE. Differing extraction patterns were found in

  5. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  6. Performance Modelling of Steam Turbine Performance using Fuzzy ...

    African Journals Online (AJOL)

    A Fuzzy Inference System for predicting the performance of steam turbine based on Rankine cycle is developed using a 144-rule based in analyzing the generated data for different inlet and outlet conditions. The result of efficiency for different types of membership functions and defuzzification method was obtained.

  7. Cognitive, social, and neural determinants of diminished decision-making and financial exploitation risk in aging and dementia: A review and new model.

    Science.gov (United States)

    Spreng, R Nathan; Karlawish, Jason; Marson, Daniel C

    2016-01-01

    In this article we will briefly review how changes in brain and in cognitive and social functioning, across the spectrum from normal to pathological aging, can lead to decision-making impairments that increase abuse risk in many life domains (e.g., health care, social engagement, financial management). The review will specifically focus on emerging research identifying neural, cognitive, and social markers of declining financial decision-making capacity in older adults. We will highlight how these findings are opening avenues for early detection and new interventions to reduce exploitation risk.

  8. The landscape of GPGPU performance modeling tools

    NARCIS (Netherlands)

    Madougou, S.; Varbanescu, A.; de Laat, C.; van Nieuwpoort, R.

    GPUs are gaining fast adoption as high-performance computing architectures, mainly because of their impressive peak performance. Yet most applications only achieve small fractions of this performance. While both programmers and architects have clear opinions about the causes of this performance gap,

  9. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    : a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2...

  10. Real-time video exploitation system for small UAVs

    Science.gov (United States)

    Su, Ang; Zhang, Yueqiang; Dong, Jing; Xu, Yuhua; Zhu, Xianwei; Zhang, Xiaohu

    2013-08-01

    The high portability of small Unmanned Aircraft Vehicles (UAVs) makes them play an important role in surveillance and reconnaissance tasks, so the military and civilian desires for UAVs are constantly growing. Recently, we have developed a real-time video exploitation system for our small UAV which is mainly used in forest patrol tasks. Our system consists of six key models, including image contrast enhancement, video stabilization, mosaicing, salient target indication, moving target indication, and display of the footprint and flight path on map. Extensive testing on the system has been implemented and the result shows our system performed well.

  11. The Gaia scientific exploitation networks

    Science.gov (United States)

    Figueras, F.; Jordi, C.

    2015-05-01

    On July 2014 the Gaia satellite, placed at L2 since January 2014, finished their commissioning phase and started collecting high accurate scientific data. New and more realistic estimations of the astrometric, photometric and spectroscopic accuracy expected after five years mission operation (2014-2019) have been recently published in the Gaia Science Performance Web page. Here we present the coordination efforts and the activities being conducted through the two GREAT (Gaia Research for European Astronomy Training) European Networks, the GREAT-ESF, a programme supported by the European Science Foundation (2010-2015), and the GREAT-ITN network, from the European Union's Seventh Framework Programme (2011-2015). The main research theme of these networks is to unravel the origin and history of our home galaxy. Emphasis is placed on the research projects being conducted by the Spanish Researchers through these networks, well coordinated by the Red Española de Explotación Científica de Gaia (REG network, with more than 140 participants). Members of the REG play an important role on the collection of complementary spectroscopic data from ground based telescopes, on the development of new tools for an optimal scientific exploitation of Gaia data and on the preparation task to create the Gaia archive.

  12. Commercial sexual exploitation of children

    National Research Council Canada - National Science Library

    Mauricio Rojas Betancur; Raquel Mendez Villamizar; Diana Lucía Moreno

    2012-01-01

      We study the sexual exploitation of children contributing to the understanding of risk and situations favouring the entry and permanence of children and adolescents from the reconstruction of the...

  13. Transnational gestational surrogacy: does it have to be exploitative?

    Science.gov (United States)

    Kirby, Jeffrey

    2014-01-01

    This article explores the controversial practice of transnational gestational surrogacy and poses a provocative question: Does it have to be exploitative? Various existing models of exploitation are considered and a novel exploitation-evaluation heuristic is introduced to assist in the analysis of the potentially exploitative dimensions/elements of complex health-related practices. On the basis of application of the heuristic, I conclude that transnational gestational surrogacy, as currently practiced in low-income country settings (such as rural, western India), is exploitative of surrogate women. Arising out of consideration of the heuristic's exploitation conditions, a set of public education and enabled choice, enhanced protections, and empowerment reforms to transnational gestational surrogacy practice is proposed that, if incorporated into a national regulatory framework and actualized within a low income country, could possibly render such practice nonexploitative.

  14. PV Performance Modeling Methods and Practices: Results from the 4th PV Performance Modeling Collaborative Workshop.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    In 2014, the IEA PVPS Task 13 added the PVPMC as a formal activity to its technical work plan for 2014-2017. The goal of this activity is to expand the reach of the PVPMC to a broader international audience and help to reduce PV performance modeling uncertainties worldwide. One of the main deliverables of this activity is to host one or more PVPMC workshops outside the US to foster more international participation within this collaborative group. This report reviews the results of the first in a series of these joint IEA PVPS Task 13/PVPMC workshops. The 4th PV Performance Modeling Collaborative Workshop was held in Cologne, Germany at the headquarters of TÜV Rheinland on October 22-23, 2015.

  15. Social exploitation of vitellogenin.

    Science.gov (United States)

    Amdam, Gro V; Norberg, Kari; Hagen, Arne; Omholt, Stig W

    2003-02-18

    Vitellogenin is a female-specific glucolipoprotein yolk precursor produced by all oviparous animals. Vitellogenin expression is under hormonal control, and the protein is generally synthesized directly before yolk deposition. In the honeybee (Apis mellifera), vitellogenin is not only synthesized by the reproductive queen, but also by the functionally sterile workers. In summer, the worker population consists of a hive bee group performing a multitude of tasks including nursing inside the nest, and a forager group specialized in collecting nectar, pollen, water, and propolis. Vitellogenin is synthesized in large quantities by hive bees. When hive bees develop into foragers, their juvenile hormone titers increase, and this causes cessation of their vitellogenin production. This inverse relationship between vitellogenin synthesis and juvenile hormone is opposite to the norm in insects, and the underlying proximate processes and life-history reasons are still not understood. Here we document an alternative use of vitellogenin by showing that it is a source for the proteinaceous royal jelly that is produced by the hive bees. Hive bees use the jelly to feed larvae, queen, workers, and drones. This finding suggests that the evolution of a brood-rearing worker class and a specialized forager class in an advanced eusocial insect society has been directed by an alternative utilization of yolk protein.

  16. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus; Condra, Thomas Joseph

    2003-01-01

    submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic- Equation system (DAE). Subsequently MatLab/Simulink has......A model for a ue gas boiler covering the ue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been dened for the furnace, the convection zone (split in 2: a zone...

  17. Performance Improvement/HPT Model: Guiding the Process

    Science.gov (United States)

    Dessinger, Joan Conway; Moseley, James L.; Van Tiem, Darlene M.

    2012-01-01

    This commentary is part of an ongoing dialogue that began in the October 2011 special issue of "Performance Improvement"--Exploring a Universal Performance Model for HPT: Notes From the Field. The performance improvement/HPT (human performance technology) model represents a unifying process that helps accomplish successful change, create…

  18. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  19. Building performance modelling for sustainable building design

    Directory of Open Access Journals (Sweden)

    Olufolahan Oduyemi

    2016-12-01

    The output revealed that BPM delivers information needed for enhanced design and building performance. Recommendations such as the establishment of proper mechanisms to monitor the performance of BPM related construction are suggested to allow for its continuous implementation. This research consolidates collective movements towards wider implementation of BPM and forms a base for developing a sound BIM strategy and guidance.

  20. Exploitation et obligation de travailler

    Directory of Open Access Journals (Sweden)

    Pierre-Étienne Vandamme

    2014-06-01

    Full Text Available Cet article défend une définition de l’exploitation, restreinte aux relations de travail, en tentant d’une part d’expliciter une certaine compréhension de sens commun du concept (rémunération inéquitable en fonction du travail presté, et d’autre part d’échapper aux difficultés qui ont affecté la définition marxiste traditionnelle de l’exploitation comme extorsion de la plus-value (dans ses diverses variantes. Il explore ainsi le lien entre l’exploitation et l’obligation matérielle de travailler pour subvenir à ses besoins fondamentaux. Après avoir mis en garde contre les politiques d’activation des chômeurs, il conclut que l’exploitation est un phénomène contre lequel on peut lutter à l’aide de mécanismes relativement simples, même dans les sociétés capitalistes. Il rappelle toutefois que cela ne suffit pas à réaliser la justice sociale, resituant l’exploitation parmi d’autres enjeux fondamentaux pour une philosophie politique égalitariste

  1. Flare Systems Exploitation and Impact on Permafrost

    Science.gov (United States)

    Filimonov, M. Yu; Vaganova, N. A.

    2017-09-01

    Mathematical models and numerical algorithms of horizontal and vertical flare systems exploitation in northern oil and gas fields located in permafrost zone are developed. Computations of long-term forecasts of permafrost degradation around such constructions have been carried out for various types of operation, including emergency situations, which cause a short-term increase in the heat flow near the ground surface, which leads to an additional soil temperature increasing.

  2. Modelling Flexible Pavement Response and Performance

    DEFF Research Database (Denmark)

    Ullidtz, Per

    This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods.......This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods....

  3. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  4. Data Mining of Hydrological Model Performances

    Science.gov (United States)

    Vitolo, Claudia; Buytaert, Wouter

    2013-04-01

    Multi-objective criteria have long been used to infer hydrological simulations and fit the natural world. On the other hand, modelling frameworks are also becoming more and more popular as identification of the processes occurring in a catchment is still a very uncertain matter. In theory, multi-objective criteria and multi-model frameworks should be used in combination so that the 'representation' of the catchment is fitted to the observations, not only the simulated results. In practise those approaches are highly computationally demanding. The modeller is often obliged to find a compromise reducing either the number of objective functions or model structures taken into consideration. This compromise is becoming obsolete using parallel computing. In the present study we investigate the extend to which model selection algorithms and regionalisation techniques can be improved by such facilities and highlight the challenges that still need to be addressed. The model simulations are obtained using an ensemble of conceptual lumped models (FUSE by Clark et al. 2008), but techniques and suggestions are of general use and applicable to any modelling frameworks. In particular we developed a novel model selection algorithm tuned to drastically reduce the subjectivity in the analysis. The procedure was automated and coupled with redundancy reduction techniques such as PCA and Cluster Analysis. Results show that the actual model 'representation' has the shape of a set of complementing model structures. It is also possible to capture intra-annum dynamics of the response as the algorithm recognises subtle variations in the selected model structures in different seasons. Similar variations can be found analysing different catchments. This suggests the same methodology would be suitable for analysing spatial patterns in the distribution of suitable model structures and maybe long term dynamics in relation with expedited climate modifications. Although the mentioned methodology

  5. The exploration-exploitation dilemma: a multidisciplinary framework.

    Directory of Open Access Journals (Sweden)

    Oded Berger-Tal

    Full Text Available The trade-off between the need to obtain new knowledge and the need to use that knowledge to improve performance is one of the most basic trade-offs in nature, and optimal performance usually requires some balance between exploratory and exploitative behaviors. Researchers in many disciplines have been searching for the optimal solution to this dilemma. Here we present a novel model in which the exploration strategy itself is dynamic and varies with time in order to optimize a definite goal, such as the acquisition of energy, money, or prestige. Our model produced four very distinct phases: Knowledge establishment, Knowledge accumulation, Knowledge maintenance, and Knowledge exploitation, giving rise to a multidisciplinary framework that applies equally to humans, animals, and organizations. The framework can be used to explain a multitude of phenomena in various disciplines, such as the movement of animals in novel landscapes, the most efficient resource allocation for a start-up company, or the effects of old age on knowledge acquisition in humans.

  6. Modeling and optimization of LCD optical performance

    CERN Document Server

    Yakovlev, Dmitry A; Kwok, Hoi-Sing

    2015-01-01

    The aim of this book is to present the theoretical foundations of modeling the optical characteristics of liquid crystal displays, critically reviewing modern modeling methods and examining areas of applicability. The modern matrix formalisms of optics of anisotropic stratified media, most convenient for solving problems of numerical modeling and optimization of LCD, will be considered in detail. The benefits of combined use of the matrix methods will be shown, which generally provides the best compromise between physical adequacy and accuracy with computational efficiency and optimization fac

  7. Integrated thermodynamic model for ignition target performance

    Directory of Open Access Journals (Sweden)

    Springer P.T.

    2013-11-01

    Full Text Available We have derived a 3-dimensional synthetic model for NIF implosion conditions, by predicting and optimizing fits to a broad set of x-ray and nuclear diagnostics obtained on each shot. By matching x-ray images, burn width, neutron time-of-flight ion temperature, yield, and fuel ρr, we obtain nearly unique constraints on conditions in the hotspot and fuel in a model that is entirely consistent with the observables. This model allows us to determine hotspot density, pressure, areal density (ρr, total energy, and other ignition-relevant parameters not available from any single diagnostic. This article describes the model and its application to National Ignition Facility (NIF tritium–hydrogen–deuterium (THD and DT implosion data, and provides an explanation for the large yield and ρr degradation compared to numerical code predictions.

  8. Hydrologic Evaluation of Landfill Performance (HELP) Model

    Science.gov (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  9. Exploiting first-class arrays in Fortran for accelerator programming

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Craig E [Los Alamos National Laboratory; Weseloh, Wayne N [Los Alamos National Laboratory; Robey, Robert W [Los Alamos National Laboratory; Matthew, Sottile J [GALORIS, INC.; Quinlan, Daniel [LLNL; Overbye, Jeffrey [INDIANA UNIV.

    2010-12-15

    Emerging architectures for high performance computing often are well suited to a data parallel programming model. This paper presents a simple programming methodology based on existing languages and compiler tools that allows programmers to take advantage of these systems. We will work with the array features of Fortran 90 to show how this infrequently exploited, standardized language feature is easily transformed to lower level accelerator code. Our transformations are based on a mapping from Fortran 90 to C++ code with OpenCL extensions. The sheer complexity of programming for clusters of many or multi-core processors with tens of millions threads of execution make the simplicity of the data parallel model attractive. Furthermore, the increasing complexity of todays applications (especially when convolved with the increasing complexity of the hardware) and the need for portability across hardware architectures make a higher-level and simpler programming model like data parallel attractive. The goal of this work has been to exploit source-to-source transformations that allow programmers to develop and maintain programs at a high-level of abstraction, without coding to a specific hardware architecture. Furthermore these transformations allow multiple hardware architectures to be targeted without changing the high-level source. It also removes the necessity for application programmers to understand details of the accelerator architecture or to know OpenCL.

  10. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-10-01

    In 2011, the NAHB Research Center began assessing the needs and motivations of residential remodelers regarding energy performance remodeling. This report outlines: the current remodeling industry and the role of energy efficiency; gaps and barriers to adding energy efficiency into remodeling; and support needs of professional remodelers to increase sales and projects involving improving home energy efficiency.

  11. Modeling vibrato and portamento in music performance

    NARCIS (Netherlands)

    Desain, P.W.M.; Honing, H.J.

    1999-01-01

    Research in the psychology of music dealing with expression is often concerned with the discrete aspects of music performance, and mainly concentrates on the study of piano music (partly because of the ease with which piano music can be reduced to discrete note events). However, on other

  12. Electric properties of organic and mineral electronic components, design and modelling of a photovoltaic chain for a better exploitation of the solar energy; Proprietes electriques des composants electroniques mineraux et organiques, conception et modelisation d'une chaine photovoltaique pour une meilleure exploitation de l'energie solaire

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, A

    2006-11-15

    The research carried out in this thesis relates to the mineral, organic electronic components and the photovoltaic systems. Concerning the mineral semiconductors, we modelled the conduction properties of the structures metal/oxide/semiconductor (MOS) strongly integrated in absence and in the presence of charges. We proposed a methodology allowing characterizing the ageing of structures MOS under injection of the Fowler Nordheim (FN) current type. Then, we studied the Schottky diodes in polymers of type metal/polymer/metal. We concluded that: The mechanism of the charges transfer, through the interface metal/polymer, is allotted to the thermo-ionic effect and could be affected by the lowering of the potential barrier to the interface metal/polymer. In the area of photovoltaic energy, we conceived and modelled a photovoltaic system of average power (100 W). We showed that the adaptation of the generator to the load allows a better exploitation of solar energy. This is carried out by the means of the converters controlled by an of type MPPT control provided with a detection circuit of dysfunction and restarting of the system. (author)

  13. Optimizing Robotic Team Performance with Probabilistic Model Checking

    Science.gov (United States)

    2014-10-22

    2014 Carnegie Mellon University Optimizing Robotic Team Performance with Probabilistic Model Checking Software Engineering Institute...2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Optimizing Robotic Team Performance with Probabilistic Model Checking 5a...permission@sei.cmu.edu. DM-0001796 3 SIMPAR 2014, Bergamo, Italy David Kyle, 0ct. 22, 2014 © 2014 Carnegie Mellon University Model Checking Pentium

  14. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some multi-input single-output (MISO) processes, namely: brewery operations (case study 1) and soap production (case study 2) processes. Two ANFIS models were developed to model the performance of the ...

  15. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    progress feedback necessary to allow control of the project. However, interpretation of this data is very difficult and sometimes cumbersome. This paper addressed the need of a software implementation progress model that is needed to help interpret the accumulated data. Certain criteria are set for design of a proposed ...

  16. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  17. Performance Improvements of the CYCOFOS Flow Model

    Science.gov (United States)

    Radhakrishnan, Hari; Moulitsas, Irene; Syrakos, Alexandros; Zodiatis, George; Nikolaides, Andreas; Hayes, Daniel; Georgiou, Georgios C.

    2013-04-01

    The CYCOFOS-Cyprus Coastal Ocean Forecasting and Observing System has been operational since early 2002, providing daily sea current, temperature, salinity and sea level forecasting data for the next 4 and 10 days to end-users in the Levantine Basin, necessary for operational application in marine safety, particularly concerning oil spills and floating objects predictions. CYCOFOS flow model, similar to most of the coastal and sub-regional operational hydrodynamic forecasting systems of the MONGOOS-Mediterranean Oceanographic Network for Global Ocean Observing System is based on the POM-Princeton Ocean Model. CYCOFOS is nested with the MyOcean Mediterranean regional forecasting data and with SKIRON and ECMWF for surface forcing. The increasing demand for higher and higher resolution data to meet coastal and offshore downstream applications motivated the parallelization of the CYCOFOS POM model. This development was carried out in the frame of the IPcycofos project, funded by the Cyprus Research Promotion Foundation. The parallel processing provides a viable solution to satisfy these demands without sacrificing accuracy or omitting any physical phenomena. Prior to IPcycofos project, there are been several attempts to parallelise the POM, as for example the MP-POM. The existing parallel code models rely on the use of specific outdated hardware architectures and associated software. The objective of the IPcycofos project is to produce an operational parallel version of the CYCOFOS POM code that can replicate the results of the serial version of the POM code used in CYCOFOS. The parallelization of the CYCOFOS POM model use Message Passing Interface-MPI, implemented on commodity computing clusters running open source software and not depending on any specialized vendor hardware. The parallel CYCOFOS POM code constructed in a modular fashion, allowing a fast re-locatable downscaled implementation. The MPI takes advantage of the Cartesian nature of the POM mesh, and use

  18. Modelling swimming hydrodynamics to enhance performance

    OpenAIRE

    D.A. Marinho; Rouboa, A.; Barbosa, Tiago M.; Silva, A.J.

    2010-01-01

    Swimming assessment is one of the most complex but outstanding and fascinating topics in biomechanics. Computational fluid dynamics (CFD) methodology is one of the different methods that have been applied in swimming research to observe and understand water movements around the human body and its application to improve swimming performance. CFD has been applied attempting to understand deeply the biomechanical basis of swimming. Several studies have been conducted willing to analy...

  19. Extending the Global Sensitivity Analysis of the SimSphere model in the Context of its Future Exploitation by the Scientific Community

    Directory of Open Access Journals (Sweden)

    George P. Petropoulos

    2015-05-01

    Full Text Available In today’s changing climate, the development of robust, accurate and globally applicable models is imperative for a wider understanding of Earth’s terrestrial biosphere. Moreover, an understanding of the representation, sensitivity and coherence of such models are vital for the operationalisation of any physically based model. A Global Sensitivity Analysis (GSA was conducted on the SimSphere land biosphere model in which a meta-modelling method adopting Bayesian theory was implemented. Initially, effects of assuming uniform probability distribution functions (PDFs for the model inputs, when examining sensitivity of key quantities simulated by SimSphere at different output times, were examined. The development of topographic model input parameters (e.g., slope, aspect, and elevation were derived within a Geographic Information System (GIS before implementation within the model. The effect of time of the simulation on the sensitivity of previously examined outputs was also analysed. Results showed that simulated outputs were significantly influenced by changes in topographic input parameters, fractional vegetation cover, vegetation height and surface moisture availability in agreement with previous studies. Time of model output simulation had a significant influence on the absolute values of the output variance decomposition, but it did not seem to change the relative importance of each input parameter. Sensitivity Analysis (SA results of the newly modelled outputs allowed identification of the most responsive model inputs and interactions. Our study presents an important step forward in SimSphere verification given the increasing interest in its use both as an independent modelling and educational tool. Furthermore, this study is very timely given on-going efforts towards the development of operational products based on the synergy of SimSphere with Earth Observation (EO data. In this context, results also provide additional support for the

  20. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  1. Simulated population responses of common carp to commercial exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Michael J.; Hennen, Matthew J.; Brown, Michael L.

    2011-12-01

    Common carp Cyprinus carpio is a widespread invasive species that can become highly abundant and impose deleterious ecosystem effects. Thus, aquatic resource managers are interested in controlling common carp populations. Control of invasive common carp populations is difficult, due in part to the inherent uncertainty of how populations respond to exploitation. To understand how common carp populations respond to exploitation, we evaluated common carp population dynamics (recruitment, growth, and mortality) in three natural lakes in eastern South Dakota. Common carp exhibited similar population dynamics across these three systems that were characterized by consistent recruitment (ages 3 to 15 years present), fast growth (K = 0.37 to 0.59), and low mortality (A = 1 to 7%). We then modeled the effects of commercial exploitation on size structure, abundance, and egg production to determine its utility as a management tool to control populations. All three populations responded similarly to exploitation simulations with a 575-mm length restriction, representing commercial gear selectivity. Simulated common carp size structure modestly declined (9 to 37%) in all simulations. Abundance of common carp declined dramatically (28 to 56%) at low levels of exploitation (0 to 20%) but exploitation >40% had little additive effect and populations were only reduced by 49 to 79% despite high exploitation (>90%). Maximum lifetime egg production was reduced from 77 to 89% at a moderate level of exploitation (40%), indicating the potential for recruitment overfishing. Exploitation further reduced common carp size structure, abundance, and egg production when simulations were not size selective. Our results provide insights to how common carp populations may respond to exploitation. Although commercial exploitation may be able to partially control populations, an integrated removal approach that removes all sizes of common carp has a greater chance of controlling population abundance

  2. Accurate Modeling and Analysis of Isolation Performance in Multiport Amplifiers

    Directory of Open Access Journals (Sweden)

    Marinella Aloisio

    2012-01-01

    Full Text Available A Multiport Amplifier (MPA is an implementation of the satellite power amplification section that allows sharing the payload RF power among several beams/ports and guarantees a highly efficient exploitation of the available DC satellite power. This feature is of paramount importance in multiple beam satellite systems where the use of MPAs allows reconfiguring the RF output power among the different service beams in order to handle unexpected traffic unbalances and traffic variations over time. This paper presents Monte Carlo simulations carried out by means of an ESA in-house simulator developed in Matlab environment. The objective of the simulations is to analyse how the MPA performance, in particular in terms of isolation at the MPA output ports, is affected by the amplitude and phase tracking errors of the high power amplifiers within the MPA.

  3. Modeling the Mechanical Performance of Die Casting Dies

    Energy Technology Data Exchange (ETDEWEB)

    R. Allen Miller

    2004-02-27

    The following report covers work performed at Ohio State on modeling the mechanical performance of dies. The focus of the project was development and particularly verification of finite element techniques used to model and predict displacements and stresses in die casting dies. The work entails a major case study performed with and industrial partner on a production die and laboratory experiments performed at Ohio State.

  4. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  5. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    Wood, A.

    2012-10-01

    In 2011, the NAHB Research Center began the first part of the multi-year effort by assessing the needs and motivations of residential remodelers regarding energy performance remodeling. The scope is multifaceted - all perspectives will be sought related to remodeling firms ranging in size from small-scale, sole proprietor to national. This will allow the Research Center to gain a deeper understanding of the remodeling and energy retrofit business and the needs of contractors when offering energy upgrade services. To determine the gaps and the motivation for energy performance remodeling, the NAHB Research Center conducted (1) an initial series of focus groups with remodelers at the 2011 International Builders' Show, (2) a second series of focus groups with remodelers at the NAHB Research Center in conjunction with the NAHB Spring Board meeting in DC, and (3) quantitative market research with remodelers based on the findings from the focus groups. The goal was threefold, to: Understand the current remodeling industry and the role of energy efficiency; Identify the gaps and barriers to adding energy efficiency into remodeling; and Quantify and prioritize the support needs of professional remodelers to increase sales and projects involving improving home energy efficiency. This report outlines all three of these tasks with remodelers.

  6. Modeling, Simulation and Performance Evaluation of Parabolic Trough

    African Journals Online (AJOL)

    Mekuannint

    MODELING, SIMULATION AND PERFORMANCE EVALUATION OF. PARABOLIC TROUGH. SOLAR COLLECTOR POWER GENERATION SYSTEM. Mekuannint Mesfin and Abebayehu Assefa. Department of Mechanical Engineering. Addis Ababa University. ABSTRACT. Model of a parabolic trough power plant, taking.

  7. Influence of horizontal resolution and ensemble size on model performance

    CSIR Research Space (South Africa)

    Dalton, A

    2014-10-01

    Full Text Available model forecast fields onto observed gridded mid-summer rainfall over South Africa. Spearman rank correlations are initially used to compare the performance of models with varying horizontal resolution as well as ensemble size. Further verification...

  8. Leakage Flow Influence on SHF pump model performances

    National Research Council Canada - National Science Library

    Dupont, Patrick; Bayeul-Lainé, Annie-Claude; Dazin, Antoine; Bois, Gérard; Roussette, Olivier; Si, Qiaorui

    2015-01-01

    This paper deals with the influence of leakage flow existing in SHF pump model on the analysis of internal flow behaviour inside the vane diffuser of the pump model performance using both experiments and calculations...

  9. Leakage Flow Influence on SHF pump model performances

    National Research Council Canada - National Science Library

    Dupont, Patrick; Bayeul-Lainé, Annie-Claude; Dazin, Antoine; Bois, Gérard; Roussette, Olivier; Si, Qiaorui

    2015-01-01

    This paper deals with the influence of leakage flow existing in SHF pump model on the analysis of internal flow behaviour inside the vane diffuser of the pump model performance using both experiments and calculations...

  10. High-Performance data flows using analytical models and measurements

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S [ORNL; Towlsey, D. [University of Massachusetts; Vardoyan, G. [University of Massachusetts; Kettimuthu, R. [Argonne National Laboratory (ANL); Foster, I. [Argonne National Laboratory (ANL); Settlemyer, Bradley [Los Alamos National Laboratory (LANL)

    2016-01-01

    The combination of analytical models and measurements provide practical configurations and parameters to achieve high data transport rates: (a) buffer sizes and number of parallel streams for improved memory and file transfer rates, (b) Hamilton and Scalable TCP congestion control modules for memory transfers in place of default CUBIC, and (c) direct IO mode for Lustre file systems for wide-area transfers. Conventional parameter selection using full sweeps is impractical in many cases since it takes months. By exploiting the unimodality of throughput profiles, we developed the d-w method that significantly reduces the number of measurements needed for parameter identification. This heuristic method was effective in practice in reducing the measurements by about 90% for Lustre and XFS file transfers.

  11. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    This study deals with modeling and performance analysis of footwear manufacturing using arena simulation modeling software. It was investigated that modeling and simulation is a potential tool for modeling and analysis of manufacturing assembly lines like footwear manufacturing because it allows the researcher to ...

  12. C3 or Garbage Can - Alternative Models of Organizational Performance

    OpenAIRE

    J. A. Buzacott

    1981-01-01

    Two alternative approaches for modeling the performance of organizations are discussed -- C^3 (command-control-communication) systems and the garbage can approach. Existing formal models using each approach are reviewed and some extensions and alternative models are proposed. The implications of the models are discussed, with particular emphasis on the impact of information technology developments on organizations.

  13. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  14. Advanced Performance Modeling with Combined Passive and Active Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Dovrolis, Constantine [Georgia Inst. of Technology, Atlanta, GA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-04-15

    To improve the efficiency of resource utilization and scheduling of scientific data transfers on high-speed networks, the "Advanced Performance Modeling with combined passive and active monitoring" (APM) project investigates and models a general-purpose, reusable and expandable network performance estimation framework. The predictive estimation model and the framework will be helpful in optimizing the performance and utilization of networks as well as sharing resources with predictable performance for scientific collaborations, especially in data intensive applications. Our prediction model utilizes historical network performance information from various network activity logs as well as live streaming measurements from network peering devices. Historical network performance information is used without putting extra load on the resources by active measurement collection. Performance measurements collected by active probing is used judiciously for improving the accuracy of predictions.

  15. Analytic Ballistic Performance Model of Whipple Shields

    Science.gov (United States)

    Miller, J. E.; Bjorkman, M. D.; Christiansen, E. L.; Ryan, S. J.

    2015-01-01

    The dual-wall, Whipple shield is the shield of choice for lightweight, long-duration flight. The shield uses an initial sacrificial wall to initiate fragmentation and melt an impacting threat that expands over a void before hitting a subsequent shield wall of a critical component. The key parameters to this type of shield are the rear wall and its mass which stops the debris, as well as the minimum shock wave strength generated by the threat particle impact of the sacrificial wall and the amount of room that is available for expansion. Ensuring the shock wave strength is sufficiently high to achieve large scale fragmentation/melt of the threat particle enables the expansion of the threat and reduces the momentum flux of the debris on the rear wall. Three key factors in the shock wave strength achieved are the thickness of the sacrificial wall relative to the characteristic dimension of the impacting particle, the density and material cohesion contrast of the sacrificial wall relative to the threat particle and the impact speed. The mass of the rear wall and the sacrificial wall are desirable to minimize for launch costs making it important to have an understanding of the effects of density contrast and impact speed. An analytic model is developed here, to describe the influence of these three key factors. In addition this paper develops a description of a fourth key parameter related to fragmentation and its role in establishing the onset of projectile expansion.

  16. Modeling, Simulation, and Performance Analysis of Decoy State Enabled Quantum Key Distribution Systems

    Directory of Open Access Journals (Sweden)

    Logan O. Mailloux

    2017-02-01

    Full Text Available Quantum Key Distribution (QKD systems exploit the laws of quantum mechanics to generate secure keying material for cryptographic purposes. To date, several commercially viable decoy state enabled QKD systems have been successfully demonstrated and show promise for high-security applications such as banking, government, and military environments. In this work, a detailed performance analysis of decoy state enabled QKD systems is conducted through model and simulation of several common decoy state configurations. The results of this study uniquely demonstrate that the decoy state protocol can ensure Photon Number Splitting (PNS attacks are detected with high confidence, while maximizing the system’s quantum throughput at no additional cost. Additionally, implementation security guidance is provided for QKD system developers and users.

  17. Performance Modeling of Communication Networks with Markov Chains

    CERN Document Server

    Mo, Jeonghoon

    2010-01-01

    This book is an introduction to Markov chain modeling with applications to communication networks. It begins with a general introduction to performance modeling in Chapter 1 where we introduce different performance models. We then introduce basic ideas of Markov chain modeling: Markov property, discrete time Markov chain (DTMe and continuous time Markov chain (CTMe. We also discuss how to find the steady state distributions from these Markov chains and how they can be used to compute the system performance metric. The solution methodologies include a balance equation technique, limiting probab

  18. Exploiting Redundancy in an OFDM SDR Receiver

    Directory of Open Access Journals (Sweden)

    Tomas Palenik

    2009-01-01

    Full Text Available Common OFDM system contains redundancy necessary to mitigate interblock interference and allows computationally effective single-tap frequency domain equalization in receiver. Assuming the system implements an outer error correcting code and channel state information is available in the receiver, we show that it is possible to understand the cyclic prefix insertion as a weak inner ECC encoding and exploit the introduced redundancy to slightly improve error performance of such a system. In this paper, an easy way to implement modification to an existing SDR OFDM receiver is presented. This modification enables the utilization of prefix redundancy, while preserving full compatibility with existing OFDM-based communication standards.

  19. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  20. Global climate model performance over Alaska and Greenland

    DEFF Research Database (Denmark)

    Walsh, John E.; Chapman, William L.; Romanovsky, Vladimir

    2008-01-01

    The performance of a set of 15 global climate models used in the Coupled Model Intercomparison Project is evaluated for Alaska and Greenland, and compared with the performance over broader pan-Arctic and Northern Hemisphere extratropical domains. Root-mean-square errors relative to the 1958......-2000 climatology of the 40-yr ECMWF Re-Analysis (ERA-40) are summed over the seasonal cycles of three variables: surface air temperature, precipitation, and sea level pressure. The specific models that perform best over the larger domains tend to be the ones that perform best over Alaska and Greenland...... precipitation and decreases of Arctic sea level pressure, when greenhouse gas concentrations are increased. Because several models have substantially smaller systematic errors than the other models, the differences in greenhouse projections imply that the choice of a subset of models may offer a viable approach...

  1. A Probabilistic Approach to Symbolic Performance Modeling of Parallel Systems

    NARCIS (Netherlands)

    Gautama, H.

    2004-01-01

    Performance modeling plays a significant role in predicting the effects of a particular design choice or in diagnosing the cause for some observed performance behavior. Especially for complex systems such as parallel computer, typically, an intended performance cannot be achieved without recourse to

  2. Common Ground Station For Imagery Processing And Exploitation

    Science.gov (United States)

    Johnston, Morris V.

    1989-02-01

    a minimum number of deployable shelters. Primary imagery inputs are from Tactical and National sources. The system provides all processing necessary to support both softcopy and hardcopy imagery exploitation. The system design is expandable, contractible, modular and segmentable. Modularity consists of packaging segments in functional units so they may be added or deleted when and where required to meet specific user requirements. Segmentability is achieved by allocating system requirements to a single functional area known as a segment. The design uses off-the-shelf hardware and software where practical and cost effective. The System is capable of operating either in a stand-alone mode while deployed or while tethered to and supporting fixed intelligence facilities. Each segment provides all functions and capabilties needed to satisfy the performance requirements for that segment. Functional and physical interface documents define the exchange of information and data between segments and with external systems. To enhance future Pre-Planned Product Improvements (P3I), the architecture promotes technology transparency within JSIPS through bus type structures. Evolutionary development allows for emerging sensors and platforms to be accommodated by substituting or adding a minimum number of plug-in hardware or software interface modules to an operational production model. Technology transparency allows for incorporation of new technologies as they become available.

  3. Testing a Model of Work Performance in an Academic Environment

    OpenAIRE

    B. Charles Tatum

    2012-01-01

    In modern society, people both work and study. The intersection between organizational and educational research suggests that a common model should apply to both academic and job performance. The purpose of this study was to apply a model of work and job performance (based on general expectancy theory) to a classroom setting, and test the predicted relationships using a causal/path model methodology. The findings revea...

  4. Performance and reliability model checking and model construction

    NARCIS (Netherlands)

    Hermanns, H.

    2000-01-01

    Continuous-time Markov chains (CTMCs) are widely used to describe stochastic phenomena in many diverse areas. They are used to estimate performance and reliability characteristics of various nature, for instance to quantify throughputs of manufacturing systems, to locate bottlenecks in communication

  5. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  6. Intertemporal Choice of Marine Ecosystem Exploitation

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

    Management, however, requires models that can link the ecosystem level to the operation level, so this paper examines an ecosystem production model and shows that it is suitable for applying ground rent theory. This model is the simplest possible that incorporates the principles of size as the main...... at the ecosystem level in the present management. Therefore, economic predictions for an ecosystem managed as a common pool resource must be that  the exploitation probably are conducted at lower sized than optimum. In addition, given its population stock approach, the present management probably overlooks...... the ability of an ecosystem to sustain total volume of harvest. Given the two aspects of intertemporal choice revealed by the model, the conclusion must be that the Fishing Down Marine Food Webs is probably driven by the current management's inability to conduct adequate intertemporal balancing; therefore...

  7. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  8. Modelling of Box Type Solar Cooker Performance in a Tropical ...

    African Journals Online (AJOL)

    Thermal performance model of box type solar cooker with loaded water is presented. The model was developed using the method of Funk to estimate cooking power in terms of climatic and design parameters for box type solar cooker in a tropical environment. Coefficients for each term used in the model were determined ...

  9. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  10. Model-driven performance evaluation for service engineering

    OpenAIRE

    Pahl, Claus; Boskovic, Marko; Hasselbring, Wilhelm

    2007-01-01

    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluat...

  11. Performance model for grid-connected photovoltaic inverters.

    Energy Technology Data Exchange (ETDEWEB)

    Boyson, William Earl; Galbraith, Gary M.; King, David L.; Gonzalez, Sigifredo

    2007-09-01

    This document provides an empirically based performance model for grid-connected photovoltaic inverters used for system performance (energy) modeling and for continuous monitoring of inverter performance during system operation. The versatility and accuracy of the model were validated for a variety of both residential and commercial size inverters. Default parameters for the model can be obtained from manufacturers specification sheets, and the accuracy of the model can be further refined using measurements from either well-instrumented field measurements in operational systems or using detailed measurements from a recognized testing laboratory. An initial database of inverter performance parameters was developed based on measurements conducted at Sandia National Laboratories and at laboratories supporting the solar programs of the California Energy Commission.

  12. Performance Modeling for Heterogeneous Wireless Networks with Multiservice Overflow Traffic

    DEFF Research Database (Denmark)

    Huang, Qian; Ko, King-Tim; Iversen, Villy Bæk

    2009-01-01

    Performance modeling is important for the purpose of developing efficient dimensioning tools for large complicated networks. But it is difficult to achieve in heterogeneous wireless networks, where different networks have different statistical characteristics in service and traffic models....... Multiservice loss analysis based on multi-dimensional Markov chain becomes intractable in these networks due to intensive computations required. This paper focuses on performance modeling for heterogeneous wireless networks based on a hierarchical overlay infrastructure. A method based on decomposition...... of the correlated traffic is used to achieve an approximate performance modeling for multiservice in hierarchical heterogeneous wireless networks with overflow traffic. The accuracy of the approximate performance obtained by our proposed modeling is verified by simulations....

  13. REG and GREAT, two networks to optimize Gaia scientific exploitation

    Science.gov (United States)

    Figueras, F.; Jordi, C.; Spanish Participants in Reg; Great

    2013-05-01

    The launch of Gaia satellite by the European Space Agency is a year ahead (last quarter of 2013), and Spanish and European community have already out in place two networks devoted to the preparation of the scientific exploitation of the data acquired by the satellite: GREAT (Gaia Research for European Astronomy Training), funded by the European Science Foundation and by Marie Curie Actions in its People 7th Programme, and REG (Spanish Network for the scientific exploitation of Gaia) funded by MINECO. These networks, which are open to the international community, have adopted the challenges of Gaia mission: to revolutionize our understanding of the Milky Way and its components, trace the distribution of dark matter in the local universe, validate and improve models of stellar structure and evolution, characterizing solar system objects, ... and many more. Both networks promote the close interaction among researchers of different institutes, by supporting short and long exchange visits, workshops, schools and large conferences. Currently, 128 Spanish people actively participate in the several working groups in GREAT and REG and 2 students are performing their PhD in the framework of the GREAT-ITN Spanish node. This paper provides detailed information about the structure of these networks, the Spanish participation, and present and future tasks that are foreseen.

  14. Impact of reactive settler models on simulated WWTP performance

    DEFF Research Database (Denmark)

    Gernaey, Krist; Jeppsson, Ulf; Batstone, Damien J.

    2006-01-01

    for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takacs settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate......, combined with a non-reactive Takacs settler. The second is a fully reactive ASM1 Takacs settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively....... The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler....

  15. Towards an Accurate Performance Modeling of Parallel SparseFactorization

    Energy Technology Data Exchange (ETDEWEB)

    Grigori, Laura; Li, Xiaoye S.

    2006-05-26

    We present a performance model to analyze a parallel sparseLU factorization algorithm on modern cached-based, high-end parallelarchitectures. Our model characterizes the algorithmic behavior bytakingaccount the underlying processor speed, memory system performance, aswell as the interconnect speed. The model is validated using theSuperLU_DIST linear system solver, the sparse matrices from realapplications, and an IBM POWER3 parallel machine. Our modelingmethodology can be easily adapted to study performance of other types ofsparse factorizations, such as Cholesky or QR.

  16. Channel modeling and performance evaluation of FSO communication systems in fog

    KAUST Repository

    Esmail, Maged Abdullah

    2016-07-01

    Free space optical (FSO) communication has become more exciting during the last decade. It has unregulated spectrum with a huge capacity compared to its radio frequency (RF) counterpart. Although FSO has many applications that cover indoor and outdoor environments, its widespread is humped by weather effects. Fog is classified as an extreme weather impairment that may cause link drop. Foggy channel modeling and characterization is necessary to analyze the system performance. In this paper, we first address the statistical behavior of the foggy channel based on a set of literature experimental data and develop a probability distribution function (PDF) model for fog attenuation. We then exploit our PDF model to derive closed form expressions and evaluate the system performance theoretically and numerically, in terms of average signal-to-noise ratio (SNR), and outage probability. The results show that for 10-3 outage probability and 22 dBm transmitted power, the FSO system can work over 80 m, 160 m, 310 m, and 460 m link length under dense, thick, moderate, and light fog respectively. Increasing the transmitted power will have high impact when the fog density is low. However, under very dense fog, it has almost no effect. © 2016 IEEE.

  17. Emerging Carbon Nanotube Electronic Circuits, Modeling, and Performance

    OpenAIRE

    Yao Xu; Ashok Srivastava; Sharma, Ashwani K.

    2010-01-01

    Current transport and dynamic models of carbon nanotube field-effect transistors are presented. A model of single-walled carbon nanotube as interconnect is also presented and extended in modeling of single-walled carbon nanotube bundles. These models are applied in studying the performances of circuits such as the complementary carbon nanotube inverter pair and carbon nanotube as interconnect. Cadence/Spectre simulations show that carbon nanotube field-effect transistor circuits can operate a...

  18. Exploiting the Nephrotoxic Effects of Venom from the Sea Anemone, Phyllodiscus semoni, to Create a Hemolytic Uremic Syndrome Model in the Rat

    Directory of Open Access Journals (Sweden)

    B. Paul Morgan

    2012-07-01

    Full Text Available In the natural world, there are many creatures with venoms that have interesting and varied activities. Although the sea anemone, a member of the phylum Coelenterata, has venom that it uses to capture and immobilise small fishes and shrimp and for protection from predators, most sea anemones are harmless to man. However, a few species are highly toxic; some have venoms containing neurotoxins, recently suggested as potential immune-modulators for therapeutic application in immune diseases. Phyllodiscus semoni is a highly toxic sea anemone; the venom has multiple effects, including lethality, hemolysis and renal injuries. We previously reported that venom extracted from Phyllodiscus semoni induced acute glomerular endothelial injuries in rats resembling hemolytic uremic syndrome (HUS, accompanied with complement dysregulation in glomeruli and suggested that the model might be useful for analyses of pathology and development of therapeutic approaches in HUS. In this mini-review, we describe in detail the venom-induced acute renal injuries in rat and summarize how the venom of Phyllodiscus semoni could have potential as a tool for analyses of complement activation and therapeutic interventions in HUS.

  19. Exploiting the nephrotoxic effects of venom from the sea anemone, Phyllodiscus semoni, to create a hemolytic uremic syndrome model in the rat.

    Science.gov (United States)

    Mizuno, Masashi; Ito, Yasuhiko; Morgan, B Paul

    2012-07-01

    In the natural world, there are many creatures with venoms that have interesting and varied activities. Although the sea anemone, a member of the phylum Coelenterata, has venom that it uses to capture and immobilise small fishes and shrimp and for protection from predators, most sea anemones are harmless to man. However, a few species are highly toxic; some have venoms containing neurotoxins, recently suggested as potential immune-modulators for therapeutic application in immune diseases. Phyllodiscus semoni is a highly toxic sea anemone; the venom has multiple effects, including lethality, hemolysis and renal injuries. We previously reported that venom extracted from Phyllodiscus semoni induced acute glomerular endothelial injuries in rats resembling hemolytic uremic syndrome (HUS), accompanied with complement dysregulation in glomeruli and suggested that the model might be useful for analyses of pathology and development of therapeutic approaches in HUS. In this mini-review, we describe in detail the venom-induced acute renal injuries in rat and summarize how the venom of Phyllodiscus semoni could have potential as a tool for analyses of complement activation and therapeutic interventions in HUS.

  20. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    Science.gov (United States)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  1. Simultaneously Exploiting Two Formulations: an Exact Benders Decomposition Approach

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Gamst, Mette; Spoorendonk, Simon

    . Furthermore, it proves the correctness of the procedure and considers how to include interesting extensions such as cutting planes and advanced branching strategies. Finally, we test and compare the performance of the proposed approach on publicly available instances of the Bin Packing problem. Compared......When modelling a given problem using linear programming techniques several possibilities often exist, and each results in a different mathematical formulation of the problem. Usually, advantages and disadvantages can be identified in any single formulation. In this paper we consider mixed integer...... linear programs and propose an approach based on Benders decomposition to exploit the advantages of two different formulations when solving a problem. We propose to apply Benders decomposition to a combined formulation,comprised of two separate formulations, augmented with linking constraints to ensure...

  2. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    only the floor construction, the differences can be directly compared. In this comparison, a two-dimensional model of a slab-on-grade floor including foundation is used as reference. The other models include a one-dimensional model and a thermal network model including the linear thermal transmittance......This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...... of the foundation. The result can be also be found in the energy consumption of the building, since up to half the energy consumption is lost through the ground. Looking at the different implementations it is also found, that including a 1m ground volume below the floor construction under a one-dimensional model...

  3. Models used to assess the performance of photovoltaic systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  4. Rationalising predictors of child sexual exploitation and sex-trading.

    Science.gov (United States)

    Klatt, Thimna; Cavner, Della; Egan, Vincent

    2014-02-01

    Although there is evidence for specific risk factors leading to child sexual exploitation and prostitution, these influences overlap and have rarely been examined concurrently. The present study examined case files for 175 young persons who attended a voluntary organization in Leicester, United Kingdom, which supports people who are sexually exploited or at risk of sexual exploitation. Based on the case files, the presence or absence of known risk factors for becoming a sex worker was coded. Data were analyzed using t-test, logistic regression, and smallest space analysis. Users of the voluntary organization's services who had been sexually exploited exhibited a significantly greater number of risk factors than service users who had not been victims of sexual exploitation. The logistic regression produced a significant model fit. However, of the 14 potential predictors--many of which were associated with each other--only four variables significantly predicted actual sexual exploitation: running away, poverty, drug and/or alcohol use, and having friends or family members in prostitution. Surprisingly, running away was found to significantly decrease the odds of becoming involved in sexual exploitation. Smallest space analysis of the data revealed 5 clusters of risk factors. Two of the clusters, which reflected a desperation and need construct and immature or out-of-control lifestyles, were significantly associated with sexual exploitation. Our research suggests that some risk factors (e.g. physical and emotional abuse, early delinquency, and homelessness) for becoming involved in sexual exploitation are common but are part of the problematic milieu of the individuals affected and not directly associated with sex trading itself. Our results also indicate that it is important to engage with the families and associates of young persons at risk of becoming (or remaining) a sex worker if one wants to reduce the numbers of persons who engage in this activity. Copyright

  5. Implementation of new pavement performance prediction models in PMIS : report

    Science.gov (United States)

    2012-08-01

    Pavement performance prediction models and maintenance and rehabilitation (M&R) optimization processes : enable managers and engineers to plan and prioritize pavement M&R activities in a cost-effective manner. : This report describes TxDOTs effort...

  6. Practical Techniques for Modeling Gas Turbine Engine Performance

    Science.gov (United States)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  7. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  8. Performance evaluation:= (process algebra + model checking) x Markov chains

    NARCIS (Netherlands)

    Hermanns, H.; Larsen, K.G.; Nielsen, Mogens; Katoen, Joost P.

    2001-01-01

    Markov chains are widely used in practice to determine system performance and reliability characteristics. The vast majority of applications considers continuous-time Markov chains (CTMCs). This tutorial paper shows how successful model specification and analysis techniques from concurrency theory

  9. Automated UAV-based video exploitation using service oriented architecture framework

    Science.gov (United States)

    Se, Stephen; Nadeau, Christian; Wood, Scott

    2011-05-01

    Airborne surveillance and reconnaissance are essential for successful military missions. Such capabilities are critical for troop protection, situational awareness, mission planning, damage assessment, and others. Unmanned Aerial Vehicles (UAVs) gather huge amounts of video data but it is extremely labour-intensive for operators to analyze hours and hours of received data. At MDA, we have developed a suite of tools that can process the UAV video data automatically, including mosaicking, change detection and 3D reconstruction, which have been integrated within a standard GIS framework. In addition, the mosaicking and 3D reconstruction tools have also been integrated in a Service Oriented Architecture (SOA) framework. The Visualization and Exploitation Workstation (VIEW) integrates 2D and 3D visualization, processing, and analysis capabilities developed for UAV video exploitation. Visualization capabilities are supported through a thick-client Graphical User Interface (GUI), which allows visualization of 2D imagery, video, and 3D models. The GUI interacts with the VIEW server, which provides video mosaicking and 3D reconstruction exploitation services through the SOA framework. The SOA framework allows multiple users to perform video exploitation by running a GUI client on the operator's computer and invoking the video exploitation functionalities residing on the server. This allows the exploitation services to be upgraded easily and allows the intensive video processing to run on powerful workstations. MDA provides UAV services to the Canadian and Australian forces in Afghanistan with the Heron, a Medium Altitude Long Endurance (MALE) UAV system. On-going flight operations service provides important intelligence, surveillance, and reconnaissance information to commanders and front-line soldiers.

  10. Dynamic vehicle model for handling performance using experimental data

    Directory of Open Access Journals (Sweden)

    SangDo Na

    2015-11-01

    Full Text Available An analytical vehicle model is essential for the development of vehicle design and performance. Various vehicle models have different complexities, assumptions and limitations depending on the type of vehicle analysis. An accurate full vehicle model is essential in representing the behaviour of the vehicle in order to estimate vehicle dynamic system performance such as ride comfort and handling. An experimental vehicle model is developed in this article, which employs experimental kinematic and compliance data measured between the wheel and chassis. From these data, a vehicle model, which includes dynamic effects due to vehicle geometry changes, has been developed. The experimental vehicle model was validated using an instrumented experimental vehicle and data such as a step change steering input. This article shows a process to develop and validate an experimental vehicle model to enhance the accuracy of handling performance, which comes from precise suspension model measured by experimental data of a vehicle. The experimental force data obtained from a suspension parameter measuring device are employed for a precise modelling of the steering and handling response. The steering system is modelled by a lumped model, with stiffness coefficients defined and identified by comparing steering stiffness obtained by the measured data. The outputs, specifically the yaw rate and lateral acceleration of the vehicle, are verified by experimental results.

  11. Performance of parallel computers for spectral atmospheric models

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.T.; Toonen, B. [Argonne National Lab., IL (United States); Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1995-06-01

    Massively parallel processing (MPP) computer systems use high-speed interconnection networks to link hundreds or thousands of RISC microprocessors. With each microprocessor having a peak performance of 100 Mflops/sec or more, there is at least the possibility of achieving very high performance. However, the question of exactly how to achieve this performance remains unanswered. MPP systems and vector multiprocessors require very different coding styles. Different MPP systems have widely varying architectures and performance characteristics. For most problems, a range of different parallel algorithms is possible, again with varying performance characteristics. In this paper, we provide a detailed, fair evaluation of MPP performance for a weather and climate modeling application. Using a specially designed spectral transform code, we study performance on three different MPP systems: Intel Paragon, IBM SP2, and Cray T3D. We take great care to control for performance differences due to varying algorithmic characteristics. The results yield insights into MPP performance characteristics, parallel spectral transform algorithms, and coding style for MPP systems. We conclude that it is possible to construct parallel models that achieve multi-Gflop/sec performance on a range of MPPs if the models are constructed to allow run-time selection of appropriate algorithms.

  12. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  13. Architecture and Programming Models for High Performance Intensive Computation

    Science.gov (United States)

    2016-06-29

    AFRL-AFOSR-VA-TR-2016-0230 Architecture and Programming Models for High Performance Intensive Computation XiaoMing Li UNIVERSITY OF DELAWARE Final...TITLE AND SUBTITLE Architecture and Programming Models for High Performance Intensive Computation 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-13-1-0213...developing an efficient system architecture and software tools for building and running Dynamic Data Driven Application Systems (DDDAS). The foremost

  14. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Juanqiong Gou; Guguan Shen; Rui Chai

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  15. A Mathematical Model to Improve the Performance of Logistics Network

    Directory of Open Access Journals (Sweden)

    Muhammad Izman Herdiansyah

    2012-01-01

    Full Text Available The role of logistics nowadays is expanding from just providing transportation and warehousing to offering total integrated logistics. To remain competitive in the global market environment, business enterprises need to improve their logistics operations performance. The improvement will be achieved when we can provide a comprehensive analysis and optimize its network performances. In this paper, a mixed integer linier model for optimizing logistics network performance is developed. It provides a single-product multi-period multi-facilities model, as well as the multi-product concept. The problem is modeled in form of a network flow problem with the main objective to minimize total logistics cost. The problem can be solved using commercial linear programming package like CPLEX or LINDO. Even in small case, the solver in Excel may also be used to solve such model.Keywords: logistics network, integrated model, mathematical programming, network optimization

  16. Exploiting for medical and biological applications

    Science.gov (United States)

    Giano, Michael C.

    Biotherapeutics are an emerging class of drug composed of molecules ranging in sizes from peptides to large proteins. Due to their poor stability and mucosal membrane permeability, biotherapeutics are administered by a parenteral method (i.e., syringe, intravenous or intramuscular). Therapeutics delivered systemically often experience short half-lives. While, local administration may involve invasive surgical procedures and suffer from poor retention at the site of application. To compensate, the patient receives frequent doses of highly concentrated therapeutic. Unfortunately, the off-target side effects and discomfort associated with multiple injections results in poor patient compliance. Therefore, new delivery methods which can improve therapeutic retention, reduce the frequency of administration and may aid in decreasing the off-target side effects is a necessity. Hydrogels are a class of biomaterials that are gaining interests for tissue engineering and drug delivery applications. Hydrogel materials are defined as porous, 3-dimensional networks that are primarily composed of water. Generally, they are mechanically rigid, cytocompatible and easily chemically functionalized. Collectively, these properties make hydrogels fantastic candidates to perform as drug delivery depots. Current hydrogel delivery systems physically entrap the target therapeutic which is then subsequently released over time at the site of administration. The swelling and degradation of the material effect the diffusion of the therapy from the hydrogel, and therefore should be controlled. Although these strategies provide some regulation over therapeutic release, full control of the delivery is not achieved. Newer approaches are focused on designing hydrogels that exploit known interactions, covalently attach the therapy or respond to an external stimulus in an effort to gain improved control over the therapy's release. Unfortunately, the biotherapeutic is typically required to be chemically

  17. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  18. Protocol to Exploit Waiting Resources for UASNs

    Directory of Open Access Journals (Sweden)

    Li-Ling Hung

    2016-03-01

    Full Text Available The transmission speed of acoustic waves in water is much slower than that of radio waves in terrestrial wireless sensor networks. Thus, the propagation delay in underwater acoustic sensor networks (UASN is much greater. Longer propagation delay leads to complicated communication and collision problems. To solve collision problems, some studies have proposed waiting mechanisms; however, long waiting mechanisms result in low bandwidth utilization. To improve throughput, this study proposes a slotted medium access control protocol to enhance bandwidth utilization in UASNs. The proposed mechanism increases communication by exploiting temporal and spatial resources that are typically idle in order to protect communication against interference. By reducing wait time, network performance and energy consumption can be improved. A performance evaluation demonstrates that when the data packets are large or sensor deployment is dense, the energy consumption of proposed protocol is less than that of existing protocols as well as the throughput is higher than that of existing protocols.

  19. Exploitation of subsea gas hydrate reservoirs

    Science.gov (United States)

    Janicki, Georg; Schlüter, Stefan; Hennig, Torsten; Deerberg, Görge

    2016-04-01

    Natural gas hydrates are considered to be a potential energy resource in the future. They occur in permafrost areas as well as in subsea sediments and are stable at high pressure and low temperature conditions. According to estimations the amount of carbon bonded in natural gas hydrates worldwide is two times larger than in all known conventional fossil fuels. Besides technical challenges that have to be overcome climate and safety issues have to be considered before a commercial exploitation of such unconventional reservoirs. The potential of producing natural gas from subsea gas hydrate deposits by various means (e.g. depressurization and/or injection of carbon dioxide) is numerically studied in the frame of the German research project »SUGAR«. The basic mechanisms of gas hydrate formation/dissociation and heat and mass transport in porous media are considered and implemented into a numerical model. The physics of the process leads to strong non-linear couplings between hydraulic fluid flow, hydrate dissociation and formation, hydraulic properties of the sediment, partial pressures and seawater solution of components and the thermal budget of the system described by the heat equation. This paper is intended to provide an overview of the recent development regarding the production of natural gas from subsea gas hydrate reservoirs. It aims at giving a broad insight into natural gas hydrates and covering relevant aspects of the exploitation process. It is focused on the thermodynamic principles and technological approaches for the exploitation. The effects occurring during natural gas production within hydrate filled sediment layers are identified and discussed by means of numerical simulation results. The behaviour of relevant process parameters such as pressure, temperature and phase saturations is described and compared for different strategies. The simulations are complemented by calculations for different safety relevant problems.

  20. SEXUAL EXPLOITATION AND ABUSE BY UN PEACEKEEPERS ...

    African Journals Online (AJOL)

    Allaiac

    The sexual exploitation of children by peacekeepers is particularly insidious. Educational interventions and training initiatives to bring about behaviour change to address sexual exploitation and abuse .... its own peacekeeping personnel are engaging in acts of sexual exploitation and abuse, including such crimes as rape.

  1. The exploitation argument against commercial surrogacy.

    Science.gov (United States)

    Wilkinson, Stephen

    2003-04-01

    This paper discusses the exploitation argument against commercial surrogacy: the claim that commercial surrogacy is morally objectionable because it is exploitative. The following questions are addressed. First, what exactly does the exploitation argument amount to? Second, is commercial surrogacy in fact exploitative? Third, if it were exploitative, would this provide a sufficient reason to prohibit (or otherwise legislatively discourage) it? The focus throughout is on the exploitation of paid surrogates, although it is noted that other parties (e.g. 'commissioning parents') may also be the victims of exploitation. It is argued that there are good reasons for believing that commercial surrogacy is often exploitative. However, even if we accept this, the exploitation argument for prohibiting (or otherwise legislatively discouraging) commercial surrogacy remains quite weak. One reason for this is that prohibition may well 'backfire' and lead to potential surrogates having to do other things that are more exploitative and/or more harmful than paid surrogacy. It is concluded therefore that those who oppose exploitation should (rather than attempting to stop particular practices like commercial surrogacy) concentrate on: (a) improving the conditions under which paid surrogates 'work'; and (b) changing the background conditions (in particular, the unequal distribution of power and wealth) which generate exploitative relationships.

  2. Designing Electronic Performance Support Systems: Models and Instructional Strategies Employed

    Science.gov (United States)

    Nekvinda, Christopher D.

    2011-01-01

    The purpose of this qualitative study was to determine whether instructional designers and performance technologists utilize instructional design models when designing and developing electronic performance support systems (EPSS). The study also explored if these same designers were utilizing instructional strategies within their EPSS to support…

  3. Hydrologic and water quality models: Performance measures and evaluation criteria

    Science.gov (United States)

    Performance measures and corresponding criteria constitute an important aspect of calibration and validation of any hydrological and water quality (H/WQ) model. As new and improved methods and information are developed, it is essential that performance measures and criteria be updated. Therefore, th...

  4. A Model for Effective Performance in the Indonesian Navy.

    Science.gov (United States)

    1987-06-01

    NAVY LEADERSHIP AND MANAGEMENT COM PETENCY M ODEL .................................. 15 D. MCBER COMPETENT MANAGERS MODEL ................ IS E. SU M M... leadership and managerial skills which emphasize on effective performance of the officers in managing the human resources under their cormnand and...supervision. By effective performance we mean officers who not only know about management theories , but who possess the characteristics, knowledge, skill, and

  5. How motivation affects academic performance: a structural equation modelling analysis.

    Science.gov (United States)

    Kusurkar, R A; Ten Cate, Th J; Vos, C M P; Westers, P; Croiset, G

    2013-03-01

    Few studies in medical education have studied effect of quality of motivation on performance. Self-Determination Theory based on quality of motivation differentiates between Autonomous Motivation (AM) that originates within an individual and Controlled Motivation (CM) that originates from external sources. To determine whether Relative Autonomous Motivation (RAM, a measure of the balance between AM and CM) affects academic performance through good study strategy and higher study effort and compare this model between subgroups: males and females; students selected via two different systems namely qualitative and weighted lottery selection. Data on motivation, study strategy and effort was collected from 383 medical students of VU University Medical Center Amsterdam and their academic performance results were obtained from the student administration. Structural Equation Modelling analysis technique was used to test a hypothesized model in which high RAM would positively affect Good Study Strategy (GSS) and study effort, which in turn would positively affect academic performance in the form of grade point averages. This model fit well with the data, Chi square = 1.095, df = 3, p = 0.778, RMSEA model fit = 0.000. This model also fitted well for all tested subgroups of students. Differences were found in the strength of relationships between the variables for the different subgroups as expected. In conclusion, RAM positively correlated with academic performance through deep strategy towards study and higher study effort. This model seems valid in medical education in subgroups such as males, females, students selected by qualitative and weighted lottery selection.

  6. Performance Implications of Business Model Change: A Case Study

    Directory of Open Access Journals (Sweden)

    Jana Poláková

    2015-01-01

    Full Text Available The paper deals with changes in performance level introduced by the change of business model. The selected case is a small family business undergoing through substantial changes in reflection of structural changes of its markets. The authors used the concept of business model to describe value creation processes within the selected family business and by contrasting the differences between value creation processes before and after the change introduced they prove the role of business model as the performance differentiator. This is illustrated with the use of business model canvas constructed on the basis interviews, observations and document analysis. The two business model canvases allow for explanation of cause-and-effect relationships within the business leading to change in performance. The change in the performance is assessed by financial analysis of the business conducted over the period of 2006–2012 demonstrates changes in performance (comparing development of ROA, ROE and ROS having their lowest levels before the change of business model was introduced, growing after the introduction of the change, as well as the activity indicators with similar developments of the family business. The described case study contributes to the concept of business modeling with the arguments supporting its value as strategic tool facilitating decisions related to value creation within the business.

  7. Gold-standard performance for 2D hydrodynamic modeling

    Science.gov (United States)

    Pasternack, G. B.; MacVicar, B. J.

    2013-12-01

    Two-dimensional, depth-averaged hydrodynamic (2D) models are emerging as an increasingly useful tool for environmental water resources engineering. One of the remaining technical hurdles to the wider adoption and acceptance of 2D modeling is the lack of standards for 2D model performance evaluation when the riverbed undulates, causing lateral flow divergence and convergence. The goal of this study was to establish a gold-standard that quantifies the upper limit of model performance for 2D models of undulating riverbeds when topography is perfectly known and surface roughness is well constrained. A review was conducted of published model performance metrics and the value ranges exhibited by models thus far for each one. Typically predicted velocity differs from observed by 20 to 30 % and the coefficient of determination between the two ranges from 0.5 to 0.8, though there tends to be a bias toward overpredicting low velocity and underpredicting high velocity. To establish a gold standard as to the best performance possible for a 2D model of an undulating bed, two straight, rectangular-walled flume experiments were done with no bed slope and only different bed undulations and water surface slopes. One flume tested model performance in the presence of a porous, homogenous gravel bed with a long flat section, then a linear slope down to a flat pool bottom, and then the same linear slope back up to the flat bed. The other flume had a PVC plastic solid bed with a long flat section followed by a sequence of five identical riffle-pool pairs in close proximity, so it tested model performance given frequent undulations. Detailed water surface elevation and velocity measurements were made for both flumes. Comparing predicted versus observed velocity magnitude for 3 discharges with the gravel-bed flume and 1 discharge for the PVC-bed flume, the coefficient of determination ranged from 0.952 to 0.987 and the slope for the regression line was 0.957 to 1.02. Unsigned velocity

  8. Tidal Power Exploitation in Korea

    Science.gov (United States)

    Choi, Byung Ho; Kim, Kyeong Ok; Choi, Jae Cheon

    The highest tides in South Korea are found along the northwest coast between latitudes 36-38 degrees and the number of possible sites for tidal range power barrages to create tidal basins is great due to irregular coastlines with numerous bays. At present Lake Sihwa tidal power plant is completed. The plant is consisted of 10 bulb type turbines with 8 sluice gates. The installed capacity of turbines and generators is 254MW and annual energy output expected is about 552.7 GWh taking flood flow generation scheme. Three other TPP projects are being progressed at Garolim Bay (20 turbines with 25.4MW capacity), Kangwha (28 turbines with 25.4MW capacity), Incheon (44 or 48 turbines with 30 MW capacity) and project features will be outlined here. The introduction of tidal barrages into four major TPP projects along the Kyeonggi bay will render wide range of potential impacts. Preliminary attempts were performed to quantify these impacts using 2 D hydrodynamic model demonstrating the changes in tidal amplitude and phase under mean tidal condition, associated changes in residual circulation (indicator for SPM and pollutant dispersion), bottom stress (indicator for bedload movement), and tidal front (positional indicator for bio-productivity) in both shelf scale and local context. Tidal regime modeling system for ocean tides in the seas bordering the Korean Peninsula is designed to cover an area that is broad in scope and size, yet provide a high degree of resolution in strong tidal current region including off southwestern tip of the Peninsula (Uldolmok , Jangjuk, Wando-Hoenggan), Daebang Sudo (Channel) and Kyeonggi Bay. With this simulation system, real tidal time simulation of extended springneap cycles was performed to estimate spatial distribution of tidal current power potentials in terms of power density, energy density and then extrapolated annual energy density.

  9. Longitudinal modeling in sports: young swimmers' performance and biomechanics profile.

    Science.gov (United States)

    Morais, Jorge E; Marques, Mário C; Marinho, Daniel A; Silva, António J; Barbosa, Tiago M

    2014-10-01

    New theories about dynamical systems highlight the multi-factorial interplay between determinant factors to achieve higher sports performances, including in swimming. Longitudinal research does provide useful information on the sportsmen's changes and how training help him to excel. These questions may be addressed in one single procedure such as latent growth modeling. The aim of the study was to model a latent growth curve of young swimmers' performance and biomechanics over a season. Fourteen boys (12.33 ± 0.65 years-old) and 16 girls (11.15 ± 0.55 years-old) were evaluated. Performance, stroke frequency, speed fluctuation, arm's propelling efficiency, active drag, active drag coefficient and power to overcome drag were collected in four different moments of the season. Latent growth curve modeling was computed to understand the longitudinal variation of performance (endogenous variables) over the season according to the biomechanics (exogenous variables). Latent growth curve modeling showed a high inter- and intra-subject variability in the performance growth. Gender had a significant effect at the baseline and during the performance growth. In each evaluation moment, different variables had a meaningful effect on performance (M1: Da, β = -0.62; M2: Da, β = -0.53; M3: η(p), β = 0.59; M4: SF, β = -0.57; all P < .001). The models' goodness-of-fit was 1.40 ⩽ χ(2)/df ⩽ 3.74 (good-reasonable). Latent modeling is a comprehensive way to gather insight about young swimmers' performance over time. Different variables were the main responsible for the performance improvement. A gender gap, intra- and inter-subject variability was verified. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Assessment of neural networks performance in modeling rainfall ...

    African Journals Online (AJOL)

    This paper presents the evaluation of performance of Neural Network (NN) model in predicting the behavioral pattern of rainfall depths of some locations in the North Central zones of Nigeria. The input to the model is the consecutive rainfall depths data obtained from the Nigerian Meteorological (NiMET) Agency. The neural ...

  11. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    Science.gov (United States)

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  12. Comparison of Predictive Models for Photovoltaic Module Performance: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Marion, B.

    2008-05-01

    This paper examines three models used to estimate the performance of photovoltaic (PV) modules when the irradiances and PV cell temperatures are known. The results presented here were obtained by comparing modeled and measured maximum power (Pm) for PV modules that rely on different technologies.

  13. Faculty Performance Evaluation: The CIPP-SAPS Model.

    Science.gov (United States)

    Mitcham, Maralynne

    1981-01-01

    The issues of faculty performance evaluation for allied health professionals are addressed. Daniel Stufflebeam's CIPP (content-imput-process-product) model is introduced and its development into a CIPP-SAPS (self-administrative-peer- student) model is pursued. (Author/CT)

  14. Performance measures and criteria for hydrologic and water quality models

    Science.gov (United States)

    Performance measures and criteria are essential for model calibration and validation. This presentation will include a summary of one of the papers that will be included in the 2014 Hydrologic and Water Quality Model Calibration & Validation Guidelines Special Collection of the ASABE Transactions. T...

  15. Issues of diffuse pollution model complexity arising from performance benchmarking

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Flow and nitrate dynamics were simulated in two catchments, the River Aire in northern England and the River Ythan in north-east Scotland. In the case of the Aire, a diffuse pollution model was coupled with a river quality model (CASCADE-QUESTOR; in the study of the Ythan, an integrated model (SWAT was used. In each study, model performance was evaluated for differing levels of spatial representation in input data sets (rainfall, soils and land use. In respect of nitrate concentrations, the performance of the models was compared with that of a regression model based on proportions of land cover. The overall objective was to assess the merits of spatially distributed input data sets. In both catchments, specific measures of quantitative performance showed that models using the most detailed available input data contributed, at best, only a marginal improvement over simpler implementations. Hence, the level of complexity used in input data sets has to be determined, not only on multiple criteria of quantitative performance but also on qualitative assessments, reflecting the specific context of the model application and the current and likely future needs of end-users.

  16. Modelling the Performance of Product Integrated Photovoltaic (PIPV) Cells Indoors

    NARCIS (Netherlands)

    Apostolou, G.; Verwaal, M.; Reinders, Angelina H.M.E.

    2014-01-01

    In this paper we present a model, which have been developed for the estimation of the PV products’ cells’ performance in an indoor environment. The model computes the efficiency and power production of PV technologies, as a function of distance from natural and artificial light sources. It intents

  17. Activity-Based Costing Model for Assessing Economic Performance.

    Science.gov (United States)

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  18. PVFORM: A new approach to photovoltaic system performance modeling

    Science.gov (United States)

    Menicucci, D. F.

    Over the past two years a number of studies have resulted in important changes in photovoltaic (PV) system simulation. For example, a Sandia National Laboratories (SNLA) study showed that the major cause of error in PV performance models is incorrect estimates of insolation in the plane of the array of the system under study. Other studies have resulted in improvements in the modelling of various array subsystems. SNLA has integrated improved modelling techniques into a simplified program named PVFORM that can be run on mainframe or personal computers. The new model simulates the hourly performance of a PV flat-plate system for a one-year period. Verification tests at SNLA have shown that predicted values from the PVFORM model are typically within 1% to 5% of measured values. This paper describes the PVFORM model and gives examples of its application to the design, testing, and evaluation of PV systems.

  19. Exploiting quality and texture features to estimate age and gender from fingerprints

    Science.gov (United States)

    Marasco, Emanuela; Lugini, Luca; Cukic, Bojan

    2014-05-01

    Age and gender of an individual, when available, can contribute to identification decisions provided by primary biometrics and help improve matching performance. In this paper, we propose a system which automatically infers age and gender from the fingerprint image. Current approaches for predicting age and gender generally exploit features such as ridge count, and white lines count that are manually extracted. Existing automated approaches have significant limitations in accuracy especially when dealing with data pertaining to elderly females. The model proposed in this paper exploits image quality features synthesized from 40 different frequency bands, and image texture properties captured using the Local Binary Pattern (LBP) and the Local Phase Quantization (LPQ) operators. We evaluate the performance of the proposed approach using fingerprint images collected from 500 users with an optical sensor. The approach achieves prediction accuracy of 89.1% for age and 88.7% for gender.

  20. Precise Disturbance Modeling for Improvement of Positioning Performance

    Science.gov (United States)

    Yamamoto, Masafumi; Iwasaki, Makoto; Ito, Kazuaki; Matsui, Nobuyuki

    This paper presents a modeling methodology for unknown disturbances in mechatronics systems, based on a disturbance estimation using an iterative learning process. In the disturbance modeling, the nonlinear friction is especially handled as the disturbance in mechanisms, which mainly deteriorates the trajectory control performance. The friction can be mathematically modeled by using the learned estimation data, as a function of displacement, velocity, acceleration, and jerk of the actuator. This model has a distinguished feature that the friction compensation can be achieved with a generalization capability for different conditions. The proposed positioning control approach with the disturbance modeling and compensation has been verified by experiments using a table drive system on machine stand.

  1. Review of Methods for Buildings Energy Performance Modelling

    Science.gov (United States)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting – replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance

  2. Performance Models for the Spike Banded Linear System Solver

    Directory of Open Access Journals (Sweden)

    Murat Manguoglu

    2011-01-01

    Full Text Available With availability of large-scale parallel platforms comprised of tens-of-thousands of processors and beyond, there is significant impetus for the development of scalable parallel sparse linear system solvers and preconditioners. An integral part of this design process is the development of performance models capable of predicting performance and providing accurate cost models for the solvers and preconditioners. There has been some work in the past on characterizing performance of the iterative solvers themselves. In this paper, we investigate the problem of characterizing performance and scalability of banded preconditioners. Recent work has demonstrated the superior convergence properties and robustness of banded preconditioners, compared to state-of-the-art ILU family of preconditioners as well as algebraic multigrid preconditioners. Furthermore, when used in conjunction with efficient banded solvers, banded preconditioners are capable of significantly faster time-to-solution. Our banded solver, the Truncated Spike algorithm is specifically designed for parallel performance and tolerance to deep memory hierarchies. Its regular structure is also highly amenable to accurate performance characterization. Using these characteristics, we derive the following results in this paper: (i we develop parallel formulations of the Truncated Spike solver, (ii we develop a highly accurate pseudo-analytical parallel performance model for our solver, (iii we show excellent predication capabilities of our model – based on which we argue the high scalability of our solver. Our pseudo-analytical performance model is based on analytical performance characterization of each phase of our solver. These analytical models are then parameterized using actual runtime information on target platforms. An important consequence of our performance models is that they reveal underlying performance bottlenecks in both serial and parallel formulations. All of our results are

  3. The Exploitation of Evolving Resources

    CERN Document Server

    McGlade, Jacqueline; Law, Richard

    1993-01-01

    The impact of man on the biosphere is profound. Quite apart from our capacity to destroy natural ecosystems and to drive species to extinction, we mould the evolution of the survivors by the selection pressures we apply to them. This has implications for the continued health of our natural biological resources and for the way in which we seek to optimise yield from those resources. Of these biological resources, fish stocks are particularly important to mankind as a source of protein. On a global basis, fish stocks provide the major source of protein for human consumption from natural ecosystems, amounting to some seventy million tonnes in 1970. Although fisheries management has been extensively developed over the last century, it has not hitherto considered the evolutionary consequences of fishing activity. While this omission may not have been serious in the past, the ever increasing intensity of exploitation and the deteriorating health of fish stocks has generated an urgent need for a better understanding...

  4. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    Science.gov (United States)

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  5. MODELING SIMULATION AND PERFORMANCE STUDY OF GRIDCONNECTED PHOTOVOLTAIC ENERGY SYSTEM

    OpenAIRE

    Nagendra K; Karthik J; Keerthi Rao C; Kumar Raja Pemmadi

    2017-01-01

    This paper presents Modeling Simulation of grid connected Photovoltaic Energy System and performance study using MATLAB/Simulink. The Photovoltaic energy system is considered in three main parts PV Model, Power conditioning System and Grid interface. The Photovoltaic Model is inter-connected with grid through full scale power electronic devices. The simulation is conducted on the PV energy system at normal temperature and at constant load by using MATLAB.

  6. Ultrasonic Flaw Imaging via Multipath Exploitation

    Directory of Open Access Journals (Sweden)

    Yimin D. Zhang

    2012-01-01

    Full Text Available We consider ultrasonic imaging for the visualization of flaws in a material. Ultrasonic imaging is a powerful nondestructive testing (NDT tool which assesses material conditions via the detection, localization, and classification of flaws inside a structure. We utilize reflections of ultrasonic signals which occur when encountering different media and interior boundaries. These reflections can be cast as direct paths to the target corresponding to the virtual sensors appearing on the top and bottom side of the target. Some of these virtual sensors constitute a virtual aperture, whereas in others, the aperture changes with the transmitter position. Exploitations of multipath extended virtual array apertures provide enhanced imaging capability beyond the limitation of traditional multisensor approaches. The waveforms observed at the physical as well as the virtual sensors yield additional measurements corresponding to different aspect angles, thus allowing proper multiview imaging of flaws. We derive the wideband point spread functions for dominant multipaths and show that fusion of physical and virtual sensor data improves the flaw perimeter detection and localization performance. The effectiveness of the proposed multipath exploitation approach is demonstrated using real data.

  7. Aphid Heritable Symbiont Exploits Defensive Mutualism.

    Science.gov (United States)

    Doremus, Matthew R; Oliver, Kerry M

    2017-04-15

    Insects and other animals commonly form symbioses with heritable bacteria, which can exert large influences on host biology and ecology. The pea aphid, Acyrthosiphon pisum , is a model for studying effects of infection with heritable facultative symbionts (HFS), and each of its seven common HFS species has been reported to provide resistance to biotic or abiotic stresses. However, one common HFS, called X-type, rarely occurs as a single infection in field populations and instead typically superinfects individual aphids with Hamiltonella defensa , another HFS that protects aphids against attack by parasitic wasps. Using experimental aphid lines comprised of all possible infection combinations in a uniform aphid genotype, we investigated whether the most common strain of X-type provides any of the established benefits associated with aphid HFS as a single infection or superinfection with H. defensa We found that X-type does not confer protection to any tested threats, including parasitoid wasps, fungal pathogens, or thermal stress. Instead, component fitness assays identified large costs associated with X-type infection, costs which were ameliorated in superinfected aphids. Together these findings suggest that X-type exploits the aphid/ H. defensa mutualism and is maintained primarily as a superinfection by "hitchhiking" via the mutualistic benefits provided by another HFS. Exploitative symbionts potentially restrict the functions and distributions of mutualistic symbioses with effects that extend to other community members. IMPORTANCE Maternally transmitted bacterial symbionts are widespread and can have major impacts on the biology of arthropods, including insects of medical and agricultural importance. Given that host fitness and symbiont fitness are tightly linked, inherited symbionts can spread within host populations by providing beneficial services. Many insects, however, are frequently infected with multiple heritable symbiont species, providing potential

  8. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  9. Real-time individualization of the unified model of performance.

    Science.gov (United States)

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  10. A psychoecological model of academic performance among Hispanic adolescents.

    Science.gov (United States)

    Chun, Heejung; Dickson, Ginger

    2011-12-01

    Although the number of students who complete high school continues to rise, dramatic differences in school success remain across racial/ethnic groups. The current study addressed Hispanic adolescents' academic performance by investigating the relationships of parental involvement, culturally responsive teaching, sense of school belonging, and academic self-efficacy and academic performance. Participants were 478 (51.5% female) Hispanic 7th graders in the US-Mexico borderlands. Based on Bronfenbrenner's ecological systems theory, a structural model was tested. Results showed that the proposed model was supported by demonstrating significant indirect effects of parental involvement, culturally responsive teaching, and sense of school belonging on academic performance. Furthermore, academic self-efficacy was found to mediate the relationships between parental involvement, culturally responsive teaching, and sense of school belonging and academic performance. The current study provides a useful psychoecological model to inform educators and psychologists who seek to meet the needs of Hispanic students.

  11. Observer analysis and its impact on task performance modeling

    Science.gov (United States)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  12. A Model of Imam's Leadership and Mosque Performance in Malaysia

    Directory of Open Access Journals (Sweden)

    Mahazan, A. M.

    2013-12-01

    Full Text Available This article aims to discuss three major arguments. First, mosque performance could be measured at least by using two dimensions; number of regular jamaah (congregants of the mosque and number of religious classes organized by the mosque. Second, the association of imams’ leadership and mosque performance is centralized on the issue of how imams’ leadership traits influence their effective leadership behavior capacity in predicting their performance as mosque leaders. Third, the association of imams leadership traits, leadership behavior, and mosque performance could be moderated by the level of autonomy the imams’ have. This study applies content analysis approach to analyze researches and theories in organizational leadership field of study as well as mosque performance. Based on the analysis conducted, this study proposes a model of imams’ leadership and mosque performance association to be applied in researches concerning imams’ leadership and mosque performance in Malaysia or elsewhere.

  13. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  14. Performance Model and Sensitivity Analysis for a Solar Thermoelectric Generator

    Science.gov (United States)

    Rehman, Naveed Ur; Siddiqui, Mubashir Ali

    2017-03-01

    In this paper, a regression model for evaluating the performance of solar concentrated thermoelectric generators (SCTEGs) is established and the significance of contributing parameters is discussed in detail. The model is based on several natural, design and operational parameters of the system, including the thermoelectric generator (TEG) module and its intrinsic material properties, the connected electrical load, concentrator attributes, heat transfer coefficients, solar flux, and ambient temperature. The model is developed by fitting a response curve, using the least-squares method, to the results. The sample points for the model were obtained by simulating a thermodynamic model, also developed in this paper, over a range of values of input variables. These samples were generated employing the Latin hypercube sampling (LHS) technique using a realistic distribution of parameters. The coefficient of determination was found to be 99.2%. The proposed model is validated by comparing the predicted results with those in the published literature. In addition, based on the elasticity for parameters in the model, sensitivity analysis was performed and the effects of parameters on the performance of SCTEGs are discussed in detail. This research will contribute to the design and performance evaluation of any SCTEG system for a variety of applications.

  15. Comparative assessment of PV plant performance models considering climate effects

    DEFF Research Database (Denmark)

    Tina, Giuseppe; Ventura, Cristina; Sera, Dezso

    2017-01-01

    The paper investigates the effect of climate conditions on the accuracy of PV system performance models (physical and interpolation methods) which are used within a monitoring system as a reference for the power produced by a PV system to detect inefficient or faulty operating conditions. The met......The paper investigates the effect of climate conditions on the accuracy of PV system performance models (physical and interpolation methods) which are used within a monitoring system as a reference for the power produced by a PV system to detect inefficient or faulty operating conditions...... the performance of the studied PV plants with others, the efficiency of the systems has been estimated by both conventional Performance Ratio and Corrected Performance Ratio...

  16. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  17. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    Science.gov (United States)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  18. Modeling and performance analysis of QoS data

    Science.gov (United States)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  19. A model of human performance on the traveling salesperson problem.

    Science.gov (United States)

    MacGregor, J N; Ormerod, T C; Chronicle, E P

    2000-10-01

    A computational model is proposed of how humans solve the traveling salesperson problem (TSP). Tests of the model are reported, using human performance measures from a variety of 10-, 20-, 40-, and 60-node problems, a single 48-node problem, and a single 100-node problem. The model provided a range of solutions that approximated the range of human solutions and conformed closely to quantitative and qualitative characteristics of human performance. The minimum path lengths of subjects and model deviated by average absolute values of 0.0%, 0.9%, 2.4%, 1.4%, 3.5%, and 0.02% for the 10-, 20-, 40-, 48-, 60-, and 100-node problems, respectively. Because the model produces a range of solutions, rather than a single solution, it may find better solutions than some conventional heuristic algorithms for solving TSPs, and comparative results are reported that support this suggestion.

  20. Performance Assessment of Hydrological Models Considering Acceptable Forecast Error Threshold

    Directory of Open Access Journals (Sweden)

    Qianjin Dong

    2015-11-01

    Full Text Available It is essential to consider the acceptable threshold in the assessment of a hydrological model because of the scarcity of research in the hydrology community and errors do not necessarily cause risk. Two forecast errors, including rainfall forecast error and peak flood forecast error, have been studied based on the reliability theory. The first order second moment (FOSM and bound methods are used to identify the reliability. Through the case study of the Dahuofang (DHF Reservoir, it is shown that the correlation between these two errors has great influence on the reliability index of hydrological model. In particular, the reliability index of the DHF hydrological model decreases with the increasing correlation. Based on the reliability theory, the proposed performance evaluation framework incorporating the acceptable forecast error threshold and correlation among the multiple errors can be used to evaluate the performance of a hydrological model and to quantify the uncertainties of a hydrological model output.

  1. Evaluation of Enterprise Performance Based on FLI-GA Model

    Directory of Open Access Journals (Sweden)

    Shouyi Zhang

    2014-05-01

    Full Text Available Purpose: There are many kinds of methods to evaluate the performance of enterprise, but they still have some distinct shortcomings. In order to achieve a better evaluation result, we put forward a new model named FLI-GA (Fuzzy logic inference & Genetic algorithm. Approach: This model, mainly based on the fuzzy logic inference method, uses fuzzy rule-based system (FRBS to avoid the drawbacks of FRBS. Genetic algorithm is applied in this model. Findings: FLI-GA model can be used to evaluate certain enterprise performance, and its evaluation results are more accurate than fuzzy logic inference method. Originality: This model combines the genetic algorithm with the unclear reasoning methods so as to make the appraisal results more reasonable and more satisfying.

  2. Comparative Performance of Volatility Models for Oil Price

    Directory of Open Access Journals (Sweden)

    Afees A. Salisu

    2012-07-01

    Full Text Available In this paper, we compare the performance of volatility models for oil price using daily returns of WTI. The innovations of this paper are in two folds: (i we analyse the oil price across three sub samples namely period before, during and after the global financial crisis, (ii we also analyse the comparative performance of both symmetric and asymmetric volatility models for the oil price. We find that oil price was most volatile during the global financial crises compared to other sub samples. Based on the appropriate model selection criteria, the asymmetric GARCH models appear superior to the symmetric ones in dealing with oil price volatility. This finding indicates evidence of leverage effects in the oil market and ignoring these effects in oil price modelling will lead to serious biases and misleading results.

  3. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  4. Comparison of Simple Versus Performance-Based Fall Prediction Models

    Directory of Open Access Journals (Sweden)

    Shekhar K. Gadkaree BS

    2015-05-01

    Full Text Available Objective: To compare the predictive ability of standard falls prediction models based on physical performance assessments with more parsimonious prediction models based on self-reported data. Design: We developed a series of fall prediction models progressing in complexity and compared area under the receiver operating characteristic curve (AUC across models. Setting: National Health and Aging Trends Study (NHATS, which surveyed a nationally representative sample of Medicare enrollees (age ≥65 at baseline (Round 1: 2011-2012 and 1-year follow-up (Round 2: 2012-2013. Participants: In all, 6,056 community-dwelling individuals participated in Rounds 1 and 2 of NHATS. Measurements: Primary outcomes were 1-year incidence of “ any fall ” and “ recurrent falls .” Prediction models were compared and validated in development and validation sets, respectively. Results: A prediction model that included demographic information, self-reported problems with balance and coordination, and previous fall history was the most parsimonious model that optimized AUC for both any fall (AUC = 0.69, 95% confidence interval [CI] = [0.67, 0.71] and recurrent falls (AUC = 0.77, 95% CI = [0.74, 0.79] in the development set. Physical performance testing provided a marginal additional predictive value. Conclusion: A simple clinical prediction model that does not include physical performance testing could facilitate routine, widespread falls risk screening in the ambulatory care setting.

  5. Six scenarios of exploiting an ontology based, mobilized learning environment

    NARCIS (Netherlands)

    Kismihók, G.; Szabó, I.; Vas, R.

    2012-01-01

    In this article, six different exploitation possibilities of an educational ontology based, mobilized learning management system are presented. The focal point of this system is the educational ontology model. The first version of this educational ontology model serves as a foundation for curriculum

  6. An improved model to predict performance under mental fatigue.

    Science.gov (United States)

    Peng, Henry T; Bouak, Fethi; Wang, Wenbi; Chow, Renee; Vartanian, Oshin

    2018-01-08

    Fatigue has become an increasing problem in our modern society. Using MATLAB as a generic modelling tool, a fatigue model was developed based on an existing one and compared with a commercial fatigue software for prediction of cognitive performance under total and partial sleep deprivation. The flexibility of our fatigue model allowed additions of new algorithms and mechanisms for non-sleep factors and countermeasures and thus improved model predictions and usability for both civilian and military applications. This was demonstrated by model simulations of various scenarios and comparison with experimental studies. Our future work will be focused on model validation and integration with other modelling tools. Practitioner Summary: Mental fatigue affects health, safety and quality of life in our modern society. In this paper, we reported a cognitive fatigue model based on existing models with newly incorporated components taking both the operator's state of alertness and task demand into account. The model provided the additional capability for prediction of cognitive performance in scenarios involving pharmaceutical countermeasures, different task demands and shift work.

  7. THE USE OF NEURAL NETWORK TECHNOLOGY TO MODEL SWIMMING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    António José Silva

    2007-03-01

    Full Text Available The aims of the present study were: to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility, swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports

  8. An ambient agent model for analyzing managers' performance during stress

    Science.gov (United States)

    ChePa, Noraziah; Aziz, Azizi Ab; Gratim, Haned

    2016-08-01

    Stress at work have been reported everywhere. Work related performance during stress is a pattern of reactions that occurs when managers are presented with work demands that are not matched with their knowledge, skills, or abilities, and which challenge their ability to cope. Although there are many prior findings pertaining to explain the development of manager performance during stress, less attention has been given to explain the same concept through computational models. In such, a descriptive nature in psychological theories about managers' performance during stress can be transformed into a causal-mechanistic stage that explains the relationship between a series of observed phenomena. This paper proposed an ambient agent model for analyzing managers' performance during stress. Set of properties and variables are identified through past literatures to construct the model. Differential equations have been used in formalizing the model. Set of equations reflecting relations involved in the proposed model are presented. The proposed model is essential and can be encapsulated within an intelligent agent or robots that can be used to support managers during stress.

  9. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  10. Product Data Model for Performance-driven Design

    Science.gov (United States)

    Hu, Guang-Zhong; Xu, Xin-Jian; Xiao, Shou-Ne; Yang, Guang-Wu; Pu, Fan

    2017-09-01

    When designing large-sized complex machinery products, the design focus is always on the overall performance; however, there exist no design theory and method based on performance driven. In view of the deficiency of the existing design theory, according to the performance features of complex mechanical products, the performance indices are introduced into the traditional design theory of "Requirement-Function-Structure" to construct a new five-domain design theory of "Client Requirement-Function-Performance-Structure-Design Parameter". To support design practice based on this new theory, a product data model is established by using performance indices and the mapping relationship between them and the other four domains. When the product data model is applied to high-speed train design and combining the existing research result and relevant standards, the corresponding data model and its structure involving five domains of high-speed trains are established, which can provide technical support for studying the relationships between typical performance indices and design parameters and the fast achievement of a high-speed train scheme design. The five domains provide a reference for the design specification and evaluation criteria of high speed train and a new idea for the train's parameter design.

  11. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  12. The influence of conceptual model structure on model performance: a comparative study for 237 French catchments

    Directory of Open Access Journals (Sweden)

    W. R. van Esse

    2013-10-01

    Full Text Available Models with a fixed structure are widely used in hydrological studies and operational applications. For various reasons, these models do not always perform well. As an alternative, flexible modelling approaches allow the identification and refinement of the model structure as part of the modelling process. In this study, twelve different conceptual model structures from the SUPERFLEX framework are compared with the fixed model structure GR4H, using a large set of 237 French catchments and discharge-based performance metrics. The results show that, in general, the flexible approach performs better than the fixed approach. However, the flexible approach has a higher chance of inconsistent results when calibrated on two different periods. When analysing the subset of 116 catchments where the two approaches produce consistent performance over multiple time periods, their average performance relative to each other is almost equivalent. From the point of view of developing a well-performing fixed model structure, the findings favour models with parallel reservoirs and a power function to describe the reservoir outflow. In general, conceptual hydrological models perform better on larger and/or wetter catchments than on smaller and/or drier catchments. The model structures performed poorly when there were large climatic differences between the calibration and validation periods, in catchments with flashy flows, and in catchments with unexplained variations in low flow measurements.

  13. The better model to predict and improve pediatric health care quality: performance or importance-performance?

    Science.gov (United States)

    Olsen, Rebecca M; Bryant, Carol A; McDermott, Robert J; Ortinau, David

    2013-01-01

    The perpetual search for ways to improve pediatric health care quality has resulted in a multitude of assessments and strategies; however, there is little research evidence as to their conditions for maximum effectiveness. A major reason for the lack of evaluation research and successful quality improvement initiatives is the methodological challenge of measuring quality from the parent perspective. Comparison of performance-only and importance-performance models was done to determine the better predictor of pediatric health care quality and more successful method for improving the quality of care provided to children. Fourteen pediatric health care centers serving approximately 250,000 patients in 70,000 households in three West Central Florida counties were studied. A cross-sectional design was used to determine the importance and performance of 50 pediatric health care attributes and four global assessments of pediatric health care quality. Exploratory factor analysis revealed five dimensions of care (physician care, access, customer service, timeliness of services, and health care facility). Hierarchical multiple regression compared the performance-only and the importance-performance models. In-depth interviews, participant observations, and a direct cognitive structural analysis identified 50 health care attributes included in a mailed survey to parents(n = 1,030). The tailored design method guided survey development and data collection. The importance-performance multiplicative additive model was a better predictor of pediatric health care quality. Attribute importance moderates performance and quality, making the importance-performance model superior for measuring and providing a deeper understanding of pediatric health care quality and a better method for improving the quality of care provided to children. Regardless of attribute performance, if the level of attribute importance is not taken into consideration, health care organizations may spend valuable

  14. Comparison of Predictive Models for PV Module Performance (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Marion, B.

    2008-05-01

    This paper examines three models used to estimate the maximum power (P{sub m}) of PV modules when the irradiance and PV cell temperature are known: (1) the power temperature coefficient model, (2) the PVFORM model, and (3) the bilinear interpolation model. A variation of the power temperature coefficient model is also presented that improved model accuracy. For modeling values of P{sub m}, an 'effective' plane-of-array (POA) irradiance (E{sub e}) and the PV cell temperature (T) are used as model inputs. Using E{sub e} essentially removes the effects of variations in solar spectrum and reflectance losses, and permits the influence of irradiance and temperature on model performance for P{sub m} to be more easily studied. Eq. 1 is used to determine E{sub e} from T and the PV module's measured short-circuit current (I{sub sc}). Zero subscripts denote performance at Standard Reporting Conditions (SRC).

  15. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    Science.gov (United States)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  16. A Bibliometric Analysis and Review on Performance Modeling Literature

    Directory of Open Access Journals (Sweden)

    Barbara Livieri

    2015-04-01

    Full Text Available In management practice, performance indicators are considered as a prerequisite to make informed decisions in line with the organization’s goals. On the other hand, indicators summarizes compound phenomena in a few digits, which can induce to inadequate decisions, biased by information loss and conflicting values. Model driven approaches in enterprise engineering can be very effective to avoid these pitfalls, or to take it under control. For that reason, “performance modeling” has the numbers to play a primary role in the “model driven enterprise” scenario, together with process, information and other enterprise-related aspects. In this perspective, we propose a systematic review of the literature on performance modeling in order to retrieve, classify, and summarize existing research, identify the core authors and define areas and opportunities for future research.

  17. Testing a Model of Work Performance in an Academic Environment

    Directory of Open Access Journals (Sweden)

    B. Charles Tatum

    2012-04-01

    Full Text Available In modern society, people both work and study. The intersection between organizational and educational research suggests that a common model should apply to both academic and job performance. The purpose of this study was to apply a model of work and job performance (based on general expectancy theory to a classroom setting, and test the predicted relationships using a causal/path model methodology. The findings revealed that motivation and ability predicted student expectations and self-efficacy, and that expectations and efficacy predicted class performance. Limitations, implications, and future research directions are discussed. This study showed how the research in industrial and organizational psychology is relevant to education. It was concluded that greater effort should be made to integrate knowledge across a wider set of domains.

  18. CORPORATE FORESIGHT AND PERFORMANCE: A CHAIN-OF-EFFECTS MODEL

    DEFF Research Database (Denmark)

    Jissink, Tymen; Huizingh, Eelko K.R.E.; Rohrbeck, René

    2015-01-01

    In this paper we develop and validate a measurement scale for corporate foresight and examine its impact on performance in a chain-of-effects model. We conceptualize corporate foresight as an organizational ability consisting of five distinct dimensions: information scope, method usage, people......, formal organization, and culture. We investigate the relation of corporate foresight with three innovation performance dimensions – new product success, new product innovativeness, and financial performance. We use partial-least-squares structural equations modelling to assess our measurement mode ls...... and test our research hypotheses. Using a cross-industry sample of 153 innovative firms, we find that corporate foresight can be validly and reliably measured by our measurement instrument. The results of the structural model support the hypothesized positive effects of corporate foresight on all...

  19. Exploiting the Errors: A Simple Approach for Improved Volatility Forecasting

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier

    We propose a new family of easy-to-implement realized volatility based forecasting models. The models exploit the asymptotic theory for high-frequency realized volatility estimation to improve the accuracy of the forecasts. By allowing the parameters of the models to vary explicitly with the (est......We propose a new family of easy-to-implement realized volatility based forecasting models. The models exploit the asymptotic theory for high-frequency realized volatility estimation to improve the accuracy of the forecasts. By allowing the parameters of the models to vary explicitly...... with the (estimated) degree of measurement error, the models exhibit stronger persistence, and in turn generate more responsive forecasts, when the measurement error is relatively low. Implementing the new class of models for the S&P500 equity index and the individual constituents of the Dow Jones Industrial Average...

  20. Passive IR sensor performance analysis using Mathcad modeling

    Science.gov (United States)

    Wan, William

    2009-05-01

    This paper presents an end-to-end physics based performance model for a passive infrared (IR) sensor using the Mathcad® spreadsheet. This model will calculate both temporal and spatial noise of a staring focal plane array (FPA) IR sensor, the signal-to-noise ratio (SNR) of the sensor against different targets at different ranges (with atmospheric effects, both turbulence and extinction considered). Finally, probability of detection (Pd) based on SNR results, against these targets, are also discussed. This model will allow the user to easily define basic sensor parameters such as spectral band, detector FPA format & size, field of view (FOV), optics F/#, etc. In addition, target and environmental parameters are also considered for the analyses. This performance model will allow the user to determine if a particular IR sensor design would meet the requirements of its operational specifications, and would help the user to refine the various parameters of the IR sensor at the early design stage.

  1. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  2. Modelling of green roof hydrological performance for urban drainage applications

    DEFF Research Database (Denmark)

    Locatelli, Luca; Mark, Ole; Mikkelsen, Peter Steen

    2014-01-01

    , the model was used to evaluate the variation of the average annual runoff from green roofs as a function of the total available storage and vegetation type. The results show that even a few millimeters of storage can reduce the mean annual runoff by up to 20% when compared to a traditional roof......Green roofs are being widely implemented for stormwater management and their impact on the urban hydrological cycle can be evaluated by incorporating them into urban drainage models. This paper presents a model of green roof long term and single event hydrological performance. The model includes...... surface and subsurface storage components representing the overall retention capacity of the green roof which is continuously re-established by evapotranspiration. The runoff from the model is described through a non-linear reservoir approach. The model was calibrated and validated using measurement data...

  3. Automatic image exploitation system for small UAVs

    Science.gov (United States)

    Heinze, N.; Esswein, M.; Krüger, W.; Saur, G.

    2008-04-01

    For surveillance and reconnaissance tasks small UAVs are of growing importance. These UAVs have an endurance of several hours, but a small payload of about some kilograms. As a consequence lightweight sensors and cameras have to be used without having a mechanical stabilized high precision sensor-platform, which would exceed the payload and cost limitations. An example of such a system is the German UAV Luna with optical and IR sensors on board. For such platforms we developed image exploitation algorithms. The algorithms comprise mosaiking, stabilization, image enhancement, video based moving target indication, and stereo-image generation. Other products are large geo-coded image mosaics, stereo mosaics, and 3-D-model generation. For test and assessment of these algorithms the experimental system ABUL has been developed, in which the algorithms are integrated. The ABUL system is used for tests and assessment by military PIs.

  4. Model of service-oriented catering supply chain performance evaluation

    Directory of Open Access Journals (Sweden)

    Juanqiong Gou

    2013-03-01

    Full Text Available Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering supply chain. Findings: With the analysis of the characteristics of catering supply chain, we construct the performance evaluation model in order to guarantee the food safety, logistics efficiency, price stability and so on. Practical implications: In order to evolve an efficient and effective service supply chain, it can not only used to own enterprise improvement, but also can be used for selecting different customers, to choose a different model of development. Originality/value: This paper has a new definition of service-oriented catering supply chain. And it offers a model to evaluate the performance of this catering supply chain.

  5. Kinetic models in industrial biotechnology - Improving cell factory performance.

    Science.gov (United States)

    Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats

    2014-07-01

    An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  7. Does better rainfall interpolation improve hydrological model performance?

    Science.gov (United States)

    Bàrdossy, Andràs; Kilsby, Chris; Lewis, Elisabeth

    2017-04-01

    High spatial variability of precipitation is one of the main sources of uncertainty in rainfall/runoff modelling. Spatially distributed models require detailed space time information on precipitation as input. In the past decades a lot of effort was spent on improving precipitation interpolation using point observations. Different geostatistical methods like Ordinary Kriging, External Drift Kriging or Copula based interpolation can be used to find the best estimators for unsampled locations. The purpose of this work is to investigate to what extents more sophisticated precipitation estimation methods can improve model performance. For this purpose the Wye catchment in Wales was selected. The physically-based spatially-distributed hydrological model SHETRAN is used to describe the hydrological processes in the catchment. 31 raingauges with 1 hourly temporal resolution are available for a time period of 6 years. In order to avoid the effect of model uncertainty model parameters were not altered in this study. Instead 100 random subsets consisting of 14 stations each were selected. For each of the configurations precipitation was interpolated for each time step using nearest neighbor (NN), inverse distance (ID) and Ordinary Kriging (OK). The variogram was obtained using the temporal correlation of the time series measured at different locations. The interpolated data were used as input for the spatially distributed model. Performance was evaluated for daily mean discharges using the Nash-Sutcliffe coefficient, temporal correlations, flow volumes and flow duration curves. The results show that the simplest NN and the sophisticated OK performances are practically equally good, while ID performed worse. NN was often better for high flows. The reason for this is that NN does not reduce the variance, while OK and ID yield smooth precipitation fields. The study points out the importance of precipitation variability and suggests the use of conditional spatial simulation as

  8. Does segmentation always improve model performance in credit scoring?

    OpenAIRE

    Bijak, Katarzyna; Thomas, Lyn C.

    2012-01-01

    Credit scoring allows for the credit risk assessment of bank customers. A single scoring model (scorecard) can be developed for the entire customer population, e.g. using logistic regression. However, it is often expected that segmentation, i.e. dividing the population into several groups and building separate scorecards for them, will improve the model performance. The most common statistical methods for segmentation are the two-step approaches, where logistic regression follows Classificati...

  9. A fuzzy approach for statistical modeling of operators’ performance

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Madadi

    2017-10-01

    Full Text Available The aim of this paper is to integrate fuzzy approach into statistical process control in order to provide a comprehensive description of an operator’s performance. To this end, all influential factors in quality of a product are simultaneously controlled to assess the performance in each working day. Then a fuzzy x̄ chart is used for statistical modeling process during a month. This paper shows that the fuzzy controller chart can provide a good indication to evaluate a person's work performance.

  10. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  11. The integration of intrapreneurship into a performance management model

    Directory of Open Access Journals (Sweden)

    Thabo WL Foba

    2007-02-01

    Full Text Available This study aimed to investigate the feasibility of using the dynamics of intrapreneurship to develop a new generation performance management model based on the structural dynamics of the Balanced Score Card approach. The literature survey covered entrepreneurship, from which the construct, intrapreneurship, was synthesized. Reconstructive logic and Hermeneutic methodology were used in studying the performance management systems and the Balanced Score Card approach. The dynamics were then integrated into a new approach for the management of performance of intrapreneurial employees in the corporate environment. An unstructured opinion survey followed: a sample of intrapreneurship students evaluated and validated the model’s conceptual feasibility and probable practical value.

  12. Evaluating performances of simplified physically based models for landslide susceptibility

    Science.gov (United States)

    Formetta, G.; Capparelli, G.; Versace, P.

    2015-12-01

    Rainfall induced shallow landslides cause loss of life and significant damages involving private and public properties, transportation system, etc. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. Reliable models' applications involve: automatic parameters calibration, objective quantification of the quality of susceptibility maps, model sensitivity analysis. This paper presents a methodology to systemically and objectively calibrate, verify and compare different models and different models performances indicators in order to individuate and eventually select the models whose behaviors are more reliable for a certain case study. The procedure was implemented in package of models for landslide susceptibility analysis and integrated in the NewAge-JGrass hydrological model. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, the optimization of the index distance to perfect classification in the receiver operating characteristic plane (D2PC) coupled with model M3 is the best modeling solution for our test case.

  13. Modelling and performance analysis of four and eight element TCAS

    Science.gov (United States)

    Sampath, K. S.; Rojas, R. G.; Burnside, W. D.

    1990-01-01

    This semi-annual report describes the work performed during the period September 1989 through March 1990. The first section presents a description of the effect of the engines of the Boeing 737-200 on the performance of a bottom mounted eight-element traffic alert and collision avoidance system (TCAS). The second section deals exclusively with a four element TCAS antenna. The model obtained to simulate the four element TCAS and new algorithms developed for studying its performance are described. The effect of location on its performance when mounted on top of a Boeing 737-200 operating at 1060 MHz is discussed. It was found that the four element TCAS generally does not perform as well as the eight element TCAS III.

  14. Performance evaluation of groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; He, X.

    2015-01-01

    hydrological performance by comparison of performance statistics from comparable hydrological models, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1-11 hydraulic conductivity zones showed improved hydrological performance with an increasing number...

  15. Model for determining and optimizing delivery performance in industrial systems

    Directory of Open Access Journals (Sweden)

    Fechete Flavia

    2017-01-01

    Full Text Available Performance means achieving organizational objectives regardless of their nature and variety, and even overcoming them. Improving performance is one of the major goals of any company. Achieving the global performance means not only obtaining the economic performance, it is a must to take into account other functions like: function of quality, delivery, costs and even the employees satisfaction. This paper aims to improve the delivery performance of an industrial system due to their very low results. The delivery performance took into account all categories of performance indicators, such as on time delivery, backlog efficiency or transport efficiency. The research was focused on optimizing the delivery performance of the industrial system, using linear programming. Modeling the delivery function using linear programming led to obtaining precise quantities to be produced and delivered each month by the industrial system in order to minimize their transport cost, satisfying their customers orders and to control their stock. The optimization led to a substantial improvement in all four performance indicators that concern deliveries.

  16. Performing Geographic Information System Analyses on Building Information Management Models

    OpenAIRE

    Bengtsson, Jonas; Grönkvist, Mikael

    2017-01-01

    As the usage of both BIM (Building Information Modelling) and 3D-GIS (Three-Dimensional Geographic Information Systems) has increased within the field of urban development and construction, so has the interest in connecting these two tools.  One possibility of integration is the potential of visualising BIM models together with other spatial data in 3D. Another is to be able to perform spatial 3D analyses on the models. Both of these can be achieved through use of GIS software. This study exp...

  17. Metallic Rotor Sizing and Performance Model for Flywheel Systems

    Science.gov (United States)

    Moore, Camille J.; Kraft, Thomas G.

    2012-01-01

    The NASA Glenn Research Center (GRC) is developing flywheel system requirements and designs for terrestrial and spacecraft applications. Several generations of flywheels have been designed and tested at GRC using in-house expertise in motors, magnetic bearings, controls, materials and power electronics. The maturation of a flywheel system from the concept phase to the preliminary design phase is accompanied by maturation of the Integrated Systems Performance model, where estimating relationships are replaced by physics based analytical techniques. The modeling can incorporate results from engineering model testing and emerging detail from the design process.

  18. Application and Performance Analysis of a New Bundle Adjustment Model

    Science.gov (United States)

    Sun, Y.; Liu, X.; Chen, R.; Wan, J.; Wang, Q.; Wang, H.; Li, Y.; Yan, L.

    2017-09-01

    As the basis for photogrammetry, Bundle Adjustment (BA) can restore the pose of cameras accurately, reconstruct the 3D models of environment, and serve as the criterion of digital production. For the classical nonlinear optimization of BA model based on the Euclidean coordinate, it suffers the problem of being seriously dependent on the initial values, making it unable to converge fast or converge to a global minimum. This paper first introduces a new BA model based on parallax angle feature parametrization, and then analyses the applications and performance of the model used in photogrammetry field. To discuss the impact and the performance of the model (especially in aerial photogrammetry), experiments using two aerial datasets under different initial values were conducted. The experiment results are better than some well-known software packages of BA, and the simulation results illustrate the stability of the new model than the normal BA under the Euclidean coordinate. In all, the new BA model shows promising applications in faster and more efficient aerial photogrammetry with good convergence and fast convergence speed.

  19. Introducing Model Predictive Control for Improving Power Plant Portfolio Performance

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Bendtsen, Jan Dimon; Børresen, Simon

    2008-01-01

    This paper introduces a model predictive control (MPC) approach for construction of a controller for balancing the power generation against consumption in a power system. The objective of the controller is to coordinate a portfolio consisting of multiple power plant units in the effort to perform...

  20. Towards a Social Networks Model for Online Learning & Performance

    Science.gov (United States)

    Chung, Kon Shing Kenneth; Paredes, Walter Christian

    2015-01-01

    In this study, we develop a theoretical model to investigate the association between social network properties, "content richness" (CR) in academic learning discourse, and performance. CR is the extent to which one contributes content that is meaningful, insightful and constructive to aid learning and by social network properties we…

  1. Models for the financial-performance effects of Marketing

    NARCIS (Netherlands)

    Hanssens, D.M.; Dekimpe, Marnik; Wierenga, B.; van der Lans, R.

    We consider marketing-mix models that explicitly include financial performance criteria. These financial metrics are not only comparable across the marketing mix, they also relate well to investors’ evaluation of the firm. To that extent, we treat marketing as an investment in customer value

  2. A model checker for performance and dependability properties

    NARCIS (Netherlands)

    Hermanns, H.; Karelse, F.; Katoen, Joost P.; Meyer-Kayser, J.; Siegle, M.

    2001-01-01

    Markov chains are widely used in the context of performance and reliability evaluation of systems of various nature. Model checking of such chains with respect to a given (branching) temporal logic formula has been proposed for both the discrete [8] and the continuous time setting [1], [3]. In this

  3. Performance in model transformations: experiments with ATL and QVT

    NARCIS (Netherlands)

    van Amstel, Marcel; Bosems, S.; Ivanov, Ivan; Ferreira Pires, Luis; Cabot, Jordi; Visser, Eelco

    Model transformations are increasingly being incorporated in software development processes. However, as systems being developed with transformations grow in size and complexity, the performance of the transformations tends to degrade. In this paper we investigate the factors that have an impact on

  4. Algorithms and Methods for High-Performance Model Predictive Control

    DEFF Research Database (Denmark)

    Frison, Gianluca

    The goal of this thesis is to investigate algorithms and methods to reduce the solution time of solvers for Model Predictive Control (MPC). The thesis is accompanied with an open-source toolbox for High-Performance implementation of solvers for MPC (HPMPC), that contains the source code of all...

  5. Information Processing Models and Computer Aids for Human Performance.

    Science.gov (United States)

    Swets, John A.; And Others

    Progress is reported on four research tasks. An experiment tested the effectiveness of a computer-based phonology instructional system for second-language learning. In research on models of human-computer interactions, experiments were performed demonstrating that the provision of certain incentives to the users of a time-sharing system can have…

  6. Probabilistic performance analysis using the SLEUTH fuel modelling code

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, I.D.

    1986-01-01

    The paper describes the development and sample use of a computer code which automates both the Monte Carlo and response surface approaches to probabilistic fuel performance modelling utilising the SLEUTH-82 deterministic program. A number of the statistical procedures employed, which have been prepared as independent computer codes, are also described. These are of general applicability in many areas of probabilistic assessment.

  7. Performances of estimators of linear auto-correlated error model ...

    African Journals Online (AJOL)

    The performances of five estimators of linear models with autocorrelated disturbance terms are compared when the independent variable is exponential. The results reveal that for both small and large samples, the Ordinary Least Squares (OLS) compares favourably with the Generalized least Squares (GLS) estimators in ...

  8. Modeling performance measurement applications and implementation issues in DEA

    CERN Document Server

    Cook, Wade D

    2005-01-01

    Addresses advanced/new DEA methodology and techniques that are developed for modeling unique and new performance evaluation issuesPesents new DEA methodology and techniques via discussions on how to solve managerial problemsProvides an easy-to-use DEA software - DEAFrontier (www.deafrontier.com) which is an excellent tool for both DEA researchers and practitioners.

  9. Stutter-Step Models of Performance in School

    Science.gov (United States)

    Morgan, Stephen L.; Leenman, Theodore S.; Todd, Jennifer J.; Kentucky; Weeden, Kim A.

    2013-01-01

    To evaluate a stutter-step model of academic performance in high school, this article adopts a unique measure of the beliefs of 12,591 high school sophomores from the Education Longitudinal Study, 2002-2006. Verbatim responses to questions on occupational plans are coded to capture specific job titles, the listing of multiple jobs, and the listing…

  10. Performance and Cognitive Assessment in 3-D Modeling

    Science.gov (United States)

    Fahrer, Nolan E.; Ernst, Jeremy V.; Branoff, Theodore J.; Clark, Aaron C.

    2011-01-01

    The purpose of this study was to investigate identifiable differences between performance and cognitive assessment scores in a 3-D modeling unit of an engineering drafting course curriculum. The study aimed to provide further investigation of the need of skill-based assessments in engineering/technical graphics courses to potentially increase…

  11. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Jeon Soohong

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  12. Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs

    CERN Document Server

    Shen, Ruijing; Yu, Hao

    2012-01-01

    Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have  become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits.  Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...

  13. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Science.gov (United States)

    Jeon, Soohong; Kim, Daehwan; Hong, Chinsuk; Jeong, Weuibong

    2014-12-01

    This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM) is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/ SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive elements, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  14. Acoustic performance of industrial mufflers with CAE modeling and simulation

    Directory of Open Access Journals (Sweden)

    Soohong Jeon

    2014-12-01

    Full Text Available This paper investigates the noise transmission performance of industrial mufflers widely used in ships based on the CAE modeling and simulation. Since the industrial mufflers have very complicated internal structures, the conventional Transfer Matrix Method (TMM is of limited use. The CAE modeling and simulation is therefore required to incorporate commercial softwares: CATIA for geometry modeling, MSC/PATRAN for FE meshing and LMS/SYSNOISE for analysis. Main sources of difficulties in this study are led by complicated arrangement of reactive ele- ments, perforated walls and absorption materials. The reactive elements and absorbent materials are modeled by applying boundary conditions given by impedance. The perforated walls are modeled by applying the transfer impedance on the duplicated node mesh. The CAE approach presented in this paper is verified by comparing with the theoretical solution of a concentric-tube resonator and is applied for industrial mufflers.

  15. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  16. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  17. Performance of GeantV EM Physics Models

    Science.gov (United States)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2017-10-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  18. Performance of GeantV EM Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2016-10-14

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  19. Performance of GeantV EM Physics Models

    CERN Document Server

    Amadio, G; Apostolakis, J; Aurora, A; Bandieramonte, M; Bhattacharyya, A; Bianchini, C; Brun, R; Canal P; Carminati, F; Cosmo, G; Duhem, L; Elvira, D; Folger, G; Gheata, A; Gheata, M; Goulas, I; Iope, R; Jun, S Y; Lima, G; Mohanty, A; Nikitina, T; Novak, M; Pokorski, W; Ribon, A; Seghal, R; Shadura, O; Vallecorsa, S; Wenzel, S; Zhang, Y

    2017-01-01

    The recent progress in parallel hardware architectures with deeper vector pipelines or many-cores technologies brings opportunities for HEP experiments to take advantage of SIMD and SIMT computing models. Launched in 2013, the GeantV project studies performance gains in propagating multiple particles in parallel, improving instruction throughput and data locality in HEP event simulation on modern parallel hardware architecture. Due to the complexity of geometry description and physics algorithms of a typical HEP application, performance analysis is indispensable in identifying factors limiting parallel execution. In this report, we will present design considerations and preliminary computing performance of GeantV physics models on coprocessors (Intel Xeon Phi and NVidia GPUs) as well as on mainstream CPUs.

  20. A PERFORMANCE MANAGEMENT MODEL FOR PHYSICAL ASSET MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.L. Jooste

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: There has been an emphasis shift from maintenance management towards asset management, where the focus is on reliable and operational equipment and on effective assets at optimum life-cycle costs. A challenge in the manufacturing industry is to develop an asset performance management model that is integrated with business processes and strategies. The authors developed the APM2 model to satisfy that requirement. The model has a generic reference structure and is supported by operational protocols to assist in operations management. It facilitates performance measurement, business integration and continuous improvement, whilst exposing industry to the latest developments in asset performance management.

    AFRIKAANSE OPSOMMING: Daar is ‘n klemverskuiwing vanaf onderhoudsbestuur na batebestuur, waar daar gefokus word op betroubare en operasionele toerusting, asook effektiewe bates teen optimum lewensikluskoste. ‘n Uitdaging in die vervaardigingsindustrie is die ontwikkeling van ‘n prestasiemodel vir bates, wat geïntegreer is met besigheidsprosesse en –strategieë. Die outeurs het die APM2 model ontwikkel om in hierdie behoefte te voorsien. Die model het ‘n generiese verwysingsstruktuur, wat ondersteun word deur operasionele instruksies wat operasionele bestuur bevorder. Dit fasiliteer prestasiebestuur, besigheidsintegrasie en voortdurende verbetering, terwyl dit die industrie ook blootstel aan die nuutste ontwikkelinge in prestasiebestuur van bates.

  1. Aerodynamic drag modeling of alpine skiers performing giant slalom turns.

    Science.gov (United States)

    Meyer, Frédéric; Le Pelley, David; Borrani, Fabio

    2012-06-01

    Aerodynamic drag plays an important role in performance for athletes practicing sports that involve high-velocity motions. In giant slalom, the skier is continuously changing his/her body posture, and this affects the energy dissipated in aerodynamic drag. It is therefore important to quantify this energy to understand the dynamic behavior of the skier. The aims of this study were to model the aerodynamic drag of alpine skiers in giant slalom simulated conditions and to apply these models in a field experiment to estimate energy dissipated through aerodynamic drag. The aerodynamic characteristics of 15 recreational male and female skiers were measured in a wind tunnel while holding nine different skiing-specific postures. The drag and the frontal area were recorded simultaneously for each posture. Four generalized and two individualized models of the drag coefficient were built, using different sets of parameters. These models were subsequently applied in a field study designed to compare the aerodynamic energy losses between a dynamic and a compact skiing technique. The generalized models estimated aerodynamic drag with an accuracy of between 11.00% and 14.28%, and the individualized models estimated aerodynamic drag with an accuracy between 4.52% and 5.30%. The individualized model used for the field study showed that using a dynamic technique led to 10% more aerodynamic drag energy loss than using a compact technique. The individualized models were capable of discriminating different techniques performed by advanced skiers and seemed more accurate than the generalized models. The models presented here offer a simple yet accurate method to estimate the aerodynamic drag acting upon alpine skiers while rapidly moving through the range of positions typical to turning technique.

  2. Ergonomic evaluation model of operational room based on team performance

    Directory of Open Access Journals (Sweden)

    YANG Zhiyi

    2017-05-01

    Full Text Available A theoretical calculation model based on the ergonomic evaluation of team performance was proposed in order to carry out the ergonomic evaluation of the layout design schemes of the action station in a multitasking operational room. This model was constructed in order to calculate and compare the theoretical value of team performance in multiple layout schemes by considering such substantial influential factors as frequency of communication, distance, angle, importance, human cognitive characteristics and so on. An experiment was finally conducted to verify the proposed model under the criteria of completion time and accuracy rating. As illustrated by the experiment results,the proposed approach is conductive to the prediction and ergonomic evaluation of the layout design schemes of the action station during early design stages,and provides a new theoretical method for the ergonomic evaluation,selection and optimization design of layout design schemes.

  3. Performance and Prediction: Bayesian Modelling of Fallible Choice in Chess

    Science.gov (United States)

    Haworth, Guy; Regan, Ken; di Fatta, Giuseppe

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

  4. Frequency modulated continuous wave lidar performance model for target detection

    Science.gov (United States)

    Du Bosq, Todd W.; Preece, Bradley L.

    2017-05-01

    The desire to provide the warfighter both ranging and reflected intensity information is increasing to meet expanding operational needs. LIDAR imaging systems can provide the user with intensity, range, and even velocity information of a scene. The ability to predict the performance of LIDAR systems is critical for the development of future designs without the need to conduct time consuming and costly field studies. Performance modeling of a frequency modulated continuous wave (FMCW) LIDAR system is challenging due to the addition of the chirped laser source and waveform mixing. The FMCW LIDAR model is implemented in the NV-IPM framework using the custom component generation tool. This paper presents an overview of the FMCW Lidar, the customized LIDAR components, and a series of trade studies using the LIDAR model.

  5. PHARAO laser source flight model: Design and performances

    Energy Technology Data Exchange (ETDEWEB)

    Lévèque, T., E-mail: thomas.leveque@cnes.fr; Faure, B.; Esnault, F. X.; Delaroche, C.; Massonnet, D.; Grosjean, O.; Buffe, F.; Torresi, P. [Centre National d’Etudes Spatiales, 18 avenue Edouard Belin, 31400 Toulouse (France); Bomer, T.; Pichon, A.; Béraud, P.; Lelay, J. P.; Thomin, S. [Sodern, 20 Avenue Descartes, 94451 Limeil-Brévannes (France); Laurent, Ph. [LNE-SYRTE, CNRS, UPMC, Observatoire de Paris, 61 avenue de l’Observatoire, 75014 Paris (France)

    2015-03-15

    In this paper, we describe the design and the main performances of the PHARAO laser source flight model. PHARAO is a laser cooled cesium clock specially designed for operation in space and the laser source is one of the main sub-systems. The flight model presented in this work is the first remote-controlled laser system designed for spaceborne cold atom manipulation. The main challenges arise from mechanical compatibility with space constraints, which impose a high level of compactness, a low electric power consumption, a wide range of operating temperature, and a vacuum environment. We describe the main functions of the laser source and give an overview of the main technologies developed for this instrument. We present some results of the qualification process. The characteristics of the laser source flight model, and their impact on the clock performances, have been verified in operational conditions.

  6. Measuring broadband in Europe: : development of a market model and performance index using structural equations modelling

    NARCIS (Netherlands)

    Lemstra, W.; Voogt, B.; Gorp, van N.

    2015-01-01

    This contribution reports on the development of a performance index and underlying market model with application to broadband developments in the European Union. The Structure–Conduct–Performance paradigm provides the theoretical grounding. Structural equations modelling was applied to determine the

  7. Modeling time-lagged reciprocal psychological empowerment-performance relationships.

    Science.gov (United States)

    Maynard, M Travis; Luciano, Margaret M; D'Innocenzo, Lauren; Mathieu, John E; Dean, Matthew D

    2014-11-01

    Employee psychological empowerment is widely accepted as a means for organizations to compete in increasingly dynamic environments. Previous empirical research and meta-analyses have demonstrated that employee psychological empowerment is positively related to several attitudinal and behavioral outcomes including job performance. While this research positions psychological empowerment as an antecedent influencing such outcomes, a close examination of the literature reveals that this relationship is primarily based on cross-sectional research. Notably, evidence supporting the presumed benefits of empowerment has failed to account for potential reciprocal relationships and endogeneity effects. Accordingly, using a multiwave, time-lagged design, we model reciprocal relationships between psychological empowerment and job performance using a sample of 441 nurses from 5 hospitals. Incorporating temporal effects in a staggered research design and using structural equation modeling techniques, our findings provide support for the conventional positive correlation between empowerment and subsequent performance. Moreover, accounting for the temporal stability of variables over time, we found support for empowerment levels as positive influences on subsequent changes in performance. Finally, we also found support for the reciprocal relationship, as performance levels were shown to relate positively to changes in empowerment over time. Theoretical and practical implications of the reciprocal psychological empowerment-performance relationships are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. TRM4: Range performance model for electro-optical imaging systems

    Science.gov (United States)

    Keßler, Stefan; Gal, Raanan; Wittenstein, Wolfgang

    2017-05-01

    TRM4 is a commonly used model for assessing device and range performance of electro-optical imagers. The latest version, TRM4.v2, has been released by Fraunhofer IOSB of Germany in June 2016. While its predecessor, TRM3, was developed for thermal imagers, assuming blackbody targets and backgrounds, TRM4 extends the TRM approach to assess three imager categories: imagers that exploit emitted radiation (TRM4 category Thermal), reflected radiation (TRM4 category Visible/NIR/SWIR), and both emitted and reflected radiation (TRM4 category General). Performance assessment in TRM3 and TRM4 is based on the perception of standard four-bar test patterns, whether distorted by under-sampling or not. Spatial and sampling characteristics are taken into account by the Average Modulation at Optimum Phase (AMOP), which replaces the system MTF used in previous models. The Minimum Temperature Difference Perceived (MTDP) figure of merit was introduced in TRM3 for assessing the range performance of thermal imagers. In TRM4, this concept is generalized to the MDSP (Minimum Difference Signal Perceived), which can be applied to all imager categories. In this paper, we outline and discuss the TRM approach and pinpoint differences between TRM4 and TRM3. In addition, an overview of the TRM4 software and its functionality is given. Features newly introduced in TRM4, such as atmospheric turbulence, irradiation sources, and libraries are addressed. We conclude with an outlook on future work and the new module for intensified CCD cameras that is currently under development

  9. Efficient Depth Map Compression Exploiting Segmented Color Data

    DEFF Research Database (Denmark)

    Milani, Simone; Zanuttigh, Pietro; Zamarin, Marco

    2011-01-01

    performances is still an open research issue. This paper presents a novel compression scheme that exploits a segmentation of the color data to predict the shape of the different surfaces in the depth map. Then each segment is approximated with a parameterized plane. In case the approximation is sufficiently...

  10. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  11. A New Model to Simulate Energy Performance of VRF Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Tianzhen; Pang, Xiufeng; Schetrit, Oren; Wang, Liping; Kasahara, Shinichi; Yura, Yoshinori; Hinokuma, Ryohei

    2014-03-30

    This paper presents a new model to simulate energy performance of variable refrigerant flow (VRF) systems in heat pump operation mode (either cooling or heating is provided but not simultaneously). The main improvement of the new model is the introduction of the evaporating and condensing temperature in the indoor and outdoor unit capacity modifier functions. The independent variables in the capacity modifier functions of the existing VRF model in EnergyPlus are mainly room wet-bulb temperature and outdoor dry-bulb temperature in cooling mode and room dry-bulb temperature and outdoor wet-bulb temperature in heating mode. The new approach allows compliance with different specifications of each indoor unit so that the modeling accuracy is improved. The new VRF model was implemented in a custom version of EnergyPlus 7.2. This paper first describes the algorithm for the new VRF model, which is then used to simulate the energy performance of a VRF system in a Prototype House in California that complies with the requirements of Title 24 ? the California Building Energy Efficiency Standards. The VRF system performance is then compared with three other types of HVAC systems: the Title 24-2005 Baseline system, the traditional High Efficiency system, and the EnergyStar Heat Pump system in three typical California climates: Sunnyvale, Pasadena and Fresno. Calculated energy savings from the VRF systems are significant. The HVAC site energy savings range from 51 to 85percent, while the TDV (Time Dependent Valuation) energy savings range from 31 to 66percent compared to the Title 24 Baseline Systems across the three climates. The largest energy savings are in Fresno climate followed by Sunnyvale and Pasadena. The paper discusses various characteristics of the VRF systems contributing to the energy savings. It should be noted that these savings are calculated using the Title 24 prototype House D under standard operating conditions. Actual performance of the VRF systems for real

  12. Life Cycle Model for IT Performance Measurement: A Reference Model for Small and Medium Enterprises (SME)

    Science.gov (United States)

    Albayrak, Can Adam; Gadatsch, Andreas; Olufs, Dirk

    IT performance measurement is often associated by chief executive officers with IT cost cutting although IT protects business processes from increasing IT costs. IT cost cutting only endangers the company’s efficiency. This opinion discriminates those who do IT performance measurement in companies as a bean-counter. The present paper describes an integrated reference model for IT performance measurement based on a life cycle model and a performance oriented framework. The presented model was created from a practical point of view. It is designed lank compared with other known concepts and is very appropriate for small and medium enterprises (SME).

  13. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-09-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  14. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-04-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location.Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams.Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed.Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model.Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  15. Outdoor FSO Communications Under Fog: Attenuation Modeling and Performance Evaluation

    KAUST Repository

    Esmail, Maged Abdullah

    2016-07-18

    Fog is considered to be a primary challenge for free space optics (FSO) systems. It may cause attenuation that is up to hundreds of decibels per kilometer. Hence, accurate modeling of fog attenuation will help telecommunication operators to engineer and appropriately manage their networks. In this paper, we examine fog measurement data coming from several locations in Europe and the United States and derive a unified channel attenuation model. Compared with existing attenuation models, our proposed model achieves a minimum of 9 dB, which is lower than the average root-mean-square error (RMSE). Moreover, we have investigated the statistical behavior of the channel and developed a probabilistic model under stochastic fog conditions. Furthermore, we studied the performance of the FSO system addressing various performance metrics, including signal-to-noise ratio (SNR), bit-error rate (BER), and channel capacity. Our results show that in communication environments with frequent fog, FSO is typically a short-range data transmission technology. Therefore, FSO will have its preferred market segment in future wireless fifth-generation/sixth-generation (5G/6G) networks having cell sizes that are lower than a 1-km diameter. Moreover, the results of our modeling and analysis can be applied in determining the switching/thresholding conditions in highly reliable hybrid FSO/radio-frequency (RF) networks.

  16. Synthesised model of market orientation-business performance relationship

    Directory of Open Access Journals (Sweden)

    G. Nwokah

    2006-12-01

    Full Text Available Purpose: The purpose of this paper is to assess the impact of market orientation on the performance of the organisation. While much empirical works have centered on market orientation, the generalisability of its impact on performance of the Food and Beverages organisations in the Nigeria context has been under-researched. Design/Methodology/Approach: The study adopted a triangulation methodology (quantitative and qualitative approach. Data was collected from key informants using a research instrument. Returned instruments were analyzed using nonparametric correlation through the use of the Statistical Package for Social Sciences (SPSS version 10. Findings: The study validated the earlier instruments but did not find any strong association between market orientation and business performance in the Nigerian context using the food and beverages organisations for the study. The reasons underlying the weak relationship between market orientation and business performance of the Food and Beverages organisations is government policies, new product development, diversification, innovation and devaluation of the Nigerian currency. One important finding of this study is that market orientation leads to business performance through some moderating variables. Implications: The study recommends that Nigerian Government should ensure a stable economy and make economic policies that will enhance existing business development in the country. Also, organisations should have performance measurement systems to detect the impact of investment on market orientation with the aim of knowing how the organisation works. Originality/Value: This study significantly refines the body of knowledge concerning the impact of market orientation on the performance of the organisation, and thereby offers a model of market orientation and business performance in the Nigerian context for marketing scholars and practitioners. This model will, no doubt, contribute to the body of

  17. Impact of spatial variability and sampling design on model performance

    Science.gov (United States)

    Schrape, Charlotte; Schneider, Anne-Kathrin; Schröder, Boris; van Schaik, Loes

    2017-04-01

    Many environmental physical and chemical parameters as well as species distributions display a spatial variability at different scales. In case measurements are very costly in labour time or money a choice has to be made between a high sampling resolution at small scales and a low spatial cover of the study area or a lower sampling resolution at the small scales resulting in local data uncertainties with a better spatial cover of the whole area. This dilemma is often faced in the design of field sampling campaigns for large scale studies. When the gathered field data are subsequently used for modelling purposes the choice of sampling design and resulting data quality influence the model performance criteria. We studied this influence with a virtual model study based on a large dataset of field information on spatial variation of earthworms at different scales. Therefore we built a virtual map of anecic earthworm distributions over the Weiherbach catchment (Baden-Württemberg in Germany). First of all the field scale abundance of earthworms was estimated using a catchment scale model based on 65 field measurements. Subsequently the high small scale variability was added using semi-variograms, based on five fields with a total of 430 measurements divided in a spatially nested sampling design over these fields, to estimate the nugget, range and standard deviation of measurements within the fields. With the produced maps, we performed virtual samplings of one up to 50 random points per field. We then used these data to rebuild the catchment scale models of anecic earthworm abundance with the same model parameters as in the work by Palm et al. (2013). The results of the models show clearly that a large part of the non-explained deviance of the models is due to the very high small scale variability in earthworm abundance: the models based on single virtual sampling points on average obtain an explained deviance of 0.20 and a correlation coefficient of 0.64. With

  18. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  19. Performance evaluation of volumetric water content and relative permittivity models.

    Science.gov (United States)

    Mukhlisin, Muhammad; Saputra, Almushfi

    2013-01-01

    In recent years many models have been proposed for measuring soil water content (θ) based on the permittivity (ε) value. Permittivity is one of the properties used to determine θ in measurements using the electromagnetic method. This method is widely used due to quite substantial differences in values of ε for air, soil, and water, as it allows the θ value to be measured accurately. The performance of six proposed models with one parameter (i.e., permittivity) and five proposed models with two or more parameters (i.e., permittivity, porosity, and dry bulk density of soil) is discussed and evaluated. Secondary data obtained from previous studies are used for comparison to calibrate and evaluate the models. The results show that the models with one parameter proposed by Roth et al. (1992) and Topp et al. (1980) have the greatest R² data errors, while for the model with two parameters, the model proposed by Malicki et al. (1996) agrees very well with the data compared with other models.

  20. Instruction-level performance modeling and characterization of multimedia applications

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Y. [Los Alamos National Lab., NM (United States). Scientific Computing Group; Cameron, K.W. [Louisiana State Univ., Baton Rouge, LA (United States). Dept. of Computer Science

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based on microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.

  1. Exploiting wild relatives of S. lycopersicum for quality traits

    NARCIS (Netherlands)

    Víquez Zamora, A.M.

    2015-01-01

    Exploiting wild relatives of S. lycopersicum for quality traits Ana Marcela Víquez Zamora Tomatoes are consumed worldwide and became a model for crop plant research. A part of the research aims at expanding genetic diversity in tomato; this can be done by incorporating

  2. Risk assessment by dynamic representation of vulnerability, exploitation, and impact

    Science.gov (United States)

    Cam, Hasan

    2015-05-01

    Assessing and quantifying cyber risk accurately in real-time is essential to providing security and mission assurance in any system and network. This paper presents a modeling and dynamic analysis approach to assessing cyber risk of a network in real-time by representing dynamically its vulnerabilities, exploitations, and impact using integrated Bayesian network and Markov models. Given the set of vulnerabilities detected by a vulnerability scanner in a network, this paper addresses how its risk can be assessed by estimating in real-time the exploit likelihood and impact of vulnerability exploitation on the network, based on real-time observations and measurements over the network. The dynamic representation of the network in terms of its vulnerabilities, sensor measurements, and observations is constructed dynamically using the integrated Bayesian network and Markov models. The transition rates of outgoing and incoming links of states in hidden Markov models are used in determining exploit likelihood and impact of attacks, whereas emission rates help quantify the attack states of vulnerabilities. Simulation results show the quantification and evolving risk scores over time for individual and aggregated vulnerabilities of a network.

  3. Final Report, “Exploiting Global View for Resilience”

    Energy Technology Data Exchange (ETDEWEB)

    Chien, Andrew [Univ. of Chicago, IL (United States)

    2017-03-29

    Final technical report for the "Exploiting Global View for Resilience" project. The GVR project aims to create a new approach to portable, resilient applications. The GVR approach builds on a global view data model,, adding versioning (multi-version), user control of timing and rate (multi-stream), and flexible cross layer error signalling and recovery. With a versioned array as a portable abstraction, GVR enables application programmers to exploit deep scientific and application code insights to manage resilience (and its overhead) in a flexible, portable fashion.

  4. Modeling the Performance of Fast Mulipole Method on HPC platforms

    KAUST Repository

    Ibeid, Huda

    2012-04-06

    The current trend in high performance computing is pushing towards exascale computing. To achieve this exascale performance, future systems will have between 100 million and 1 billion cores assuming gigahertz cores. Currently, there are many efforts studying the hardware and software bottlenecks for building an exascale system. It is important to understand and meet these bottlenecks in order to attain 10 PFLOPS performance. On applications side, there is an urgent need to model application performance and to understand what changes need to be made to ensure continued scalability at this scale. Fast multipole methods (FMM) were originally developed for accelerating N-body problems for particle based methods. Nowadays, FMM is more than an N-body solver, recent trends in HPC have been to use FMMs in unconventional application areas. FMM is likely to be a main player in exascale due to its hierarchical nature and the techniques used to access the data via a tree structure which allow many operations to happen simultaneously at each level of the hierarchy. In this thesis , we discuss the challenges for FMM on current parallel computers and future exasclae architecture. Furthermore, we develop a novel performance model for FMM. Our ultimate aim of this thesis is to ensure the scalability of FMM on the future exascale machines.

  5. Packaging of Sin Goods - Commitment or Exploitation?

    DEFF Research Database (Denmark)

    Nafziger, Julia

    to such self-control problems, and possibly exploit them, by offering different package sizes. In a competitive market, either one or three (small, medium and large) packages are offered. In contrast to common intuition, the large, and not the small package is a commitment device. The latter serves to exploit...

  6. The exploitation of Gestalt principles by magicians.

    Science.gov (United States)

    Barnhart, Anthony S

    2010-01-01

    Magicians exploit a host of psychological principles in deceiving their audiences. Psychologists have recently attempted to pinpoint the most common psychological tendencies exploited by magicians. This paper highlights two co-occurring principles that appear to be the basis for many popular magic tricks: accidental alignment and good continuation.

  7. Compact models and performance investigations for subthreshold interconnects

    CERN Document Server

    Dhiman, Rohit

    2014-01-01

    The book provides a detailed analysis of issues related to sub-threshold interconnect performance from the perspective of analytical approach and design techniques. Particular emphasis is laid on the performance analysis of coupling noise and variability issues in sub-threshold domain to develop efficient compact models. The proposed analytical approach gives physical insight of the parameters affecting the transient behavior of coupled interconnects. Remedial design techniques are also suggested to mitigate the effect of coupling noise. The effects of wire width, spacing between the wires, wi

  8. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  9. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna

    2011-08-01

    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, http://dias.space.noa.gr. Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  10. Fuel performance modeling for proposed Th-based Canadian SCWR

    Energy Technology Data Exchange (ETDEWEB)

    Bell, J.S.; Chan, P.K. [Royal Military College of Canada, Chemistry and Chemical Engineering Department, Kingston, ON (Canada)

    2014-07-01

    The fuel assembly for the Canadian Super Critical Water Reactor (SCWR) is in the conceptual design phase. The proposed fuel pellets are made of ceramic Th-Pu mixed oxide ((Th,Pu)O{sub 2}). Neutronics and thermal hydraulics calculations are being undertaken by the nuclear industry to optimize the fuel assembly within a pressure tube. The SCWR working groups have established two conceptual fuel element designs, which defines outer diameter, fuel composition, cladding material, exit burnup etc. A detailed fuel element performance assessment under in-reactor conditions could be used to determine cladding material thickness and suitability and to optimize the fuel pellet geometry. This work reports the development of a fuel performance model to predict the behaviour of the Canadian SCWR fuel using the finite element method (COMSOL). An initial approach is to develop a thorium-uranium mixed-oxide ((Th,U)O{sub 2}) model. Preliminary results from this model agree with fuel irradiation data. Uranium dioxide (UO{sub 2}) fuel, under the same conditions, is also being modeled and compared. A plan to model (Th,Pu)O{sub 2} SCWR fuel will also be briefly presented here. (author)

  11. Key performance indicators in hospital based on balanced scorecard model

    Directory of Open Access Journals (Sweden)

    Hamed Rahimi

    2017-01-01

    Full Text Available Introduction: Performance measurement is receiving increasing verification all over the world. Nowadays in a lot of organizations, irrespective of their type or size, performance evaluation is the main concern and a key issue for top administrators. The purpose of this study is to organize suitable key performance indicators (KPIs for hospitals’ performance evaluation based on the balanced scorecard (BSC. Method: This is a mixed method study. In order to identify the hospital’s performance indicators (HPI, first related literature was reviewed and then the experts’ panel and Delphi method were used. In this study, two rounds were needed for the desired level of consensus. The experts rated the importance of the indicators, on a five-point Likert scale. In the consensus calculation, the consensus percentage was calculated by classifying the values 1-3 as not important (0 and 4-5 to (1 as important. Simple additive weighting technique was used to rank the indicators and select hospital’s KPIs. The data were analyzed by Excel 2010 software. Results: About 218 indicators were obtained from a review of selected literature. Through internal expert panel, 77 indicators were selected. Finally, 22 were selected for KPIs of hospitals. Ten indicators were selected in internal process perspective and 5, 4, and 3 indicators in finance, learning and growth, and customer, respectively. Conclusion: This model can be a useful tool for evaluating and comparing the performance of hospitals. However, this model is flexible and can be adjusted according to differences in the target hospitals. This study can be beneficial for hospital administrators and it can help them to change their perspective about performance evaluation.

  12. Fundamentals of Modeling, Data Assimilation, and High-performance Computing

    Science.gov (United States)

    Rood, Richard B.

    2005-01-01

    This lecture will introduce the concepts of modeling, data assimilation and high- performance computing as it relates to the study of atmospheric composition. The lecture will work from basic definitions and will strive to provide a framework for thinking about development and application of models and data assimilation systems. It will not provide technical or algorithmic information, leaving that to textbooks, technical reports, and ultimately scientific journals. References to a number of textbooks and papers will be provided as a gateway to the literature.

  13. Lifetime Cost and Performance model for photovoltaic power systems

    Science.gov (United States)

    Borden, C. S.

    1978-01-01

    This paper describes the approach and procedures of the Lifetime Cost and Performance (LCP) model for photovoltaic power systems. The LCP model is designed to evaluate the impact of alternative initial design and recurrent policy decisions on both cost and power output over the lifetime of a photovoltaic power plant. LCP is, therefore, useful to system designers and operators for addressing questions relating to optimal system configuration, installation activities, level of effort and timing of operations/maintenance actions, allowable degradation and replacement options.

  14. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  15. Human Engineering Modeling and Performance Lab Study Project

    Science.gov (United States)

    Oliva-Buisson, Yvette J.

    2014-01-01

    The HEMAP (Human Engineering Modeling and Performance) Lab is a joint effort between the Industrial and Human Engineering group and the KAVE (Kennedy Advanced Visualiations Environment) group. The lab consists of sixteen camera system that is used to capture human motions and operational tasks, through te use of a Velcro suit equipped with sensors, and then simulate these tasks in an ergonomic software package know as Jac, The Jack software is able to identify the potential risk hazards.

  16. Thermal performance modeling of cross-flow heat exchangers

    CERN Document Server

    Cabezas-Gómez, Luben; Saíz-Jabardo, José Maria

    2014-01-01

    This monograph introduces a numerical computational methodology for thermal performance modeling of cross-flow heat exchangers, with applications in chemical, refrigeration and automobile industries. This methodology allows obtaining effectiveness-number of transfer units (e-NTU) data and has been used for simulating several standard and complex flow arrangements configurations of cross-flow heat exchangers. Simulated results have been validated through comparisons with results from available exact and approximate analytical solutions. Very accurate results have been obtained over wide ranges

  17. Duct thermal performance models for large commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Wray, Craig P.

    2003-10-01

    Despite the potential for significant energy savings by reducing duct leakage or other thermal losses from duct systems in large commercial buildings, California Title 24 has no provisions to credit energy-efficient duct systems in these buildings. A substantial reason is the lack of readily available simulation tools to demonstrate the energy-saving benefits associated with efficient duct systems in large commercial buildings. The overall goal of the Efficient Distribution Systems (EDS) project within the PIER High Performance Commercial Building Systems Program is to bridge the gaps in current duct thermal performance modeling capabilities, and to expand our understanding of duct thermal performance in California large commercial buildings. As steps toward this goal, our strategy in the EDS project involves two parts: (1) developing a whole-building energy simulation approach for analyzing duct thermal performance in large commercial buildings, and (2) using the tool to identify the energy impacts of duct leakage in California large commercial buildings, in support of future recommendations to address duct performance in the Title 24 Energy Efficiency Standards for Nonresidential Buildings. The specific technical objectives for the EDS project were to: (1) Identify a near-term whole-building energy simulation approach that can be used in the impacts analysis task of this project (see Objective 3), with little or no modification. A secondary objective is to recommend how to proceed with long-term development of an improved compliance tool for Title 24 that addresses duct thermal performance. (2) Develop an Alternative Calculation Method (ACM) change proposal to include a new metric for thermal distribution system efficiency in the reporting requirements for the 2005 Title 24 Standards. The metric will facilitate future comparisons of different system types using a common ''yardstick''. (3) Using the selected near-term simulation approach

  18. The application of DEA model in enterprise environmental performance auditing

    Science.gov (United States)

    Li, F.; Zhu, L. Y.; Zhang, J. D.; Liu, C. Y.; Qu, Z. G.; Xiao, M. S.

    2017-01-01

    As a part of society, enterprises have an inescapable responsibility for environmental protection and governance. This article discusses the feasibility and necessity of enterprises environmental performance auditing and uses DEA model calculate the environmental performance of Haier for example. The most of reference data are selected and sorted from Haier’s environmental reportspublished in 2008, 2009, 2011 and 2015, and some of the data from some published articles and fieldwork. All the calculation results are calculated by DEAP software andhave a high credibility. The analysis results of this article can give corporate managements an idea about using environmental performance auditing to adjust their corporate environmental investments capital quota and change their company’s environmental strategies.

  19. Comparative assessment of PV plant performance models considering climate effects

    DEFF Research Database (Denmark)

    Tina, Giuseppe; Ventura, Cristina; Sera, Dezso

    2017-01-01

    . The methodological approach is based on comparative tests of the analyzed models applied to two PV plants installed respectively in north of Denmark (Aalborg) and in the south of Italy (Agrigento). The different ambient, operating and installation conditions allow to understand how these factors impact the precision...... and effectiveness of such approaches, among these factors it is worth mentioning the different percentage of diffuse component of the yearly solar radiation on the global one. The experimental results show the effectiveness of the proposed approach. In order to have the possibility to analyze and compare...... the performance of the studied PV plants with others, the efficiency of the systems has been estimated by both conventional Performance Ratio and Corrected Performance Ratio...

  20. Advanced transport systems analysis, modeling, and evaluation of performances

    CERN Document Server

    Janić, Milan

    2014-01-01

    This book provides a systematic analysis, modeling and evaluation of the performance of advanced transport systems. It offers an innovative approach by presenting a multidimensional examination of the performance of advanced transport systems and transport modes, useful for both theoretical and practical purposes. Advanced transport systems for the twenty-first century are characterized by the superiority of one or several of their infrastructural, technical/technological, operational, economic, environmental, social, and policy performances as compared to their conventional counterparts. The advanced transport systems considered include: Bus Rapid Transit (BRT) and Personal Rapid Transit (PRT) systems in urban area(s), electric and fuel cell passenger cars, high speed tilting trains, High Speed Rail (HSR), Trans Rapid Maglev (TRM), Evacuated Tube Transport system (ETT), advanced commercial subsonic and Supersonic Transport Aircraft (STA), conventionally- and Liquid Hydrogen (LH2)-fuelled commercial air trans...

  1. Logit Model based Performance Analysis of an Optimization Algorithm

    Science.gov (United States)

    Hernández, J. A.; Ospina, J. D.; Villada, D.

    2011-09-01

    In this paper, the performance of the Multi Dynamics Algorithm for Global Optimization (MAGO) is studied through simulation using five standard test functions. To guarantee that the algorithm converges to a global optimum, a set of experiments searching for the best combination between the only two MAGO parameters -number of iterations and number of potential solutions, are considered. These parameters are sequentially varied, while increasing the dimension of several test functions, and performance curves were obtained. The MAGO was originally designed to perform well with small populations; therefore, the self-adaptation task with small populations is more challenging while the problem dimension is higher. The results showed that the convergence probability to an optimal solution increases according to growing patterns of the number of iterations and the number of potential solutions. However, the success rates slow down when the dimension of the problem escalates. Logit Model is used to determine the mutual effects between the parameters of the algorithm.

  2. A performance measurement using balanced scorecard and structural equation modeling

    Directory of Open Access Journals (Sweden)

    Rosha Makvandi

    2014-02-01

    Full Text Available During the past few years, balanced scorecard (BSC has been widely used as a promising method for performance measurement. BSC studies organizations in terms of four perspectives including customer, internal processes, learning and growth and financial figures. This paper presents a hybrid of BSC and structural equation modeling (SEM to measure the performance of an Iranian university in province of Alborz, Iran. The proposed study of this paper uses this conceptual method, designs a questionnaire and distributes it among some university students and professors. Using SEM technique, the survey analyzes the data and the results indicate that the university did poorly in terms of all four perspectives. The survey extracts necessary target improvement by presenting necessary attributes for performance improvement.

  3. Optimization and Performance Modeling of Stencil Computations on Modern Microprocessors

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Kaushik; Kamil, Shoaib; Williams, Samuel; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2007-06-01

    Stencil-based kernels constitute the core of many important scientific applications on blockstructured grids. Unfortunately, these codes achieve a low fraction of peak performance, due primarily to the disparity between processor and main memory speeds. In this paper, we explore the impact of trends in memory subsystems on a variety of stencil optimization techniques and develop performance models to analytically guide our optimizations. Our work targets cache reuse methodologies across single and multiple stencil sweeps, examining cache-aware algorithms as well as cache-oblivious techniques on the Intel Itanium2, AMD Opteron, and IBM Power5. Additionally, we consider stencil computations on the heterogeneous multicore design of the Cell processor, a machine with an explicitly managed memory hierarchy. Overall our work represents one of the most extensive analyses of stencil optimizations and performance modeling to date. Results demonstrate that recent trends in memory system organization have reduced the efficacy of traditional cache-blocking optimizations. We also show that a cache-aware implementation is significantly faster than a cache-oblivious approach, while the explicitly managed memory on Cell enables the highest overall efficiency: Cell attains 88% of algorithmic peak while the best competing cache-based processor achieves only 54% of algorithmic peak performance.

  4. Development of performance models for thick composites in compression

    Energy Technology Data Exchange (ETDEWEB)

    Blake, H.W.; Grimsby, H.J.; Starbuck, J.M.; Welch, D.E.

    1991-11-01

    This report details initial activities and results from an investigation into the failure of thick-section composite cylinders loaded in compression. The efforts are aimed at the development of models for predicting cylinder performance based on composite material strengths derived from ring and cylinder tests of unidirectional materials. Initial results indicate that existing failure theories are applicable provided that material strength allowables are based on representative tests, and that appropriate solutions for cylinder stresses are used. Both the failure criteria and stress solution must allow for the three-dimensional stress state and for the discrete layer construction. Predictions for an initial test cylinder, which achieved a record pressure in hydrotest, are consistent with the observed performance. Performance model results obtained for a range of laminate constructions indicate this design to be optimum. Improvements in test fixturing also contributed to the record performance for this first cylinder. This work is sponsored by the Director as a three-year project funded from the Oak Ridge National Laboratory seed-money program.

  5. Forecasting Performance of Asymmetric GARCH Stock Market Volatility Models

    Directory of Open Access Journals (Sweden)

    Hojin Lee

    2009-12-01

    Full Text Available We investigate the asymmetry between positive and negative returns in their effect on conditional variance of the stock market index and incorporate the characteristics to form an out-of-sample volatility forecast. Contrary to prior evidence, however, the results in this paper suggest that no asymmetric GARCH model is superior to basic GARCH(1,1 model. It is our prior knowledge that, for equity returns, it is unlikely that positive and negative shocks have the same impact on the volatility. In order to reflect this intuition, we implement three diagnostic tests for volatility models: the Sign Bias Test, the Negative Size Bias Test, and the Positive Size Bias Test and the tests against the alternatives of QGARCH and GJR-GARCH. The asymmetry test results indicate that the sign and the size of the unexpected return shock do not influence current volatility differently which contradicts our presumption that there are asymmetric effects in the stock market volatility. This result is in line with various diagnostic tests which are designed to determine whether the GARCH(1,1 volatility estimates adequately represent the data. The diagnostic tests in section 2 indicate that the GARCH(1,1 model for weekly KOSPI returns is robust to the misspecification test. We also investigate two representative asymmetric GARCH models, QGARCH and GJR-GARCH model, for our out-of-sample forecasting performance. The out-of-sample forecasting ability test reveals that no single model is clearly outperforming. It is seen that the GJR-GARCH and QGARCH model give mixed results in forecasting ability on all four criteria across all forecast horizons considered. Also, the predictive accuracy test of Diebold and Mariano based on both absolute and squared prediction errors suggest that the forecasts from the linear and asymmetric GARCH models need not be significantly different from each other.

  6. A Fluid Model for Performance Analysis in Cellular Networks

    Directory of Open Access Journals (Sweden)

    Coupechoux Marceau

    2010-01-01

    Full Text Available We propose a new framework to study the performance of cellular networks using a fluid model and we derive from this model analytical formulas for interference, outage probability, and spatial outage probability. The key idea of the fluid model is to consider the discrete base station (BS entities as a continuum of transmitters that are spatially distributed in the network. This model allows us to obtain simple analytical expressions to reveal main characteristics of the network. In this paper, we focus on the downlink other-cell interference factor (OCIF, which is defined for a given user as the ratio of its outer cell received power to its inner cell received power. A closed-form formula of the OCIF is provided in this paper. From this formula, we are able to obtain the global outage probability as well as the spatial outage probability, which depends on the location of a mobile station (MS initiating a new call. Our analytical results are compared to Monte Carlo simulations performed in a traditional hexagonal network. Furthermore, we demonstrate an application of the outage probability related to cell breathing and densification of cellular networks.

  7. A nonlinear model for top fuel dragster dynamic performance assessment

    Science.gov (United States)

    Spanos, P. D.; Castillo, D. H.; Kougioumtzoglou, I. A.; Tapia, R. A.

    2012-02-01

    The top fuel dragster is the fastest and quickest vehicle in drag racing. This vehicle is capable of travelling a quarter mile in less than 4.5 s, reaching a final speed in excess of 330 miles per hour. The average power delivered by its engine exceeds 7000 Hp. To analyse and eventually increase the performance of a top fuel dragster, a dynamic model of the vehicle is developed. Longitudinal, vertical, and pitching chassis motions are considered, as well as drive-train dynamics. The aerodynamics of the vehicle, the engine characteristics, and the force due to the combustion gases are incorporated into the model. Further, a simplified model of the traction characteristics of the rear tyres is developed where the traction is calculated as a function of the slip ratio and the velocity. The resulting nonlinear, coupled differential equations of motion are solved using a fourth-order Runge-Kutta numerical integration scheme. Several simulation runs are made to investigate the effects of the aerodynamics and of the engine's initial torque in the performance of the vehicle. The results of the computational simulations are scrutinised by comparisons with data from actual dragster races. Ultimately, the proposed dynamic model of the dragster can be used to improve the aerodynamics, the engine and clutch set-ups of the vehicle, and possibly facilitate the redesign of the dragster.

  8. Thermal performance curves of Paramecium caudatum: a model selection approach.

    Science.gov (United States)

    Krenek, Sascha; Berendonk, Thomas U; Petzoldt, Thomas

    2011-05-01

    The ongoing climate change has motivated numerous studies investigating the temperature response of various organisms, especially that of ectotherms. To correctly describe the thermal performance of these organisms, functions are needed which sufficiently fit to the complete optimum curve. Surprisingly, model-comparisons for the temperature-dependence of population growth rates of an important ectothermic group, the protozoa, are still missing. In this study, temperature reaction norms of natural isolates of the freshwater protist Paramecium caudatum were investigated, considering nearly the entire temperature range. These reaction norms were used to estimate thermal performance curves by applying a set of commonly used model functions. An information theory approach was used to compare models and to identify the best ones for describing these data. Our results indicate that the models which can describe negative growth at the high- and low-temperature branch of an optimum curve are preferable. This is a prerequisite for accurately calculating the critical upper and lower thermal limits. While we detected a temperature optimum of around 29 °C for all investigated clonal strains, the critical thermal limits were considerably different between individual clones. Here, the tropical clone showed the narrowest thermal tolerance, with a shift of its critical thermal limits to higher temperatures. Copyright © 2010 Elsevier GmbH. All rights reserved.

  9. A Five- Year CMAQ Model Performance for Wildfires and ...

    Science.gov (United States)

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  10. Cooperative cognitive radio networking system model, enabling techniques, and performance

    CERN Document Server

    Cao, Bin; Mark, Jon W

    2016-01-01

    This SpringerBrief examines the active cooperation between users of Cooperative Cognitive Radio Networking (CCRN), exploring the system model, enabling techniques, and performance. The brief provides a systematic study on active cooperation between primary users and secondary users, i.e., (CCRN), followed by the discussions on research issues and challenges in designing spectrum-energy efficient CCRN. As an effort to shed light on the design of spectrum-energy efficient CCRN, they model the CCRN based on orthogonal modulation and orthogonally dual-polarized antenna (ODPA). The resource allocation issues are detailed with respect to both models, in terms of problem formulation, solution approach, and numerical results. Finally, the optimal communication strategies for both primary and secondary users to achieve spectrum-energy efficient CCRN are analyzed.

  11. Evaluating the performance and utility of regional climate models

    DEFF Research Database (Denmark)

    Christensen, Jens H.; Carter, Timothy R.; Rummukainen, Markku

    2007-01-01

    This special issue of Climatic Change contains a series of research articles documenting co-ordinated work carried out within a 3-year European Union project 'Prediction of Regional scenarios and Uncertainties for Defining European Climate change risks and Effects' (PRUDENCE). The main objective...... of the PRUDENCE project was to provide high resolution climate change scenarios for Europe at the end of the twenty-first century by means of dynamical downscaling (regional climate modelling) of global climate simulations. The first part of the issue comprises seven overarching PRUDENCE papers on: (1) the design...... of the model simulations and analyses of climate model performance, (2 and 3) evaluation and intercomparison of simulated climate changes, (4 and 5) specialised analyses of impacts on water resources and on other sectors including agriculture, ecosystems, energy, and transport, (6) investigation of extreme...

  12. Models for the energy performance of low-energy houses

    DEFF Research Database (Denmark)

    Andersen, Philip Hvidthøft Delff

    -building. The building is well-insulated and features large modern energy-effcient windows and oor heating. These features lead to increased non-linear responses to solar radiation and longer time constants. The building is equipped with advanced control and measuring equipment. Experiments are designed and performed......The aim of this thesis is data-driven modeling of heat dynamics of buildings. Traditionally, thermal modeling of buildings is done using simulation tools which take information about the construction, weather data, occupancy etc. as inputs and generate deterministic energy profiles of the buildings...... such as mechanical ventilation, floor heating, and control of the lighting effect, the heat dynamics must be taken into account. Hence, this thesis provides methods for data-driven modeling of heat dynamics of modern buildings. While most of the work in this thesis is related to characterization of heat dynamics...

  13. Exploitation of cloud computing in management of construction projects in Slovakia

    OpenAIRE

    Mandičák, Tomáš; Mesároš, Peter; Kozlovská, Mária

    2016-01-01

    The issue of cloud computing is a highly topical issue. Cloud computing represents a new model for information technology (IT) services based on the exploitation of Web (it represents a cloud) and other application platforms, as well as software as a service. In general, the exploitation of cloud computing in construction project management has several advantages, as demonstrated by several research reports. Currently, research quantifying the exploitation of cloud computing in the Slovak con...

  14. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  15. An opportunity cost model of subjective effort and task performance.

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W; Myers, Justus

    2013-12-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternative explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost--that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternative explanations for both the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across sub-disciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternative models might be empirically distinguished.

  16. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    Science.gov (United States)

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  17. Test of the classic model for predicting endurance running performance.

    Science.gov (United States)

    McLaughlin, James E; Howley, Edward T; Bassett, David R; Thompson, Dixie L; Fitzhugh, Eugene C

    2010-05-01

    To compare the classic physiological variables linked to endurance performance (VO2max, %VO2max at lactate threshold (LT), and running economy (RE)) with peak treadmill velocity (PTV) as predictors of performance in a 16-km time trial. Seventeen healthy, well-trained distance runners (10 males and 7 females) underwent laboratory testing to determine maximal oxygen uptake (VO2max), RE, percentage of maximal oxygen uptake at the LT (%VO2max at LT), running velocity at LT, and PTV. Velocity at VO2max (vVO2max) was calculated from RE and VO2max. Three stepwise regression models were used to determine the best predictors (classic vs treadmill performance protocols) for the 16-km running time trial. Simple Pearson correlations of the variables with 16-km performance showed vVO2max to have the highest correlation (r = -0.972) and %VO2max at the LT the lowest (r = 0.136). The correlation coefficients for LT, VO2max, and PTV were very similar in magnitude (r = -0.903 to r = -0.892). When VO2max, %VO2max at LT, RE, and PTV were entered into SPSS stepwise analysis, VO2max explained 81.3% of the total variance, and RE accounted for an additional 10.7%. vVO2max was shown to be the best predictor of the 16-km performance, accounting for 94.4% of the total variance. The measured velocity at VO2max (PTV) was highly correlated with the estimated velocity at vVO2max (r = 0.8867). Among well-trained subjects heterogeneous in VO2max and running performance, vVO2max is the best predictor of running performance because it integrates both maximal aerobic power and the economy of running. The PTV is linked to the same physiological variables that determine vVO2max.

  18. Performance Benchmarking Tsunami Models for NTHMP's Inundation Mapping Activities

    Science.gov (United States)

    Horrillo, Juan; Grilli, Stéphan T.; Nicolsky, Dmitry; Roeber, Volker; Zhang, Joseph

    2015-03-01

    The coastal states and territories of the United States (US) are vulnerable to devastating tsunamis from near-field or far-field coseismic and underwater/subaerial landslide sources. Following the catastrophic 2004 Indian Ocean tsunami, the National Tsunami Hazard Mitigation Program (NTHMP) accelerated the development of public safety products for the mitigation of these hazards. In response to this initiative, US coastal states and territories speeded up the process of developing/enhancing/adopting tsunami models that can be used for developing inundation maps and evacuation plans. One of NTHMP's requirements is that all operational and inundation-based numerical (O&I) models used for such purposes be properly validated against established standards to ensure the reliability of tsunami inundation maps as well as to achieve a basic level of consistency between parallel efforts. The validation of several O&I models was considered during a workshop held in 2011 at Texas A&M University (Galveston). This validation was performed based on the existing standard (OAR-PMEL-135), which provides a list of benchmark problems (BPs) covering various tsunami processes that models must meet to be deemed acceptable. Here, we summarize key approaches followed, results, and conclusions of the workshop. Eight distinct tsunami models were validated and cross-compared by using a subset of the BPs listed in the OAR-PMEL-135 standard. Of the several BPs available, only two based on laboratory experiments are detailed here for sake of brevity; since they are considered as sufficiently comprehensive. Average relative errors associated with expected parameters values such as maximum surface amplitude/runup are estimated. The level of agreement with the reference data, reasons for discrepancies between model results, and some of the limitations are discussed. In general, dispersive models were found to perform better than nondispersive models, but differences were relatively small, in part

  19. Reasoning About Programs by Exploiting the Environment

    Science.gov (United States)

    1994-02-02

    1i TITLE ( lude Security Ca .fcation) Reasoning about Programs by Exploiting the Environment rt 12. PERSONAL AUTHOR(S) w]ba1oa Ui.tm A Limor Fix axd...editions are obsolete. Reasoning About Programs by Exploiting the Environment * Limor Fix Fred B. Schneider TR 94-1409 February 1994 Department of Computer...agencies. Limor Fix is also supported, in part, by a Fullbright post-doctoral award. Reasoning about Programs by Exploiting the Environment ---- NITIS GRA&I

  20. Photovoltaic Pixels for Neural Stimulation: Circuit Models and Performance.

    Science.gov (United States)

    Boinagrov, David; Lei, Xin; Goetz, Georges; Kamins, Theodore I; Mathieson, Keith; Galambos, Ludwig; Harris, James S; Palanker, Daniel

    2016-02-01

    Photovoltaic conversion of pulsed light into pulsed electric current enables optically-activated neural stimulation with miniature wireless implants. In photovoltaic retinal prostheses, patterns of near-infrared light projected from video goggles onto subretinal arrays of photovoltaic pixels are converted into patterns of current to stimulate the inner retinal neurons. We describe a model of these devices and evaluate the performance of photovoltaic circuits, including the electrode-electrolyte interface. Characteristics of the electrodes measured in saline with various voltages, pulse durations, and polarities were modeled as voltage-dependent capacitances and Faradaic resistances. The resulting mathematical model of the circuit yielded dynamics of the electric current generated by the photovoltaic pixels illuminated by pulsed light. Voltages measured in saline with a pipette electrode above the pixel closely matched results of the model. Using the circuit model, our pixel design was optimized for maximum charge injection under various lighting conditions and for different stimulation thresholds. To speed discharge of the electrodes between the pulses of light, a shunt resistor was introduced and optimized for high frequency stimulation.

  1. Modeling impact of environmental factors on photovoltaic array performance

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jie; Sun, Yize; Xu, Yang [College of Mechanical Engineering, Donghua University NO.2999, North Renmin Road, Shanghai (China)

    2013-07-01

    It is represented in this paper that a methodology to model and quantify the impact of the three environmental factors, the ambient temperature, the incident irradiance and the wind speed, upon the performance of photovoltaic array operating under outdoor conditions. First, A simple correlation correlating operating temperature with the three environmental variables is validated for a range of wind speed studied, 2-8, and for irradiance values between 200 and 1000. Root mean square error (RMSE) between modeled operating temperature and measured values is 1.19% and the mean bias error (MBE) is -0.09%. The environmental factors studied influence I-V curves, P-V curves, and maximum-power outputs of photovoltaic array. The cell-to-module-to-array mathematical model for photovoltaic panels is established in this paper and the method defined as segmented iteration is adopted to solve the I-V curve expression to relate model I-V curves. The model I-V curves and P-V curves are concluded to coincide well with measured data points. The RMSE between numerically calculated maximum-power outputs and experimentally measured ones is 0.2307%, while the MBE is 0.0183%. In addition, a multivariable non-linear regression equation is proposed to eliminate the difference between numerically calculated values and measured ones of maximum power outputs over the range of high ambient temperature and irradiance at noon and in the early afternoon. In conclusion, the proposed method is reasonably simple and accurate.

  2. Urban Modelling Performance of Next Generation SAR Missions

    Science.gov (United States)

    Sefercik, U. G.; Yastikli, N.; Atalay, C.

    2017-09-01

    In synthetic aperture radar (SAR) technology, urban mapping and modelling have become possible with revolutionary missions TerraSAR-X (TSX) and Cosmo-SkyMed (CSK) since 2007. These satellites offer 1m spatial resolution in high-resolution spotlight imaging mode and capable for high quality digital surface model (DSM) acquisition for urban areas utilizing interferometric SAR (InSAR) technology. With the advantage of independent generation from seasonal weather conditions, TSX and CSK DSMs are much in demand by scientific users. The performance of SAR DSMs is influenced by the distortions such as layover, foreshortening, shadow and double-bounce depend up on imaging geometry. In this study, the potential of DSMs derived from convenient 1m high-resolution spotlight (HS) InSAR pairs of CSK and TSX is validated by model-to-model absolute and relative accuracy estimations in an urban area. For the verification, an airborne laser scanning (ALS) DSM of the study area was used as the reference model. Results demonstrated that TSX and CSK urban DSMs are compatible in open, built-up and forest land forms with the absolute accuracy of 8-10 m. The relative accuracies based on the coherence of neighbouring pixels are superior to absolute accuracies both for CSK and TSX.

  3. DoD’s Ability to Exploit the Environment in M&S

    Science.gov (United States)

    2008-03-01

    the Environment in M&S 2008 Defense Modeling and Simulation Conference Report Documentation Page Form ApprovedOMB No...COVERED - 4. TITLE AND SUBTITLE DoDs Ability to Exploit the Environment in M&S 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...e n s o r s B e h a v i o r M o d e l s D e c i s i o n M a k e r s WHY REPLICATE THE ENVIRONMENT Goal: Effect System Performance/Human

  4. Modeling the performance of coated LPG tanks engulfed in fires.

    Science.gov (United States)

    Landucci, Gabriele; Molag, Menso; Cozzani, Valerio

    2009-12-15

    The improvement of passive fire protection of storage vessels is a key factor to enhance safety among the LPG distribution chain. A thermal and mechanical model based on finite elements simulations was developed to assess the behaviour of full size tanks used for LPG storage and transportation in fire engulfment scenarios. The model was validated by experimental results. A specific analysis of the performance of four different reference coating materials was then carried out, also defining specific key performance indicators (KPIs) to assess design safety margins in near-miss simulations. The results confirmed the wide influence of coating application on the expected vessel time to failure due to fire engulfment. A quite different performance of the alternative coating materials was evidenced. General correlations were developed among the vessel time to failure and the effective coating thickness in full engulfment scenarios, providing a preliminary assessment of the coating thickness required to prevent tank rupture for a given time lapse. The KPIs defined allowed the assessment of the available safety margins in the reference scenarios analyzed and of the robustness of thermal protection design.

  5. Uncertainty assessment in building energy performance with a simplified model

    Directory of Open Access Journals (Sweden)

    Titikpina Fally

    2015-01-01

    Full Text Available To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared to the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of the dynamic and the static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the Guide to the Expression of Measurement Uncertainty (GUM as well as by Bayesian Statistical Theory (BST. Another choice is the use of numerical methods like Monte Carlo Simulation (MCS. In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST is given. Therefore, an office building has been monitored and multiple temperature sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 m2.

  6. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  7. ICT evaluation models and performance of medium and small enterprises

    Directory of Open Access Journals (Sweden)

    Bayaga Anass

    2014-01-01

    Full Text Available Building on prior research related to (1 impact of information communication technology (ICT and (2 operational risk management (ORM in the context of medium and small enterprises (MSEs, the focus of this study was to investigate the relationship between (1 ICT operational risk management (ORM and (2 performances of MSEs. To achieve the focus, the research investigated evaluating models for understanding the value of ICT ORM in MSEs. Multiple regression, Repeated-Measures Analysis of Variance (RM-ANOVA and Repeated-Measures Multivariate Analysis of Variance (RM-MANOVA were performed. The findings of the distribution revealed that only one variable made a significant percentage contribution to the level of ICT operation in MSEs, the Payback method (β = 0.410, p < .000. It may thus be inferred that the Payback method is the prominent variable, explaining the variation in level of evaluation models affecting ICT adoption within MSEs. Conclusively, in answering the two questions (1 degree of variability explained and (2 predictors, the results revealed that the variable contributed approximately 88.4% of the variations in evaluation models affecting ICT adoption within MSEs. The analysis of variance also revealed that the regression coefficients were real and did not occur by chance

  8. Acoustic/seismic signal propagation and sensor performance modeling

    Science.gov (United States)

    Wilson, D. Keith; Marlin, David H.; Mackay, Sean

    2007-04-01

    Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).

  9. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  10. Mission Exploitation Platform PROBA-V

    Science.gov (United States)

    Goor, Erwin

    2016-04-01

    VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger

  11. Performance modeling of a wearable brain PET (BET) camera

    Science.gov (United States)

    Schmidtlein, C. R.; Turner, J. N.; Thompson, M. O.; Mandal, K. C.; Häggström, I.; Zhang, J.; Humm, J. L.; Feiglin, D. H.; Krol, A.

    2016-03-01

    Purpose: To explore, by means of analytical and Monte Carlo modeling, performance of a novel lightweight and low-cost wearable helmet-shaped Brain PET (BET) camera based on thin-film digital Geiger Avalanche Photo Diode (dGAPD) with LSO and LaBr3 scintillators for imaging in vivo human brain processes for freely moving and acting subjects responding to various stimuli in any environment. Methods: We performed analytical and Monte Carlo modeling PET performance of a spherical cap BET device and cylindrical brain PET (CYL) device, both with 25 cm diameter and the same total mass of LSO scintillator. Total mass of LSO in both the BET and CYL systems is about 32 kg for a 25 mm thick scintillator, and 13 kg for 10 mm thick scintillator (assuming an LSO density of 7.3 g/ml). We also investigated a similar system using an LaBr3 scintillator corresponding to 22 kg and 9 kg for the 25 mm and 10 mm thick systems (assuming an LaBr3 density of 5.08 g/ml). In addition, we considered a clinical whole body (WB) LSO PET/CT scanner with 82 cm ring diameter and 15.8 cm axial length to represent a reference system. BET consisted of distributed Autonomous Detector Arrays (ADAs) integrated into Intelligent Autonomous Detector Blocks (IADBs). The ADA comprised of an array of small LYSO scintillator volumes (voxels with base a×a: 1.0 50% better noise equivalent count (NEC) performance relative to the CYL geometry, and >1100% better performance than a WB geometry for 25 mm thick LSO and LaBr3. For 10 mm thick LaBr3 equivalent mass systems LSO (7 mm thick) performed ~40% higher NEC than LaBr3. Analytic and Monte Carlo simulations also showed that 1×1×3 mm scintillator crystals can achieve ~1.2 mm FWHM spatial resolution. Conclusions: This study shows that a spherical cap brain PET system can provide improved NEC while preserving spatial resolution when compared to an equivalent dedicated cylindrical PET brain camera and shows greatly improved PET performance relative to a conventional

  12. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  13. Frame-rate performance modeling of software MPEG decoder

    Science.gov (United States)

    Ramamoorthy, Victor

    1997-01-01

    A software MPEG decoder, though attractive in terms of performance and cost, opens up new technical challenges. The most critical question is: When does a software decoder drop a frame? How to predict its timing performance well ahead of its implementation? It is not easy to answer these questions without introducing a stochastic model of the decoding time. With a double buffering scheme, fluctuations in decoding time can be smoothed out to a large extent. However, dropping of frames can not be totally eliminated. New ideas of slip and asymptotic synchronous locking are shown to answer critical design questions of a software decoder. Beneath the troubled world of frame droppings lies the beauty and harmony of our stochastic formulation.

  14. LCP- LIFETIME COST AND PERFORMANCE MODEL FOR DISTRIBUTED PHOTOVOLTAIC SYSTEMS

    Science.gov (United States)

    Borden, C. S.

    1994-01-01

    The Lifetime Cost and Performance (LCP) Model was developed to assist in the assessment of Photovoltaic (PV) system design options. LCP is a simulation of the performance, cost, and revenue streams associated with distributed PV power systems. LCP provides the user with substantial flexibility in specifying the technical and economic environment of the PV application. User-specified input parameters are available to describe PV system characteristics, site climatic conditions, utility purchase and sellback rate structures, discount and escalation rates, construction timing, and lifetime of the system. Such details as PV array orientation and tilt angle, PV module and balance-of-system performance attributes, and the mode of utility interconnection are user-specified. LCP assumes that the distributed PV system is utility grid interactive without dedicated electrical storage. In combination with a suitable economic model, LCP can provide an estimate of the expected net present worth of a PV system to the owner, as compared to electricity purchased from a utility grid. Similarly, LCP might be used to perform sensitivity analyses to identify those PV system parameters having significant impact on net worth. The user describes the PV system configuration to LCP via the basic electrical components. The module is the smallest entity in the PV system which is modeled. A PV module is defined in the simulation by its short circuit current, which varies over the system lifetime due to degradation and failure. Modules are wired in series to form a branch circuit. Bypass diodes are allowed between modules in the branch circuits. Branch circuits are then connected in parallel to form a bus. A collection of buses is connected in parallel to form an increment to capacity of the system. By choosing the appropriate series-parallel wiring design, the user can specify the current, voltage, and reliability characteristics of the system. LCP simulation of system performance is site

  15. Exploitative and Deceptive Resource Acquisition Strategies

    Directory of Open Access Journals (Sweden)

    Joshua J. Reynolds

    2015-07-01

    Full Text Available Life history strategy (LHS and life history contingencies (LHCs should theoretically influence the use of exploitative and deceptive resource acquisition strategies. However, little research has been done in this area. The purpose of the present work was to create measures of exploitative strategies and test the predictions of life history theory. Pilot studies developed and validated a behavioral measure of cheating called the Dot Game. The role of individual LHS and LHCs (manipulated via validated story primes on cheating was investigated in Study 1. Studies 2a through 2c were conducted to develop and validate a self-report measure called the Exploitative and Deceptive Resource Acquisition Strategy Scale (EDRASS. Finally, Study 3 investigated life history and EDRASS. Results indicated that while LHS influences exploitative strategies, life history contingences had little effect. Implications of these findings are discussed.

  16. Life History Theory and Exploitative Strategies

    Directory of Open Access Journals (Sweden)

    Joshua J. Reynolds

    2016-07-01

    Full Text Available Exploitative strategies involve depriving others of resources while enhancing one’s own. Life history theory suggests that there are individual differences (life history strategy and environmental characteristics (life history contingencies [LHCs] that influence the use of exploitative strategies. However, past work manipulating LHCs has found mixed evidence for the influence of this information on exploitative behavior. We present three studies that help clarify the effects of this type of information. Results indicated that younger individuals are most sensitive to LHC information. We also found, contrary to predictions, that communicating slow LHC information (i.e., high population density, intraspecific competition, and resource scarcity increased rather than decreased the temptation to engage in exploitative behavior. Limitations and future directions are discussed.

  17. Exploiting Genome Structure in Association Analysis

    Science.gov (United States)

    Kim, Seyoung

    2014-01-01

    Abstract A genome-wide association study involves examining a large number of single-nucleotide polymorphisms (SNPs) to identify SNPs that are significantly associated with the given phenotype, while trying to reduce the false positive rate. Although haplotype-based association methods have been proposed to accommodate correlation information across nearby SNPs that are in linkage disequilibrium, none of these methods directly incorporated the structural information such as recombination events along chromosome. In this paper, we propose a new approach called stochastic block lasso for association mapping that exploits prior knowledge on linkage disequilibrium structure in the genome such as recombination rates and distances between adjacent SNPs in order to increase the power of detecting true associations while reducing false positives. Following a typical linear regression framework with the genotypes as inputs and the phenotype as output, our proposed method employs a sparsity-enforcing Laplacian prior for the regression coefficients, augmented by a first-order Markov process along the sequence of SNPs that incorporates the prior information on the linkage disequilibrium structure. The Markov-chain prior models the structural dependencies between a pair of adjacent SNPs, and allows us to look for association SNPs in a coupled manner, combining strength from multiple nearby SNPs. Our results on HapMap-simulated datasets and mouse datasets show that there is a significant advantage in incorporating the prior knowledge on linkage disequilibrium structure for marker identification under whole-genome association. PMID:21548809

  18. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    does the design time and eort. This challenge is widely recognized throughout academia and the industry and in order to address this, novel frameworks and methods, which will automate design steps as well as raise the level of abstraction used to design systems, are being called upon. To support...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  19. Proctors exploit three-dimensional ghost tools during clinical-like training scenarios: a preliminary study.

    Science.gov (United States)

    Jarc, Anthony M; Stanley, Andrew A; Clifford, Thomas; Gill, Inderbir S; Hung, Andrew J

    2017-06-01

    In this study, we examine three-dimensional (3D) proctoring tools (i.e., semitransparent ghost tools overlaid on the surgeon's field of view) on realistic surgical tasks. Additionally, we develop novel, quantitative measures of whether proctors exploit the additional capabilities offered by ghost tools. Seven proctor-trainee pairs completed realistic surgical tasks such as tissue dissection and suturing in a live porcine model using 3D ghost tools on the da Vinci Xi Surgical System. The usability and effectiveness of 3D ghost tools were evaluated using objective measures of proctor performance based on proctor hand movements and button presses, as well as post-study questionnaires. Proctors exploited the capabilities of ghost tools, such as 3D hand movement (p proctors in the x-, y-, and z-directions were 57.6, 31.9, and 50.7, respectively. Proctors and trainees consistently evaluated the ghost tools as effective across multiple categories of mentoring. Trainees found ghost tools more helpful than proctors across all categories (p Proctors exploit the augmented capabilities of 3D ghost tools during clinical-like training scenarios. Additionally, both proctors and trainees evaluated ghost tools as effective mentoring tools, thereby confirming previous studies on simple, inanimate tasks. Based on this preliminary work, advanced mentoring technologies, such as 3D ghost tools, stand to improve current telementoring and training technologies in robot-assisted minimally invasive surgery.

  20. Analytical and numerical performance models of a Heisenberg Vortex Tube

    Science.gov (United States)

    Bunge, C. D.; Cavender, K. A.; Matveev, K. I.; Leachman, J. W.

    2017-12-01

    Analytical and numerical investigations of a Heisenberg Vortex Tube (HVT) are performed to estimate the cooling potential with cryogenic hydrogen. The Ranque-Hilsch Vortex Tube (RHVT) is a device that tangentially injects a compressed fluid stream into a cylindrical geometry to promote enthalpy streaming and temperature separation between inner and outer flows. The HVT is the result of lining the inside of a RHVT with a hydrogen catalyst. This is the first concept to utilize the endothermic heat of para-orthohydrogen conversion to aid primary cooling. A review of 1st order vortex tube models available in the literature is presented and adapted to accommodate cryogenic hydrogen properties. These first order model predictions are compared with 2-D axisymmetric Computational Fluid Dynamics (CFD) simulations.

  1. BISON and MARMOT Development for Modeling Fast Reactor Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, Kyle Allan Lawrence [Idaho National Lab. (INL), Idaho Falls, ID (United States); Williamson, Richard L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Novascone, Stephen Rhead [Idaho National Lab. (INL), Idaho Falls, ID (United States); Medvedev, Pavel G. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    BISON and MARMOT are two codes under development at the Idaho National Laboratory for engineering scale and lower length scale fuel performance modeling. It is desired to add capabilities for fast reactor applications to these codes. The fast reactor fuel types under consideration are metal (U-Pu-Zr) and oxide (MOX). The cladding types of interest include 316SS, D9, and HT9. The purpose of this report is to outline the proposed plans for code development and provide an overview of the models added to the BISON and MARMOT codes for fast reactor fuel behavior. A brief overview of preliminary discussions on the formation of a bilateral agreement between the Idaho National Laboratory and the National Nuclear Laboratory in the United Kingdom is presented.

  2. MRAC Revisited: Guaranteed Performance with Reference Model Modification

    Science.gov (United States)

    Stepanyan, Vahram; Krishnakumar, Kalmaje

    2010-01-01

    This paper presents modification of the conventional model reference adaptive control (MRAC) architecture in order to achieve guaranteed transient performance both in the output and input signals of an uncertain system. The proposed modification is based on the tracking error feedback to the reference model. It is shown that approach guarantees tracking of a given command and the ideal control signal (one that would be designed if the system were known) not only asymptotically but also in transient by a proper selection of the error feedback gain. The method prevents generation of high frequency oscillations that are unavoidable in conventional MRAC systems for large adaptation rates. The provided design guideline makes it possible to track a reference command of any magnitude form any initial position without re-tuning. The benefits of the method are demonstrated in simulations.

  3. Modified Neutral Models as Benchmarks to Evaluate the Dynamics of Land System (DLS Model Performance

    Directory of Open Access Journals (Sweden)

    Yingchang Xiu

    2017-07-01

    Full Text Available Assessing model performance is a continuous challenge for modelers of land use change. Comparing land use models with two neutral models, including the random constraint match model (RCM and growing cluster model (GrC that consider the initial land use patterns using a variety of evaluation metrics, provides a new way to evaluate the accuracy of land use models. However, using only two neutral models is not robust enough for reference maps. A modified neutral model that combines a density-based point pattern analysis and a null neutral model algorithm is introduced. In this case, the modified neutral model generates twenty different spatial pattern results using a random algorithm and mid-point displacement algorithm, respectively. The random algorithm-based modified neutral model (Random_MNM results decrease regularly with the fragmentation degree from 0 to 1, while the mid-point displacement algorithm-based modified neutral model (MPD_MNM results decrease in a fluctuating manner with the fragmentation degree. Using the modified neutral model results as benchmarks, a new proposed land use model, the Dynamics of Land System (DLS model, for Jilin Province of China from 2003 to 2013 is assessed using the Kappa statistic and Kappain-out statistic for simulation accuracy. The results show that the DLS model output presents higher Kappa and Kappain-out values than all the twenty neutral model results. The map comparison results indicate that the DLS model could simulate land use change more accurately compared to the Random_MNM and MPD_MNM. However, the amount and spatial allocation of land transitions for the DLS model are lower than the actual land use change. Improving the accuracy of the land use transition allocations in the DLS model requires further investigation.

  4. Liposomal cancer therapy: exploiting tumor characteristics

    DEFF Research Database (Denmark)

    Kaasgaard, Thomas; Andresen, Thomas Lars

    2010-01-01

    the reader will gain: The review focuses on strategies that exploit characteristic features of solid tumors, such as abnormal vasculature, overexpression of receptors and enzymes, as well as acidic and thiolytic characteristics of the tumor microenvironment. Take home message: It is concluded that the design...... of new liposomal drug delivery systems that better exploit tumor characteristic features is likely to result in more efficacious cancer treatments....

  5. Exploiting the opportunities of Internet and multi-channel pricing : An exploratory research

    NARCIS (Netherlands)

    Sotgiu, Francesca; Ancarani, Fabio

    2004-01-01

    Smart firms are not worried about the impact of the Internet on pricing, but realise that they have the unique opportunity to exploit new options and improve their marketing performance. Multi-channel pricing is one of the most interesting opportunities firms can exploit in the digital economy.

  6. Strategic Alignment of Innovation to Business: Balancing the Exploration and Exploitation Function

    NARCIS (Netherlands)

    Fortuin, F.T.J.M.; Omta, S.W.F.

    2007-01-01

    This book addresses the crucial question for innovative prospector companies of how to bridge the gap between exploration and exploitation. Whereas exploration deals with the search for new ideas and opportunities, exploitation is about incrementally moving the performance bar a little bit higher.

  7. Optimal exploitation strategies for an animal population in a Markovian environment: A theory and an example

    Science.gov (United States)

    Anderson, D.R.

    1975-01-01

    Optimal exploitation strategies were studied for an animal population in a Markovian (stochastic, serially correlated) environment. This is a general case and encompasses a number of important special cases as simplifications. Extensive empirical data on the Mallard (Anas platyrhynchos) were used as an example of general theory. The number of small ponds on the central breeding grounds was used as an index to the state of the environment. A general mathematical model was formulated to provide a synthesis of the existing literature, estimates of parameters developed from an analysis of data, and hypotheses regarding the specific effect of exploitation on total survival. The literature and analysis of data were inconclusive concerning the effect of exploitation on survival. Therefore, two hypotheses were explored: (1) exploitation mortality represents a largely additive form of mortality, and (2) exploitation mortality is compensatory with other forms of mortality, at least to some threshold level. Models incorporating these two hypotheses were formulated as stochastic dynamic programming models and optimal exploitation strategies were derived numerically on a digital computer. Optimal exploitation strategies were found to exist under the rather general conditions. Direct feedback control was an integral component in the optimal decision-making process. Optimal exploitation was found to be substantially different depending upon the hypothesis regarding the effect of exploitation on the population. If we assume that exploitation is largely an additive force of mortality in Mallards, then optimal exploitation decisions are a convex function of the size of the breeding population and a linear or slight concave function of the environmental conditions. Under the hypothesis of compensatory mortality forces, optimal exploitation decisions are approximately linearly related to the size of the Mallard breeding population. Dynamic programming is suggested as a very general

  8. Predicting optimum vortex tube performance using a simplified CFD model

    Energy Technology Data Exchange (ETDEWEB)

    Karimi-Esfahani, M; Fartaj, A.; Rankin, G.W. [Univ. of Windsor, Dept. of Mechanical, Automotive and Materials Engineering, Windsor, Ontario (Canada)]. E-mail: mki_60@hotmail.com

    2004-07-01

    The Ranque-Hilsch tube is a particular type of vortex tube device. The flow enters the device tangentially near one end and exits from the open ends of the tube. The inlet air is of a uniform temperature throughout while the outputs are of different temperatures. One outlet is hotter and the other is colder than the inlet air. This device has no moving parts and does not require any additional power for its operation other than that supplied to the device to compress the inlet air. It has, however, not been widely used, mainly because of its low efficiency. In this paper, a simplified 2-dimensional computational fluid dynamics model for the flow in the vortex tube is developed using FLUENT. This model makes use of the assumption of axial symmetry throughout the entire flow domain. Compared to a three-dimensional computational solution, the simplified model requires significantly less computational time. This is important because the model is to be used for an optimization study. A user-defined function is generated to implement a modified version of the k-epsilon model to account for turbulence. This model is validated by comparing a particular solution with available experimental data. The variation of cold temperature drop and efficiency of the device with orifice diameter, inlet pressure and cold mass flow ratio qualitatively agree with experimental results. Variation of these performance indices with tube length did not agree with the experiments for small values of tube length. However, it did agree qualitatively for large values. (author)

  9. Performance Analysis of Different NeQuick Ionospheric Model Parameters

    Directory of Open Access Journals (Sweden)

    WANG Ningbo

    2017-04-01

    Full Text Available Galileo adopts NeQuick model for single-frequency ionospheric delay corrections. For the standard operation of Galileo, NeQuick model is driven by the effective ionization level parameter Az instead of the solar activity level index, and the three broadcast ionospheric coefficients are determined by a second-polynomial through fitting the Az values estimated from globally distributed Galileo Sensor Stations (GSS. In this study, the processing strategies for the estimation of NeQuick ionospheric coefficients are discussed and the characteristics of the NeQuick coefficients are also analyzed. The accuracy of Global Position System (GPS broadcast Klobuchar, original NeQuick2 and fitted NeQuickC as well as Galileo broadcast NeQuickG models is evaluated over the continental and oceanic regions, respectively, in comparison with the ionospheric total electron content (TEC provided by global ionospheric maps (GIM, GPS test stations and JASON-2 altimeter. The results show that NeQuickG can mitigate ionospheric delay by 54.2%~65.8% on a global scale, and NeQuickC can correct for 71.1%~74.2% of the ionospheric delay. NeQuick2 performs at the same level with NeQuickG, which is a bit better than that of GPS broadcast Klobuchar model.

  10. Linkage rules for plant-pollinator networks: trait complementarity or exploitation barriers?

    Directory of Open Access Journals (Sweden)

    Luis Santamaría

    2007-02-01

    Full Text Available Recent attempts to examine the biological processes responsible for the general characteristics of mutualistic networks focus on two types of explanations: nonmatching biological attributes of species that prevent the occurrence of certain interactions ("forbidden links", arising from trait complementarity in mutualist networks (as compared to barriers to exploitation in antagonistic ones, and random interactions among individuals that are proportional to their abundances in the observed community ("neutrality hypothesis". We explored the consequences that simple linkage rules based on the first two hypotheses (complementarity of traits versus barriers to exploitation had on the topology of plant-pollination networks. Independent of the linkage rules used, the inclusion of a small set of traits (two to four sufficed to account for the complex topological patterns observed in real-world networks. Optimal performance was achieved by a "mixed model" that combined rules that link plants and pollinators whose trait ranges overlap ("complementarity models" and rules that link pollinators to flowers whose traits are below a pollinator-specific barrier value ("barrier models". Deterrence of floral parasites (barrier model is therefore at least as important as increasing pollination efficiency (complementarity model in the evolutionary shaping of plant-pollinator networks.

  11. GEO Supersites Data Exploitation Platform

    Science.gov (United States)

    Lengert, W.; Popp, H.-J.; Gleyzes, J.-P.

    2012-04-01

    In the framework of the GEO Geohazard Supersite initiative, an international partnership of organizations and scientists involved in the monitoring and assessment of geohazards has been established. The mission is to advance the scientific understanding of geohazards by improving geohazard monitoring through the combination of in-situ and space-based data, and by facilitating the access to data relevant for geohazard research. The stakeholders are: (1) governmental organizations or research institutions responsible for the ground-based monitoring of earthquake and volcanic areas, (2) space agencies and satellite operators providing satellite data, (3) the global geohazard scientific community. The 10.000's of ESA's SAR products are accessible, since beginning 2008, using ESA's "Virtual Archive", a Cloud Computing assets, allowing the global community an utmost downloading performance of these high volume data sets for mass-market costs. In the GEO collaborative context, the management of ESA's "Virtual Archive" and the ordering of these large data sets is being performed by UNAVCO, who is also coordinating the data demand for the several hundreds of co-PIs. ESA is envisaging to provide scientists and developers access to a highly elastic operational e-infrastructure, providing interdisciplinary data on a large scale as well as tools ensuring innovation and a permanent evolution of the products. Consequently, this science environment will help in defining and testing new applications and technologies fostering innovation and new science findings. In Europe, the collaboration between EPOS, "European Plate Observatory System" lead by INGV, and ESA with support of DLR, ASI, and CNES are the main institutional stakeholders for the GEO Supersites contributing also to a unifying e-infrastructure. The overarching objective of the Geohazard Supersites is: "To implement a sustainable Global Earthquake Observation System and a Global Volcano Observation System as part of the

  12. Postural Hand Synergies during Environmental Constraint Exploitation

    Directory of Open Access Journals (Sweden)

    Cosimo Della Santina

    2017-08-01

    Full Text Available Humans are able to intuitively exploit the shape of an object and environmental constraints to achieve stable grasps and perform dexterous manipulations. In doing that, a vast range of kinematic strategies can be observed. However, in this work we formulate the hypothesis that such ability can be described in terms of a synergistic behavior in the generation of hand postures, i.e., using a reduced set of commonly used kinematic patterns. This is in analogy with previous studies showing the presence of such behavior in different tasks, such as grasping. We investigated this hypothesis in experiments performed by six subjects, who were asked to grasp objects from a flat surface. We quantitatively characterized hand posture behavior from a kinematic perspective, i.e., the hand joint angles, in both pre-shaping and during the interaction with the environment. To determine the role of tactile feedback, we repeated the same experiments but with subjects wearing a rigid shell on the fingertips to reduce cutaneous afferent inputs. Results show the persistence of at least two postural synergies in all the considered experimental conditions and phases. Tactile impairment does not alter significantly the first two synergies, and contact with the environment generates a change only for higher order Principal Components. A good match also arises between the first synergy found in our analysis and the first synergy of grasping as quantified by previous work. The present study is motivated by the interest of learning from the human example, extracting lessons that can be applied in robot design and control. Thus, we conclude with a discussion on implications for robotics of our findings.

  13. Inconsistent Strategies to Spin up Models in CMIP5: Implications for Ocean Biogeochemical Model Performance Assessment

    Science.gov (United States)

    Seferian, Roland; Gehlen, Marion; Bopp, Laurent; Resplandy, Laure; Orr, James C.; Marti, Olivier; Dunne, John P.; Christian, James R.; Doney, Scott C.; Ilyina, Tatiana; hide

    2015-01-01

    During the fifth phase of the Coupled Model Intercomparison Project (CMIP5) substantial efforts were made to systematically assess the skill of Earth system models. One goal was to check how realistically representative marine biogeochemical tracer distributions could be reproduced by models. In routine assessments model historical hindcasts were compared with available modern biogeochemical observations. However, these assessments considered neither how close modeled biogeochemical reservoirs were to equilibrium nor the sensitivity of model performance to initial conditions or to the spin-up protocols. Here, we explore how the large diversity in spin-up protocols used for marine biogeochemistry in CMIP5 Earth system models (ESMs) contributes to model-to-model differences in the simulated fields. We take advantage of a 500-year spin-up simulation of IPSL-CM5A-LR to quantify the influence of the spin-up protocol on model ability to reproduce relevant data fields. Amplification of biases in selected biogeochemical fields (O2, NO3, Alk-DIC) is assessed as a function of spin-up duration. We demonstrate that a relationship between spin-up duration and assessment metrics emerges from our model results and holds when confronted with a larger ensemble of CMIP5 models. This shows that drift has implications for performance assessment in addition to possibly aliasing estimates of climate change impact. Our study suggests that differences in spin-up protocols could explain a substantial part of model disparities, constituting a source of model-to- model uncertainty. This requires more attention in future model intercomparison exercises in order to provide quantitatively more correct ESM results on marine biogeochemistry and carbon cycle feedbacks.

  14. Inconsistent strategies to spin up models in CMIP5: implications for ocean biogeochemical model performance assessment

    Science.gov (United States)

    Seferian, R.; Gehlen, M.; Bopp, L.; Resplandy, L.; Orr, J. C.; Marti, O.

    2016-12-01

    During the fifth phase of the Coupled Model Intercomparison Project (CMIP5) substantial efforts were made to systematically assess the skills of Earth system models against available modern observations. However, most of these skill-assessment approaches can be considered as "blind" given that they were applied without considering models' specific characteristics and treat models a priori as independent of observations. Indeed, since these models are typically initialized from observations, the spin-up procedure (e.g. the length of time for which the model has been run since initialization, and therefore the degree to which it has approached it's own equilibrium) has the potential to exert a significant control over the skill-assessment metrics calculated for each model. Here, we explore how the large diversity in spin-up protocols used for marine biogeochemistry in CMIP5 Earth system models (ESM) contributes to model-to-model differences in the simulated fields. We focus on the amplification of biases in selected biogeochemical fields (O2, NO3, Alk-DIC) as a function of spin-up duration in a dedicated 500-year-long spin-up simulation performed with IPSL-CM5A-LR as well as an ensemble of 24 CMIP5 ESMs. We demonstrate that a relationship between spin-up duration and skill-assessment metrics emerges from the results of a single model and holds when confronted with a larger ensemble of CMIP5 models. This shows that drift in biogeochemical fields has implications for performance assessment in addition to possibly influence estimates of climate change impact. Our study suggests that differences in spin-up protocols could explain a substantial part of model disparities, constituting a source of model-to-model uncertainty. This requires more attention in future model intercomparison exercises in order to provide quantitatively more correct ESM results on marine biogeochemistry and carbon cycle feedbacks.

  15. Effects of simulation language and modeling methodology on simulation modeling performance

    Energy Technology Data Exchange (ETDEWEB)

    Wang, T.J.

    1987-01-01

    Research in simulation modeling has made little advance over the past two decades. Many simulation languages and modeling methodologies were designed but not evaluated. Model developers were given no criteria for selecting from among these modeling tools. A framework of research in simulation modeling was developed to identify factors that might most affect simulation modeling performance. First, two simulation languages (MAGIE and GPSS) that differ greatly in complexity were compared. Both languages are similar in their design philosophy. However, MAGIE is a small simulation language with ten model building blocks while GPSS is a large simulation language with fifty-six model building blocks. Secondly, two modeling methodologies, namely the top-down and the bottom-up approaches, were compared. This research shows that it is feasible to apply the user-based empirical research methodology to study simulation modeling. It is also concluded that modeling with a large simulation language does not necessarily yield better results than modeling with a small simulation language. Furthermore, it was found that using the top-down modeling approach does not necessarily yield better results than using the bottom-up modeling approach.

  16. Development of the integrated environmental control model: Performance model for the NOXSO process. Quarterly progress report

    Energy Technology Data Exchange (ETDEWEB)

    Kalagnanam, J.R.; Rubin, E.S.

    1995-04-01

    In its current configuration, the IECM provides a capability to model various conventional and advanced processes for controlling air pollutant emissions from coal-fired power plants before, during, or after combustion. The principal purpose of the model is to calculate the performance, emissions, and cost of power plant configurations employing alternative environmental control methods. The model consists of various control technology modules, which may be integrated into a complete utility plant in any desired combination. In contrast to conventional deterministic models, the IECM offers the unique capability to assign probabilistic values to all model input parameters, and to obtain probabilistic outputs in the form of cumulative distribution functions indicating the likelihood of different costs and performance results. The most recent version of the IECM, implemented on a MacIntosh II computer, was delivered to DOE/PETC at the end of the last contract in May 1991. The current contract will continue the model development effort to provide DOE/PETC with improved model capabilities, including new software developments to facilitate model use and new technical capabilities for analysis of environmental control technologies. Integrated environmental control systems involving pre-combustion, combustion, and post-combustion control methods will be considered. Phase I involves developing the existing modules of the IECM. Phase II deals with creating new technology modules, linking the IECM with PETC databases, and training PETC personnel on the use of the updated models. The present report summarizes recent progress on the Phase I effort during the period January 1 - March 31, 1995. A preliminary summary is given of the new performance model developed for the NOXSO process. The performance model is developed from first principles and parametrized based on experimental data from pilot plants.

  17. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  18. Modelling of green roofs' hydrologic performance using EPA's SWMM.

    Science.gov (United States)

    Burszta-Adamiak, E; Mrowiec, M

    2013-01-01

    Green roofs significantly affect the increase in water retention and thus the management of rain water in urban areas. In Poland, as in many other European countries, excess rainwater resulting from snowmelt and heavy rainfall contributes to the development of local flooding in urban areas. Opportunities to reduce surface runoff and reduce flood risks are among the reasons why green roofs are more likely to be used also in this country. However, there are relatively few data on their in situ performance. In this study the storm water performance was simulated for the green roofs experimental plots using the Storm Water Management Model (SWMM) with Low Impact Development (LID) Controls module (version 5.0.022). The model consists of many parameters for a particular layer of green roofs but simulation results were unsatisfactory considering the hydrologic response of the green roofs. For the majority of the tested rain events, the Nash coefficient had negative values. It indicates a weak fit between observed and measured flow-rates. Therefore complexity of the LID module does not affect the increase of its accuracy. Further research at a technical scale is needed to determine the role of the green roof slope, vegetation cover and drying process during the inter-event periods.

  19. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  20. Direct-Steam Linear Fresnel Performance Model for NREL's System Advisor Model

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.; Zhu, G.

    2012-09-01

    This paper presents the technical formulation and demonstrated model performance results of a new direct-steam-generation (DSG) model in NREL's System Advisor Model (SAM). The model predicts the annual electricity production of a wide range of system configurations within the DSG Linear Fresnel technology by modeling hourly performance of the plant in detail. The quasi-steady-state formulation allows users to investigate energy and mass flows, operating temperatures, and pressure drops for geometries and solar field configurations of interest. The model includes tools for heat loss calculation using either empirical polynomial heat loss curves as a function of steam temperature, ambient temperature, and wind velocity, or a detailed evacuated tube receiver heat loss model. Thermal losses are evaluated using a computationally efficient nodal approach, where the solar field and headers are discretized into multiple nodes where heat losses, thermal inertia, steam conditions (including pressure, temperature, enthalpy, etc.) are individually evaluated during each time step of the simulation. This paper discusses the mathematical formulation for the solar field model and describes how the solar field is integrated with the other subsystem models, including the power cycle and optional auxiliary fossil system. Model results are also presented to demonstrate plant behavior in the various operating modes.