WorldWideScience

Sample records for computing resource estimate

  1. Geological control in computer-based resource estimation

    International Nuclear Information System (INIS)

    Cram, A.A.

    1992-01-01

    The economic assessment of mineral deposits potentially involves a number of phases from initial assessment through to detailed orebody evaluation and subsequently to estimation of bench or stope grades in an operational mine. Obviously the nature and quantity of information varies significantly from one extreme to the other. This paper reports on geological interpretation which obviously plays a major part during the early phases of assessment and it logically follows that computerized orebody assessment techniques must be able to effectively utilize this information. Even in an operational mine the importance of geological control varies according to the type of orebody. The method of modelling coal seams is distinctly different from those techniques used for a massive sulphide deposit. For coal seams significant reliance is placed on the correlation of seam intercepts from holes which are widely spaced in relation to the seam (i.e.) orebody) thickness. The use of geological interpretation in this case is justified other experience gained in the past, compared to the cost of reliance only on samples from a very dense drill hole pattern

  2. Robust Wave Resource Estimation

    DEFF Research Database (Denmark)

    Lavelle, John; Kofoed, Jens Peter

    2013-01-01

    density estimates of the PDF as a function both of Hm0 and Tp, and Hm0 and T0;2, together with the mean wave power per unit crest length, Pw, as a function of Hm0 and T0;2. The wave elevation parameters, from which the wave parameters are calculated, are filtered to correct or remove spurious data....... An overview is given of the methods used to do this, and a method for identifying outliers of the wave elevation data, based on the joint distribution of wave elevations and accelerations, is presented. The limitations of using a JONSWAP spectrum to model the measured wave spectra as a function of Hm0 and T0......;2 or Hm0 and Tp for the Hanstholm site data are demonstrated. As an alternative, the non-parametric loess method, which does not rely on any assumptions about the shape of the wave elevation spectra, is used to accurately estimate Pw as a function of Hm0 and T0;2....

  3. Estimation of potential uranium resources

    International Nuclear Information System (INIS)

    Curry, D.L.

    1977-09-01

    Potential estimates, like reserves, are limited by the information on hand at the time and are not intended to indicate the ultimate resources. Potential estimates are based on geologic judgement, so their reliability is dependent on the quality and extent of geologic knowledge. Reliability differs for each of the three potential resource classes. It is greatest for probable potential resources because of the greater knowledge base resulting from the advanced stage of exploration and development in established producing districts where most of the resources in this class are located. Reliability is least for speculative potential resources because no significant deposits are known, and favorability is inferred from limited geologic data. Estimates of potential resources are revised as new geologic concepts are postulated, as new types of uranium ore bodies are discovered, and as improved geophysical and geochemical techniques are developed and applied. Advances in technology that permit the exploitation of deep or low-grade deposits, or the processing of ores of previously uneconomic metallurgical types, also will affect the estimates

  4. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  5. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  6. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  7. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  8. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  9. Computer-Aided Parts Estimation

    OpenAIRE

    Cunningham, Adam; Smart, Robert

    1993-01-01

    In 1991, Ford Motor Company began deployment of CAPE (computer-aided parts estimating system), a highly advanced knowledge-based system designed to generate, evaluate, and cost automotive part manufacturing plans. cape is engineered on an innovative, extensible, declarative process-planning and estimating knowledge representation language, which underpins the cape kernel architecture. Many manufacturing processes have been modeled to date, but eventually every significant process in motor veh...

  10. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  11. Statistics Online Computational Resource for Education

    Science.gov (United States)

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  12. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  13. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  14. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  15. Efficient Resource Management in Cloud Computing

    OpenAIRE

    Rushikesh Shingade; Amit Patil; Shivam Suryawanshi; M. Venkatesan

    2015-01-01

    Cloud computing, one of the widely used technology to provide cloud services for users who are charged for receiving services. In the aspect of a maximum number of resources, evaluating the performance of Cloud resource management policies are difficult to optimize efficiently. There are different simulation toolkits available for simulation and modelling the Cloud computing environment like GridSim CloudAnalyst, CloudSim, GreenCloud, CloudAuction etc. In proposed Efficient Resource Manage...

  16. Resource estimations in contingency planning for FMD

    DEFF Research Database (Denmark)

    Boklund, Anette; Sten, Mortensen; Holm Johansen, Maren

    Based on results from a stochastic simulation model, it was possible to create a simple model in excel to estimate the requirements for personnel and materiel during an FMD outbreak in Denmark. The model can easily be adjusted, when new information on resources appears from management of other cr...

  17. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  18. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  19. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  20. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  1. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  2. Offshore wind resource estimation for wind energy

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Mouche, A.

    2010-01-01

    Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite observati......Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite...... observations are compared to selected offshore meteorological masts in the Baltic Sea and North Sea. The overall aim of the Norsewind project is a state-of-the-art wind atlas at 100 m height. The satellite winds are all valid at 10 m above sea level. Extrapolation to higher heights is a challenge. Mesoscale...... modeling of the winds at hub height will be compared to data from wind lidars observing at 100 m above sea level. Plans are also to compare mesoscale model results and satellite-based estimates of the offshore wind resource....

  3. Resource-estimation models and predicted discovery

    International Nuclear Information System (INIS)

    Hill, G.W.

    1982-01-01

    Resources have been estimated by predictive extrapolation from past discovery experience, by analogy with better explored regions, or by inference from evidence of depletion of targets for exploration. Changes in technology and new insights into geological mechanisms have occurred sufficiently often in the long run to form part of the pattern of mature discovery experience. The criterion, that a meaningful resource estimate needs an objective measure of its precision or degree of uncertainty, excludes 'estimates' based solely on expert opinion. This is illustrated by development of error measures for several persuasive models of discovery and production of oil and gas in USA, both annually and in terms of increasing exploration effort. Appropriate generalizations of the models resolve many points of controversy. This is illustrated using two USA data sets describing discovery of oil and of U 3 O 8 ; the latter set highlights an inadequacy of available official data. Review of the oil-discovery data set provides a warrant for adjusting the time-series prediction to a higher resource figure for USA petroleum. (author)

  4. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  5. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  6. LHCb Computing Resources: 2011 re-assessment, 2012 request and 2013 forecast

    CERN Document Server

    Graciani, R

    2011-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2011 data taking period, request of computing resource needs for 2012 data taking period and a first forecast of the 2013 needs, when no data taking is foreseen. Estimates are based on 2010 experienced and last updates from LHC schedule, as well as on a new implementation of the computing model simulation tool. Differences in the model and deviations in the estimates from previous presented results are stressed.

  7. LHCb Computing Resources: 2012 re-assessment, 2013 request and 2014 forecast

    CERN Document Server

    Graciani Diaz, Ricardo

    2012-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2012 data-taking period, request of computing resource needs for 2013, and a first forecast of the 2014 needs, when restart of data-taking is foreseen. Estimates are based on 2011 experience, as well as on the results of a simulation of the computing model described in the document. Differences in the model and deviations in the estimates from previous presented results are stressed.

  8. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  9. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  10. LHCb Computing Resource usage in 2017

    CERN Document Server

    Bozzi, Concezio

    2018-01-01

    This document reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2017. The data in the following sections have been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  11. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  12. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  13. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  14. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  15. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  16. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  17. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  18. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  19. Comparing computing formulas for estimating concentration ratios

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.

    1984-03-01

    This paper provides guidance on the choice of computing formulas (estimators) for estimating concentration ratios and other ratio-type measures of radionuclides and other environmental contaminant transfers between ecosystem components. Mathematical expressions for the expected value of three commonly used estimators (arithmetic mean of ratios, geometric mean of ratios, and the ratio of means) are obtained when the multivariate lognormal distribution is assumed. These expressions are used to explain why these estimators will not in general give the same estimate of the average concentration ratio. They illustrate that the magnitude of the discrepancies depends on the magnitude of measurement biases, and on the variances and correlations associated with spatial heterogeneity and measurement errors. This paper also reports on a computer simulation study that compares the accuracy of eight computing formulas for estimating a ratio relationship that is constant over time and/or space. Statistical models appropriate for both controlled spiking experiments and observational field studies for either normal or lognormal distributions are considered. 24 references, 15 figures, 7 tables

  20. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  1. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  2. Internal Clock Drift Estimation in Computer Clusters

    Directory of Open Access Journals (Sweden)

    Hicham Marouani

    2008-01-01

    Full Text Available Most computers have several high-resolution timing sources, from the programmable interrupt timer to the cycle counter. Yet, even at a precision of one cycle in ten millions, clocks may drift significantly in a single second at a clock frequency of several GHz. When tracing the low-level system events in computer clusters, such as packet sending or reception, each computer system records its own events using an internal clock. In order to properly understand the global system behavior and performance, as reported by the events recorded on each computer, it is important to estimate precisely the clock differences and drift between the different computers in the system. This article studies the clock precision and stability of several computer systems, with different architectures. It also studies the typical network delay characteristics, since time synchronization algorithms rely on the exchange of network packets and are dependent on the symmetry of the delays. A very precise clock, based on the atomic time provided by the GPS satellite network, was used as a reference to measure clock drifts and network delays. The results obtained are of immediate use to all applications which depend on computer clocks or network time synchronization accuracy.

  3. Hydrokinetic energy resource estimates of River ERO at Lafiagi ...

    African Journals Online (AJOL)

    Hydrokinetic energy resource estimates of River ERO at Lafiagi, Kwara State, ... cost-effective renewable energy solution without requiring the construction of a ... Keywords: Hydrokinetic Power, Energy Resource, River Ero, Water Resources ... (14); Eritrea (1); Ethiopia (30); Ghana (27); Kenya (29); Lesotho (1); Libya (2) ...

  4. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  5. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  6. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  7. Some issues of creation of belarusian language computer resources

    OpenAIRE

    Rubashko, N.; Nevmerjitskaia, G.

    2003-01-01

    The main reason for creation of computer resources of natural language is the necessity to bring into accord the ways of language normalization with the form of its existence - the computer form of language usage should correspond to the computer form of language standards fixation. This paper discusses various aspects of the creation of Belarusian language computer resources. It also briefly gives an overview of the objectives of the project involved.

  8. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  9. Estimating near-shore wind resources

    DEFF Research Database (Denmark)

    Floors, Rogier Ralph; Hahmann, Andrea N.; Peña, Alfredo

    An evaluation and sensitivity study using the WRF mesoscale model to estimate the wind in a coastal area is performed using a unique data set consisting of scanning, profiling and floating lidars. The ability of the WRF model to represent the wind speed was evaluated by running the model for a four...... grid spacings were performed for each of the two schemes. An evaluation of the wind profile using vertical profilers revealed small differences in modelled mean wind speed between the different set-ups, with the YSU scheme predicting slightly higher mean wind speeds. Larger differences between...... the different simulations were observed when comparing the root-mean-square error (RMSE) between modelled and measured wind, with the ERA interim-based simulations having the lowest errors. The simulations with finer horizontal grid spacing had a larger MSE. Horizontal transects of mean wind speed across...

  10. Meta-analysis of non-renewable energy resource estimates

    International Nuclear Information System (INIS)

    Dale, Michael

    2012-01-01

    This paper offers a review of estimates of ultimately recoverable resources (URR) of non-renewable energy sources: coal, conventional and unconventional oil, conventional and unconventional gas, and uranium for nuclear fission. There is a large range in the estimates of many of the energy sources, even those that have been utilized for a long time and, as such, should be well understood. If it is assumed that the estimates for each resource are normally distributed, then the total value of ultimately recoverable fossil and fissile energy resources is 70,592 EJ. If, on the other hand, the best fitting distribution from each of the resource estimate populations is used, a the total value is 50,702 EJ, a factor of around 30% smaller. - Highlights: ► Brief introduction to categorization of resources. ► Collated over 380 estimates of ultimately recoverable global resources for all non-renewable energy sources. ► Extensive statistical analysis and distribution fitting conducted. ► Cross-energy source comparison of resource magnitudes.

  11. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  12. Estimation of economic parameters of U.S. hydropower resources

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Douglas G. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Hunt, Richard T. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Reeves, Kelly S. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Carroll, Greg R. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2003-06-01

    Tools for estimating the cost of developing and operating and maintaining hydropower resources in the form of regression curves were developed based on historical plant data. Development costs that were addressed included: licensing, construction, and five types of environmental mitigation. It was found that the data for each type of cost correlated well with plant capacity. A tool for estimating the annual and monthly electric generation of hydropower resources was also developed. Additional tools were developed to estimate the cost of upgrading a turbine or a generator. The development and operation and maintenance cost estimating tools, and the generation estimating tool were applied to 2,155 U.S. hydropower sites representing a total potential capacity of 43,036 MW. The sites included totally undeveloped sites, dams without a hydroelectric plant, and hydroelectric plants that could be expanded to achieve greater capacity. Site characteristics and estimated costs and generation for each site were assembled in a database in Excel format that is also included within the EERE Library under the title, “Estimation of Economic Parameters of U.S. Hydropower Resources - INL Hydropower Resource Economics Database.”

  13. Improving ATLAS computing resource utilization with HammerCloud

    CERN Document Server

    Schovancova, Jaroslava; The ATLAS collaboration

    2018-01-01

    HammerCloud is a framework to commission, test, and benchmark ATLAS computing resources and components of various distributed systems with realistic full-chain experiment workflows. HammerCloud contributes to ATLAS Distributed Computing (ADC) Operations and automation efforts, providing the automated resource exclusion and recovery tools, that help re-focus operational manpower to areas which have yet to be automated, and improve utilization of available computing resources. We present recent evolution of the auto-exclusion/recovery tools: faster inclusion of new resources in testing machinery, machine learning algorithms for anomaly detection, categorized resources as master vs. slave for the purpose of blacklisting, and a tool for auto-exclusion/recovery of resources triggered by Event Service job failures that is being extended to other workflows besides the Event Service. We describe how HammerCloud helped commissioning various concepts and components of distributed systems: simplified configuration of qu...

  14. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  15. Wind Resource Estimation using QuikSCAT Ocean Surface Winds

    DEFF Research Database (Denmark)

    Xu, Qing; Zhang, Guosheng; Cheng, Yongcun

    2011-01-01

    In this study, the offshore wind resources in the East China Sea and South China Sea were estimated from over ten years of QuikSCAT scatterometer wind products. Since the errors of these products are larger close to the coast due to the land contamination of radar backscatter signal...... and the complexity of air-sea interaction processes, an empirical relationship that adjusts QuikSCAT winds in coastal waters was first proposed based on vessel measurements. Then the shape and scale parameters of Weibull function are determined for wind resource estimation. The wind roses are also plotted. Results...

  16. Speculative resources of uranium. A review of International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983

    International Nuclear Information System (INIS)

    1983-01-01

    On a country by country basis the International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983 are reviewed. Information provided includes exploration work, airborne survey, radiometric survey, gamma-ray spectrometric survey, estimate of speculative resources, uranium occurrences, uranium deposits, uranium mineralization, agreements for uranium exploration, feasibilities studies, geological classification of resources, proposed revised resource range, production estimate of uranium

  17. Decentralized Resource Management in Distributed Computer Systems.

    Science.gov (United States)

    1982-02-01

    directly exchanging user state information. Eventcounts and sequencers correspond to semaphores in the sense that synchronization primitives are used to...and techniques are required to achieve synchronization in distributed computers without reliance on any centralized entity such as a semaphore ...known solutions to the access synchronization problem was Dijkstra’s semaphore [12]. The importance of the semaphore is that it correctly addresses the

  18. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  19. Estimation of Total Tree Height from Renewable Resources Evaluation Data

    Science.gov (United States)

    Charles E. Thomas

    1981-01-01

    Many ecological, biological, and genetic studies use the measurement of total tree height. Until recently, the Southern Forest Experiment Station's inventory procedures through Renewable Resources Evaluation (RRE) have not included total height measurements. This note provides equations to estimate total height based on other RRE measurements.

  20. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  1. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  2. Wind resource estimation and siting of wind turbines

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik; Mortensen, N.G.; Landberg, L.

    1994-01-01

    Detailed knowledge of the characteristics of the natural wind is necessary for the design, planning and operational aspect of wind energy systems. Here, we shall only be concerned with those meteorological aspects of wind energy planning that are termed wind resource estimation. The estimation...... of the wind resource ranges from the overall estimation of the mean energy content of the wind over a large area - called regional assessment - to the prediction of the average yearly energy production of a specific wind turbine at a specific location - called siting. A regional assessment will most often...... lead to a so-called wind atlas. A precise prediction of the wind speed at a given site is essential because for aerodynamic reasons the power output of a wind turbine is proportional to the third power of the wind speed, hence even small errors in prediction of wind speed may result in large deviations...

  3. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  4. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  5. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  6. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  7. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  8. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  9. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  10. Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution

    International Nuclear Information System (INIS)

    Souza, Luis Eduardo de; Costa, Joao Felipe C. L.; Koppe, Jair C.

    2004-01-01

    For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit

  11. Estimating uranium resources and production. A guide to future supply

    International Nuclear Information System (INIS)

    Taylor, D.M.; Haeussermann, W.

    1983-01-01

    Nuclear power can only continue to grow if sufficient fuel, uranium, is available. Concern has been expressed that, in the not too distant future, the supply of uranium may be inadequate to meet reactor development. This will not be the case. Uranium production capability, actual and planned, is the main indicator of short- and medium-term supply. However, for the longer term, uranium resource estimates and projections of the possible rate of production from the resource base are important. Once an estimate has been made of the resources contained in a deposit, several factors influence the decision to produce the uranium and also the rates at which the uranium can be produced. The effect of these factors, which include uranium market trends and ever increasing lead times from discovery to production, must be taken into account when making projections of future production capability and before comparing these with forecasts of future uranium requirements. The uranium resource base has developed over the last two decades mainly in response to dramatically changing projections of natural uranium requirements. A study of this development and the changes in production, together with the most recent data, shows that in the short- and medium-term, production from already discovered resources should be sufficient to cover any likely reactor requirements. Studies such as those undertaken during the International Uranium Resources Evaluation Project, and others which project future discovery rates and production, are supported by past experience in resource development in showing that uranium supply could continue to meet demand until well into the next century. The uranium supply potential has lessened the need for the early large-scale global introduction of the breeder reactor

  12. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  13. Estimation of wind and solar resources in Mali

    Energy Technology Data Exchange (ETDEWEB)

    Badger, J.; Kamissoko, F.; Olander Rasmussen, M.; Larsen, Soeren; Guidon, N.; Boye Hansen, L.; Dewilde, L.; Alhousseini, M.; Noergaard, P.; Nygaard, I.

    2012-11-15

    The wind resource has been estimated for all of Mali at 7.5 km resolution using the KAMM/WAsP numerical wind atlas methodology. Three domains were used to cover entire country and three sets of wind classes used to capture change in large scale forcing over country. The final output includes generalized climate statistics for any location in Mali, giving wind direction and wind speed distribution. The modelled generalized climate statistics can be used directly in the WAsP software. The preliminary results show a wind resource, which is relatively low, but which under certain conditions may be economically feasible, i.e. at favourably exposed sites, giving enhanced winds, and where practical utilization is possible, given consideration to grid connection or replacement or augmentation of diesel-based electricity systems. The solar energy resource for Mali was assessed for the period between July 2008 and June 2011 using a remote sensing based estimate of the down-welling surface shortwave flux. The remote sensing estimates were adjusted on a month-by-month basis to account for seasonal differences between the remote sensing estimates and in situ data. Calibration was found to improve the coefficient of determination as well as decreasing the mean error both for the calibration and validation data. Compared to the results presented in the ''Renewable energy resources in Mali - preliminary mapping''-report that showed a tendency for underestimation compared to data from the NASA PPOWER/SSE database, the presented results show a very good agreement with the in situ data (after calibration) with no significant bias. Unfortunately, the NASA-database only contains data up until 2005, so a similar comparison could not be done for the time period analyzed in this study, although the agreement with the historic NASA data is still useful as reference. (LN)

  14. Estimating the Ground Water Resources of Atoll Islands

    Directory of Open Access Journals (Sweden)

    Arne E. Olsen

    2010-01-01

    Full Text Available Ground water resources of atolls, already minimal due to the small surface area and low elevation of the islands, are also subject to recurring, and sometimes devastating, droughts. As ground water resources become the sole fresh water source when rain catchment supplies are exhausted, it is critical to assess current groundwater resources and predict their depletion during drought conditions. Several published models, both analytical and empirical, are available to estimate the steady-state freshwater lens thickness of small oceanic islands. None fully incorporates unique shallow geologic characteristics of atoll islands, and none incorporates time-dependent processes. In this paper, we provide a review of these models, and then present a simple algebraic model, derived from results of a comprehensive numerical modeling study of steady-state atoll island aquifer dynamics, to predict the ground water response to changes in recharge on atoll islands. The model provides an estimate thickness of the freshwater lens as a function of annual rainfall rate, island width, Thurber Discontinuity depth, upper aquifer hydraulic conductivity, presence or absence of a confining reef flat plate, and in the case of drought, time. Results compare favorably with published atoll island lens thickness observations. The algebraic model is incorporated into a spreadsheet interface for use by island water resources managers.

  15. State of the art on wind resource estimation

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B.

    1998-12-31

    With the increasing number of wind resource estimation studies carried out for regions, countries and even larger areas all over the world, the IEA finds that the time has come to stop and take stock of the various methods used in these studies. The IEA would therefore like to propose an Experts Meeting on wind resource estimation. The Experts Meeting should describe the models and databases used in the various studies. It should shed light on the strengths and shortcomings of the models and answer questions like: where and under what circumstances should a specific model be used? what is the expected accuracy of the estimate of the model? and what is the applicability? When addressing databases the main goal will be to identify the content and scope of these. Further, the quality, availability and reliability of the databases must also be recognised. In the various studies of wind resources the models and databases have been combined in different ways. A final goal of the Experts Meeting is to see whether it is possible to develop systems of methods which would depend on the available input. These systems of methods should be able to address the simple case (level 0) of a region with barely no data, to the complex case of a region with all available measurements: surface observations, radio soundings, satellite observations and so on. The outcome of the meeting should be an inventory of available models as well as databases and a map of already studied regions. (au)

  16. Dose estimation for paediatric cranial computed tomography

    International Nuclear Information System (INIS)

    Curci Daros, K.A.; Bitelli Medeiros, R.; Curci Daros, K.A.; Oliveira Echeimberg, J. de

    2006-01-01

    In the last ten years, the number of paediatric computed tomography (CT) scans have increased worldwide, contributing to higher population radiation dose. Technique diversification in paediatrics and different CT equipment technologies have led to various exposure levels complicating precise evaluation of doses and operational conditions necessary for good quality images. The objective of this study was to establish a quantitative relationship between absorbed dose and cranial region in children up to 6 years old undergoing CT exams. Methods: X-ray was measured on the cranial surface of 64 patients undergoing CT using thermoluminescent (T.L.) dosimeters. Forty T.L.D.100 thermoluminescent dosimeters (T.L.D.) were evenly distributed on each patients skin surface along the sagittal axis. Measurements were performed in facial regions exposed to scatter radiation and in the supratentorial and posterior fossa regions, submitted to primary radiation. T.L.D. were calibrated for 120 kV X-ray over the acrylic phantom. T.L. measurements were made with a Harshaw 4000 system. Patient mean T.L. readings were determined for position, pi, of T.L.D. and normalized to the maximum supratentorial reading. From integrating the linear T.L. density function (?) resulting from radiation distribution in each of the three exposed regions, dose fraction was determined in the region of interest, along with total dose under the technical conditions used in that specific exam protocol. For each T.L.D. position along the patient cranium, there were n T.L. measurements with 2% uncertainty due to T.L. reader, and 5% due to thermal treatment of dosimeters. Also, mean T.L. readings and their uncertainties were calculated for each patient at each position, p. Results: Mean linear T.L. density for the region exposed to secondary radiation defined by position, 0.3≤p≤6 cm, was ρ((p)=7.9(4)x10 -2 +7(5)x10 -5 p 4.5(4) cm -1 ; exposed to primary X-ray for the posterior fossa region defined by position

  17. Dose estimation for paediatric cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Curci Daros, K.A.; Bitelli Medeiros, R. [Sao Paulo Univ. Federal (Brazil); Curci Daros, K.A.; Oliveira Echeimberg, J. de [Centro Univ. Sao Camilo, Sao Paulo (Brazil)

    2006-07-01

    In the last ten years, the number of paediatric computed tomography (CT) scans have increased worldwide, contributing to higher population radiation dose. Technique diversification in paediatrics and different CT equipment technologies have led to various exposure levels complicating precise evaluation of doses and operational conditions necessary for good quality images. The objective of this study was to establish a quantitative relationship between absorbed dose and cranial region in children up to 6 years old undergoing CT exams. Methods: X-ray was measured on the cranial surface of 64 patients undergoing CT using thermoluminescent (T.L.) dosimeters. Forty T.L.D.100 thermoluminescent dosimeters (T.L.D.) were evenly distributed on each patients skin surface along the sagittal axis. Measurements were performed in facial regions exposed to scatter radiation and in the supratentorial and posterior fossa regions, submitted to primary radiation. T.L.D. were calibrated for 120 kV X-ray over the acrylic phantom. T.L. measurements were made with a Harshaw 4000 system. Patient mean T.L. readings were determined for position, pi, of T.L.D. and normalized to the maximum supratentorial reading. From integrating the linear T.L. density function (?) resulting from radiation distribution in each of the three exposed regions, dose fraction was determined in the region of interest, along with total dose under the technical conditions used in that specific exam protocol. For each T.L.D. position along the patient cranium, there were n T.L. measurements with 2% uncertainty due to T.L. reader, and 5% due to thermal treatment of dosimeters. Also, mean T.L. readings and their uncertainties were calculated for each patient at each position, p. Results: Mean linear T.L. density for the region exposed to secondary radiation defined by position, 0.3{<=}p{<=}6 cm, was {rho}((p)=7.9(4)x10{sup -2}+7(5)x10{sup -5}p{sup 4.5(4)} cm{sup -1}; exposed to primary X-ray for the posterior fossa

  18. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  19. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  20. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  1. Estimation of intermediate grade uranium resources. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Kendall, G.R.; Klahn, L.J.; Davis, J.C.; Harbaugh, J.W.

    1980-12-01

    The purpose of this project is to analyze the technique currently used by DOE to estimate intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) and, if possible, suggest alternatives to improve the accuracy and precision of the estimate. There are three principal conclusions resulting from this study. They relate to the quantity, distribution and sampling of intermediate grade uranium. While the results of this study must be validated further, they indicate that DOE may be underestimating intermediate level reserves by 20 to 30%. Plots of grade of U 3 O 8 versus tonnage of ore and tonnage U 3 O 8 indicate grade-tonnage relationships that are essentially log-linear, at least down to 0.01% U 3 O 8 . Though this is not an unexpected finding, it may provide a technique for reducing the uncertainty of intermediate grade endowment. The results of this study indicate that a much lower drill hole density is necessary for DOE to estimate uranium resources than for a mining company to calculate ore resources. Though errors in local estimates will occur, they will tend to cancel over the entire deposit

  2. Integration of Openstack cloud resources in BES III computing cluster

    Science.gov (United States)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  3. Computer-aided resource planning and scheduling for radiological services

    Science.gov (United States)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  4. A Semi-Preemptive Computational Service System with Limited Resources and Dynamic Resource Ranking

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2012-03-01

    Full Text Available In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short, which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET, users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.

  5. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  6. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  7. Can the Teachers' Creativity Overcome Limited Computer Resources?

    Science.gov (United States)

    Nikolov, Rumen; Sendova, Evgenia

    1988-01-01

    Describes experiences of the Research Group on Education (RGE) at the Bulgarian Academy of Sciences and the Ministry of Education in using limited computer resources when teaching informatics. Topics discussed include group projects; the use of Logo; ability grouping; and out-of-class activities, including publishing a pupils' magazine. (13…

  8. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  9. Canada's forest biomass resources: deriving estimates from Canada's forest inventory

    International Nuclear Information System (INIS)

    Penner, M.; Power, K.; Muhairwe, C.; Tellier, R.; Wang, Y.

    1997-01-01

    A biomass inventory for Canada was undertaken to address the data needs of carbon budget modelers, specifically to provide estimates of above-ground tree components and of non-merchantable trees in Canadian forests. The objective was to produce a national method for converting volume estimates to biomass that was standardized, repeatable across the country, efficient and well documented. Different conversion methods were used for low productivity forests (productivity class 1) and higher productivity forests (productivity class 2). The conversion factors were computed by constructing hypothetical stands for each site, age, species and province combination, and estimating the merchantable volume and all the above-ground biomass components from suitable published equations. This report documents the procedures for deriving the national biomass inventory, and provides illustrative examples of the results. 46 refs., 9 tabs., 5 figs

  10. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  11. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  12. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  13. Bioinspired Computational Approach to Missing Value Estimation

    Directory of Open Access Journals (Sweden)

    Israel Edem Agbehadji

    2018-01-01

    Full Text Available Missing data occurs when values of variables in a dataset are not stored. Estimating these missing values is a significant step during the data cleansing phase of a big data management approach. The reason of missing data may be due to nonresponse or omitted entries. If these missing data are not handled properly, this may create inaccurate results during data analysis. Although a traditional method such as maximum likelihood method extrapolates missing values, this paper proposes a bioinspired method based on the behavior of birds, specifically the Kestrel bird. This paper describes the behavior and characteristics of the Kestrel bird, a bioinspired approach, in modeling an algorithm to estimate missing values. The proposed algorithm (KSA was compared with WSAMP, Firefly, and BAT algorithm. The results were evaluated using the mean of absolute error (MAE. A statistical test (Wilcoxon signed-rank test and Friedman test was conducted to test the performance of the algorithms. The results of Wilcoxon test indicate that time does not have a significant effect on the performance, and the quality of estimation between the paired algorithms was significant; the results of Friedman test ranked KSA as the best evolutionary algorithm.

  14. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  15. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  16. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  17. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  18. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  19. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  20. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    2002-01-01

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of non-linear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  1. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of nonlinear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  2. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  3. The complexity of computing the MCD-estimator

    DEFF Research Database (Denmark)

    Bernholt, T.; Fischer, Paul

    2004-01-01

    In modem statistics the robust estimation of parameters is a central problem, i.e., an estimation that is not or only slightly affected by outliers in the data. The minimum covariance determinant (MCD) estimator (J. Amer. Statist. Assoc. 79 (1984) 871) is probably one of the most important robust...... estimators of location and scatter. The complexity of computing the MCD, however, was unknown and generally thought to be exponential even if the dimensionality of the data is fixed. Here we present a polynomial time algorithm for MCD for fixed dimension of the data. In contrast we show that computing...... the MCD-estimator is NP-hard if the dimension varies. (C) 2004 Elsevier B.V. All rights reserved....

  4. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  5. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  6. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  7. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  8. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  9. A Computationally Efficient Method for Polyphonic Pitch Estimation

    Directory of Open Access Journals (Sweden)

    Ruohua Zhou

    2009-01-01

    Full Text Available This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  10. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  11. Variance computations for functional of absolute risk estimates.

    Science.gov (United States)

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  12. Estimation of subcriticality with the computed values. 2

    Energy Technology Data Exchange (ETDEWEB)

    Sakurai, Kiyoshi; Arakawa, Takuya; Naito, Yoshitaka [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-02-01

    For measurements of reactivities and neutron count rate space distributions, seven subcritical cores including non-square array cores were constructed using critical assembly TCA. MCNP-4A was used for the experimental analysis. The calculational results of the neutron count rate space distributions agreed with the measured ones within the each error range. It means that for calculation error indirect estimation method, the calculated neutron multiplication factors are equal to ones of experimental reactivities. It is shown that from these experiments and calculations estimation method of subcriticality with the computed values based on the calculation error indirect estimation method is also applicable to six non-square array cores. (author).

  13. Computational Error Estimate for the Power Series Solution of Odes ...

    African Journals Online (AJOL)

    This paper compares the error estimation of power series solution with recursive Tau method for solving ordinary differential equations. From the computational viewpoint, the power series using zeros of Chebyshevpolunomial is effective, accurate and easy to use. Keywords: Lanczos Tau method, Chebyshev polynomial, ...

  14. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  15. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  16. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  17. On robust parameter estimation in brain-computer interfacing

    Science.gov (United States)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  18. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  19. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    Science.gov (United States)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  20. Sex estimation from sternal measurements using multidetector computed tomography.

    Science.gov (United States)

    Ekizoglu, Oguzhan; Hocaoglu, Elif; Inci, Ercan; Bilgili, Mustafa Gokhan; Solmaz, Dilek; Erdil, Irem; Can, Ismail Ozgur

    2014-12-01

    We aimed to show the utility and reliability of sternal morphometric analysis for sex estimation.Sex estimation is a very important step in forensic identification. Skeletal surveys are main methods for sex estimation studies. Morphometric analysis of sternum may provide high accuracy rated data in sex discrimination. In this study, morphometric analysis of sternum was evaluated in 1 mm chest computed tomography scans for sex estimation. Four hundred forty 3 subjects (202 female, 241 male, mean age: 44 ± 8.1 [distribution: 30-60 year old]) were included the study. Manubrium length (ML), mesosternum length (2L), Sternebra 1 (S1W), and Sternebra 3 (S3W) width were measured and also sternal index (SI) was calculated. Differences between genders were evaluated by student t-test. Predictive factors of sex were determined by discrimination analysis and receiver operating characteristic (ROC) analysis. Male sternal measurement values are significantly higher than females (P discrimination analysis, MSL has high accuracy rate with 80.2% in females and 80.9% in males. MSL also has the best sensitivity (75.9%) and specificity (87.6%) values. Accuracy rates were above 80% in 3 stepwise discrimination analysis for both sexes. Stepwise 1 (ML, MSL, S1W, S3W) has the highest accuracy rate in stepwise discrimination analysis with 86.1% in females and 83.8% in males. Our study showed that morphometric computed tomography analysis of sternum might provide important information for sex estimation.

  1. Computationally Efficient and Noise Robust DOA and Pitch Estimation

    DEFF Research Database (Denmark)

    Karimian-Azari, Sam; Jensen, Jesper Rindom; Christensen, Mads Græsbøll

    2016-01-01

    Many natural signals, such as voiced speech and some musical instruments, are approximately periodic over short intervals. These signals are often described in mathematics by the sum of sinusoids (harmonics) with frequencies that are proportional to the fundamental frequency, or pitch. In sensor...... a joint DOA and pitch estimator. In white Gaussian noise, we derive even more computationally efficient solutions which are designed using the narrowband power spectrum of the harmonics. Numerical results reveal the performance of the estimators in colored noise compared with the Cram\\'{e}r-Rao lower...

  2. Big Data in Cloud Computing: A Resource Management Perspective

    Directory of Open Access Journals (Sweden)

    Saeed Ullah

    2018-01-01

    Full Text Available The modern day advancement is increasingly digitizing our lives which has led to a rapid growth of data. Such multidimensional datasets are precious due to the potential of unearthing new knowledge and developing decision-making insights from them. Analyzing this huge amount of data from multiple sources can help organizations to plan for the future and anticipate changing market trends and customer requirements. While the Hadoop framework is a popular platform for processing larger datasets, there are a number of other computing infrastructures, available to use in various application domains. The primary focus of the study is how to classify major big data resource management systems in the context of cloud computing environment. We identify some key features which characterize big data frameworks as well as their associated challenges and issues. We use various evaluation metrics from different aspects to identify usage scenarios of these platforms. The study came up with some interesting findings which are in contradiction with the available literature on the Internet.

  3. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  4. Computer-assisted estimating for the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Spooner, J.E.

    1976-02-01

    An analysis is made of the cost estimating system currently in use at the Los Alamos Scientific Laboratory (LASL) and the benefits of computer assistance are evaluated. A computer-assisted estimating system (CAE) is proposed for LASL. CAE can decrease turnaround and provide more flexible response to management requests for cost information and analyses. It can enhance value optimization at the design stage, improve cost control and change-order justification, and widen the use of cost information in the design process. CAE costs are not well defined at this time although they appear to break even with present operations. It is recommended that a CAE system description be submitted for contractor consideration and bid while LASL system development continues concurrently

  5. Computer model for estimating electric utility environmental noise

    International Nuclear Information System (INIS)

    Teplitzky, A.M.; Hahn, K.J.

    1991-01-01

    This paper reports on a computer code for estimating environmental noise emissions from the operation and the construction of electric power plants that was developed based on algorithms. The computer code (Model) is used to predict octave band sound power levels for power plant operation and construction activities on the basis of the equipment operating characteristics and calculates off-site sound levels for each noise source and for an entire plant. Estimated noise levels are presented either as A-weighted sound level contours around the power plant or as octave band levels at user defined receptor locations. Calculated sound levels can be compared with user designated noise criteria, and the program can assist the user in analyzing alternative noise control strategies

  6. Estimating solar resources in Mexico using cloud cover data

    Energy Technology Data Exchange (ETDEWEB)

    Renne, David; George, Ray; Brady, Liz; Marion, Bill [National Renewable Energy Laboratory, Colorado (United States); Estrada Cajigal, Vicente [Cuernavaca, Morelos (Mexico)

    2000-07-01

    This paper presents the results of applying the National Renewable Energy Laboratory's (NREL) Climatological Solar Radiation (CSR) model to Mexico to develop solar resource data. A major input to the CSR model is a worldwide surface and satellite-derived cloud cover database, called the Real Time Nephanalysis (RTNEPH). The RTNEPH is developed by the U.S. Air Force and distributed by the U.S. National Climatic Data Center. The RTNEPH combines routine ground-based cloud cover observations made every three hours at national weather centers throughout the world with satellite-derived cloud cover information developed from polar orbiting weather satellites. The data are geospatially digitized so that multilayerd cloud cover information is available on a grid of approximately 40-km to a side. The development of this database is an ongoing project that now covers more than twenty years of observations. For the North America analysis (including Mexico) we used an 8-year summarized histogram of the RTNEPH that provides monthly average cloud cover information for the period 1985-1992. The CSR model also accounts for attenuation of the solar beam due to aerosols, atmospheric trace gases, and water vapor. The CSR model outputs monthly average direct normal, global horizontal and diffuse solar information for each of the 40-km grid cells. From this information it is also possible to produce solar resource estimates for various solar collector types and orientations, such as flat plate collectors oriented at latitude tilt, or concentrating solar power collectors. Model results are displayed using Geographic Information System software. CSR model results for Mexico are presented here, along with a discussion of earlier solar resource assessment studies for Mexico, where both modeling approaches and measurement analyses have been used. [Spanish] Este articulo presenta los resultados de aplicar el modelo Radiacion Solar Climatologica CSR del NREL (National Renewable Energy

  7. Estimation and change tendency of rape straw resource in Leshan

    Science.gov (United States)

    Guan, Qinlan; Gong, Mingfu

    2018-04-01

    Rape straw in Leshan area are rape stalks, including stems, leaves and pods after removing rapeseed. Leshan area is one of the main rape planting areas in Sichuan Province and rape planting area is large. Each year will produce a lot of rape straw. Based on the analysis of the trend of rapeseed planting area and rapeseed yield from 2008 to 2014, the change trend of rape straw resources in Leshan from 2008 to 2014 was analyzed and the decision-making reference was provided for resource utilization of rape straw. The results showed that the amount of rape straw resources in Leshan was very large, which was more than 100,000 tons per year, which was increasing year by year. By 2014, the amount of rape straw resources in Leshan was close to 200,000 tons.

  8. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  9. Chest X ray effective doses estimation in computed radiography

    International Nuclear Information System (INIS)

    Abdalla, Esra Abdalrhman Dfaalla

    2013-06-01

    Conventional chest radiography is technically difficult because of wide in tissue attenuations in the chest and limitations of screen-film systems. Computed radiography (CR) offers a different approach utilizing a photostimulable phosphor. photostimulable phosphors overcome some image quality limitations of chest imaging. The objective of this study was to estimate the effective dose in computed radiography at three hospitals in Khartoum. This study has been conducted in radiography departments in three centres Advanced Diagnostic Center, Nilain Diagnostic Center, Modern Diagnostic Center. The entrance surface dose (ESD) measurement was conducted for quality control of x-ray machines and survey of operators experimental techniques. The ESDs were measured by UNFORS dosimeter and mathematical equations to estimate patient doses during chest X rays. A total of 120 patients were examined in three centres, among them 62 were males and 58 were females. The overall mean and range of patient dosed was 0.073±0.037 (0.014-0.16) mGy per procedure while the effective dose was 3.4±01.7 (0.6-7.0) mSv per procedure. This study compared radiation doses to patients radiographic examinations of chest using computed radiology. The radiation dose was measured in three centres in Khartoum- Sudan. The results of the measured effective dose showed that the dose in chest radiography was lower in computed radiography compared to previous studies.(Author)

  10. An Analysis of the Published Mineral Resource Estimates of the Haji-Gak Iron Deposit, Afghanistan

    International Nuclear Information System (INIS)

    Sutphin, David M.; Renaud, Karine M.; Drew, Lawrence J.

    2011-01-01

    The Haji-Gak iron deposit of eastern Bamyan Province, eastern Afghanistan, was studied extensively and resource calculations were made in the 1960s by Afghan and Russian geologists. Recalculation of the resource estimates verifies the original estimates for categories A (in-place resources known in detail), B (in-place resources known in moderate detail), and C 1 (in-place resources estimated on sparse data), totaling 110.8 Mt, or about 6% of the resources as being supportable for the methods used in the 1960s. C 2 (based on a loose exploration grid with little data) resources are based on one ore grade from one drill hole, and P 2 (prognosis) resources are based on field observations, field measurements, and an ore grade derived from averaging grades from three better sampled ore bodies. C 2 and P 2 resources are 1,659.1 Mt or about 94% of the total resources in the deposit. The vast P 2 resources have not been drilled or sampled to confirm their extent or quality. The purpose of this article is to independently evaluate the resources of the Haji-Gak iron deposit by using the available geologic and mineral resource information including geologic maps and cross sections, sampling data, and the analog-estimating techniques of the 1960s to determine the size and tenor of the deposit.

  11. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  12. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  13. Peat resource estimation in South Carolina. Final report, Year 2

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, M.; Andrejko, M.; Corvinus, D.; Tisdale, M.

    1982-01-01

    South Carolina has few indigenous energy resources. Most widely known and utilized are hydropower, wood, and solar. Peat is a material composed of partially decomposed organic matter that, after burial for long periods of time, may eventually become coal. Peat is utilized as an energy resource for the production of electricity and for home heating in Europe and the Soviet Union. There are peat deposits in South Carolina, but peat has never been used as an energy resource within the state. This report presents the results of the two years of a planned four-year study of the quantity and energy potential of peat in South Carolina. In this year's survey two activities were undertaken. The first was to visit highly probable peat deposits to confirm the presence of fuel-grade peat. The second was to survey and characterize in more detail the areas judged to be of highest potential as major resources. The factors carrying the greatest weight in our determination of priority areas were: (1) a description of peat deposits in the scientific literature or from discussions with state and federal soil scientists; (2) mention of organic soils on soil maps or in the literature; and (3) information from farmers and other local citizens.

  14. Estimating the Wind Resource in Uttarakhand: Comparison of Dynamic Downscaling with Doppler Lidar Wind Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Lundquist, J. K. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pukayastha, A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Martin, C. [Univ. of Colorado, Boulder, CO (United States); Newsom, R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    Previous estimates of the wind resources in Uttarakhand, India, suggest minimal wind resources in this region. To explore whether or not the complex terrain in fact provides localized regions of wind resource, the authors of this study employed a dynamic down scaling method with the Weather Research and Forecasting model, providing detailed estimates of winds at approximately 1 km resolution in the finest nested simulation.

  15. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  16. Statistically and Computationally Efficient Estimating Equations for Large Spatial Datasets

    KAUST Repository

    Sun, Ying

    2014-11-07

    For Gaussian process models, likelihood based methods are often difficult to use with large irregularly spaced spatial datasets, because exact calculations of the likelihood for n observations require O(n3) operations and O(n2) memory. Various approximation methods have been developed to address the computational difficulties. In this paper, we propose new unbiased estimating equations based on score equation approximations that are both computationally and statistically efficient. We replace the inverse covariance matrix that appears in the score equations by a sparse matrix to approximate the quadratic forms, then set the resulting quadratic forms equal to their expected values to obtain unbiased estimating equations. The sparse matrix is constructed by a sparse inverse Cholesky approach to approximate the inverse covariance matrix. The statistical efficiency of the resulting unbiased estimating equations are evaluated both in theory and by numerical studies. Our methods are applied to nearly 90,000 satellite-based measurements of water vapor levels over a region in the Southeast Pacific Ocean.

  17. Statistically and Computationally Efficient Estimating Equations for Large Spatial Datasets

    KAUST Repository

    Sun, Ying; Stein, Michael L.

    2014-01-01

    For Gaussian process models, likelihood based methods are often difficult to use with large irregularly spaced spatial datasets, because exact calculations of the likelihood for n observations require O(n3) operations and O(n2) memory. Various approximation methods have been developed to address the computational difficulties. In this paper, we propose new unbiased estimating equations based on score equation approximations that are both computationally and statistically efficient. We replace the inverse covariance matrix that appears in the score equations by a sparse matrix to approximate the quadratic forms, then set the resulting quadratic forms equal to their expected values to obtain unbiased estimating equations. The sparse matrix is constructed by a sparse inverse Cholesky approach to approximate the inverse covariance matrix. The statistical efficiency of the resulting unbiased estimating equations are evaluated both in theory and by numerical studies. Our methods are applied to nearly 90,000 satellite-based measurements of water vapor levels over a region in the Southeast Pacific Ocean.

  18. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  19. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  20. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  1. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  2. NEWBOX: A computer program for parameter estimation in diffusion problems

    International Nuclear Information System (INIS)

    Nestor, C.W. Jr.; Godbee, H.W.; Joy, D.S.

    1989-01-01

    In the analysis of experiments to determine amounts of material transferred form 1 medium to another (e.g., the escape of chemically hazardous and radioactive materials from solids), there are at least 3 important considerations. These are (1) is the transport amenable to treatment by established mass transport theory; (2) do methods exist to find estimates of the parameters which will give a best fit, in some sense, to the experimental data; and (3) what computational procedures are available for evaluating the theoretical expressions. The authors have made the assumption that established mass transport theory is an adequate model for the situations under study. Since the solutions of the diffusion equation are usually nonlinear in some parameters (diffusion coefficient, reaction rate constants, etc.), use of a method of parameter adjustment involving first partial derivatives can be complicated and prone to errors in the computation of the derivatives. In addition, the parameters must satisfy certain constraints; for example, the diffusion coefficient must remain positive. For these reasons, a variant of the constrained simplex method of M. J. Box has been used to estimate parameters. It is similar, but not identical, to the downhill simplex method of Nelder and Mead. In general, they calculate the fraction of material transferred as a function of time from expressions obtained by the inversion of the Laplace transform of the fraction transferred, rather than by taking derivatives of a calculated concentration profile. With the above approaches to the 3 considerations listed at the outset, they developed a computer program NEWBOX, usable on a personal computer, to calculate the fractional release of material from 4 different geometrical shapes (semi-infinite medium, finite slab, finite circular cylinder, and sphere), accounting for several different boundary conditions

  3. Cloud Computing and Information Technology Resource Cost Management for SMEs

    DEFF Research Database (Denmark)

    Kuada, Eric; Adanu, Kwame; Olesen, Henning

    2013-01-01

    This paper analyzes the decision-making problem confronting SMEs considering the adoption of cloud computing as an alternative to in-house computing services provision. The economics of choosing between in-house computing and a cloud alternative is analyzed by comparing the total economic costs...... in determining the relative value of cloud computing....

  4. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  5. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  6. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  8. Sustainable economic growth and exhaustible resources: A model and estimation for the US

    Directory of Open Access Journals (Sweden)

    Almuth Scholl

    2002-01-01

    Full Text Available This paper studies current models on sustainable economic growth with resource constraints and explores to what extent resource constraints can be overcome by substitution and technological change. We also study the problem of intergenerational equity and the different criteria that have been suggested in the literature. The central part of this paper is the presentation of stylized facts on exhaustible resources and an estimation of a basic model with resource constraints for US time series data. The estimated years left until depletion and the empirical trends of the ratios of capital stock and consumption to resources seem to indicate that there might be a threat to sustainable growth in the future. In our estimation, we obtain parameter values, which help to interpret the extent to which growth with exhaustible resources is sustainable.

  9. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  10. Computer modelling of the UK wind energy resource: UK wind speed data package and user manual

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    A software package has been developed for IBM-PC or true compatibles. It is designed to provide easy access to the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. With the wind speed software package, the user is able to obtain a display of the modelled wind speed at 10m, 25m and 45m above ground level for any location in the UK. The required co-ordinates are simply supplied by the user, and the package displays the selected wind speed. This user manual summarises the methodology used in the generation of these UK maps and shows computer generated plots of the 25m wind speeds in 200 x 200 km regions covering the whole UK. The uncertainties inherent in the derivation of these maps are also described, and notes given on their practical usage. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (18 figures, 3 tables, 6 references). (author)

  11. River suspended sediment estimation by climatic variables implication: Comparative study among soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal

    2012-06-01

    Estimating sediment volume carried by a river is an important issue in water resources engineering. This paper compares the accuracy of three different soft computing methods, Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS), and Gene Expression Programming (GEP), in estimating daily suspended sediment concentration on rivers by using hydro-meteorological data. The daily rainfall, streamflow and suspended sediment concentration data from Eel River near Dos Rios, at California, USA are used as a case study. The comparison results indicate that the GEP model performs better than the other models in daily suspended sediment concentration estimation for the particular data sets used in this study. Levenberg-Marquardt, conjugate gradient and gradient descent training algorithms were used for the ANN models. Out of three algorithms, the Conjugate gradient algorithm was found to be better than the others.

  12. Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples

    Science.gov (United States)

    Ahlbrandt, T.S.; Klett, T.R.

    2005-01-01

    Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods

  13. South African uranium resource and production capability estimates

    International Nuclear Information System (INIS)

    Camisani-Calzolari, F.A.G.M.; Toens, P.D.

    1980-09-01

    South Africa, along with Canada and the United States, submitted forecasts of uranium capacities and capabilites to the year 2025 for the 1979 'Red Book' edition. This report deals with the methodologies used in arriving at the South African forecasts. As the future production trends of the South African uranium producers cannot be confidently defined, chiefly because uranium is extracted as a by-product of the gold mining industry and is thus highly sensitive to market fluctuations for both uranium and gold, the Evaluation Group of the Atomic Energy Board has carried out numerous forecast exercises using current and historical norms and assuming various degrees of 'adverse', 'normal' and 'most favourable' conditions. The two exercises, which were submitted for the 'Red Book', are shown in the Appendices. This paper has been prepared for presentation to the Working Group on Methodologies for Forecasting Uranium Availability of the NEA/IAEA Steering Group on Uranium Resources [af

  14. High-order computer-assisted estimates of topological entropy

    Science.gov (United States)

    Grote, Johannes

    The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.

  15. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  16. Monte Carlo estimation of the absorbed dose in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Woo; Youn, Han Bean; Kim, Ho Kyung [Pusan National University, Busan (Korea, Republic of)

    2016-05-15

    The purpose of this study is to devise an algorithm calculating absorbed dose distributions of patients based on Monte Carlo (MC) methods, and which includes the dose estimations due to primary and secondary (scattered) x-ray photons. Assessment of patient dose in computed tomography (CT) at the population level has become a subject of public attention and concern, and ultimate CT quality assurance and dose optimization have the goal of reducing radiation-induced cancer risks in the examined population. However, the conventional CT dose index (CTDI) concept is not a surrogate of risk but it has rather been designed to measure an average central dose. In addition, the CTDI or the dose-length product has showed troubles for helical CT with a wider beam collimation. Simple algorithms to estimate a patient specific CT dose based on the MCNP output data have been introduced. For numerical chest and head phantoms, the spatial dose distributions were calculated. The results were reasonable. The estimated dose distribution map can be readily converted into the effective dose. The important list for further studies includes the validation of the models with the experimental measurements and the acceleration of algorithms.

  17. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  18. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  19. Resource-Aware Load Balancing Scheme using Multi-objective Optimization in Cloud Computing

    OpenAIRE

    Kavita Rana; Vikas Zandu

    2016-01-01

    Cloud computing is a service based, on-demand, pay per use model consisting of an interconnected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are ...

  20. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    Science.gov (United States)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  1. Brain-computer interface for alertness estimation and improving

    Science.gov (United States)

    Hramov, Alexander; Maksimenko, Vladimir; Hramova, Marina

    2018-02-01

    Using wavelet analysis of the signals of electrical brain activity (EEG), we study the processes of neural activity, associated with perception of visual stimuli. We demonstrate that the brain can process visual stimuli in two scenarios: (i) perception is characterized by destruction of the alpha-waves and increase in the high-frequency (beta) activity, (ii) the beta-rhythm is not well pronounced, while the alpha-wave energy remains unchanged. The special experiments show that the motivation factor initiates the first scenario, explained by the increasing alertness. Based on the obtained results we build the brain-computer interface and demonstrate how the degree of the alertness can be estimated and controlled in real experiment.

  2. Estimating the Economic Impacts of Recreation Response to Resource Management Alternatives

    Science.gov (United States)

    Donald B.K. English; J. Michael Bowker; John C. Bergstrom; H. Ken Cordell

    1995-01-01

    Managing forest resources involves tradeoffs and making decisions among resource management alternatives. Some alternatives will lead to changes in the level of recreation visitation and the amount of associated visitor spending. Thus, the alternatives can affect local economies. This paper reports a method that can be used to estimate the economic impacts of such...

  3. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  4. Trained manpower resources in Brazil. Estimates and preparations

    International Nuclear Information System (INIS)

    Alves, R.N.; Araujo, R. de; Pinto, C.S.M.; Dale, C.M.M.; Souza, J.A.M. de; Spitalnik, J.

    1977-01-01

    The Brazilian nuclear programme will require by 1990 the installation of at least 10000MW(e) of nuclear power capacity, the implementation of the entire fuel cycle complex, and the creation of a reactor heavy-components manufacturing industry and of a nuclear power plant engineering capability. It has been estimated that such a programme will have to employ, up to 1985, some 7000-8000 people at the engineering and technician levels. The paper summarizes the consequent planning for preparation and qualification of manpower, which, as it involved such large numbers, required not only thorough analyses of sectoral requirements but also careful consideration of depletion rates and losses during the training process. Taking this into account, the Universities and Technical Schools will need to graduate, on average, 450 additional engineers and 550 additional technicians per year during the next ten years. For this purpose, the maximum use of the existing educational system in Brazil will avoid excessive reliance on external sources and will strengthen the local infrastructure. Crash specialization courses have been developed, in conjunction with the Universities, to comply specifically with the requirements of the nuclear programme. Only when no industrial experience can be provided in the country is on-the-job training in foreign firms considered. Training of nuclear power plant operators is also to be a local activity. An Operators Training Centre, by using a plant simulator, is being implemented with a scheduled operational date in the early 1980s. To implement the nuclear manpower programme, the Comissao Nacional de Energia Nuclear has been given the task of promoting and co-ordinating the nuclear academic education, while Empresas Nucleares Brasileiras SA is responsible for specialization and training of personnel in nuclear technology. (author)

  5. Science and Technology Resources on the Internet: Computer Security.

    Science.gov (United States)

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  6. Data Security Risk Estimation for Information-Telecommunication Systems on the basis of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anatoly Valeryevich Tsaregorodtsev

    2014-02-01

    Full Text Available Cloud computing will be one of the most common IT technologies to deploy applications, due to its key features: on-demand network access to a shared pool of configurable computing resources, flexibility and good quality/price ratio. Migrating to cloud architecture enables organizations to reduce the overall cost of implementing and maintaining the infrastructure and reduce development time for new business applications. There are many factors that influence the information security environment of cloud, as its multitenant architecture brings new and more complex problems and vulnerabilities. And the approach to risk estimation used in making decisions about the migration of critical data in the cloud infrastructure of the organization are proposed in the paper.

  7. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    Science.gov (United States)

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  8. ``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis

    Science.gov (United States)

    Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin

    Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.

  9. Quantum computing with incoherent resources and quantum jumps.

    Science.gov (United States)

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  10. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Nan Zhang

    Full Text Available Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  11. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  12. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  13. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  14. Surgical resource utilization in urban terrorist bombing: a computer simulation.

    Science.gov (United States)

    Hirshberg, A; Stein, M; Walden, R

    1999-09-01

    The objective of this study was to analyze the utilization of surgical staff and facilities during an urban terrorist bombing incident. A discrete-event computer model of the emergency room and related hospital facilities was constructed and implemented, based on cumulated data from 12 urban terrorist bombing incidents in Israel. The simulation predicts that the admitting capacity of the hospital depends primarily on the number of available surgeons and defines an optimal staff profile for surgeons, residents, and trauma nurses. The major bottlenecks in the flow of critical casualties are the shock rooms and the computed tomographic scanner but not the operating rooms. The simulation also defines the number of reinforcement staff needed to treat noncritical casualties and shows that radiology is the major obstacle to the flow of these patients. Computer simulation is an important new tool for the optimization of surgical service elements for a multiple-casualty situation.

  15. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  16. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  17. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  18. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  19. Energy-efficient cloud computing : autonomic resource provisioning for datacenters

    OpenAIRE

    Tesfatsion, Selome Kostentinos

    2018-01-01

    Energy efficiency has become an increasingly important concern in data centers because of issues associated with energy consumption, such as capital costs, operating expenses, and environmental impact. While energy loss due to suboptimal use of facilities and non-IT equipment has largely been reduced through the use of best-practice technologies, addressing energy wastage in IT equipment still requires the design and implementation of energy-aware resource management systems. This thesis focu...

  20. TOWARDS NEW COMPUTATIONAL ARCHITECTURES FOR MASS-COLLABORATIVE OPENEDUCATIONAL RESOURCES

    OpenAIRE

    Ismar Frango Silveira; Xavier Ochoa; Antonio Silva Sprock; Pollyana Notargiacomo Mustaro; Yosly C. Hernandez Bieluskas

    2011-01-01

    Open Educational Resources offer several benefits mostly in education and training. Being potentially reusable, their use can reduce time and cost of developing educational programs, so that these savings could be transferred directly to students through the production of a large range of open, freely available content, which vary from hypermedia to digital textbooks. This paper discuss this issue and presents a project and a research network that, in spite of being directed to Latin America'...

  1. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  2. Computer System Resource Requirements of Novice Programming Students.

    Science.gov (United States)

    Nutt, Gary J.

    The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…

  3. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2014-01-01

    Full Text Available This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU virtualization and mobile agent for mobile transparent computing (MTC to devise a method of managing shared resources and services management (SRSM. It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user’s requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  4. A novel resource management method of providing operating system as a service for mobile transparent computing.

    Science.gov (United States)

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  5. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  6. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  7. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  8. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  9. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  10. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  11. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  12. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  13. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    International Nuclear Information System (INIS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-01-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  14. Overview of the Practical and Theoretical Approaches to the Estimation of Mineral Resources. A Financial Perspective

    Directory of Open Access Journals (Sweden)

    Leontina Pavaloaia

    2012-10-01

    Full Text Available Mineral resources represent an important natural resource whose exploitation, unless it is rational, can lead to their exhaustion and the collapse of sustainable development. Given the importance of mineral resources and the uncertainty concerning the estimation of extant reserves, they have been analyzed by several national and international institutions. In this article we shall present a few aspects concerning the ways to approach the reserves of mineral resources at national and international level, by considering both economic aspects and those aspects concerned with the definition, classification and aggregation of the reserves of mineral resources by various specialized institutions. At present there are attempts to homogenize practices concerning these aspects for the purpose of presenting correct and comparable information.

  15. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  16. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  17. Estimation of uranium resources by life-cycle or discovery-rate models: a critique

    International Nuclear Information System (INIS)

    Harris, D.P.

    1976-10-01

    This report was motivated primarily by M. A. Lieberman's ''United States Uranium Resources: An Analysis of Historical Data'' (Science, April 30). His conclusion that only 87,000 tons of U 3 O 8 resources recoverable at a forward cost of $8/lb remain to be discovered is criticized. It is shown that there is no theoretical basis for selecting the exponential or any other function for the discovery rate. Some of the economic (productivity, inflation) and data issues involved in the analysis of undiscovered, recoverable U 3 O 8 resources on discovery rates of $8 reserves are discussed. The problem of the ratio of undiscovered $30 resources to undiscovered $8 resources is considered. It is concluded that: all methods for the estimation of unknown resources must employ a model of some form of the endowment-exploration-production complex, but every model is a simplification of the real world, and every estimate is intrinsically uncertain. The life-cycle model is useless for the appraisal of undiscovered, recoverable U 3 O 8 , and the discovery rate model underestimates these resources

  18. Using High Performance Computing to Support Water Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Groves, David G. [RAND Corporation, Santa Monica, CA (United States); Lembert, Robert J. [RAND Corporation, Santa Monica, CA (United States); May, Deborah W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leek, James R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Syme, James [RAND Corporation, Santa Monica, CA (United States)

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  19. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  20. Decision making in water resource planning: Models and computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Fedra, K; Carlsen, A J [ed.

    1987-01-01

    This paper describes some basic concepts of simulation-based decision support systems for water resources management and the role of symbolic, graphics-based user interfaces. Designed to allow direct and easy access to advanced methods of analysis and decision support for a broad and heterogeneous group of users, these systems combine data base management, system simulation, operations research techniques such as optimization, interactive data analysis, elements of advanced decision technology, and artificial intelligence, with a friendly and conversational, symbolic display oriented user interface. Important features of the interface are the use of several parallel or alternative styles of interaction and display, indlucing colour graphics and natural language. Combining quantitative numerical methods with qualitative and heuristic approaches, and giving the user direct and interactive control over the systems function, human knowledge, experience and judgement are integrated with formal approaches into a tightly coupled man-machine system through an intelligent and easily accessible user interface. 4 drawings, 42 references.

  1. Monitoring of computing resource utilization of the ATLAS experiment

    International Nuclear Information System (INIS)

    Rousseau, David; Vukotic, Ilija; Schaffer, RD; Dimitrov, Gancho; Aidel, Osman; Albrand, Solveig

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  2. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  3. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  4. An Improved Global Wind Resource Estimate for Integrated Assessment Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-01

    This paper summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquely detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.

  5. Accounting for animal movement in estimation of resource selection functions: sampling and data analysis.

    Science.gov (United States)

    Forester, James D; Im, Hae Kyung; Rathouz, Paul J

    2009-12-01

    Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to

  6. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; Reber, Tim; McCabe, Kevin; Mooney, Meghan; Young, Katherine R.

    2017-05-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  7. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; McCabe, Kevin; Mooney, Meghan; Reber, Timothy; Young, Katherine R.

    2016-10-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  8. Geological 3-D modelling and resources estimation of the Budenovskoye uranium deposit (Kazakhstan)

    International Nuclear Information System (INIS)

    Boytsov, A.; Heyns, M.; Seredkin, M.

    2014-01-01

    The Budenovskoye deposit is the biggest sandstone-hosted, roll front type uranium deposit in Kazakhstan and in the world. Uranium mineralization occurs in the unconsolidated lacustrine-alluvial sediments of Late Cretaceous Mynkuduk and Inkuduk horizons. The Budenovskoye deposit was split into four areas for development with the present Karatau ISL Mine operating No. 2 area and Akbastau ISL Mine Nos. 1, 3 and 4 areas. Mines are owned by Kazatomprom and Uranium One in equal shares. CSA Global was retained by Uranium One to update in accordance with NI 43-101 the Mineral Resource estimates for the Karatau and Akbastau Mines. The modelling Reports shows a significant increase in total uranium resources tonnage at both mines when compared to the March 2012 NI 43-101 resource estimate: at Karartau measured and indicated resources increased by 586% while at Akbastau by 286%. It has also added a 55,766 tonnes U to the Karatau Inferred Mineral Resource category.The new estimates result from the application of 3-D modelling techniques to the extensive database of drilling information, new exploration activities.

  9. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  10. Schistosomiasis and water resources development: systematic review, meta-analysis, and estimates of people at risk.

    Science.gov (United States)

    Steinmann, Peter; Keiser, Jennifer; Bos, Robert; Tanner, Marcel; Utzinger, Jürg

    2006-07-01

    An estimated 779 million people are at risk of schistosomiasis, of whom 106 million (13.6%) live in irrigation schemes or in close proximity to large dam reservoirs. We identified 58 studies that examined the relation between water resources development projects and schistosomiasis, primarily in African settings. We present a systematic literature review and meta-analysis with the following objectives: (1) to update at-risk populations of schistosomiasis and number of people infected in endemic countries, and (2) to quantify the risk of water resources development and management on schistosomiasis. Using 35 datasets from 24 African studies, our meta-analysis showed pooled random risk ratios of 2.4 and 2.6 for urinary and intestinal schistosomiasis, respectively, among people living adjacent to dam reservoirs. The risk ratio estimate for studies evaluating the effect of irrigation on urinary schistosomiasis was in the range 0.02-7.3 (summary estimate 1.1) and that on intestinal schistosomiasis in the range 0.49-23.0 (summary estimate 4.7). Geographic stratification showed important spatial differences, idiosyncratic to the type of water resources development. We conclude that the development and management of water resources is an important risk factor for schistosomiasis, and hence strategies to mitigate negative effects should become integral parts in the planning, implementation, and operation of future water projects.

  11. Estimating Resource Costs of Levy Campaigns in Five Ohio School Districts

    Science.gov (United States)

    Ingle, W. Kyle; Petroff, Ruth Ann; Johnson, Paul A.

    2011-01-01

    Using Levin and McEwan's (2001) "ingredients method," this study identified the major activities and associated costs of school levy campaigns in five districts. The ingredients were divided into one of five cost categories--human resources, facilities, fees, marketing, and supplies. As to overall costs of the campaigns, estimates ranged…

  12. Analysis of the Possibility of Required Resources Estimation for Nuclear Power Plant Decommissioning Applying BIM

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Estimation of decommissioning cost, decommissioning strategy, and decommissioning quantity at the time when entering into any decommissioning plans are some elements whose inputs are mandatory for nuclear power plant decommissioning. Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. This study aims at analyzing whether required resources for decommissioning nuclear power plants can be estimated, applying BIM. To achieve this goal, this study analyzed the status quo of BIM such as definition, characteristics, and areas applied, and made use of them when drawing out study results by examining types and features of the tools realizing BIM. In order to review how BIM could be used for decommissioning nuclear power plants, the definition, characteristics and applied areas of BIM were discussed. BIM designs objects of the structures (walls, slabs, pillars, stairs, windows and doors, etc.) by 3D technology and endows attribute (function, structure and usage) information for each object, thereby providing visualized information of structures for participants in construction projects. Major characteristics of BIM attribute information are as follows: - Geometry: The information of objects is represented by measurable geometric information - Extensible object attributes: Objects include pre-defined attributes, and allow extension of other attributes. Any model that includes these attributes forms relationships with other various attributes in order to perform analysis and simulation. - All information including the attributes are integrated to ensure continuity, accuracy and accessibility, and all information used during the life cycle of structures are supported. This means that when information of required resources is added as another attributes other than geometric

  13. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  14. Implications of applying solar industry best practice resource estimation on project financing

    International Nuclear Information System (INIS)

    Pacudan, Romeo

    2016-01-01

    Solar resource estimation risk is one of the main solar PV project risks that influences lender’s decision in providing financing and in determining the cost of capital. More recently, a number of measures have emerged to mitigate this risk. The study focuses on solar industry’s best practice energy resource estimation and assesses its financing implications to the 27 MWp solar PV project study in Brunei Darussalam. The best practice in resource estimation uses multiple data sources through the measure-correlate-predict (MCP) technique as compared with the standard practice that rely solely on modelled data source. The best practice case generates resource data with lower uncertainty and yields superior high-confidence energy production estimate than the standard practice case. Using project financial parameters in Brunei Darussalam for project financing and adopting the international debt-service coverage ratio (DSCR) benchmark rates, the best practice case yields DSCRs that surpass the target rates while those of standard practice case stay below the reference rates. The best practice case could also accommodate higher debt share and have lower levelized cost of electricity (LCOE) while the standard practice case would require a lower debt share but having a higher LCOE. - Highlights: •Best practice solar energy resource estimation uses multiple datasets. •Multiple datasets are combined through measure-correlate-predict technique. •Correlated data have lower uncertainty and yields superior high-confidence energy production. •Best practice case yields debt-service coverage ratios (DSCRs) that surpass the benchmark rates. •Best practice case accommodates high debt share and have low levelized cost of electricity.

  15. NATO Advanced Study Institute on Statistical Treatments for Estimation of Mineral and Energy Resources

    CERN Document Server

    Fabbri, A; Sinding-Larsen, R

    1988-01-01

    This volume contains the edited papers prepared by lecturers and participants of the NATO Advanced Study Institute on "Statistical Treatments for Estimation of Mineral and Energy Resources" held at II Ciocco (Lucca), Italy, June 22 - July 4, 1986. During the past twenty years, tremendous efforts have been made to acquire quantitative geoscience information from ore deposits, geochemical, geophys ical and remotely-sensed measurements. In October 1981, a two-day symposium on "Quantitative Resource Evaluation" and a three-day workshop on "Interactive Systems for Multivariate Analysis and Image Processing for Resource Evaluation" were held in Ottawa, jointly sponsored by the Geological Survey of Canada, the International Association for Mathematical Geology, and the International Geological Correlation Programme. Thirty scientists from different countries in Europe and North America were invited to form a forum for the discussion of quantitative methods for mineral and energy resource assessment. Since then, not ...

  16. Estimating resource costs of compliance with EU WFD ecological status requirements at the river basin scale

    DEFF Research Database (Denmark)

    Riegels, Niels; Jensen, Roar; Benasson, Lisa

    2011-01-01

    Resource costs of meeting EU WFD ecological status requirements at the river basin scale are estimated by comparing net benefits of water use given ecological status constraints to baseline water use values. Resource costs are interpreted as opportunity costs of water use arising from water...... scarcity. An optimization approach is used to identify economically efficient ways to meet WFD requirements. The approach is implemented using a river basin simulation model coupled to an economic post-processor; the simulation model and post-processor are run from a central controller that iterates until...... an allocation is found that maximizes net benefits given WFD requirements. Water use values are estimated for urban/domestic, agricultural, industrial, livestock, and tourism water users. Ecological status is estimated using metrics that relate average monthly river flow volumes to the natural hydrologic regime...

  17. Sensor and computing resource management for a small satellite

    Science.gov (United States)

    Bhatia, Abhilasha; Goehner, Kyle; Sand, John; Straub, Jeremy; Mohammad, Atif; Korvald, Christoffer; Nervold, Anders Kose

    A small satellite in a low-Earth orbit (e.g., approximately a 300 to 400 km altitude) has an orbital velocity in the range of 8.5 km/s and completes an orbit approximately every 90 minutes. For a satellite with minimal attitude control, this presents a significant challenge in obtaining multiple images of a target region. Presuming an inclination in the range of 50 to 65 degrees, a limited number of opportunities to image a given target or communicate with a given ground station are available, over the course of a 24-hour period. For imaging needs (where solar illumination is required), the number of opportunities is further reduced. Given these short windows of opportunity for imaging, data transfer, and sending commands, scheduling must be optimized. In addition to the high-level scheduling performed for spacecraft operations, payload-level scheduling is also required. The mission requires that images be post-processed to maximize spatial resolution and minimize data transfer (through removing overlapping regions). The payload unit includes GPS and inertial measurement unit (IMU) hardware to aid in image alignment for the aforementioned. The payload scheduler must, thus, split its energy and computing-cycle budgets between determining an imaging sequence (required to capture the highly-overlapping data required for super-resolution and adjacent areas required for mosaicking), processing the imagery (to perform the super-resolution and mosaicking) and preparing the data for transmission (compressing it, etc.). This paper presents an approach for satellite control, scheduling and operations that allows the cameras, GPS and IMU to be used in conjunction to acquire higher-resolution imagery of a target region.

  18. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  19. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  20. ESTIMATION OF EXTERNAL FACTORS INFLUENCE ON THE ORGANIZATIONAL AND RESOURCE SUPPORT OF ENGINEERING

    Directory of Open Access Journals (Sweden)

    Yu. V. Gusak

    2013-09-01

    Full Text Available Purpose. The engineering industry is characterized by deep specialization and high co-operation, which suggests a high degree of interaction with other industries and the economy, highly sensitive to external factors. Effective regulation of the engineering industry’s organizational-resource support will ensure coherence of all the subsystems of the market economy, the competitive environment, a full course of the investment process and the success of the industry. Therefore there is a need for detailed estimation and analysis of the external factors’ influence on the formation and implementation indexes of the engineering industry’s organizational-resource support. Methodology. To establish the close connection between the set of external factors of formation and implementation indexes of the engineering industry organizational-resource support the correlation analysis was used, to calculate the amount of the formation and implementation indexes of the engineering industry organizational-resource support’s change under the influence of the external factors with malleability coefficient were applied. Findings. The external influence factors on the engineering industry organizational-resource support by the source of origin: industrial, economical, political, informational, and social were separated and grouped. The classification of the external factors influence on the engineering industry organizational-resource support, depending on their influence’s direction on the formation and implementation indexes of the engineering industry’s organizational-resource support was made. The connection closeness and the amount of the formation and implementation indexes of the engineering industry organizational-resource support change (the machinery index of and the sales volume machinery index under the influence of the external factors with malleability coefficient were determined. Originality. The estimation of the external factors

  1. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  2. Uranium Resources Modeling And Estimation In Lembah Hitam Sector, Kalan, West Kalimantan

    International Nuclear Information System (INIS)

    Adi Gunawan Muhammad; Bambang Soetopo

    2016-01-01

    Lembah Hitam Sector is part of Schwaner Mountains and Kalan Basin upper part stratigraphy. Uranium (U) mineralization layer is associated with metasiltstone and metapelites schistose heading to N 265° E/60° S. Evaluation drilling carried out with a distance of 50 m from an existing point (FKL 14 and FKL 13) to determine the model and the amount of U resources in measured category. To achieve these objectives some activities including reviewing the previous studies, geological and U mineralization data collecting, grades quantitative estimation using log gross-count gamma ray, database and modeling creation and resource estimation of U carried out. Based on modeling on ten drilling data and completed with drilled core observation, the average grade of U mineralization in Lembah Hitam Sector obtained. The average grade is ranging from 0.0076 - 0.95 % eU_3O_8, with a thickness of mineralization ranging from 0.1 - 4.5 m. Uranium mineralization present as fracture filling (veins) or groups of veins and as matrix filling in tectonic breccia, associated with pyrite, pyrrhotite, magnetite, molybdenite, tourmaline and quartz in metasiltstone and metapelites schistose. Calculation of U resources to 26 ores body using 25 m searching radius resulted in 655.65 tons ores. By using 0.01 % cut-off grade resulted in 546.72 tons ores with an average grade 0.101 % eU_3O_8. Uranium resource categorized as low-grade measured resources. (author)

  3. Mass Estimate for a Lunar Resource Launcher Based on Existing Terrestrial Electromagnetic Launchers

    Directory of Open Access Journals (Sweden)

    Gordon Roesler

    2013-06-01

    Full Text Available Economic exploitation of lunar resources may be more efficient with a non-rocket approach to launch from the lunar surface. The launch system cost will depend on its design, and on the number of launches from Earth to deliver the system to the Moon. Both of these will depend on the launcher system mass. Properties of an electromagnetic resource launcher are derived from two mature terrestrial electromagnetic launchers. A mass model is derived and used to estimate launch costs for a developmental launch vehicle. A rough manufacturing cost for the system is suggested.

  4. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  6. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  7. Open Educational Resources: The Role of OCW, Blogs and Videos in Computer Networks Classroom

    Directory of Open Access Journals (Sweden)

    Pablo Gil

    2012-09-01

    Full Text Available This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

  8. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  9. Fusion of neural computing and PLS techniques for load estimation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, M.; Xue, H.; Cheng, X. [Northwestern Polytechnical Univ., Xi' an (China); Zhang, W. [Xi' an Inst. of Post and Telecommunication, Xi' an (China)

    2007-07-01

    A method to predict the electric load of a power system in real time was presented. The method is based on neurocomputing and partial least squares (PLS). Short-term load forecasts for power systems are generally determined by conventional statistical methods and Computational Intelligence (CI) techniques such as neural computing. However, statistical modeling methods often require the input of questionable distributional assumptions, and neural computing is weak, particularly in determining topology. In order to overcome the problems associated with conventional techniques, the authors developed a CI hybrid model based on neural computation and PLS techniques. The theoretical foundation for the designed CI hybrid model was presented along with its application in a power system. The hybrid model is suitable for nonlinear modeling and latent structure extracting. It can automatically determine the optimal topology to maximize the generalization. The CI hybrid model provides faster convergence and better prediction results compared to the abductive networks model because it incorporates a load conversion technique as well as new transfer functions. In order to demonstrate the effectiveness of the hybrid model, load forecasting was performed on a data set obtained from the Puget Sound Power and Light Company. Compared with the abductive networks model, the CI hybrid model reduced the forecast error by 32.37 per cent on workday, and by an average of 27.18 per cent on the weekend. It was concluded that the CI hybrid model has a more powerful predictive ability. 7 refs., 1 tab., 3 figs.

  10. A Parameter Estimation Method for Dynamic Computational Cognitive Models

    NARCIS (Netherlands)

    Thilakarathne, D.J.

    2015-01-01

    A dynamic computational cognitive model can be used to explore a selected complex cognitive phenomenon by providing some features or patterns over time. More specifically, it can be used to simulate, analyse and explain the behaviour of such a cognitive phenomenon. It generates output data in the

  11. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  12. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  13. Estimated sand and gravel resources of the South Merrimack, Hillsborough County, New Hampshire, 7.5-minute quadrangle

    Science.gov (United States)

    Sutphin, D.M.; Drew, L.J.; Fowler, B.K.

    2006-01-01

    A computer methodology is presented that allows natural aggregate producers, local governmental, and nongovernmental planners to define specific locations that may have sand and gravel deposits meeting user-specified minimum size, thickness, and geographic and geologic criteria, in areas where the surficial geology has been mapped. As an example, the surficial geologic map of the South Merrimack quadrangle was digitized and several digital geographic information system databases were downloaded from the internet and used to estimate the sand and gravel resources in the quadrangle. More than 41 percent of the South Merrimack quadrangle has been mapped as having sand and (or) gravel deposited by glacial meltwaters. These glaciofluvial areas are estimated to contain a total of 10 million m3 of material mapped as gravel, 60 million m3 of material mapped as mixed sand and gravel, and another 50 million m3 of material mapped as sand with minor silt. The mean thickness of these areas is about 1.95 meters. Twenty tracts were selected, each having individual areas of more than about 14 acres4 (5.67 hectares) of stratified glacial-meltwater sand and gravel deposits, at least 10-feet (3.0 m) of material above the watertable, and not sterilized by the proximity of buildings, roads, streams and other bodies of water, or railroads. The 20 tracts are estimated to contain between about 4 and 10 million short tons (st) of gravel and 20 and 30 million st of sand. The five most gravel-rich tracts contain about 71 to 82 percent of the gravel resources in all 20 tracts and about 54-56 percent of the sand. Using this methodology, and the above criteria, a group of four tracts, divided by narrow areas sterilized by a small stream and secondary roads, may have the highest potential in the quadrangle for sand and gravel resources. ?? Springer Science+Business Media, LLC 2006.

  14. An Estimate of Recoverable Heavy Oil Resources of the Orinoco Oil Belt, Venezuela

    Science.gov (United States)

    Schenk, Christopher J.; Cook, Troy A.; Charpentier, Ronald R.; Pollastro, Richard M.; Klett, Timothy R.; Tennyson, Marilyn E.; Kirschbaum, Mark A.; Brownfield, Michael E.; Pitman, Janet K.

    2009-01-01

    The Orinoco Oil Belt Assessment Unit of the La Luna-Quercual Total Petroleum System encompasses approximately 50,000 km2 of the East Venezuela Basin Province that is underlain by more than 1 trillion barrels of heavy oil-in-place. As part of a program directed at estimating the technically recoverable oil and gas resources of priority petroleum basins worldwide, the U.S. Geological Survey estimated the recoverable oil resources of the Orinoco Oil Belt Assessment Unit. This estimate relied mainly on published geologic and engineering data for reservoirs (net oil-saturated sandstone thickness and extent), petrophysical properties (porosity, water saturation, and formation volume factors), recovery factors determined by pilot projects, and estimates of volumes of oil-in-place. The U.S. Geological Survey estimated a mean volume of 513 billion barrels of technically recoverable heavy oil in the Orinoco Oil Belt Assessment Unit of the East Venezuela Basin Province; the range is 380 to 652 billion barrels. The Orinoco Oil Belt Assessment Unit thus contains one of the largest recoverable oil accumulations in the world.

  15. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  16. Methods for the estimation and economic evaluation of undiscovered uranium endowment and resources

    International Nuclear Information System (INIS)

    1992-01-01

    The present Instruction Manual was prepared as part of a programme of the International Atomic Energy Agency to supply the international uranium community with standard guides for a number of topics related to uranium resource assessment and supply. The quantitative estimation of undiscovered resources and endowments aims at supplying data on potential mineral resources; these data are needed to compare long term projections with one another and to assess the mineral supplies to be obtained from elsewhere. These objectives have relatively recently been supplemented by the concern of land managers and national policy planners to assess the potential of certain lands before the constitution of national parks and other areas reserved from mineral exploration and development. 88 refs, 28 figs, 33 tabs

  17. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  18. Estimating the energy independence of a municipal wastewater treatment plant incorporating green energy resources

    International Nuclear Information System (INIS)

    Chae, Kyu-Jung; Kang, Jihoon

    2013-01-01

    Highlights: • We estimated green energy production in a municipal wastewater treatment plant. • Engineered approaches in mining multiple green energy resources were presented. • The estimated green energy production accounted for 6.5% of energy independence in the plant. • We presented practical information regarding green energy projects in water infrastructures. - Abstract: Increasing energy prices and concerns about global climate change highlight the need to improve energy independence in municipal wastewater treatment plants (WWTPs). This paper presents methodologies for estimating the energy independence of a municipal WWTP with a design capacity of 30,000 m 3 /d incorporating various green energy resources into the existing facilities, including different types of 100 kW photovoltaics, 10 kW small hydropower, and an effluent heat recovery system with a 25 refrigeration ton heat pump. It also provides guidance for the selection of appropriate renewable technologies or their combinations for specific WWTP applications to reach energy self-sufficiency goals. The results showed that annual energy production equal to 107 tons of oil equivalent could be expected when the proposed green energy resources are implemented in the WWTP. The energy independence, which was defined as the percent ratio of green energy production to energy consumption, was estimated to be a maximum of 6.5% and to vary with on-site energy consumption in the WWTP. Implementing green energy resources tailored to specific site conditions is necessary to improve the energy independence in WWTPs. Most of the applied technologies were economically viable primarily because of the financial support under the mandatory renewable portfolio standard in Korea

  19. Cost Implications of Uncertainty in CO{sub 2} Storage Resource Estimates: A Review

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Steven T., E-mail: sanderson@usgs.gov [National Center, U.S. Geological Survey (United States)

    2017-04-15

    Carbon capture from stationary sources and geologic storage of carbon dioxide (CO{sub 2}) is an important option to include in strategies to mitigate greenhouse gas emissions. However, the potential costs of commercial-scale CO{sub 2} storage are not well constrained, stemming from the inherent uncertainty in storage resource estimates coupled with a lack of detailed estimates of the infrastructure needed to access those resources. Storage resource estimates are highly dependent on storage efficiency values or storage coefficients, which are calculated based on ranges of uncertain geological and physical reservoir parameters. If dynamic factors (such as variability in storage efficiencies, pressure interference, and acceptable injection rates over time), reservoir pressure limitations, boundaries on migration of CO{sub 2}, consideration of closed or semi-closed saline reservoir systems, and other possible constraints on the technically accessible CO{sub 2} storage resource (TASR) are accounted for, it is likely that only a fraction of the TASR could be available without incurring significant additional costs. Although storage resource estimates typically assume that any issues with pressure buildup due to CO{sub 2} injection will be mitigated by reservoir pressure management, estimates of the costs of CO{sub 2} storage generally do not include the costs of active pressure management. Production of saline waters (brines) could be essential to increasing the dynamic storage capacity of most reservoirs, but including the costs of this critical method of reservoir pressure management could increase current estimates of the costs of CO{sub 2} storage by two times, or more. Even without considering the implications for reservoir pressure management, geologic uncertainty can significantly impact CO{sub 2} storage capacities and costs, and contribute to uncertainty in carbon capture and storage (CCS) systems. Given the current state of available information and the

  20. Estimation of crop water requirements using remote sensing for operational water resources management

    Science.gov (United States)

    Vasiliades, Lampros; Spiliotopoulos, Marios; Tzabiras, John; Loukas, Athanasios; Mylopoulos, Nikitas

    2015-06-01

    An integrated modeling system, developed in the framework of "Hydromentor" research project, is applied to evaluate crop water requirements for operational water resources management at Lake Karla watershed, Greece. The framework includes coupled components for operation of hydrotechnical projects (reservoir operation and irrigation works) and estimation of agricultural water demands at several spatial scales using remote sensing. The study area was sub-divided into irrigation zones based on land use maps derived from Landsat 5 TM images for the year 2007. Satellite-based energy balance for mapping evapotranspiration with internalized calibration (METRIC) was used to derive actual evapotranspiration (ET) and crop coefficient (ETrF) values from Landsat TM imagery. Agricultural water needs were estimated using the FAO method for each zone and each control node of the system for a number of water resources management strategies. Two operational strategies of hydro-technical project development (present situation without operation of the reservoir and future situation with the operation of the reservoir) are coupled with three water demand strategies. In total, eight (8) water management strategies are evaluated and compared. The results show that, under the existing operational water resources management strategies, the crop water requirements are quite large. However, the operation of the proposed hydro-technical projects in Lake Karla watershed coupled with water demand management measures, like improvement of existing water distribution systems, change of irrigation methods, and changes of crop cultivation could alleviate the problem and lead to sustainable and ecological use of water resources in the study area.

  1. Estimate of Hot Dry Rock Geothermal Resource in Daqing Oilfield, Northeast China

    Directory of Open Access Journals (Sweden)

    Guangzheng Jiang

    2016-10-01

    Full Text Available Development and utilization of deep geothermal resources, especially a hot dry rock (HDR geothermal resource, is beneficial for both economic and environmental consideration in oilfields. This study used data from multiple sources to assess the geothermal energy resource in the Daqing Oilfield. The temperature logs in boreholes (both shallow water wells and deep boreholes and the drilling stem test temperature were used to create isothermal maps in depths. Upon the temperature field and thermophysical parameters of strata, the heat content was calculated by 1 km × 1 km × 0.1 km cells. The result shows that in the southeastern part of Daqing Oilfield, the temperature can reach 150 °C at a depth of 3 km. The heat content within 3–5 km is 24.28 × 1021 J, wherein 68.2% exceeded 150 °C. If the recovery factor was given by 2% and the lower limit of temperature was set to be 150 °C, the most conservative estimate for recoverable HDR geothermal resource was 0.33 × 1021 J. The uncertainties of the estimation are mainly contributed to by the temperature extrapolation and the physical parameter selections.

  2. Latent-failure risk estimates for computer control

    Science.gov (United States)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  3. Improved computation method in residual life estimation of structural components

    Directory of Open Access Journals (Sweden)

    Maksimović Stevan M.

    2013-01-01

    Full Text Available This work considers the numerical computation methods and procedures for the fatigue crack growth predicting of cracked notched structural components. Computation method is based on fatigue life prediction using the strain energy density approach. Based on the strain energy density (SED theory, a fatigue crack growth model is developed to predict the lifetime of fatigue crack growth for single or mixed mode cracks. The model is based on an equation expressed in terms of low cycle fatigue parameters. Attention is focused on crack growth analysis of structural components under variable amplitude loads. Crack growth is largely influenced by the effect of the plastic zone at the front of the crack. To obtain efficient computation model plasticity-induced crack closure phenomenon is considered during fatigue crack growth. The use of the strain energy density method is efficient for fatigue crack growth prediction under cyclic loading in damaged structural components. Strain energy density method is easy for engineering applications since it does not require any additional determination of fatigue parameters (those would need to be separately determined for fatigue crack propagation phase, and low cyclic fatigue parameters are used instead. Accurate determination of fatigue crack closure has been a complex task for years. The influence of this phenomenon can be considered by means of experimental and numerical methods. Both of these models are considered. Finite element analysis (FEA has been shown to be a powerful and useful tool1,6 to analyze crack growth and crack closure effects. Computation results are compared with available experimental results. [Projekat Ministarstva nauke Republike Srbije, br. OI 174001

  4. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  5. Estimating resting motor thresholds in transcranial magnetic stimulation research and practice: a computer simulation evaluation of best methods.

    Science.gov (United States)

    Borckardt, Jeffrey J; Nahas, Ziad; Koola, Jejo; George, Mark S

    2006-09-01

    Resting motor threshold is the basic unit of dosing in transcranial magnetic stimulation (TMS) research and practice. There is little consensus on how best to estimate resting motor threshold with TMS, and only a few tools and resources are readily available to TMS researchers. The current study investigates the accuracy and efficiency of 5 different approaches to motor threshold assessment for TMS research and practice applications. Computer simulation models are used to test the efficiency and accuracy of 5 different adaptive parameter estimation by sequential testing (PEST) procedures. For each approach, data are presented with respect to the mean number of TMS trials necessary to reach the motor threshold estimate as well as the mean accuracy of the estimates. A simple nonparametric PEST procedure appears to provide the most accurate motor threshold estimates, but takes slightly longer (on average, 3.48 trials) to complete than a popular parametric alternative (maximum likelihood PEST). Recommendations are made for the best starting values for each of the approaches to maximize both efficiency and accuracy. In light of the computer simulation data provided in this article, the authors review and suggest which techniques might best fit different TMS research and clinical situations. Lastly, a free user-friendly software package is described and made available on the world wide web that allows users to run all of the motor threshold estimation procedures discussed in this article for clinical and research applications.

  6. Estimating the financial resources needed for local public health departments in Minnesota: a multimethod approach.

    Science.gov (United States)

    Riley, William; Briggs, Jill; McCullough, Mac

    2011-01-01

    This study presents a model for determining total funding needed for individual local health departments. The aim is to determine the financial resources needed to provide services for statewide local public health departments in Minnesota based on a gaps analysis done to estimate the funding needs. We used a multimethod analysis consisting of 3 approaches to estimate gaps in local public health funding consisting of (1) interviews of selected local public health leaders, (2) a Delphi panel, and (3) a Nominal Group Technique. On the basis of these 3 approaches, a consensus estimate of funding gaps was generated for statewide projections. The study includes an analysis of cost, performance, and outcomes from 2005 to 2007 for all 87 local governmental health departments in Minnesota. For each of the methods, we selected a panel to represent a profile of Minnesota health departments. The 2 main outcome measures were local-level gaps in financial resources and total resources needed to provide public health services at the local level. The total public health expenditure in Minnesota for local governmental public health departments was $302 million in 2007 ($58.92 per person). The consensus estimate of the financial gaps in local public health departments indicates that an additional $32.5 million (a 10.7% increase or $6.32 per person) is needed to adequately serve public health needs in the local communities. It is possible to make informed estimates of funding gaps for public health activities on the basis of a combination of quantitative methods. There is a wide variation in public health expenditure at the local levels, and methods are needed to establish minimum baseline expenditure levels to adequately treat a population. The gaps analysis can be used by stakeholders to inform policy makers of the need for improved funding of the public health system.

  7. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng; Fei, Shiyang; Zongan, Wang; Li, Yu; Zhao, Feng; Gao, Xin

    2018-01-01

    structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology

  8. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  9. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  10. The Usage of informal computer based communication in the context of organization’s technological resources

    OpenAIRE

    Raišienė, Agota Giedrė; Jonušauskas, Steponas

    2011-01-01

    Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization's technological resources. Methodology - meta analysis, survey and descriptive analysis. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the ...

  11. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  12. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    Science.gov (United States)

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  13. Photonic entanglement as a resource in quantum computation and quantum communication

    OpenAIRE

    Prevedel, Robert; Aspelmeyer, Markus; Brukner, Caslav; Jennewein, Thomas; Zeilinger, Anton

    2008-01-01

    Entanglement is an essential resource in current experimental implementations for quantum information processing. We review a class of experiments exploiting photonic entanglement, ranging from one-way quantum computing over quantum communication complexity to long-distance quantum communication. We then propose a set of feasible experiments that will underline the advantages of photonic entanglement for quantum information processing.

  14. SEISRISK II; a computer program for seismic hazard estimation

    Science.gov (United States)

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  15. Model calibration and parameter estimation for environmental and water resource systems

    CERN Document Server

    Sun, Ne-Zheng

    2015-01-01

    This three-part book provides a comprehensive and systematic introduction to the development of useful models for complex systems. Part 1 covers the classical inverse problem for parameter estimation in both deterministic and statistical frameworks, Part 2 is dedicated to system identification, hyperparameter estimation, and model dimension reduction, and Part 3 considers how to collect data and construct reliable models for prediction and decision-making. For the first time, topics such as multiscale inversion, stochastic field parameterization, level set method, machine learning, global sensitivity analysis, data assimilation, model uncertainty quantification, robust design, and goal-oriented modeling, are systematically described and summarized in a single book from the perspective of model inversion, and elucidated with numerical examples from environmental and water resources modeling. Readers of this book will not only learn basic concepts and methods for simple parameter estimation, but also get famili...

  16. Estimation of potential biomass resource and biogas production from aquatic plants in Argentina

    Science.gov (United States)

    Fitzsimons, R. E.; Laurino, C. N.; Vallejos, R. H.

    1982-08-01

    The use of aquatic plants in artificial lakes as a biomass source for biogas and fertilizer production through anaerobic fermentation is evaluated, and the magnitude of this resource and the potential production of biogas and fertilizer are estimated. The specific case considered is the artificial lake that will be created by the construction of Parana Medio Hydroelectric Project on the middle Parana River in Argentina. The growth of the main aquatic plant, water hyacinth, on the middle Parana River has been measured, and its conversion to methane by anaerobic fermentation is determined. It is estimated that gross methane production may be between 1.0-4.1 x 10 to the 9th cu cm/year. The fermentation residue can be used as a soil conditioner, and it is estimated production of the residue may represent between 54,900-221,400 tons of nitrogen/year, a value which is 2-8 times the present nitrogen fertilizer demand in Argentina.

  17. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  18. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  19. Estimating health workforce needs for antiretroviral therapy in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Fullem Andrew

    2006-01-01

    Full Text Available Abstract Background Efforts to increase access to life-saving treatment, including antiretroviral therapy (ART, for people living with HIV/AIDS in resource-limited settings has been the growing focus of international efforts. One of the greatest challenges to scaling up will be the limited supply of adequately trained human resources for health, including doctors, nurses, pharmacists and other skilled providers. As national treatment programmes are planned, better estimates of human resource needs and improved approaches to assessing the impact of different staffing models are critically needed. However there have been few systematic assessments of staffing patterns in existing programmes or of the estimates being used in planning larger programmes. Methods We reviewed the published literature and selected plans and scaling-up proposals, interviewed experts and collected data on staffing patterns at existing treatment sites through a structured survey and site visits. Results We found a wide range of staffing patterns and patient-provider ratios in existing and planned treatment programmes. Many factors influenced health workforce needs, including task assignments, delivery models, other staff responsibilities and programme size. Overall, the number of health care workers required to provide ART to 1000 patients included 1–2 physicians, 2–7 nurses, Discussion These data are consistent with other estimates of human resource requirements for antiretroviral therapy, but highlight the considerable variability of current staffing models and the importance of a broad range of factors in determining personnel needs. Few outcome or cost data are currently available to assess the effectiveness and efficiency of different staffing models, and it will be important to develop improved methods for gathering this information as treatment programmes are scaled up.

  20. Effect of water resource development and management on lymphatic filariasis, and estimates of populations at risk.

    Science.gov (United States)

    Erlanger, Tobias E; Keiser, Jennifer; Caldas De Castro, Marcia; Bos, Robert; Singer, Burton H; Tanner, Marcel; Utzinger, Jürg

    2005-09-01

    Lymphatic filariasis (LF) is a debilitating disease overwhelmingly caused by Wuchereria bancrofti, which is transmitted by various mosquito species. Here, we present a systematic literature review with the following objectives: (i) to establish global and regional estimates of populations at risk of LF with particular consideration of water resource development projects, and (ii) to assess the effects of water resource development and management on the frequency and transmission dynamics of the disease. We estimate that globally, 2 billion people are at risk of LF. Among them, there are 394.5 million urban dwellers without access to improved sanitation and 213 million rural dwellers living in close proximity to irrigation. Environmental changes due to water resource development and management consistently led to a shift in vector species composition and generally to a strong proliferation of vector populations. For example, in World Health Organization (WHO) subregions 1 and 2, mosquito densities of the Anopheles gambiae complex and Anopheles funestus were up to 25-fold higher in irrigated areas when compared with irrigation-free sites. Although the infection prevalence of LF often increased after the implementation of a water project, there was no clear association with clinical symptoms. Concluding, there is a need to assess and quantify changes of LF transmission parameters and clinical manifestations over the entire course of water resource developments. Where resources allow, integrated vector management should complement mass drug administration, and broad-based monitoring and surveillance of the disease should become an integral part of large-scale waste management and sanitation programs, whose basic rationale lies in a systemic approach to city, district, and regional level health services and disease prevention.

  1. A computer program for the estimation of time of death

    DEFF Research Database (Denmark)

    Lynnerup, N

    1993-01-01

    In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant and that t......In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant...... cooling of bodies is presented. It is proposed that by having a computer program that solves the equation, giving the length of the cooling period in response to a certain rectal temperature, and which allows easy comparison of multiple solutions, the uncertainties related to ambience temperature...

  2. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  3. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    International Nuclear Information System (INIS)

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  4. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  5. A confirmatory investigation of a job demands-resources model using a categorical estimator.

    Science.gov (United States)

    de Beer, Leon; Rothmann, Sebastiaan; Pienaar, Jaco

    2012-10-01

    A confirmatory investigation of a job demands-resources model was conducted with alternative methods, in a sample of 15,633 working adults aggregated from various economic sectors. The proposed model is in line with job demands-resources theory and assumes two psychological processes at work which are collectively coined "the dual process." The first process, the energetic, presents that job demands lead to ill-health outcomes due to burnout. The second process, the motivational, indicates that job resources lead to organizational commitment due to work engagement. Structural equation modelling analyses were implemented with a categorical estimator. Mediation analyses of each of the processes included bootstrapped indirect effects and kappa-squared values to apply qualitative labels to effect sizes. The relationship between job resources and organizational commitment was mediated by engagement with a large effect. The relationship between job demands and ill-health was mediated by burnout with a medium effect. The implications of the results for theory and practice were discussed.

  6. Estimate of the Geothermal Energy Resource in the Major Sedimentary Basins in the United States (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Esposito, A.; Porro, C.; Augustine, C.; Roberts, B.

    2012-09-01

    Because most sedimentary basins have been explored for oil and gas, well logs, temperatures at depth, and reservoir properties such as depth to basement and formation thickness are well known. The availability of this data reduces exploration risk and allows development of geologic exploration models for each basin. This study estimates the magnitude of recoverable geothermal energy from 15 major known U.S. sedimentary basins and ranks these basins relative to their potential. The total available thermal resource for each basin was estimated using the volumetric heat-in-place method originally proposed by (Muffler, 1979). A qualitative recovery factor was determined for each basin based on data on flow volume, hydrothermal recharge, and vertical and horizontal permeability. Total sedimentary thickness maps, stratigraphic columns, cross sections, and temperature gradient information was gathered for each basin from published articles, USGS reports, and state geological survey reports. When published data were insufficient, thermal gradients and reservoir properties were derived from oil and gas well logs obtained on oil and gas commission databases. Basin stratigraphy, structural history, and groundwater circulation patterns were studied in order to develop a model that estimates resource size, temperature distribution, and a probable quantitative recovery factor.

  7. Regrowth patterns of supratentorial gliomas: estimation from computed tomographic scans

    Energy Technology Data Exchange (ETDEWEB)

    Tsuboi, K.; Yoshii, Y.; Nakagawa, K.; Maki, Y.

    1986-12-01

    To clarify the regrowth patterns of benign and malignant gliomas, we chose 27 intervals (between two operations or between an operation and autopsy) from 21 patients with pathologically verified recurrent supratentorial gliomas. Serial computed tomographic (CT) scans of these cases were analyzed to determine the doubling time (Td) calculated from the change in volume of enhanced and low density areas, the enhancement effect graded from 0 to 4 according to the Hounsfield number, and the presence of dissemination and contralateral extension. We studied 5 benign gliomas (including 1 case of radiation necrosis), 8 malignant astrocytomas, and 8 glioblastomas. The Td's of enhanced areas on CT scans of benign gliomas, malignant astrocytomas, and glioblastomas were 937 +/- 66.5 days, 65.1 +/- 29.4 days, and 48.1 +/- 20.9 days, respectively. The Td's of low density areas were 895 +/- 130.6 days, 70.8 +/- 22.2 days, and 50.5 +/- 14.7 days. There was a significant correlation between the Td's of the enhanced and low density areas (0.97). The enhancement effect increased at recurrence in 55% of the cases, with an average increase of 1.1 grades. The increase in enhancement effect at recurrence showed a tendency to become smaller as the tumor's degree of anaplasia increased. Radiotherapy was effective in significantly retarding the growth rate of malignant gliomas, whose Td's were doubled. Although the Td's of both enhanced and low density areas of benign gliomas were significantly longer than those of malignant gliomas, there was no significant difference in the Td's of enhanced areas between malignant astrocytomas and glioblastomas.

  8. Regrowth patterns of supratentorial gliomas: estimation from computed tomographic scans

    International Nuclear Information System (INIS)

    Tsuboi, K.; Yoshii, Y.; Nakagawa, K.; Maki, Y.

    1986-01-01

    To clarify the regrowth patterns of benign and malignant gliomas, we chose 27 intervals (between two operations or between an operation and autopsy) from 21 patients with pathologically verified recurrent supratentorial gliomas. Serial computed tomographic (CT) scans of these cases were analyzed to determine the doubling time (Td) calculated from the change in volume of enhanced and low density areas, the enhancement effect graded from 0 to 4 according to the Hounsfield number, and the presence of dissemination and contralateral extension. We studied 5 benign gliomas (including 1 case of radiation necrosis), 8 malignant astrocytomas, and 8 glioblastomas. The Td's of enhanced areas on CT scans of benign gliomas, malignant astrocytomas, and glioblastomas were 937 +/- 66.5 days, 65.1 +/- 29.4 days, and 48.1 +/- 20.9 days, respectively. The Td's of low density areas were 895 +/- 130.6 days, 70.8 +/- 22.2 days, and 50.5 +/- 14.7 days. There was a significant correlation between the Td's of the enhanced and low density areas (0.97). The enhancement effect increased at recurrence in 55% of the cases, with an average increase of 1.1 grades. The increase in enhancement effect at recurrence showed a tendency to become smaller as the tumor's degree of anaplasia increased. Radiotherapy was effective in significantly retarding the growth rate of malignant gliomas, whose Td's were doubled. Although the Td's of both enhanced and low density areas of benign gliomas were significantly longer than those of malignant gliomas, there was no significant difference in the Td's of enhanced areas between malignant astrocytomas and glioblastomas

  9. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  10. Ion range estimation by using dual energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Huenemohr, Nora; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Medical Physics in Radiation Oncology; Krauss, Bernhard [Siemens AG, Forchheim (Germany). Imaging and Therapy; Dinkel, Julien [German Cancer Research Center (DKFZ), Heidelberg (Germany). Radiology; Massachusetts General Hospital, Boston, MA (United States). Radiology; Gillmann, Clarissa [German Cancer Research Center (DKFZ), Heidelberg (Germany). Medical Physics in Radiation Oncology; University Hospital Heidelberg (Germany). Radiation Oncology; Ackermann, Benjamin [Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Medical Physics in Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany); University Hospital Heidelberg (Germany). Radiation Oncology

    2013-07-01

    Inaccurate conversion of CT data to water-equivalent path length (WEPL) is one of the most important uncertainty sources in ion treatment planning. Dual energy CT (DECT) imaging might help to reduce CT number ambiguities with the additional information. In our study we scanned a series of materials (tissue substitutes, aluminum, PMMA, and other polymers) in the dual source scanner (Siemens Somatom Definition Flash). Based on the 80 kVp/140Sn kVp dual energy images, the electron densities Q{sub e} and effective atomic numbers Z{sub eff} were calculated. We introduced a new lookup table that translates the Q{sub e} to the WEPL. The WEPL residuals from the calibration were significantly reduced for the investigated tissue surrogates compared to the empirical Hounsfield-look-up table (single energy CT imaging) from (-1.0 {+-} 1.8)% to (0.1 {+-} 0.7)% and for non-tissue equivalent PMMA from -7.8% to -1.0%. To assess the benefit of the new DECT calibration, we conducted a treatment planning study for three different idealized cases based on tissue surrogates and PMMA. The DECT calibration yielded a significantly higher target coverage in tissue surrogates and phantom material (i.e. PMMA cylinder, mean target coverage improved from 62% to 98%). To verify the DECT calibration for real tissue, ion ranges through a frozen pig head were measured and compared to predictions calculated by the standard single energy CT calibration and the novel DECT calibration. By using this method, an improvement of ion range estimation from -2.1% water-equivalent thickness deviation (single energy CT) to 0.3% (DECT) was achieved. If one excludes raypaths located on the edge of the sample accompanied with high uncertainties, no significant difference could be observed. (orig.)

  11. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  12. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  13. Fast covariance estimation for innovations computed from a spatial Gibbs point process

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Rubak, Ege

    In this paper, we derive an exact formula for the covariance of two innovations computed from a spatial Gibbs point process and suggest a fast method for estimating this covariance. We show how this methodology can be used to estimate the asymptotic covariance matrix of the maximum pseudo...

  14. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  15. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  16. The NILE system architecture: fault-tolerant, wide-area access to computing and data resources

    International Nuclear Information System (INIS)

    Ricciardi, Aleta; Ogg, Michael; Rothfus, Eric

    1996-01-01

    NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

  17. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  18. Computer modelling of the UK wind energy resource: final overview report

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    This report describes the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (20 figures, 7 tables, 10 references). (author)

  19. EXTRAN: A computer code for estimating concentrations of toxic substances at control room air intakes

    International Nuclear Information System (INIS)

    Ramsdell, J.V.

    1991-03-01

    This report presents the NRC staff with a tool for assessing the potential effects of accidental releases of radioactive materials and toxic substances on habitability of nuclear facility control rooms. The tool is a computer code that estimates concentrations at nuclear facility control room air intakes given information about the release and the environmental conditions. The name of the computer code is EXTRAN. EXTRAN combines procedures for estimating the amount of airborne material, a Gaussian puff dispersion model, and the most recent algorithms for estimating diffusion coefficients in building wakes. It is a modular computer code, written in FORTRAN-77, that runs on personal computers. It uses a math coprocessor, if present, but does not require one. Code output may be directed to a printer or disk files. 25 refs., 8 figs., 4 tabs

  20. MLSOIL and DFSOIL - computer codes to estimate effective ground surface concentrations for dose computations

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Kocher, D.C.; Killough, G.G.; Miller, C.W.

    1984-11-01

    This report is a user's manual for MLSOIL (Multiple Layer SOIL model) and DFSOIL (Dose Factors for MLSOIL) and a documentation of the computational methods used in those two computer codes. MLSOIL calculates an effective ground surface concentration to be used in computations of external doses. This effective ground surface concentration is equal to (the computed dose in air from the concentration in the soil layers)/(the dose factor for computing dose in air from a plane). MLSOIL implements a five compartment linear-transfer model to calculate the concentrations of radionuclides in the soil following deposition on the ground surface from the atmosphere. The model considers leaching through the soil as well as radioactive decay and buildup. The element-specific transfer coefficients used in this model are a function of the k/sub d/ and environmental parameters. DFSOIL calculates the dose in air per unit concentration at 1 m above the ground from each of the five soil layers used in MLSOIL and the dose per unit concentration from an infinite plane source. MLSOIL and DFSOIL have been written to be part of the Computerized Radiological Risk Investigation System (CRRIS) which is designed for assessments of the health effects of airborne releases of radionuclides. 31 references, 3 figures, 4 tables

  1. Estimating the Value of Improved Distributed Photovoltaic Adoption Forecasts for Utility Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Gagnon, Pieter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ehlen, Ali [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zuboy, Jarret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2018-05-15

    Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities; forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by using a suite of models to explore the capacity expansion and operation of the Western Interconnection over a 15-year period across a wide range of DPV growth rates and misforecast severities. The system costs under a misforecast are compared against the costs under a perfect forecast, to quantify the costs of misforecasting. Using a simplified probabilistic method applied to these modeling results, an analyst can make a first-order estimate of the financial benefit of improving a utility’s forecasting capabilities, and thus be better informed about whether to make such an investment. For example, under our base assumptions, a utility with 10 TWh per year of retail electric sales who initially estimates that DPV growth could range from 2% to 7.5% of total generation over the next 15 years could expect total present-value savings of approximately $4 million if they could reduce the severity of misforecasting to within ±25%. Utility resource planners can compare those savings against the costs needed to achieve that level of precision, to guide their decision on whether to make an investment in tools or resources.

  2. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  3. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  4. Piping data bank and erection system of Angra 2: structure, computational resources and systems

    International Nuclear Information System (INIS)

    Abud, P.R.; Court, E.G.; Rosette, A.C.

    1992-01-01

    The Piping Data Bank of Angra 2 called - Erection Management System - Was developed to manage the piping erection of the Nuclear Power Plant of Angra 2. Beyond the erection follow-up of piping and supports, it manages: the piping design, the material procurement, the flow of the fabrication documents, testing of welds and material stocks at the Warehouse. The works developed in the sense of defining the structure of the Data Bank, Computational Resources and System are here described. (author)

  5. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  6. Estimation of resource savings due to fly ash utilization in road construction

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Subodh; Patil, C.B. [Centre for Energy Studies, Indian Institute of Technology, New Delhi 110016 (India)

    2006-08-15

    A methodology for estimation of natural resource savings due to fly ash utilization in road construction in India is presented. Analytical expressions for the savings of various resources namely soil, stone aggregate, stone chips, sand and cement in the embankment, granular sub-base (GSB), water bound macadam (WBM) and pavement quality concrete (PQC) layers of fly ash based road formation with flexible and rigid pavements of a given geometry have been developed. The quantity of fly ash utilized in these layers of different pavements has also been quantified. In the present study, the maximum amount of resource savings is found in GSB followed by WBM and other layers of pavement. The soil quantity saved increases asymptotically with the rise in the embankment height. The results of financial analysis based on Indian fly ash based road construction cost data indicate that the savings in construction cost decrease with the lead and the investment on this alternative is found to be financially attractive only for a lead less than 60 and 90km for flexible and rigid pavements, respectively. (author)

  7. A geo-informatics approach for estimating water resources management components and their interrelationships

    KAUST Repository

    Liaqat, Umar Waqas

    2016-09-21

    A remote sensing based geo-informatics approach was developed to estimate water resources management (WRM) components across a large irrigation scheme in the Indus Basin of Pakistan. The approach provides a generalized framework for estimating a range of key water management variables and provides a management tool for the sustainable operation of similar schemes globally. A focus on the use of satellite data allowed for the quantification of relationships across a range of spatial and temporal scales. Variables including actual and crop evapotranspiration, net and gross irrigation, net and gross groundwater use, groundwater recharge, net groundwater recharge, were estimated and then their interrelationships explored across the Hakra Canal command area. Spatially distributed remotely sensed estimates of actual evapotranspiration (ETa) rates were determined using the Surface Energy Balance System (SEBS) model and evaluated against ground-based evaporation calculated from the advection-aridity method. Analysis of ETa simulations across two cropping season, referred to as Kharif and Rabi, yielded Pearson correlation (R) values of 0.69 and 0.84, Nash-Sutcliffe criterion (NSE) of 0.28 and 0.63, percentage bias of −3.85% and 10.6% and root mean squared error (RMSE) of 10.6 mm and 12.21 mm for each season, respectively. For the period of study between 2008 and 2014, it was estimated that an average of 0.63 mm day−1 water was supplied through canal irrigation against a crop water demand of 3.81 mm day−1. Approximately 1.86 mm day−1 groundwater abstraction was estimated in the region, which contributed to fulfil the gap between crop water demand and canal water supply. Importantly, the combined canal, groundwater and rainfall sources of water only met 70% of the crop water requirements. As such, the difference between recharge and discharge showed that groundwater depletion was around −115 mm year−1 during the six year study period. Analysis indicated that

  8. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  9. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  10. Multidetector row computed tomography may accurately estimate plaque vulnerability. Does MDCT accurately estimate plaque vulnerability? (Pro)

    International Nuclear Information System (INIS)

    Komatsu, Sei; Imai, Atsuko; Kodama, Kazuhisa

    2011-01-01

    Over the past decade, multidetector row computed tomography (MDCT) has become the most reliable and established of the noninvasive examination techniques for detecting coronary heart disease. Now MDCT is chasing intravascular ultrasound (IVUS) in terms of spatial resolution. Among the components of vulnerable plaque, MDCT may detect lipid-rich plaque, the lipid pool, and calcified spots using computed tomography number. Plaque components are detected by MDCT with high accuracy compared with IVUS and angioscopy when assessing vulnerable plaque. The TWINS study and TOGETHAR trial demonstrated that angioscopic loss of yellow color occurred independently of volumetric plaque change by statin therapy. These 2 studies showed that plaque stabilization and regression reflect independent processes mediated by different mechanisms and time course. Noncalcified plaque and/or low-density plaque was found to be the strongest predictor of cardiac events, regardless of lesion severity, and act as a potential marker of plaque vulnerability. MDCT may be an effective tool for early triage of patients with chest pain who have a normal electrocardiogram (ECG) and cardiac enzymes in the emergency department. MDCT has the potential ability to analyze coronary plaque quantitatively and qualitatively if some problems are resolved. MDCT may become an essential tool for detecting and preventing coronary artery disease in the future. (author)

  11. The Trope Tank: A Laboratory with Material Resources for Creative Computing

    Directory of Open Access Journals (Sweden)

    Nick Montfort

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/1807-9288.2014v10n2p53 Principles for organizing and making use of a laboratory with material computing resources are articulated. This laboratory, the Trope Tank, is a facility for teaching, research, and creative collaboration and offers hardware (in working condition and set up for use from the 1970s, 1980s, and 1990s, including videogame systems, home computers, and an arcade cabinet. To aid in investigating the material history of texts, the lab has a small 19th century letterpress, a typewriter, a print terminal, and dot-matrix printers. Other resources include controllers, peripherals, manuals, books, and software on physical media. These resources are used for teaching, loaned for local exhibitions and presentations, and accessed by researchers and artists. The space is primarily a laboratory (rather than a library, studio, or museum, so materials are organized by platform and intended use. Textual information about the historical contexts of the available systems, and resources are set up to allow easy operation, and even casual use, by researchers, teachers, students, and artists.

  12. Testing a computer-based ostomy care training resource for staff nurses.

    Science.gov (United States)

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  13. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  14. Linking resource selection and mortality modeling for population estimation of mountain lions in Montana

    Science.gov (United States)

    Robinson, Hugh S.; Ruth, Toni K.; Gude, Justin A.; Choate, David; DeSimone, Rich; Hebblewhite, Mark; Matchett, Marc R.; Mitchell, Michael S.; Murphy, Kerry; Williams, Jim

    2015-01-01

    To be most effective, the scale of wildlife management practices should match the range of a particular species’ movements. For this reason, combined with our inability to rigorously or regularly census mountain lion populations, several authors have suggested that mountain lions be managed in a source-sink or metapopulation framework. We used a combination of resource selection functions, mortality estimation, and dispersal modeling to estimate cougar population levels in Montana statewide and potential population level effects of planned harvest levels. Between 1980 and 2012, 236 independent mountain lions were collared and monitored for research in Montana. From these data we used 18,695 GPS locations collected during winter from 85 animals to develop a resource selection function (RSF), and 11,726 VHF and GPS locations from 142 animals along with the locations of 6343 mountain lions harvested from 1988–2011 to validate the RSF model. Our RSF model validated well in all portions of the State, although it appeared to perform better in Montana Fish, Wildlife and Parks (MFWP) Regions 1, 2, 4 and 6, than in Regions 3, 5, and 7. Our mean RSF based population estimate for the total population (kittens, juveniles, and adults) of mountain lions in Montana in 2005 was 3926, with almost 25% of the entire population in MFWP Region 1. Estimates based on a high and low reference population estimates produce a possible range of 2784 to 5156 mountain lions statewide. Based on a range of possible survival rates we estimated the mountain lion population in Montana to be stable to slightly increasing between 2005 and 2010 with lambda ranging from 0.999 (SD = 0.05) to 1.02 (SD = 0.03). We believe these population growth rates to be a conservative estimate of true population growth. Our model suggests that proposed changes to female harvest quotas for 2013–2015 will result in an annual statewide population decline of 3% and shows that, due to reduced dispersal, changes to

  15. Comparison of Personal Resources in Patients Who Differently Estimate the Impact of Multiple Sclerosis.

    Science.gov (United States)

    Wilski, Maciej; Tomczak, Maciej

    2017-04-01

    Discrepancies between physicians' assessment and patients' subjective representations of the disease severity may influence physician-patient communication and management of a chronic illness, such as multiple sclerosis (MS). For these reasons, it is important to recognize factors that distinguish patients who differently estimate the impact of MS. The purpose of this study was to verify if the patients who overestimate or underestimate the impact of MS differ in their perception of personal resources from individuals presenting with a realistic appraisal of their physical condition. A total of 172 women and 92 men diagnosed with MS completed Multiple Sclerosis Impact Scale, University of Washington Self Efficacy Scale, Rosenberg Self-Esteem Scale, Body Esteem Scale, Brief Illness Perception Questionnaire, Treatment Beliefs Scale, Actually Received Support Scale, and Socioeconomic resources scale. Physician's assessment of health status was determined with Expanded Disability Status Scale. Linear regression analysis was conducted to identify the subsets of patients with various patterns of subjective health and Expanded Disability Status Scale (EDSS) scores. Patients overestimating the impact of their disease presented with significantly lower levels of self-esteem, self-efficacy in MS, and body esteem; furthermore, they perceived their condition more threatening than did realists and underestimators. They also assessed anti-MS treatment worse, had less socioeconomic resources, and received less support than underestimators. Additionally, underestimators presented with significantly better perception of their disease, self, and body than did realists. Self-assessment of MS-related symptoms is associated with specific perception of personal resources in coping with the disease. These findings may facilitate communication with patients and point to new directions for future research on adaptation to MS.

  16. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  17. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of Construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  18. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    International Nuclear Information System (INIS)

    Jung, Insu; Kim, Woojung

    2014-01-01

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  19. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  20. Subroutine library for error estimation of matrix computation (Ver. 1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi; Shizawa, Yoshihisa; Kishida, Norio

    1999-03-01

    'Subroutine Library for Error Estimation of Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the linear system's solutions or the Hermitian matrices' eigenvalues. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calculate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The subroutines for error estimation of Hermitian matrix eigenvalues derive the error ranges of the eigenvalues according to the Korn-Kato's formula. The test matrix generators supply the matrices appeared in the mathematical research, the ones randomly generated and the ones appeared in the application programs. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  1. Concrete resource analysis of the quantum linear-system algorithm used to compute the electromagnetic scattering cross section of a 2D target

    Science.gov (United States)

    Scherer, Artur; Valiron, Benoît; Mau, Siun-Chuon; Alexander, Scott; van den Berg, Eric; Chapuran, Thomas E.

    2017-03-01

    We provide a detailed estimate for the logical resource requirements of the quantum linear-system algorithm (Harrow et al. in Phys Rev Lett 103:150502, 2009) including the recently described elaborations and application to computing the electromagnetic scattering cross section of a metallic target (Clader et al. in Phys Rev Lett 110:250504, 2013). Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width (related to parallelism), circuit depth (total number of steps), the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set { X, Y, Z, H, S, T, { CNOT } }. In order to perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the explicit example problem size N=332{,}020{,}680 beyond which, according to a crude big-O complexity comparison, the quantum linear-system algorithm is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy ɛ =0.01 requires an approximate circuit width 340 and circuit depth of order 10^{25} if oracle costs are excluded, and a circuit width and circuit depth of order 10^8 and 10^{29}, respectively, if the resource requirements of oracles are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly (using a fine-grained approach rather than relying on coarse big-O asymptotic approximations) how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  3. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  4. Estimating long-term uranium resource availability and discovery requirements. A Canadian case study

    International Nuclear Information System (INIS)

    Martin, H.L.; Azis, A.; Williams, R.M.

    1979-01-01

    Well-founded estimates of the rate at which a country's resources might be made available are a prime requisite for energy planners and policy makers at the national level. To meet this need, a method is discussed that can aid in the analysis of future supply patterns of uranium and other metals. Known sources are first appraised, on a mine-by-mine basis, in relation to projected domestic needs and expectable export levels. The gap between (a) production from current and anticipated mines, and (b) production levels needed to meet both domestic needs and export opportunities, would have to be met by new sources. Using as measuring sticks the resources and production capabilities of typical uranium deposits, a measure can be obtained of the required timing and magnitude of discovery needs. The new discoveries, when developed into mines, would need to be sufficient to meet not only any shortfalls in production capability, but also any special reserve requirements as stipulated, for example, under Canada's uranium export guidelines. Since the method can be followed simply and quickly, it can serve as a valuable tool for long-term supply assessments of any mineral commodity from a nation's mines. (author)

  5. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  6. Computable Error Estimates for Finite Element Approximations of Elliptic Partial Differential Equations with Rough Stochastic Data

    KAUST Repository

    Hall, Eric Joseph

    2016-12-08

    We derive computable error estimates for finite element approximations of linear elliptic partial differential equations with rough stochastic coefficients. In this setting, the exact solutions contain high frequency content that standard a posteriori error estimates fail to capture. We propose goal-oriented estimates, based on local error indicators, for the pathwise Galerkin and expected quadrature errors committed in standard, continuous, piecewise linear finite element approximations. Derived using easily validated assumptions, these novel estimates can be computed at a relatively low cost and have applications to subsurface flow problems in geophysics where the conductivities are assumed to have lognormal distributions with low regularity. Our theory is supported by numerical experiments on test problems in one and two dimensions.

  7. Estimation of subcriticality with the computed values analysis using MCNP of experiment on coupled cores

    International Nuclear Information System (INIS)

    Sakurai, Kiyoshi; Yamamoto, Toshihiro; Arakawa, Takuya; Naito, Yoshitaka

    1998-01-01

    Experiments on coupled cores performed at TCA were analysed using continuous energy Monte Carlo calculation code MCNP 4A. Errors of neutron multiplication factors are evaluated using Indirect Bias Estimation Method proposed by authors. Calculation for simulation of pulsed neutron method was performed for 17 X 17 + 5G + 17 x 17 core system and its of exponential experiment method was also performed for 16 x 9 + 3G + 16 x 9 and 16 x 9 + 5G + 16 x 9 core systems. Errors of neutron multiplication factors are estimated to be (-1.5) - (-0.6)% evaluated by Indirect Bias Estimation Method. Its errors evaluated by conventional pulsed neutron method and exponential experiment method are estimated to be 7%, but it is below 1% for estimation of subcriticality with the computed values by applying Indirect Bias Estimation Method. Feasibility of subcriticality management is higher by application of the method to full scale fuel strage facility. (author)

  8. Computationally fast estimation of muscle tension for realtime bio-feedback.

    Science.gov (United States)

    Murai, Akihiko; Kurosaki, Kosuke; Yamane, Katsu; Nakamura, Yoshihiko

    2009-01-01

    In this paper, we propose a method for realtime estimation of whole-body muscle tensions. The main problem of muscle tension estimation is that there are infinite number of solutions to realize a particular joint torque due to the actuation redundancy. Numerical optimization techniques, e.g. quadratic programming, are often employed to obtain a unique solution, but they are usually computationally expensive. For example, our implementation of quadratic programming takes about 0.17 sec per frame on the musculoskeletal model with 274 elements, which is far from realtime computation. Here, we propose to reduce the computational cost by using EMG data and by reducing the number of unknowns in the optimization. First, we compute the tensions of muscles with surface EMG data based on a biological muscle data, which is a very efficient process. We also assume that their synergists have the same activity levels and compute their tensions with the same model. Tensions of the remaining muscles are then computed using quadratic programming, but the number of unknowns is significantly reduced by assuming that the muscles in the same heteronymous group have the same activity level. The proposed method realizes realtime estimation and visualization of the whole-body muscle tensions that can be applied to sports training and rehabilitation.

  9. Computational estimation of soybean oil adulteration in Nepalese mustard seed oil based on fatty acid composition

    OpenAIRE

    Shrestha, Kshitij; De Meulenaer, Bruno

    2011-01-01

    The experiment was carried out for the computational estimation of soybean oil adulteration in the mustard seed oil using chemometric technique based on fatty acid composition. Principal component analysis and K-mean clustering of fatty acid composition data showed 4 major mustard/rapeseed clusters, two of high erucic and two of low erucic mustard type. Soybean and other possible adulterants made a distinct cluster from them. The methodology for estimation of soybean oil adulteration was deve...

  10. FORTRAN subroutine for computing the optimal estimate of f(x)

    International Nuclear Information System (INIS)

    Gaffney, P.W.

    1980-10-01

    A FORTRAN subroutine called RANGE is presented that is designed to compute the optimal estimate of a function f given values of the function at n distinct points x 1 2 < ... < x/sub n/ and given a bound on one of the derivatives of f. We donate this estimate by Ω. It is optimal in the sense that the error abs value (f - Ω) has the smallest possible error bound

  11. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  12. Estimating the Impact of Drought on Groundwater Resources of the Marshall Islands

    Directory of Open Access Journals (Sweden)

    Brandon L. Barkey

    2017-01-01

    Full Text Available Groundwater resources of small coral islands are threatened due to short-term and long-term changes in climate. A significant short-term threat is El Niño events, which typically induce a severe months-long drought for many atoll nations in the western and central Pacific regions that exhausts rainwater supply and necessitates the use of groundwater. This study quantifies fresh groundwater resources under both average rainfall and drought conditions for the Republic of Marshall Islands (RMI, a nation composed solely of atolls and which is severely impacted by El Niño droughts. The atoll island algebraic model is used to estimate the thickness of the freshwater lens for 680 inhabited and uninhabited islands of the RMI, with a focus on the severe 1998 drought. The model accounts for precipitation, island width, hydraulic conductivity of the upper Holocene-age sand aquifer, the depth to the contact between the Holocene aquifer and the lower Pleistocene-age limestone aquifer, and the presence of a reef flat plate underlying the ocean side of the island. Model results are tested for islands that have fresh groundwater data. Results highlight the fragility of groundwater resources for the nation. Average lens thickness during typical seasonal rainfall is approximately 4 m, with only 30% of the islands maintaining a lens thicker than 4.5% and 55% of the islands with a lens less than 2.5 m thick. Thicker lenses typically occur for larger islands, islands located on the leeward side of an atoll due to lower hydraulic conductivity, and islands located in the southern region of the RMI due to higher rainfall rates. During drought, groundwater on small islands (<300 m in width is completely depleted. Over half (54% of the islands are classified as “Highly Vulnerable” to drought. Results provide valuable information for RMI water resources planners, particularly during the current 2016 El Niño drought, and similar methods can be used to quantify

  13. Hemoglobin estimation by the HemoCue® portable hemoglobin photometer in a resource poor setting.

    Science.gov (United States)

    Nkrumah, Bernard; Nguah, Samuel Blay; Sarpong, Nimako; Dekker, Denise; Idriss, Ali; May, Juergen; Adu-Sarkodie, Yaw

    2011-04-21

    In resource poor settings where automated hematology analyzers are not available, the Cyanmethemoglobin method is often used. This method though cheaper, takes more time. In blood donations, the semi-quantitative gravimetric copper sulfate method which is very easy and inexpensive may be used but does not provide an acceptable degree of accuracy. The HemoCue® hemoglobin photometer has been used for these purposes. This study was conducted to generate data to support or refute its use as a point-of-care device for hemoglobin estimation in mobile blood donations and critical care areas in health facilities. EDTA blood was collected from study participants drawn from five groups: pre-school children, school children, pregnant women, non-pregnant women and men. Blood collected was immediately processed to estimate the hemoglobin concentration using three different methods (HemoCue®, Sysmex KX21N and Cyanmethemoglobin). Agreement between the test methods was assessed by the method of Bland and Altman. The Intraclass correlation coefficient (ICC) was used to determine the within subject variability of measured hemoglobin. Of 398 subjects, 42% were males with the overall mean age being 19.4 years. The overall mean hemoglobin as estimated by each method was 10.4 g/dl for HemoCue, 10.3 g/dl for Sysmex KX21N and 10.3 g/dl for Cyanmethemoglobin. Pairwise analysis revealed that the hemoglobin determined by the HemoCue method was higher than that measured by the KX21N and Cyanmethemoglobin. Comparing the hemoglobin determined by the HemoCue to Cyanmethemoglobin, the concordance correlation coefficient was 0.995 (95% CI: 0.994-0.996, p < 0.001). The Bland and Altman's limit of agreement was -0.389 - 0.644 g/dl with the mean difference being 0.127 (95% CI: 0.102-0.153) and a non-significant difference in variability between the two measurements (p = 0.843). After adjusting to assess the effect of other possible confounders such as sex, age and category of person, there was no

  14. Resource Estimations in Contingency Planning for Foot-and-Mouth Disease

    Directory of Open Access Journals (Sweden)

    Anette Boklund

    2017-05-01

    Full Text Available Preparedness planning for a veterinary crisis is important to be fast and effective in the eradication of disease. For countries with a large export of animals and animal products, each extra day in an epidemic will cost millions of Euros due to the closure of export markets. This is important for the Danish husbandry industry, especially the swine industry, which had an export of €4.4 billion in 2012. The purposes of this project were to (1 develop an iterative tool with the aim of estimating the resources needed during an outbreak of foot-and-mouth disease (FMD in Denmark, (2 identify areas, which can delay the control of the disease. The tool developed should easily be updated, when knowledge is gained from other veterinary crises or during an outbreak of FMD. The stochastic simulation model DTU-DADS was used to simulate spread of FMD in Denmark. For each task occurring during an epidemic of FMD, the time and personnel needed per herd was estimated by a working group with expertise in contingency and crisis management. By combining this information, an iterative model was created to calculate the needed personnel on a daily basis during the epidemic. The needed personnel was predicted to peak within the first week with a requirement of approximately 123 (65–175 veterinarians, 33 (23–64 technicians, and 36 (26–49 administrative staff on day 2, while the personnel needed in the Danish Emergency Management Agency (responsible for the hygiene barrier and initial cleaning and disinfection of the farm was predicted to be 174 (58–464, mostly recruits. The time needed for surveillance visits was predicted to be the most influential factor in the calculations. Based on results from a stochastic simulation model, it was possible to create an iterative model to estimate the requirements for personnel during an FMD outbreak in Denmark. The model can easily be adjusted, when new information on resources appears from management of other crisis or

  15. Hemoglobin estimation by the HemoCue® portable hemoglobin photometer in a resource poor setting

    Directory of Open Access Journals (Sweden)

    Idriss Ali

    2011-04-01

    Full Text Available Abstract Background In resource poor settings where automated hematology analyzers are not available, the Cyanmethemoglobin method is often used. This method though cheaper, takes more time. In blood donations, the semi-quantitative gravimetric copper sulfate method which is very easy and inexpensive may be used but does not provide an acceptable degree of accuracy. The HemoCue® hemoglobin photometer has been used for these purposes. This study was conducted to generate data to support or refute its use as a point-of-care device for hemoglobin estimation in mobile blood donations and critical care areas in health facilities. Method EDTA blood was collected from study participants drawn from five groups: pre-school children, school children, pregnant women, non-pregnant women and men. Blood collected was immediately processed to estimate the hemoglobin concentration using three different methods (HemoCue®, Sysmex KX21N and Cyanmethemoglobin. Agreement between the test methods was assessed by the method of Bland and Altman. The Intraclass correlation coefficient (ICC was used to determine the within subject variability of measured hemoglobin. Results Of 398 subjects, 42% were males with the overall mean age being 19.4 years. The overall mean hemoglobin as estimated by each method was 10.4 g/dl for HemoCue, 10.3 g/dl for Sysmex KX21N and 10.3 g/dl for Cyanmethemoglobin. Pairwise analysis revealed that the hemoglobin determined by the HemoCue method was higher than that measured by the KX21N and Cyanmethemoglobin. Comparing the hemoglobin determined by the HemoCue to Cyanmethemoglobin, the concordance correlation coefficient was 0.995 (95% CI: 0.994-0.996, p Conclusion Hemoglobin determined by the HemoCue method is comparable to that determined by the other methods. The HemoCue photometer is therefore recommended for use as on-the-spot device for determining hemoglobin in resource poor setting.

  16. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    Science.gov (United States)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  17. Solar resources estimation combining digital terrain models and satellite images techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bosch, J.L.; Batlles, F.J. [Universidad de Almeria, Departamento de Fisica Aplicada, Ctra. Sacramento s/n, 04120-Almeria (Spain); Zarzalejo, L.F. [CIEMAT, Departamento de Energia, Madrid (Spain); Lopez, G. [EPS-Universidad de Huelva, Departamento de Ingenieria Electrica y Termica, Huelva (Spain)

    2010-12-15

    One of the most important steps to make use of any renewable energy is to perform an accurate estimation of the resource that has to be exploited. In the designing process of both active and passive solar energy systems, radiation data is required for the site, with proper spatial resolution. Generally, a radiometric stations network is used in this evaluation, but when they are too dispersed or not available for the study area, satellite images can be utilized as indirect solar radiation measurements. Although satellite images cover wide areas with a good acquisition frequency they usually have a poor spatial resolution limited by the size of the image pixel, and irradiation must be interpolated to evaluate solar irradiation at a sub-pixel scale. When pixels are located in flat and homogeneous areas, correlation of solar irradiation is relatively high, and classic interpolation can provide a good estimation. However, in complex topography zones, data interpolation is not adequate and the use of Digital Terrain Model (DTM) information can be helpful. In this work, daily solar irradiation is estimated for a wide mountainous area using a combination of Meteosat satellite images and a DTM, with the advantage of avoiding the necessity of ground measurements. This methodology utilizes a modified Heliosat-2 model, and applies for all sky conditions; it also introduces a horizon calculation of the DTM points and accounts for the effect of snow covers. Model performance has been evaluated against data measured in 12 radiometric stations, with results in terms of the Root Mean Square Error (RMSE) of 10%, and a Mean Bias Error (MBE) of +2%, both expressed as a percentage of the mean value measured. (author)

  18. DrugSig: A resource for computational drug repositioning utilizing gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Hongyu Wu

    Full Text Available Computational drug repositioning has been proved as an effective approach to develop new drug uses. However, currently existing strategies strongly rely on drug response gene signatures which scattered in separated or individual experimental data, and resulted in low efficient outputs. So, a fully drug response gene signatures database will be very helpful to these methods. We collected drug response microarray data and annotated related drug and targets information from public databases and scientific literature. By selecting top 500 up-regulated and down-regulated genes as drug signatures, we manually established the DrugSig database. Currently DrugSig contains more than 1300 drugs, 7000 microarray and 800 targets. Moreover, we developed the signature based and target based functions to aid drug repositioning. The constructed database can serve as a resource to quicken computational drug repositioning. Database URL: http://biotechlab.fudan.edu.cn/database/drugsig/.

  19. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  20. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  1. The significance of regulation and land use patterns on natural gas resource estimates in the Marcellus shale

    International Nuclear Information System (INIS)

    Blohm, Andrew; Peichel, Jeremy; Smith, Caroline; Kougentakis, Alexandra

    2012-01-01

    Recent advancements in natural gas extraction (e.g. hydraulic fracturing) have significantly increased natural gas reserves in the United States. Estimates of the technically recoverable natural gas (TRR) in the Marcellus range between 141 trillion cubic feet (TCF) and 489 TCF. However, TRR estimation does not incorporate existing policies, regulations, or land use. We find that approximately 48% of the Marcellus in New York and Pennsylvania is inaccessible given land use patterns and current policy. In New York, approximately 83% of the Marcellus is inaccessible; while in Pennsylvania about 32% of the Marcellus is off limits to drilling. The New York portion of the Marcellus is estimated to have a TRR of between 19.9 TCF and 68.9 TCF. We estimate that 79% of the resource is inaccessible, which results in an accessible resource estimate of between 4.2 TCF and 14.4 TCF. In Pennsylvania, the shale gas TRR is estimated at 86.6–300 TCF. However, we estimate that 31% of the resource is inaccessible, which results in an accessible resource estimate of between 60.0 TCF and 208 TCF. - Highlights: ► Existing natural gas reserve estimation techniques ignore land use, regulations, and policies. ► The impact of land use and regulation on the recoverable natural gas resource is significant. ► 48% of the Marcellus Shale in New York and Pennsylvania is inaccessible to drilling. ► In New York, approximately 83% of the Marcellus is inaccessible. ► In Pennsylvania about 32% of the Marcellus is off limits to drilling.

  2. Estimation of intermediate-grade uranium resources II. Proposed method for estimating intermediate-grade uranium resources in roll-front deposits. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Yee, S.N.

    1981-09-01

    The purpose of this and a previous project was to examine the feasibility of estimating intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) on the basis of existing, sparsely drilled holes. All data are from the Powder River Basin in Wyoming. DOE makes preliminary estimates of endowment by calculating an Average Area of Influence (AAI) based on densely drilled areas, multiplying that by the thickness of the mineralization and then dividing by a tonnage factor. The resulting tonnage of ore is then multiplied by the average grade of the interval to obtain the estimate of U 3 O 8 tonnage. Total endowment is the sum of these values over all mineralized intervals in all wells in the area. In regions where wells are densely drilled and approximately regularly spaced this technique approaches the classical polygonal estimation technique used to estimate ore reserves and should be fairly reliable. The method is conservative because: (1) in sparsely drilled regions a large fraction of the area is not considered to contribute to endowment; (2) there is a bias created by the different distributions of point grades and mining block grades. A conservative approach may be justified for purposes of ore reserve estimation, where large investments may hinge on local forecasts. But for estimates of endowment over areas as large as 1 0 by 2 0 quadrangles, or the nation as a whole, errors in local predictions are not critical as long as they tend to cancel and a less conservative estimation approach may be justified.One candidate, developed for this study and described is called the contoured thickness technique. A comparison of estimates based on the contoured thickness approach with DOE calculations for five areas of Wyoming roll-fronts in the Powder River Basin is presented. The sensitivity of the technique to well density is examined and the question of predicting intermediate grade endowment from data on higher grades is discussed

  3. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  4. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  5. Estimating social carrying capacity through computer simulation modeling: an application to Arches National Park, Utah

    Science.gov (United States)

    Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere

    2001-01-01

    Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...

  6. Epicardial adipose tissue volume estimation by postmortem computed tomography of eviscerated hearts

    DEFF Research Database (Denmark)

    Hindsø, Louise; Jakobsen, Lykke S; Jacobsen, Christina

    2017-01-01

    Epicardial adipose tissue (EAT) may play a role in the development of coronary artery disease. The purpose of this study was to evaluate a method based on postmortem computed tomography (PMCT) for the estimation of EAT volume. We PMCT-scanned the eviscerated hearts of 144 deceased individuals, wh...

  7. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  8. Direct estimation of human trabecular bone stiffness using cone beam computed tomography.

    Science.gov (United States)

    Klintström, Eva; Klintström, Benjamin; Pahr, Dieter; Brismar, Torkel B; Smedby, Örjan; Moreno, Rodrigo

    2018-04-10

    The aim of this study was to evaluate the possibility of estimating the biomechanical properties of trabecular bone through finite element simulations by using dental cone beam computed tomography data. Fourteen human radius specimens were scanned in 3 cone beam computed tomography devices: 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan), NewTom 5 G (QR Verona, Verona, Italy), and Verity (Planmed, Helsinki, Finland). The imaging data were segmented by using 2 different methods. Stiffness (Young modulus), shear moduli, and the size and shape of the stiffness tensor were studied. Corresponding evaluations by using micro-CT were regarded as the reference standard. The 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan) showed good performance in estimating stiffness and shear moduli but was sensitive to the choice of segmentation method. NewTom 5 G (QR Verona, Verona, Italy) and Verity (Planmed, Helsinki, Finland) yielded good correlations, but they were not as strong as Accuitomo 80 (J. Morita MFG., Kyoto, Japan). The cone beam computed tomography devices overestimated both stiffness and shear compared with the micro-CT estimations. Finite element-based calculations of biomechanics from cone beam computed tomography data are feasible, with strong correlations for the Accuitomo 80 scanner (J. Morita MFG., Kyoto, Japan) combined with an appropriate segmentation method. Such measurements might be useful for predicting implant survival by in vivo estimations of bone properties. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Comparison of computer models for estimating hydrology and water quality in an agricultural watershed

    Science.gov (United States)

    Various computer models, ranging from simple to complex, have been developed to simulate hydrology and water quality from field to watershed scales. However, many users are uncertain about which model to choose when estimating water quantity and quality conditions in a watershed. This study compared...

  10. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods.

    Science.gov (United States)

    Teixeira, C A; Pereira, W C A; Ruano, A E; Ruano, M Graça

    2010-01-01

    This work reports original results on the possibility of non-invasive temperature estimation (NITE) in a multilayered phantom by applying soft-computing methods. The existence of reliable non-invasive temperature estimator models would improve the security and efficacy of thermal therapies. These points would lead to a broader acceptance of this kind of therapies. Several approaches based on medical imaging technologies were proposed, magnetic resonance imaging (MRI) being appointed as the only one to achieve the acceptable temperature resolutions for hyperthermia purposes. However, MRI intrinsic characteristics (e.g., high instrumentation cost) lead us to use backscattered ultrasound (BSU). Among the different BSU features, temporal echo-shifts have received a major attention. These shifts are due to changes of speed-of-sound and expansion of the medium. The originality of this work involves two aspects: the estimator model itself is original (based on soft-computing methods) and the application to temperature estimation in a three-layer phantom is also not reported in literature. In this work a three-layer (non-homogeneous) phantom was developed. The two external layers were composed of (in % of weight): 86.5% degassed water, 11% glycerin and 2.5% agar-agar. The intermediate layer was obtained by adding graphite powder in the amount of 2% of the water weight to the above composition. The phantom was developed to have attenuation and speed-of-sound similar to in vivo muscle, according to the literature. BSU signals were collected and cumulative temporal echo-shifts computed. These shifts and the past temperature values were then considered as possible estimators inputs. A soft-computing methodology was applied to look for appropriate multilayered temperature estimators. The methodology involves radial-basis functions neural networks (RBFNN) with structure optimized by the multi-objective genetic algorithm (MOGA). In this work 40 operating conditions were

  11. Development of a computationally efficient algorithm for attitude estimation of a remote sensing satellite

    Science.gov (United States)

    Labibian, Amir; Bahrami, Amir Hossein; Haghshenas, Javad

    2017-09-01

    This paper presents a computationally efficient algorithm for attitude estimation of remote a sensing satellite. In this study, gyro, magnetometer, sun sensor and star tracker are used in Extended Kalman Filter (EKF) structure for the purpose of Attitude Determination (AD). However, utilizing all of the measurement data simultaneously in EKF structure increases computational burden. Specifically, assuming n observation vectors, an inverse of a 3n×3n matrix is required for gain calculation. In order to solve this problem, an efficient version of EKF, namely Murrell's version, is employed. This method utilizes measurements separately at each sampling time for gain computation. Therefore, an inverse of a 3n×3n matrix is replaced by an inverse of a 3×3 matrix for each measurement vector. Moreover, gyro drifts during the time can reduce the pointing accuracy. Therefore, a calibration algorithm is utilized for estimation of the main gyro parameters.

  12. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  13. Estimation of Distribution Algorithm for Resource Allocation in Green Cooperative Cognitive Radio Sensor Networks

    Directory of Open Access Journals (Sweden)

    Alagan Anpalagan

    2013-04-01

    Full Text Available Due to the rapid increase in the usage and demand of wireless sensor networks (WSN, the limited frequency spectrum available for WSN applications will be extremely crowded in the near future. More sensor devices also mean more recharging/replacement of batteries, which will cause significant impact on the global carbon footprint. In this paper, we propose a relay-assisted cognitive radio sensor network (CRSN that allocates communication resources in an environmentally friendly manner. We use shared band amplify and forward relaying for cooperative communication in the proposed CRSN. We present a multi-objective optimization architecture for resource allocation in a green cooperative cognitive radio sensor network (GC-CRSN. The proposed multi-objective framework jointly performs relay assignment and power allocation in GC-CRSN, while optimizing two conflicting objectives. The first objective is to maximize the total throughput, and the second objective is to minimize the total transmission power of CRSN. The proposed relay assignment and power allocation problem is a non-convex mixed-integer non-linear optimization problem (NC-MINLP, which is generally non-deterministic polynomial-time (NP-hard. We introduce a hybrid heuristic algorithm for this problem. The hybrid heuristic includes an estimation-of-distribution algorithm (EDA for performing power allocation and iterative greedy schemes for constraint satisfaction and relay assignment. We analyze the throughput and power consumption tradeoff in GC-CRSN. A detailed analysis of the performance of the proposed algorithm is presented with the simulation results.

  14. Computationally Efficient 2D DOA Estimation with Uniform Rectangular Array in Low-Grazing Angle

    Directory of Open Access Journals (Sweden)

    Junpeng Shi

    2017-02-01

    Full Text Available In this paper, we propose a computationally efficient spatial differencing matrix set (SDMS method for two-dimensional direction of arrival (2D DOA estimation with uniform rectangular arrays (URAs in a low-grazing angle (LGA condition. By rearranging the auto-correlation and cross-correlation matrices in turn among different subarrays, the SDMS method can estimate the two parameters independently with one-dimensional (1D subspace-based estimation techniques, where we only perform difference for auto-correlation matrices and the cross-correlation matrices are kept completely. Then, the pair-matching of two parameters is achieved by extracting the diagonal elements of URA. Thus, the proposed method can decrease the computational complexity, suppress the effect of additive noise and also have little information loss. Simulation results show that, in LGA, compared to other methods, the proposed methods can achieve performance improvement in the white or colored noise conditions.

  15. Estimating resource acquisition and at-sea body condition of a marine predator

    Science.gov (United States)

    Schick, Robert S; New, Leslie F; Thomas, Len; Costa, Daniel P; Hindell, Mark A; McMahon, Clive R; Robinson, Patrick W; Simmons, Samantha E; Thums, Michele; Harwood, John; Clark, James S

    2013-01-01

    Body condition plays a fundamental role in many ecological and evolutionary processes at a variety of scales and across a broad range of animal taxa. An understanding of how body condition changes at fine spatial and temporal scales as a result of interaction with the environment provides necessary information about how animals acquire resources. However, comparatively little is known about intra- and interindividual variation of condition in marine systems. Where condition has been studied, changes typically are recorded at relatively coarse time-scales. By quantifying how fine-scale interaction with the environment influences condition, we can broaden our understanding of how animals acquire resources and allocate them to body stores. Here we used a hierarchical Bayesian state-space model to estimate the body condition as measured by the size of an animal's lipid store in two closely related species of marine predator that occupy different hemispheres: northern elephant seals (Mirounga angustirostris) and southern elephant seals (Mirounga leonina). The observation model linked drift dives to lipid stores. The process model quantified daily changes in lipid stores as a function of the physiological condition of the seal (lipid:lean tissue ratio, departure lipid and departure mass), its foraging location, two measures of behaviour and environmental covariates. We found that physiological condition significantly impacted lipid gain at two time-scales – daily and at departure from the colony – that foraging location was significantly associated with lipid gain in both species of elephant seals and that long-term behavioural phase was associated with positive lipid gain in northern and southern elephant seals. In northern elephant seals, the occurrence of short-term behavioural states assumed to represent foraging were correlated with lipid gain. Lipid gain was a function of covariates in both species. Southern elephant seals performed fewer drift dives than

  16. Usefulness of an enhanced Kitaev phase-estimation algorithm in quantum metrology and computation

    Science.gov (United States)

    Kaftal, Tomasz; Demkowicz-Dobrzański, Rafał

    2014-12-01

    We analyze the performance of a generalized Kitaev's phase-estimation algorithm where N phase gates, acting on M qubits prepared in a product state, may be distributed in an arbitrary way. Unlike the standard algorithm, where the mean square error scales as 1 /N , the optimal generalizations offer the Heisenberg 1 /N2 error scaling and we show that they are in fact very close to the fundamental Bayesian estimation bound. We also demonstrate that the optimality of the algorithm breaks down when losses are taken into account, in which case the performance is inferior to the optimal entanglement-based estimation strategies. Finally, we show that when an alternative resource quantification is adopted, which describes the phase estimation in Shor's algorithm more accurately, the standard Kitaev's procedure is indeed optimal and there is no need to consider its generalized version.

  17. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    Science.gov (United States)

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  18. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  19. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  20. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  1. Monitoring of computing resource use of active software releases at ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219183; The ATLAS collaboration

    2017-01-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and dis...

  2. Research on uranium resource models. Part IV. Logic: a computer graphics program to construct integrated logic circuits for genetic-geologic models. Progress report

    International Nuclear Information System (INIS)

    Scott, W.A.; Turner, R.M.; McCammon, R.B.

    1981-01-01

    Integrated logic circuits were described as a means of formally representing genetic-geologic models for estimating undiscovered uranium resources. The logic circuits are logical combinations of selected geologic characteristics judged to be associated with particular types of uranium deposits. Each combination takes on a value which corresponds to the combined presence, absence, or don't know states of the selected characteristic within a specified geographic cell. Within each cell, the output of the logic circuit is taken as a measure of the favorability of occurrence of an undiscovered deposit of the type being considered. In this way, geological, geochemical, and geophysical data are incorporated explicitly into potential uranium resource estimates. The present report describes how integrated logic circuits are constructed by use of a computer graphics program. A user's guide is also included

  3. Breccia-pipe uranium mining in northern Arizona; estimate of resources and assessment of historical effects

    Science.gov (United States)

    Bills, Donald J.; Brown, Kristin M.; Alpine, Andrea E.; Otton, James K.; Van Gosen, Bradley S.; Hinck, Jo Ellen; Tillman, Fred D.

    2011-01-01

    About 1 million acres of Federal land in the Grand Canyon region of Arizona were temporarily withdrawn from new mining claims in July 2009 by the Secretary of the Interior because of concern that increased uranium mining could have negative impacts on the land, water, people, and wildlife. During a 2-year interval, a Federal team led by the Bureau of Land Management is evaluating the effects of withdrawing these lands for extended periods. As part of this team, the U.S. Geological Survey (USGS) conducted a series of short-term studies to examine the historical effects of breccia-pipe uranium mining in the region. The USGS studies provide estimates of uranium resources affected by the possible land withdrawal, examine the effects of previous breccia-pipe mining, summarize water-chemistry data for streams and springs, and investigate potential biological pathways of exposure to uranium and associated contaminants. This fact sheet summarizes results through December 2009 and outlines further research needs.

  4. Sensitivity of Calibrated Parameters and Water Resource Estimates on Different Objective Functions and Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Delaram Houshmand Kouchi

    2017-05-01

    Full Text Available The successful application of hydrological models relies on careful calibration and uncertainty analysis. However, there are many different calibration/uncertainty analysis algorithms, and each could be run with different objective functions. In this paper, we highlight the fact that each combination of optimization algorithm-objective functions may lead to a different set of optimum parameters, while having the same performance; this makes the interpretation of dominant hydrological processes in a watershed highly uncertain. We used three different optimization algorithms (SUFI-2, GLUE, and PSO, and eight different objective functions (R2, bR2, NSE, MNS, RSR, SSQR, KGE, and PBIAS in a SWAT model to calibrate the monthly discharges in two watersheds in Iran. The results show that all three algorithms, using the same objective function, produced acceptable calibration results; however, with significantly different parameter ranges. Similarly, an algorithm using different objective functions also produced acceptable calibration results, but with different parameter ranges. The different calibrated parameter ranges consequently resulted in significantly different water resource estimates. Hence, the parameters and the outputs that they produce in a calibrated model are “conditioned” on the choices of the optimization algorithm and objective function. This adds another level of non-negligible uncertainty to watershed models, calling for more attention and investigation in this area.

  5. SPADER - Science Planning Analysis and Data Estimation Resource for the NASA Parker Solar Probe Mission

    Science.gov (United States)

    Rodgers, D. J.; Fox, N. J.; Kusterer, M. B.; Turner, F. S.; Woleslagle, A. B.

    2017-12-01

    Scheduled to launch in July 2018, the Parker Solar Probe (PSP) will orbit the Sun for seven years, making a total of twenty-four extended encounters inside a solar radial distance of 0.25 AU. During most orbits, there are extended periods of time where PSP-Sun-Earth geometry dramatically reduces PSP-Earth communications via the Deep Space Network (DSN); there is the possibility that multiple orbits will have little to no high-rate downlink available. Science and housekeeping data taken during an encounter may reside on the spacecraft solid state recorder (SSR) for multiple orbits, potentially running the risk of overflowing the SSR in the absence of mitigation. The Science Planning Analysis and Data Estimation Resource (SPADER) has been developed to provide the science and operations teams the ability to plan operations accounting for multiple orbits in order to mitigate the effects caused by the lack of high-rate downlink. Capabilities and visualizations of SPADER are presented; further complications associated with file downlink priority and high-speed data transfers between instrument SSRs and the spacecraft SSR are discussed, as well as the long-term consequences of variations in DSN downlink parameters on the science data downlink.

  6. Geological Carbon Sequestration Storage Resource Estimates for the Ordovician St. Peter Sandstone, Illinois and Michigan Basins, USA

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, David; Ellett, Kevin; Leetaru, Hannes

    2014-09-30

    The Cambro-Ordovician strata of the Midwest of the United States is a primary target for potential geological storage of CO2 in deep saline formations. The objective of this project is to develop a comprehensive evaluation of the Cambro-Ordovician strata in the Illinois and Michigan Basins above the basal Mount Simon Sandstone since the Mount Simon is the subject of other investigations including a demonstration-scale injection at the Illinois Basin Decatur Project. The primary reservoir targets investigated in this study are the middle Ordovician St Peter Sandstone and the late Cambrian to early Ordovician Knox Group carbonates. The topic of this report is a regional-scale evaluation of the geologic storage resource potential of the St Peter Sandstone in both the Illinois and Michigan Basins. Multiple deterministic-based approaches were used in conjunction with the probabilistic-based storage efficiency factors published in the DOE methodology to estimate the carbon storage resource of the formation. Extensive data sets of core analyses and wireline logs were compiled to develop the necessary inputs for volumetric calculations. Results demonstrate how the range in uncertainty of storage resource estimates varies as a function of data availability and quality, and the underlying assumptions used in the different approaches. In the simplest approach, storage resource estimates were calculated from mapping the gross thickness of the formation and applying a single estimate of the effective mean porosity of the formation. Results from this approach led to storage resource estimates ranging from 3.3 to 35.1 Gt in the Michigan Basin, and 1.0 to 11.0 Gt in the Illinois Basin at the P10 and P90 probability level, respectively. The second approach involved consideration of the diagenetic history of the formation throughout the two basins and used depth-dependent functions of porosity to derive a more realistic spatially variable model of porosity rather than applying a

  7. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  8. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  9. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  10. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  11. Computing ordinary least-squares parameter estimates for the National Descriptive Model of Mercury in Fish

    Science.gov (United States)

    Donato, David I.

    2013-01-01

    A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.

  12. Unconventional energy resources in a crowded subsurface: Reducing uncertainty and developing a separation zone concept for resource estimation and deep 3D subsurface planning using legacy mining data.

    Science.gov (United States)

    Monaghan, Alison A

    2017-12-01

    Over significant areas of the UK and western Europe, anthropogenic alteration of the subsurface by mining of coal has occurred beneath highly populated areas which are now considering a multiplicity of 'low carbon' unconventional energy resources including shale gas and oil, coal bed methane, geothermal energy and energy storage. To enable decision making on the 3D planning, licensing and extraction of these resources requires reduced uncertainty around complex geology and hydrogeological and geomechanical processes. An exemplar from the Carboniferous of central Scotland, UK, illustrates how, in areas lacking hydrocarbon well production data and 3D seismic surveys, legacy coal mine plans and associated boreholes provide valuable data that can be used to reduce the uncertainty around geometry and faulting of subsurface energy resources. However, legacy coal mines also limit unconventional resource volumes since mines and associated shafts alter the stress and hydrogeochemical state of the subsurface, commonly forming pathways to the surface. To reduce the risk of subsurface connections between energy resources, an example of an adapted methodology is described for shale gas/oil resource estimation to include a vertical separation or 'stand-off' zone between the deepest mine workings, to ensure the hydraulic fracturing required for shale resource production would not intersect legacy coal mines. Whilst the size of such separation zones requires further work, developing the concept of 3D spatial separation and planning is key to utilising the crowded subsurface energy system, whilst mitigating against resource sterilisation and environmental impacts, and could play a role in positively informing public and policy debate. Copyright © 2017 British Geological Survey, a component institute of NERC. Published by Elsevier B.V. All rights reserved.

  13. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    International Nuclear Information System (INIS)

    Norris, Edward T.; Liu, Xin; Hsieh, Jiang

    2015-01-01

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. The CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer

  14. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  15. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  16. Computational error estimates for Monte Carlo finite element approximation with log normal diffusion coefficients

    KAUST Repository

    Sandberg, Mattias

    2015-01-07

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with log normal distributed diffusion coefficients, e.g. modelling ground water flow. Typical models use log normal diffusion coefficients with H¨older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. This talk will address how the total error can be estimated by the computable error.

  17. Computable error estimates for Monte Carlo finite element approximation of elliptic PDE with lognormal diffusion coefficients

    KAUST Repository

    Hall, Eric

    2016-01-09

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with lognormal distributed diffusion coefficients, e.g. modeling ground water flow. Typical models use lognormal diffusion coefficients with H´ older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. We address how the total error can be estimated by the computable error.

  18. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...... the learning game “Global Conflicts: Latin America” as a resource into the teaching and learning of a course involving the two subjects “English language learning” and “Social studies” at the final year in a Danish high school. The study adapts an explorative research design approach and investigates...

  19. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  20. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  1. Computer simulation comparison of tripolar, bipolar, and spline Laplacian electrocadiogram estimators.

    Science.gov (United States)

    Chen, T; Besio, W; Dai, W

    2009-01-01

    A comparison of the performance of the tripolar and bipolar concentric as well as spline Laplacian electrocardiograms (LECGs) and body surface Laplacian mappings (BSLMs) for localizing and imaging the cardiac electrical activation has been investigated based on computer simulation. In the simulation a simplified eccentric heart-torso sphere-cylinder homogeneous volume conductor model were developed. Multiple dipoles with different orientations were used to simulate the underlying cardiac electrical activities. Results show that the tripolar concentric ring electrodes produce the most accurate LECG and BSLM estimation among the three estimators with the best performance in spatial resolution.

  2. Microdiamond grade as a regionalised variable - some basic requirements for successful local microdiamond resource estimation of kimberlites

    Science.gov (United States)

    Stiefenhofer, Johann; Thurston, Malcolm L.; Bush, David E.

    2018-04-01

    Microdiamonds offer several advantages as a resource estimation tool, such as access to deeper parts of a deposit which may be beyond the reach of large diameter drilling (LDD) techniques, the recovery of the total diamond content in the kimberlite, and a cost benefit due to the cheaper treatment cost compared to large diameter samples. In this paper we take the first step towards local estimation by showing that micro-diamond samples can be treated as a regionalised variable suitable for use in geostatistical applications and we show examples of such output. Examples of microdiamond variograms are presented, the variance-support relationship for microdiamonds is demonstrated and consistency of the diamond size frequency distribution (SFD) is shown with the aid of real datasets. The focus therefore is on why local microdiamond estimation should be possible, not how to generate such estimates. Data from our case studies and examples demonstrate a positive correlation between micro- and macrodiamond sample grades as well as block estimates. This relationship can be demonstrated repeatedly across multiple mining operations. The smaller sample support size for microdiamond samples is a key difference between micro- and macrodiamond estimates and this aspect must be taken into account during the estimation process. We discuss three methods which can be used to validate or reconcile the estimates against macrodiamond data, either as estimates or in the form of production grades: (i) reconcilliation using production data, (ii) by comparing LDD-based grade estimates against microdiamond-based estimates and (iii) using simulation techniques.

  3. Reduced-Complexity Direction of Arrival Estimation Using Real-Valued Computation with Arbitrary Array Configurations

    Directory of Open Access Journals (Sweden)

    Feng-Gang Yan

    2018-01-01

    Full Text Available A low-complexity algorithm is presented to dramatically reduce the complexity of the multiple signal classification (MUSIC algorithm for direction of arrival (DOA estimation, in which both tasks of eigenvalue decomposition (EVD and spectral search are implemented with efficient real-valued computations, leading to about 75% complexity reduction as compared to the standard MUSIC. Furthermore, the proposed technique has no dependence on array configurations and is hence suitable for arbitrary array geometries, which shows a significant implementation advantage over most state-of-the-art unitary estimators including unitary MUSIC (U-MUSIC. Numerical simulations over a wide range of scenarios are conducted to show the performance of the new technique, which demonstrates that with a significantly reduced computational complexity, the new approach is able to provide a close accuracy to the standard MUSIC.

  4. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  5. Adaptive resource allocation scheme using sliding window subchannel gain computation: context of OFDMA wireless mobiles systems

    International Nuclear Information System (INIS)

    Khelifa, F.; Samet, A.; Ben Hassen, W.; Afif, M.

    2011-01-01

    Multiuser diversity combined with Orthogonal Frequency Division Multiple Access (OFDMA) are a promising technique for achieving high downlink capacities in new generation of cellular and wireless network systems. The total capacity of OFDMA based-system is maximized when each subchannel is assigned to the mobile station with the best channel to noise ratio for that subchannel with power is uniformly distributed between all subchannels. A contiguous method for subchannel construction is adopted in IEEE 802.16 m standard in order to reduce OFDMA system complexity. In this context, new subchannel gain computation method, can contribute, jointly with optimal assignment subchannel to maximize total system capacity. In this paper, two new methods have been proposed in order to achieve a better trade-off between fairness and efficiency use of resources. Numerical results show that proposed algorithms provide low complexity, higher total system capacity and fairness among users compared to others recent methods.

  6. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  7. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  8. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  9. Assessing different parameters estimation methods of Weibull distribution to compute wind power density

    International Nuclear Information System (INIS)

    Mohammadi, Kasra; Alavi, Omid; Mostafaeipour, Ali; Goudarzi, Navid; Jalilvand, Mahdi

    2016-01-01

    Highlights: • Effectiveness of six numerical methods is evaluated to determine wind power density. • More appropriate method for computing the daily wind power density is estimated. • Four windy stations located in the south part of Alberta, Canada namely is investigated. • The more appropriate parameters estimation method was not identical among all examined stations. - Abstract: In this study, the effectiveness of six numerical methods is evaluated to determine the shape (k) and scale (c) parameters of Weibull distribution function for the purpose of calculating the wind power density. The selected methods are graphical method (GP), empirical method of Justus (EMJ), empirical method of Lysen (EML), energy pattern factor method (EPF), maximum likelihood method (ML) and modified maximum likelihood method (MML). The purpose of this study is to identify the more appropriate method for computing the wind power density in four stations distributed in Alberta province of Canada namely Edmonton City Center Awos, Grande Prairie A, Lethbridge A and Waterton Park Gate. To provide a complete analysis, the evaluations are performed on both daily and monthly scales. The results indicate that the precision of computed wind power density values change when different parameters estimation methods are used to determine the k and c parameters. Four methods of EMJ, EML, EPF and ML present very favorable efficiency while the GP method shows weak ability for all stations. However, it is found that the more effective method is not similar among stations owing to the difference in the wind characteristics.

  10. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    Science.gov (United States)

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  11. Monitoring of computing resource use of active software releases at ATLAS

    Science.gov (United States)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  12. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Directory of Open Access Journals (Sweden)

    Qian Li

    Full Text Available BACKGROUND: Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. METHODOLOGY: We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671 between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. CONCLUSIONS: This article proposes a network-based multi-target computational estimation

  13. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Science.gov (United States)

    Li, Qian; Li, Xudong; Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-03-22

    Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by

  14. Estimate of Hot Dry Rock Geothermal Resource in Daqing Oilfield, Northeast China

    OpenAIRE

    Guangzheng Jiang; Yi Wang; Yizuo Shi; Chao Zhang; Xiaoyin Tang; Shengbiao Hu

    2016-01-01

    Development and utilization of deep geothermal resources, especially a hot dry rock (HDR) geothermal resource, is beneficial for both economic and environmental consideration in oilfields. This study used data from multiple sources to assess the geothermal energy resource in the Daqing Oilfield. The temperature logs in boreholes (both shallow water wells and deep boreholes) and the drilling stem test temperature were used to create isothermal maps in depths. Upon the temperature field and the...

  15. A high resolution global wind atlas - improving estimation of world wind resources

    DEFF Research Database (Denmark)

    Badger, Jake; Ejsing Jørgensen, Hans

    2011-01-01

    to population centres, electrical transmission grids, terrain types, and protected land areas are important parts of the resource assessment downstream of the generation of wind climate statistics. Related to these issues of integration are the temporal characteristics and spatial correlation of the wind...... resources. These aspects will also be addressed by the Global Wind Atlas. The Global Wind Atlas, through a transparent methodology, will provide a unified, high resolution, and public domain dataset of wind energy resources for the whole world. The wind atlas data will be the most appropriate wind resource...

  16. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  17. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Computationally Efficient 2D DOA Estimation for L-Shaped Array with Unknown Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Yang-Yang Dong

    2018-01-01

    Full Text Available Although L-shaped array can provide good angle estimation performance and is easy to implement, its two-dimensional (2D direction-of-arrival (DOA performance degrades greatly in the presence of mutual coupling. To deal with the mutual coupling effect, a novel 2D DOA estimation method for L-shaped array with low computational complexity is developed in this paper. First, we generalize the conventional mutual coupling model for L-shaped array and compensate the mutual coupling blindly via sacrificing a few sensors as auxiliary elements. Then we apply the propagator method twice to mitigate the effect of strong source signal correlation effect. Finally, the estimations of azimuth and elevation angles are achieved simultaneously without pair matching via the complex eigenvalue technique. Compared with the existing methods, the proposed method is computationally efficient without spectrum search or polynomial rooting and also has fine angle estimation performance for highly correlated source signals. Theoretical analysis and simulation results have demonstrated the effectiveness of the proposed method.

  19. Estimate of fuel burnup spatial a multipurpose reactor in computer simulation

    International Nuclear Information System (INIS)

    Santos, Nadia Rodrigues dos; Lima, Zelmo Rodrigues de; Moreira, Maria de Lourdes

    2015-01-01

    In previous research, which aimed, through computer simulation, estimate the spatial fuel burnup for the research reactor benchmark, material test research - International Atomic Energy Agency (MTR/IAEA), it was found that the use of the code in FORTRAN language, based on the diffusion theory of neutrons and WIMSD-5B, which makes cell calculation, bespoke be valid to estimate the spatial burnup other nuclear research reactors. That said, this paper aims to present the results of computer simulation to estimate the space fuel burnup of a typical multipurpose reactor, plate type and dispersion. the results were considered satisfactory, being in line with those presented in the literature. for future work is suggested simulations with other core configurations. are also suggested comparisons of WIMSD-5B results with programs often employed in burnup calculations and also test different methods of interpolation values obtained by FORTRAN. Another proposal is to estimate the burning fuel, taking into account the thermohydraulics parameters and the appearance of xenon. (author)

  20. A Computable Plug-In Estimator of Minimum Volume Sets for Novelty Detection

    KAUST Repository

    Park, Chiwoo

    2010-10-01

    A minimum volume set of a probability density is a region of minimum size among the regions covering a given probability mass of the density. Effective methods for finding the minimum volume sets are very useful for detecting failures or anomalies in commercial and security applications-a problem known as novelty detection. One theoretical approach of estimating the minimum volume set is to use a density level set where a kernel density estimator is plugged into the optimization problem that yields the appropriate level. Such a plug-in estimator is not of practical use because solving the corresponding minimization problem is usually intractable. A modified plug-in estimator was proposed by Hyndman in 1996 to overcome the computation difficulty of the theoretical approach but is not well studied in the literature. In this paper, we provide theoretical support to this estimator by showing its asymptotic consistency. We also show that this estimator is very competitive to other existing novelty detection methods through an extensive empirical study. ©2010 INFORMS.

  1. A Bootstrap Approach to Computing Uncertainty in Inferred Oil and Gas Reserve Estimates

    International Nuclear Information System (INIS)

    Attanasi, Emil D.; Coburn, Timothy C.

    2004-01-01

    This study develops confidence intervals for estimates of inferred oil and gas reserves based on bootstrap procedures. Inferred reserves are expected additions to proved reserves in previously discovered conventional oil and gas fields. Estimates of inferred reserves accounted for 65% of the total oil and 34% of the total gas assessed in the U.S. Geological Survey's 1995 National Assessment of oil and gas in US onshore and State offshore areas. When the same computational methods used in the 1995 Assessment are applied to more recent data, the 80-year (from 1997 through 2076) inferred reserve estimates for pre-1997 discoveries located in the lower 48 onshore and state offshore areas amounted to a total of 39.7 billion barrels of oil (BBO) and 293 trillion cubic feet (TCF) of gas. The 90% confidence interval about the oil estimate derived from the bootstrap approach is 22.4 BBO to 69.5 BBO. The comparable 90% confidence interval for the inferred gas reserve estimate is 217 TCF to 413 TCF. The 90% confidence interval describes the uncertainty that should be attached to the estimates. It also provides a basis for developing scenarios to explore the implications for energy policy analysis

  2. A Computable Plug-In Estimator of Minimum Volume Sets for Novelty Detection

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Ding, Yu

    2010-01-01

    A minimum volume set of a probability density is a region of minimum size among the regions covering a given probability mass of the density. Effective methods for finding the minimum volume sets are very useful for detecting failures or anomalies in commercial and security applications-a problem known as novelty detection. One theoretical approach of estimating the minimum volume set is to use a density level set where a kernel density estimator is plugged into the optimization problem that yields the appropriate level. Such a plug-in estimator is not of practical use because solving the corresponding minimization problem is usually intractable. A modified plug-in estimator was proposed by Hyndman in 1996 to overcome the computation difficulty of the theoretical approach but is not well studied in the literature. In this paper, we provide theoretical support to this estimator by showing its asymptotic consistency. We also show that this estimator is very competitive to other existing novelty detection methods through an extensive empirical study. ©2010 INFORMS.

  3. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  4. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    International Nuclear Information System (INIS)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-01-01

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  5. Coupled Crop/Hydrology Model to Estimate Expanded Irrigation Impact on Water Resources

    Science.gov (United States)

    Handyside, C. T.; Cruise, J.

    2017-12-01

    A coupled agricultural and hydrologic systems model is used to examine the environmental impact of irrigation in the Southeast. A gridded crop model for the Southeast is used to determine regional irrigation demand. This irrigation demand is used in a regional hydrologic model to determine the hydrologic impact of irrigation. For the Southeast to maintain/expand irrigated agricultural production and provide adaptation to climate change and climate variability it will require integrated agricultural and hydrologic system models that can calculate irrigation demand and the impact of the this demand on the river hydrology. These integrated models can be used as (1) historical tools to examine vulnerability of expanded irrigation to past climate extremes (2) future tools to examine the sustainability of expanded irrigation under future climate scenarios and (3) a real-time tool to allow dynamic water resource management. Such tools are necessary to assure stakeholders and the public that irrigation can be carried out in a sustainable manner. The system tools to be discussed include a gridded version of the crop modeling system (DSSAT). The gridded model is referred to as GriDSSAT. The irrigation demand from GriDSSAT is coupled to a regional hydrologic model developed by the Eastern Forest Environmental Threat Assessment Center of the USDA Forest Service) (WaSSI). The crop model provides the dynamic irrigation demand which is a function of the weather. The hydrologic model includes all other competing uses of water. Examples of use the crop model coupled with the hydrologic model include historical analyses which show the change in hydrology as additional acres of irrigated land are added to water sheds. The first order change in hydrology is computed in terms of changes in the Water Availability Stress Index (WASSI) which is the ratio of water demand (irrigation, public water supply, industrial use, etc.) and water availability from the hydrologic model. Also

  6. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  7. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    Science.gov (United States)

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.

  8. Increasing to Accuracy Estimation of Latent Parameters in Human Resource Management Systems

    Directory of Open Access Journals (Sweden)

    O.N. Gustun

    2011-09-01

    Full Text Available An approach to build an adaptive testing system, in which initial estimates of test items are obtained through calibration testing, is considered. The maximum likehood method is used to obtain these estimates. Optimization of the objective function is carried out using the method of Hooke—Jeeves. The influence of various factors on the accuracy of the estimates obtained is investigated.

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Unbiased estimation of the eyeball volume using the Cavalieri principle on computed tomography images.

    Science.gov (United States)

    Acer, Niyazi; Sahin, Bunyamin; Ucar, Tolga; Usanmaz, Mustafa

    2009-01-01

    The size of the eyeball has been the subject of a few studies. None of them used stereological methods to estimate the volume. In the current study, we estimated the volume of eyeball in normal men and women using the stereological methods. Eyeball volume (EV) was estimated using the Cavalieri principle as a combination of point-counting and planimetry techniques. We used computed tomography scans taken from 36 participants (15 men and 21 women) to estimate the EV. The mean (SD) EV values obtained by planimetry method were 7.49 (0.79) and 7.06 (0.85) cm in men and women, respectively. By using point-counting method, the mean (SD) values were 7.48 (0.85) and 7.21 (0.84) cm in men and women, respectively. There was no statistically significant difference between the findings from the 2 methods (P > 0.05). A weak correlation was found between the axial length of eyeball and the EV estimated by point counting and planimetry (P eyeball.

  11. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  12. GLODEP2: a computer model for estimating gamma dose due to worldwide fallout of radioactive debris

    International Nuclear Information System (INIS)

    Edwards, L.L.; Harvey, T.F.; Peterson, K.R.

    1984-03-01

    The GLODEP2 computer code provides estimates of the surface deposition of worldwide radioactivity and the gamma-ray dose to man from intermediate and long-term fallout. The code is based on empirical models derived primarily from injection-deposition experience gained from the US and USSR nuclear tests in 1958. Under the assumption that a nuclear power facility is destroyed and that its debris behaves in the same manner as the radioactive cloud produced by the nuclear weapon that attached the facility, predictions are made for the gamma does from this source of radioactivity. As a comparison study the gamma dose due to the atmospheric nuclear tests from the period of 1951 to 1962 has been computed. The computed and measured values from Grove, UK and Chiba, Japan agree to within a few percent. The global deposition of radioactivity and resultant gamma dose from a hypothetical strategic nuclear exchange between the US and the USSR is reported. Of the assumed 5300 Mton in the exchange, 2031 Mton of radioactive debris is injected in the atmosphere. The highest estimated average whole body total integrated dose over 50 years (assuming no reduction by sheltering or weathering) is 23 rem in the 30 to 50 degree latitude band. If the attack included a 100 GW(e) nuclear power industry as targets in the US, this dose is increased to 84.6 rem. Hotspots due to rainfall could increase these values by factors of 10 to 50

  13. Tridimensional modelling and resource estimation of the mining waste piles of São Domingos mine, Iberian Pyrite Belt, Portugal

    Science.gov (United States)

    Vieira, Alexandre; Matos, João; Lopes, Luis; Martins, Ruben

    2016-04-01

    Located in the Iberian Pyrite Belt (IPB) northern sector, near the Portuguese/Spanish border, the outcropping São Domingos deposit was mined since Roman time. Between 1854 and 1966 the Mason & Barry Company developed open pit excavation until 120 m depth and underground mining until 420 m depth. The São Domingos subvertical deposit is associated with felsic volcanics and black shales of the IPB Volcano-Sedimentary Complex and is represented by massive sulphide and stockwork ore (py, cpy, sph, ga, tt, aspy) and related supergene enrichment ore (hematite gossan and covellite/chalcocite). Different mine waste classes were mapped around the old open pit: gossan (W1), felsic volcanic and shales (W2), shales (W3) and mining waste landfill (W4). Using the LNEG (Portuguese Geological Survey) CONASA database (company historical mining waste characterization based on 162 shafts and 160 reverse circulation boreholes), a methodology for tridimensional modelling mining waste pile was followed, and a new mining waste resource is presented. Considering some constraints to waste removal, such as the Mina de São Domingos village proximity of the wastes, the industrial and archaeological patrimony (e.g., mining infrastructures, roman galleries), different resource scenarios were considered: unconditioned resources (total estimates) and conditioned resources (only the volumes without removal constraints considered). Using block modelling (SURPAC software) a mineral inferred resource of 2.38 Mt @ 0.77 g/t Au and 8.26 g/t Ag is estimated in unconditioned volumes of waste. Considering all evaluated wastes, including village areas, an inferred resource of 4.0 Mt @ 0.64 g/t Au and 7.30 g/t Ag is presented, corresponding to a total metal content of 82,878 oz t Au and 955,753 oz t Ag. Keywords. São Domingos mine, mining waste resources, mining waste pile modelling, Iberian Pyrite Belt, Portugal

  14. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  15. Reciprocal Estimation of Pedestrian Location and Motion State toward a Smartphone Geo-Context Computing Solution

    Directory of Open Access Journals (Sweden)

    Jingbin Liu

    2015-06-01

    Full Text Available The rapid advance in mobile communications has made information and services ubiquitously accessible. Location and context information have become essential for the effectiveness of services in the era of mobility. This paper proposes the concept of geo-context that is defined as an integral synthesis of geographical location, human motion state and mobility context. A geo-context computing solution consists of a positioning engine, a motion state recognition engine, and a context inference component. In the geo-context concept, the human motion states and mobility context are associated with the geographical location where they occur. A hybrid geo-context computing solution is implemented that runs on a smartphone, and it utilizes measurements of multiple sensors and signals of opportunity that are available within a smartphone. Pedestrian location and motion states are estimated jointly under the framework of hidden Markov models, and they are used in a reciprocal manner to improve their estimation performance of one another. It is demonstrated that pedestrian location estimation has better accuracy when its motion state is known, and in turn, the performance of motion state recognition can be improved with increasing reliability when the location is given. The geo-context inference is implemented simply with the expert system principle, and more sophisticated approaches will be developed.

  16. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.

    2014-01-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  17. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D' Albuquerque, Luiz A.C., E-mail: rsnpinheiro@gmail.com [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Gastroenterologia. Div. de Transplante de Orgaos do Aparelho Digestivo; Lai, Quirino [Universidade de L' Aquila, San Salvatore Hospital (Italy); Ibuki, Felicia S.; Rocha, Manoel S. [Universidade de Sao Paulo (USP), SP (Brazil). Departamento de Radiologia

    2017-09-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r{sup 2} =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  18. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    International Nuclear Information System (INIS)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D'Albuquerque, Luiz A.C.; Ibuki, Felicia S.; Rocha, Manoel S.

    2017-01-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r"2 =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  19. PREOPERATIVE COMPUTED TOMOGRAPHY VOLUMETRY AND GRAFT WEIGHT ESTIMATION IN ADULT LIVING DONOR LIVER TRANSPLANTATION

    Science.gov (United States)

    PINHEIRO, Rafael S.; CRUZ-JR, Ruy J.; ANDRAUS, Wellington; DUCATTI, Liliana; MARTINO, Rodrigo B.; NACIF, Lucas S.; ROCHA-SANTOS, Vinicius; ARANTES, Rubens M; LAI, Quirino; IBUKI, Felicia S.; ROCHA, Manoel S.; D´ALBUQUERQUE, Luiz A. C.

    2017-01-01

    ABSTRACT Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r2=0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 - 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. PMID:28489167

  20. Estimate of the energy resources of the Aracatuba region, Sao Paulo, Brazil; Estimativa dos recursos energeticos da regiao de Aracatuba

    Energy Technology Data Exchange (ETDEWEB)

    Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro; Grimoni, Jose Aquiles Baesso; Souza, Carlos Antonio Farias de [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Engenharia de Energia e Automacao Eletricas. Grupo de Energia], e-mail: udaeta@pea.usp.br

    2004-07-01

    The complete assessment on energy producing resources, as referred, indicates a consideration about technical, economical, political and environmental aspects of these resources. The developed project presents a methodology for the implementation of this complete assessment, which consists of identification and inventory of the technologies and resources available at the time to a specific geographical site and their classification as an indicative and computative character of the mentioned aspects, taking in consideration the sustainable development. This methodology was used in the Administrative Region of Aracatuba, obtaining as results a ranking of alternatives. The solar collectors acquired the best evaluation. From this result it's possible to indicate the better option, in this case the solar collector, of investment in a preliminary approach. (author)

  1. Estimate of the energy resources of the Aracatuba region, Sao Paulo, Brazil; Estimativa dos recursos energeticos da regiao de Aracatuba

    Energy Technology Data Exchange (ETDEWEB)

    Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro; Grimoni, Jose Aquiles Baesso; Souza, Carlos Antonio Farias de [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Engenharia de Energia e Automacao Eletricas. Grupo de Energia], e-mail: udaeta@pea.usp.br

    2004-07-01

    The complete assessment on energy producing resources, as referred, indicates a consideration about technical, economical, political and environmental aspects of these resources. The developed project presents a methodology for the implementation of this complete assessment, which consists of identification and inventory of the technologies and resources available at the time to a specific geographical site and their classification as an indicative and computative character of the mentioned aspects, taking in consideration the sustainable development. This methodology was used in the Administrative Region of Aracatuba, obtaining as results a ranking of alternatives. The solar collectors acquired the best evaluation. From this result it's possible to indicate the better option, in this case the solar collector, of investment in a preliminary approach. (author)

  2. Estimation of countries’ interdependence in plant genetic resources provisioning national food supplies and production systems

    OpenAIRE

    Khoury, C.K.; Achicanoy, H.A.; Bjorkman, A.D.; Navarro-Racines, C.; Guarino, L.; Flores-Palacios, X.; Engels, J.M.M.; Wiersema, J.H.; Dempewolf, H.; Ramirez-Villegas, J.; Castaneda-Alvarez, N.P.; Fowler, C.; Jarvis, A.; Rieseberg, L.H.; Struik, P.C.

    2015-01-01

    The Contracting Parties of the International Treaty recognize that plant genetic resources for food and agriculture are a common concern of all countries, in that all countries depend largely on plant genetic resources for food and agriculture that originated elsewhere. Nearly 20 years ago, an initial research on interdependence mong countries on crop diversity provided information helpful for countries to establish the Treaty, and in particular its Multilateral System of Access and Benefit-s...

  3. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  4. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  5. Application of High Performance Computing to Earthquake Hazard and Disaster Estimation in Urban Area

    Directory of Open Access Journals (Sweden)

    Muneo Hori

    2018-02-01

    Full Text Available Integrated earthquake simulation (IES is a seamless simulation of analyzing all processes of earthquake hazard and disaster. There are two difficulties in carrying out IES, namely, the requirement of large-scale computation and the requirement of numerous analysis models for structures in an urban area, and they are solved by taking advantage of high performance computing (HPC and by developing a system of automated model construction. HPC is a key element in developing IES, as it needs to analyze wave propagation and amplification processes in an underground structure; a model of high fidelity for the underground structure exceeds a degree-of-freedom larger than 100 billion. Examples of IES for Tokyo Metropolis are presented; the numerical computation is made by using K computer, the supercomputer of Japan. The estimation of earthquake hazard and disaster for a given earthquake scenario is made by the ground motion simulation and the urban area seismic response simulation, respectively, for the target area of 10,000 m × 10,000 m.

  6. Estimation of kinetic and thermodynamic ligand-binding parameters using computational strategies.

    Science.gov (United States)

    Deganutti, Giuseppe; Moro, Stefano

    2017-04-01

    Kinetic and thermodynamic ligand-protein binding parameters are gaining growing importance as key information to consider in drug discovery. The determination of the molecular structures, using particularly x-ray and NMR techniques, is crucial for understanding how a ligand recognizes its target in the final binding complex. However, for a better understanding of the recognition processes, experimental studies of ligand-protein interactions are needed. Even though several techniques can be used to investigate both thermodynamic and kinetic profiles for a ligand-protein complex, these procedures are very often laborious, time consuming and expensive. In the last 10 years, computational approaches have enormous potential in providing insights into each of the above effects and in parsing their contributions to the changes in both kinetic and thermodynamic binding parameters. The main purpose of this review is to summarize the state of the art of computational strategies for estimating the kinetic and thermodynamic parameters of a ligand-protein binding.

  7. Estimation of effectiveness of automatic exposure control in computed tomograph scanner

    International Nuclear Information System (INIS)

    Akhilesh, Philomina; Sharma, S.D.; Datta, D.; Kulkarni, Arti

    2018-01-01

    With the advent of multiple detector array technology, the use of Computed Tomography (CT) scanning has increase tremendously. Computed Tomography examinations deliver relatively high radiation dose to patients in comparison with conventional radiography. It is therefore required to reduce the dose delivered in CT scans without compromising the image quality. Several parameters like applied potential, tube current, scan length, pitch etc. influence the dose delivered in CT scans. For optimization of patient dose and image quality, all modern CT scanners are enabled with Automatic Exposure Control (AEC) systems. The aim of this work is to compare the dose delivered during CT scans performed with and without AEC in order to estimate the effectiveness of AEC techniques used in CT scanners of various manufacturer

  8. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  9. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10 -5 Gy) per 1,000 mR (258 μC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicists who can make the calculations immediately

  10. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10(-5) Gy) per 1,000 mR (258 microC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicians who can make the calculations immediately

  11. BAESNUM, a conversational computer program for the Bayesian estimation of a parameter by a numerical method

    International Nuclear Information System (INIS)

    Colombo, A.G.; Jaarsma, R.J.

    1982-01-01

    This report describes a conversational computer program which, via Bayes' theorem, numerically combines the prior distribution of a parameter with a likelihood function. Any type of prior and likelihood function can be considered. The present version of the program includes six types of prior and employs the binomial likelihood. As input the program requires the law and parameters of the prior distribution and the sample data. As output it gives the posterior distribution as a histogram. The use of the program for estimating the constant failure rate of an item is briefly described

  12. EFFAIR: a computer program for estimating the dispersion of atmospheric emissions from a nuclear site

    International Nuclear Information System (INIS)

    Dormuth, K.W.; Lyon, R.B.

    1978-11-01

    Analysis of the transport of material through the turbulent atmospheric boundary layer is an important part of environmental impact assessments for nuclear plants. Although this is a complex phenomenon, practical estimates of ground level concentrations downwind of release are usually obtained using a simple Gaussian formula whose coefficients are obtained from empirical correlations. Based on this formula, the computer program EFFAIR has been written to provide a flexible tool for atmospheric dispersion calculations. It is considered appropriate for calculating dilution factors at distances of 10 2 to 10 4 metres from an effluent source if reflection from the inversion lid is negligible in that range. (author)

  13. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  14. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    Science.gov (United States)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original

  15. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  16. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  17. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    Science.gov (United States)

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  18. AN ESTIMATION OF HISTORICAL-CULTURAL RESOURCES OF THE TURKIVSKOGO DISTRICT IS FOR NECESSITIES OF ETHNIC TOURISM.

    OpenAIRE

    Безручко, Л.С.

    2016-01-01

    In the article thefeatures of estimation of historical-culturalresources are considered for the necessitiesof ethnic tourism. The list of objects thatcan be used as resources in ethnic toutismis distinguished. In particular, the objects ofJewish heritage (synagogue, Jewish burialplaces), material objects that remainedfrom the German colonists (two churches),are studied, and also the material and nonmaterialculture of boyko ethnos (churches,building, traditions, museums) is studied.The compres...

  19. Increasing efficiency of job execution with resource co-allocation in distributed computer systems

    OpenAIRE

    Cankar, Matija

    2014-01-01

    The field of distributed computer systems, while not new in computer science, is still the subject of a lot of interest in both industry and academia. More powerful computers, faster and more ubiquitous networks, and complex distributed applications are accelerating the growth of distributed computing. Large numbers of computers interconnected in a single network provide additional computing power to users whenever required. Such systems are, however, expensive and complex to manage, which ca...

  20. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  1. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Directory of Open Access Journals (Sweden)

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  2. Application analysis of Monte Carlo to estimate the capacity of geothermal resources in Lawu Mount

    Energy Technology Data Exchange (ETDEWEB)

    Supriyadi, E-mail: supriyadi-uno@yahoo.co.nz [Physics, Faculty of Mathematics and Natural Sciences, University of Jember, Jl. Kalimantan Kampus Bumi Tegal Boto, Jember 68181 (Indonesia); Srigutomo, Wahyu [Complex system and earth physics, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Munandar, Arif [Kelompok Program Penelitian Panas Bumi, PSDG, Badan Geologi, Kementrian ESDM, Jl. Soekarno Hatta No. 444 Bandung 40254 (Indonesia)

    2014-03-24

    Monte Carlo analysis has been applied in calculation of geothermal resource capacity based on volumetric method issued by Standar Nasional Indonesia (SNI). A deterministic formula is converted into a stochastic formula to take into account the nature of uncertainties in input parameters. The method yields a range of potential power probability stored beneath Lawu Mount geothermal area. For 10,000 iterations, the capacity of geothermal resources is in the range of 139.30-218.24 MWe with the most likely value is 177.77 MWe. The risk of resource capacity above 196.19 MWe is less than 10%. The power density of the prospect area covering 17 km{sup 2} is 9.41 MWe/km{sup 2} with probability 80%.

  3. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  4. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  5. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  6. Estimation of forest resources from a country wide laser scanning survey and national forest inventory data

    DEFF Research Database (Denmark)

    Nord-Larsen, Thomas; Schumacher, Johannes

    2012-01-01

    Airborne laser scanning may provide a means for assessing local forest biomass resources. In this study, national forest inventory (NFI) data was used as reference data for modeling forest basal area, volume, aboveground biomass, and total biomass from laser scanning data obtained in a countrywid...

  7. Recent revisions of phosphate rock reserves and resources: reassuring or misleading? An in-depth literature review of global estimates of phosphate rock reserves and resources

    Science.gov (United States)

    Edixhoven, J. D.; Gupta, J.; Savenije, H. H. G.

    2013-09-01

    Phosphate rock (PR) is a finite mineral indispensible for fertilizer production and a major pollutant. High grade PR is obtained from deposits which took millions of years to form and are gradually being depleted. Over the past three years, global PR reserves as reported by US Geological Survey (USGS) have seen a massive increase, from 16 000 Mt PR in 2010 to 65 000 Mt PR in 2011. The bulk of this four-fold increase is based on a 2010 report by International Fertilizer Development Center (IFDC), which increased Moroccan reserves from 5700 Mt PR as reported by USGS, to 51 000 Mt PR, reported as upgraded ("beneficiated") concentrate. IFDC used a starkly simplified classification compared to the classification used by USGS and proposed that agreement should be reached on PR resource terminology which should be as simple as possible. The report has profoundly influenced the PR scarcity debate, shifting the emphasis from depletion to the pollution angle of the phosphate problem. Various analysts adopted the findings of IFDC and USGS, and argued that that following depletion of reserves, uneconomic deposits (resources and occurrences) will remain available which will extend the lifetime of available deposits to thousands of years. Given the near total dependence of food production on PR, data on PR deposits must be transparent, comparable, reliable and credible. Based on an in-depth literature review, we analyze (i) how IFDC's simplified terminology compares to international best practice in resource classification and whether it is likely to yield data that meets the abovementioned requirements; (ii) whether the difference between ore reserves and reserves as concentrate is sufficiently noted in the literature, and (iii) whether the IFDC report and its estimate of PR reserves and resources is reliable. We conclude that, while there is a global development toward common criteria in resource reporting, IFDC's definitions contravene this development and - due to their

  8. Resources for preparing independent government estimates for remedial contracting work assignments. Directive

    International Nuclear Information System (INIS)

    1992-01-01

    The memorandum provides information regarding the availability of tools, data bases, and assistance for developing independent government estimates of the cost of work to be performed by contractors for remedial work assignments

  9. Optimization-based scatter estimation using primary modulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yi; Ma, Jingchen; Zhao, Jun, E-mail: junzhao@sjtu.edu.cn [School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Song, Ying [Department of Radiation Oncology, West China Hospital, Sichuan University, Chengdu 610041 (China)

    2016-08-15

    Purpose: Scatter reduces the image quality in computed tomography (CT), but scatter correction remains a challenge. A previously proposed primary modulation method simultaneously obtains the primary and scatter in a single scan. However, separating the scatter and primary in primary modulation is challenging because it is an underdetermined problem. In this study, an optimization-based scatter estimation (OSE) algorithm is proposed to estimate and correct scatter. Methods: In the concept of primary modulation, the primary is modulated, but the scatter remains smooth by inserting a modulator between the x-ray source and the object. In the proposed algorithm, an objective function is designed for separating the scatter and primary. Prior knowledge is incorporated in the optimization-based framework to improve the accuracy of the estimation: (1) the primary is always positive; (2) the primary is locally smooth and the scatter is smooth; (3) the location of penumbra can be determined; and (4) the scatter-contaminated data provide knowledge about which part is smooth. Results: The simulation study shows that the edge-preserving weighting in OSE improves the estimation accuracy near the object boundary. Simulation study also demonstrates that OSE outperforms the two existing primary modulation algorithms for most regions of interest in terms of the CT number accuracy and noise. The proposed method was tested on a clinical cone beam CT, demonstrating that OSE corrects the scatter even when the modulator is not accurately registered. Conclusions: The proposed OSE algorithm improves the robustness and accuracy in scatter estimation and correction. This method is promising for scatter correction of various kinds of x-ray imaging modalities, such as x-ray radiography, cone beam CT, and the fourth-generation CT.

  10. Computable error estimates of a finite difference scheme for option pricing in exponential Lévy models

    KAUST Repository

    Kiessling, Jonas; Tempone, Raul

    2014-01-01

    jump activity, then the jumps smaller than some (Formula presented.) are approximated by diffusion. The resulting diffusion approximation error is also estimated, with leading order term in computable form, as well as the dependence of the time

  11. Computational Fluid Dynamic Pressure Drop Estimation of Flow between Parallel Plates

    Energy Technology Data Exchange (ETDEWEB)

    Son, Hyung Min; Yang, Soo Hyung; Park, Jong Hark [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Many pool type reactors have forced downward flows inside the core during normal operation; there is a chance of flow inversion when transients occur. During this phase, the flow undergo transition between turbulent and laminar regions where drastic changes take place in terms of momentum and heat transfer, and the decrease in safety margin is usually observed. Additionally, for high Prandtl number fluids such as water, an effect of the velocity profile inside the channel on the temperature distribution is more pronounced over the low Prandtl number ones. This makes the checking of its pressure drop estimation accuracy less important, assuming the code verification is complete. With an advent of powerful computer hardware, engineering applications of computational fluid dynamics (CFD) methods have become quite common these days. Especially for a fully-turbulent and single phase convective heat transfer, the predictability of the commercial codes has matured enough so that many well-known companies adopt those to accelerate a product development cycle and to realize an increased profitability. In contrast to the above, the transition models for the CFD code are still under development, and the most of the models show limited generality and prediction accuracy. Unlike the system codes, the CFD codes estimate the pressure drop from the velocity profile which is obtained by solving momentum conservation equations, and the resulting friction factor can be a representative parameter for a constant cross section channel flow. In addition, the flow inside a rectangular channel with a high span to gap ratio can be approximated by flow inside parallel plates. The computational fluid dynamics simulation on the flow between parallel plates showed reasonable prediction capability for the laminar and the turbulent regime.

  12. Bioenergy in Australia: An improved approach for estimating spatial availability of biomass resources in the agricultural production zones

    International Nuclear Information System (INIS)

    Herr, Alexander; Dunlop, Michael

    2011-01-01

    Bioenergy production from crops and agricultural residues has a greenhouse gas mitigation potential. However, there is considerable debate about the size of this potential. This is partly due to difficulties in estimating the feedstock resource base accurately and with good spatial resolution. Here we provide two techniques for spatially estimating crop-based bioenergy feedstocks in Australia using regional agricultural statistics and national land use maps. The approach accommodates temporal variability by estimating ranges of feedstock availability and the shifting nature of zones of the highest spatial concentration of feedstocks. The techniques are applicable to biomass production from forestry, agricultural residues or oilseeds, all of which have been proposed as biofuel feedstocks. -- Highlights: → Dasymetric mapping appoach for producing spatial and temporal variation maps in feedstock production.→ Combines land use and crop statistics to produce regionally precise feedstock maps. → Feedstock concentrations and feedstock density maps enable identification of feedstock concentration spatially and comparison of yearly variation in production.

  13. A review of the methods available for estimating soil moisture and its implications for water resource management

    Science.gov (United States)

    Dobriyal, Pariva; Qureshi, Ashi; Badola, Ruchi; Hussain, Syed Ainul

    2012-08-01

    SummaryThe maintenance of elevated soil moisture is an important ecosystem service of the natural ecosystems. Understanding the patterns of soil moisture distribution is useful to a wide range of agencies concerned with the weather and climate, soil conservation, agricultural production and landscape management. However, the great heterogeneity in the spatial and temporal distribution of soil moisture and the lack of standard methods to estimate this property limit its quantification and use in research. This literature based review aims to (i) compile the available knowledge on the methods used to estimate soil moisture at the landscape level, (ii) compare and evaluate the available methods on the basis of common parameters such as resource efficiency, accuracy of results and spatial coverage and (iii) identify the method that will be most useful for forested landscapes in developing countries. On the basis of the strengths and weaknesses of each of the methods reviewed we conclude that the direct method (gravimetric method) is accurate and inexpensive but is destructive, slow and time consuming and does not allow replications thereby having limited spatial coverage. The suitability of indirect methods depends on the cost, accuracy, response time, effort involved in installation, management and durability of the equipment. Our review concludes that measurements of soil moisture using the Time Domain Reflectometry (TDR) and Ground Penetrating Radar (GPR) methods are instantaneously obtained and accurate. GPR may be used over larger areas (up to 500 × 500 m a day) but is not cost-effective and difficult to use in forested landscapes in comparison to TDR. This review will be helpful to researchers, foresters, natural resource managers and agricultural scientists in selecting the appropriate method for estimation of soil moisture keeping in view the time and resources available to them and to generate information for efficient allocation of water resources and

  14. DEEBAR - A BASIC interactive computer programme for estimating mean resonance spacings

    International Nuclear Information System (INIS)

    Booth, M.; Pope, A.L.; Smith, R.W.; Story, J.S.

    1988-02-01

    DEEBAR is a BASIC interactive programme, which uses the theories of Dyson and of Dyson and Mehta, to compute estimates of the mean resonance spacings and associated uncertainty statistics from an input file of neutron resonance energies. In applying these theories the broad scale energy dependence of D-bar, as predicted by the ordinary theory of level densities, is taken into account. The mean spacing D-bar ± δD-bar, referred to zero energy of the incident neutrons, is computed from the energies of the first k resonances, for k = 2,3...K in turn and as if no resonances are missing. The user is asked to survey this set of D-bar and δD-bar values and to form a judgement - up to what value of k is the set of resonances complete and what value, in consequence, does the user adopt as the preferred value of D-bar? When the preferred values for k and D-bar have been input, the programme calculates revised values for the level density parameters, consistent with this value for D-bar and with other input information. Two short tables are printed, illustrating the energy variation and spin dependence of D-bar. Dyson's formula based on his Coulomb gas analogy is used for estimating the most likely energies of the topmost bound levels. Finally the quasi-crystalline character of a single level series is exploited by means of a table in which the resonance energies are set alongside an energy ladder whose rungs are regularly spaced with spacing D-bar(E); this comparative table expedites the search for gaps where resonances may have been missed experimentally. Used in conjunction with the program LJPROB, which calculates neutron strengths and compares them against the expected Porter Thomas distribution, estimates of the statistical parameters for use in the unresolved resonance region may be derived. (author)

  15. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  16. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  17. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  18. Multidimensional model of estimated resource usage for multimedia NoC QoS

    NARCIS (Netherlands)

    Pastrnak, M.; With, de P.H.N.; Lagendijk, R.; Weber, H. Jos; Berg, van den, F. A.

    2006-01-01

    Multiprocessor systems are rapidly entering various high-performance computing segments, like multimedia processing. Instead of an increase in processor clock frequency, the new trend is enabling multiple cores in performing processing, e.g. dual or quadrapule CPUs in one subsystem. In this

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  1. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  2. Estimation of staff doses in complex radiological examinations using a Monte Carlo computer code

    International Nuclear Information System (INIS)

    Vanhavere, F.

    2007-01-01

    The protection of medical personnel in interventional radiology is an important issue of radiological protection. The irradiation of the worker is largely non-uniform, and a large part of his body is shielded by a lead apron. The estimation of effective dose (E) under these conditions is difficult and several approaches are used to estimate effective dose involving such a protective apron. This study presents a summary from an extensive series of simulations to determine scatter-dose distribution around the patient and staff effective dose from personal dosimeter readings. The influence of different parameters (like beam energy and size, patient size, irradiated region, worker position and orientation) on the staff doses has been determined. Published algorithms that combine readings of an unshielded and a shielded dosimeter to estimate effective dose have been applied and a new algorithm, that gives more accurate dose estimates for a wide range of situations was proposed. A computational approach was used to determine the dose distribution in the worker's body. The radiation transport and energy deposition was simulated using the MCNP4B code. The human bodies of the patient and radiologist were generated with the Body Builder anthropomorphic model-generating tool. The radiologist is protected with a lead apron (0.5 mm lead equivalent in the front and 0.25 mm lead equivalent in the back and sides) and a thyroid collar (0.35 mm lead equivalent). The lower-arms of the worker were folded to simulate the arms position during clinical examinations. This realistic situation of the folded arms affects the effective dose to the worker. Depending on the worker position and orientation (and of course the beam energy), the difference can go up to 25 percent. A total of 12 Hp(10) dosimeters were positioned above and under the lead apron at the neck, chest and waist levels. Extra dosimeters for the skin dose were positioned at the forehead, the forearms and the front surface of

  3. Estimating the tuberculosis burden in resource-limited countries: a capture-recapture study in Yemen.

    Science.gov (United States)

    Bassili, A; Al-Hammadi, A; Al-Absi, A; Glaziou, P; Seita, A; Abubakar, I; Bierrenbach, A L; van Hest, N A

    2013-04-01

    The lack of applicable population-based methods to measure tuberculosis (TB) incidence rates directly at country level emphasises the global need to generate robust TB surveillance data to ascertain trends in disease burden and to assess the performance of TB control programmes in the context of the United Nations Millenium Development Goals and World Health Organization targets for TB control. To estimate the incidence of TB cases (all forms) and sputum smear-positive disease, and the level of under-reporting of TB in Yemen in 2010. Record-linkage and three-source capture-recapture analysis of data collected through active prospective longitudinal surveillance within the public and private non-National Tuberculosis Programme sector in twelve Yemeni governorates, selected by stratified cluster random sampling. For all TB cases, the estimated ratio of notified to incident cases and completeness of case ascertainment after record linkage, i.e., the ratio of detected to incident cases, was respectively 71% (95%CI 64-80) and 75% (95%CI 68-85). For sputum smear-positive TB cases, these ratios were respectively 67% (95%CI 58-75) and 76% (95%CI 66-84). We estimate that there were 13 082 (95%CI 11 610-14 513) TB cases in Yemen in 2010. Under-reporting of TB in Yemen is estimated at 29% (95%CI 20-36).

  4. Effort estimation for enterprise resource planning implementation projects using social choice - a comparative study

    Science.gov (United States)

    Koch, Stefan; Mitlöhner, Johann

    2010-08-01

    ERP implementation projects have received enormous attention in the last years, due to their importance for organisations, as well as the costs and risks involved. The estimation of effort and costs associated with new projects therefore is an important topic. Unfortunately, there is still a lack of models that can cope with the special characteristics of these projects. As the main focus lies in adapting and customising a complex system, and even changing the organisation, traditional models like COCOMO can not easily be applied. In this article, we will apply effort estimation based on social choice in this context. Social choice deals with aggregating the preferences of a number of voters into a collective preference, and we will apply this idea by substituting the voters by project attributes. Therefore, instead of supplying numeric values for various project attributes, a new project only needs to be placed into rankings per attribute, necessitating only ordinal values, and the resulting aggregate ranking can be used to derive an estimation. We will describe the estimation process using a data set of 39 projects, and compare the results to other approaches proposed in the literature.

  5. Estimating indigenous resources for fuel-wood and poles and plantation requirements in the tribal trust lands of Zimbabwe Rhodesia

    Energy Technology Data Exchange (ETDEWEB)

    Furness, C K

    1981-01-01

    The difficulties encountered in planning for the conservation of indigenous timber resources and in estimating the timber consumption in tribal trust land are outlined in this paper. An estimate of these resources and of the consumption of timber, together with an estimate of exotic plantations required to make up any shortfall of timber, is given. Some 66,000 ha of eucalypts are currently required in the tribal trust lands, where planting has thus far provided only 3800 ha. The types of plantations established and the species used are mentioned. The rural population has, generally speaking, shown only limited enthusiasm for growing exotics, one of the reasons being the traditional use of indigenous timber which is still available in most areas without cost, and the preference for indigenous timber compared to eucalypts. The need for more reliable data for future planning is emphasized. Substitutes for fuel-wood are discussed and the need to reserve areas of indigenous timber in tribal trust land for the protection of soil and water and for fuel-wood are proposed. (Refs. 1).

  6. The actual status of uranium ore resources at Eko Remaja Sector: the need of verification of resources computation and geometrical form of mineralization zone by mining test

    International Nuclear Information System (INIS)

    Johan Baratha; Muljono, D.S.; Agus Sumaryanto; Handoko Supalal

    1996-01-01

    Uranium ore resources calculation was done after ending all of geological work step. Estimation process of ore resources was started from evaluation drilling, continued with borehole logging. From logging, the result has presented in anomaly graphs, then was processed to determine thickness and grade value of ore. Those mineralization points were correlated one another to form mineralization zones which have direction of N 270 degree to N 285 degree with 70 degree dip to North. From Grouping the mineralization distribution, 19 mineralization planes was constructed which contain 553 ton of U 3 O 8 measured. It is suggested that before expanding measured ore deposit area, mining test should be done first at certain mineralization planes to prove the method applied to calculate the reserve. Results form mining test could be very useful to reevaluate all the work-step done. (author); 4 refs; 2 tabs; 8 figs

  7. Estimation of Resource Productivity and Efficiency: An Extended Evaluation of Sustainability Related to Material Flow

    Directory of Open Access Journals (Sweden)

    Pin-Chih Wang

    2014-09-01

    Full Text Available This study is intended to conduct an extended evaluation of sustainability based on the material flow analysis of resource productivity. We first present updated information on the material flow analysis (MFA database in Taiwan. Essential indicators are selected to quantify resource productivity associated with the economy-wide MFA of Taiwan. The study also applies the IPAT (impact-population-affluence-technology master equation to measure trends of material use efficiency in Taiwan and to compare them with those of other Asia-Pacific countries. An extended evaluation of efficiency, in comparison with selected economies by applying data envelopment analysis (DEA, is conducted accordingly. The Malmquist Productivity Index (MPI is thereby adopted to quantify the patterns and the associated changes of efficiency. Observations and summaries can be described as follows. Based on the MFA of the Taiwanese economy, the average growth rates of domestic material input (DMI; 2.83% and domestic material consumption (DMC; 2.13% in the past two decades were both less than that of gross domestic product (GDP; 4.95%. The decoupling of environmental pressures from economic growth can be observed. In terms of the decomposition analysis of the IPAT equation and in comparison with 38 other economies, the material use efficiency of Taiwan did not perform as well as its economic growth. The DEA comparisons of resource productivity show that Denmark, Germany, Luxembourg, Malta, Netherlands, United Kingdom and Japan performed the best in 2008. Since the MPI consists of technological change (frontier-shift or innovation and efficiency change (catch-up, the change in efficiency (catch-up of Taiwan has not been accomplished as expected in spite of the increase in its technological efficiency.

  8. Estimating the Prospectivity of Geothermal Resources Using the Concept of Hydrogeologic Windows

    Science.gov (United States)

    Bielicki, Jeffrey; Blackwell, David; Harp, Dylan; Karra, Satish; Kelley, Richard; Kelley, Shari; Middleton, Richard; Person, Mark; Sutula, Glenn; Witcher, James

    2016-04-01

    In this Geothermal Play Fairways Analysis project we sought to develop new ways to analyze geologic, geochemical, and geophysical data to reduce the risk and increase the prospects of successful geothermal exploration and development. We collected, organized, and analyzed data from southwest New Mexico in the context of an integrated framework that combines the data for various signatures of a geothermal resource into a cohesive analysis of the presence of heat, fluid, and permeability. We incorporated data on structural characteristics (earthquakes, geophysical logs, fault location and age, basement depth), topographic and water table elevations, conservative ion concentrations, and thermal information (heat flow, bottom hole temperature, discharge temperature, and basement heat generation). These data were combined to create maps that indicate structural analysis, slope, geothermometry, and heat. We also mapped discharge areas (to constrain elevations where groundwater may be discharged through modern thermal springs or paleo-thermal springs) and subcrops: possible erosionally- or structurally-controlled breaches in regional-scale aquitards that form the basis of our hydrogeologic windows concept. These two maps were particularly useful in identifying known geothermal systems and narrowing the search for unknown geothermal prospects. We further refined the "prospectivity" of the areas within the subcrops and discharge areas by developing and applying a new method for spatial association analysis to data on known and inferred faults, earthquakes, geochemical thermometers, and heat flow. This new methodology determines the relationships of the location and magnitudes of observations of these data with known geothermal sites. The results of each of the six spatial association analyses were weighted between 0 and 1 and summed to produce a prospectivity score between 0 and 6, with 6 indicating highest geothermal potential. The mean value of prospectivity for all

  9. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  10. Estimation of MSAD values in computed tomography scans using radiochromic films

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Bruno Beraldo; Teogenes Augusto da, E-mail: bbo@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Mourao, Arnaldo Prata [Centro Federal de Educacao Tecnologica de Minas Gerais (CEFET-MG), Belo Horizonte, MG (Brazil)

    2013-03-15

    Objective: To evaluate the feasibility of using radiochromic films as an alternative dosimeter to estimate the multiple scan average dose on the basis on kerma profiles. Materials and Methods: The radiochromic films were distributed in cylinders positioned in the center and in four peripheral bores of a standard abdominal phantom utilized for computed tomography dosimetry. Results: Values for multiple scan average dose values corresponded to 13.6 {+-} 0.7, 13.5 {+-} 0.7 and 18.7 {+-} 1.0 mGy for pitch of 0.75, 1.00 and 1.50, respectively. Conclusion: In spite of results showing lower values than the reference level for radiodiagnosis (25 mGy) established by the Brazilian regulations for abdominal studies, it is suggested that there is room to optimize procedures and review the reference level for radiodiagnosis in Brazil. (author)

  11. Deoxyglucose method for the estimation of local myocardial glucose metabolism with positron computed tomography

    International Nuclear Information System (INIS)

    Ratib, O.; Phelps, M.E.; Huang, S.C.; Henze, E.; Selin, C.E.; Schelbert, H.R.

    1981-01-01

    The deoxyglucose method originally developed for measurements of the local cerebral metabolic rate for glucose has been investigated in terms of its application to studies of the heart with positron computed tomography (PCT) and FDG. Studies were performed in dogs to measure the tissue kinetics of FDG with PCT and by direct arterial-venous sampling. The operational equation developed in our laboratory as an extension of the Sokoloff model was used to analyze the data. The FDG method accurately predicted the true MMRGlc even when the glucose metabolic rate was normal but myocardial blood flow (MBF) was elevated 5 times the control value or when metabolism was reduced to 10% of normal and MBF increased 5 times normal. Improvements in PCT resolution are required to improve the accuracy of the estimates of the rate constants and the MMRGlc

  12. Selection of meteorological parameters affecting rainfall estimation using neuro-fuzzy computing methodology

    Science.gov (United States)

    Hashim, Roslan; Roy, Chandrabhushan; Motamedi, Shervin; Shamshirband, Shahaboddin; Petković, Dalibor; Gocic, Milan; Lee, Siew Cheng

    2016-05-01

    Rainfall is a complex atmospheric process that varies over time and space. Researchers have used various empirical and numerical methods to enhance estimation of rainfall intensity. We developed a novel prediction model in this study, with the emphasis on accuracy to identify the most significant meteorological parameters having effect on rainfall. For this, we used five input parameters: wet day frequency (dwet), vapor pressure (e̅a), and maximum and minimum air temperatures (Tmax and Tmin) as well as cloud cover (cc). The data were obtained from the Indian Meteorological Department for the Patna city, Bihar, India. Further, a type of soft-computing method, known as the adaptive-neuro-fuzzy inference system (ANFIS), was applied to the available data. In this respect, the observation data from 1901 to 2000 were employed for testing, validating, and estimating monthly rainfall via the simulated model. In addition, the ANFIS process for variable selection was implemented to detect the predominant variables affecting the rainfall prediction. Finally, the performance of the model was compared to other soft-computing approaches, including the artificial neural network (ANN), support vector machine (SVM), extreme learning machine (ELM), and genetic programming (GP). The results revealed that ANN, ELM, ANFIS, SVM, and GP had R2 of 0.9531, 0.9572, 0.9764, 0.9525, and 0.9526, respectively. Therefore, we conclude that the ANFIS is the best method among all to predict monthly rainfall. Moreover, dwet was found to be the most influential parameter for rainfall prediction, and the best predictor of accuracy. This study also identified sets of two and three meteorological parameters that show the best predictions.

  13. Resource Evaluation and Energy Production Estimate for a Tidal Energy Conversion Installation using Acoustic Flow Measurements

    Science.gov (United States)

    Gagnon, Ian; Baldwin, Ken; Wosnik, Martin

    2015-11-01

    The ``Living Bridge'' project plans to install a tidal turbine at Memorial Bridge in the Piscataqua River at Portsmouth, NH. A spatio-temporal tidal energy resource assessment was performed using long term bottom-deployed Acoustic Doppler Current Profilers ADCP. Two locations were evaluated: at the planned deployment location and mid-channel. The goal was to determine the amount of available kinetic energy that can be converted into usable electrical energy on the bridge. Changes in available kinetic energy with ebb/flood and spring/neap tidal cycles and electrical energy demand were analyzed. A system model is used to calculate the net energy savings using various tidal generator and battery bank configurations. Differences in the tidal characteristics between the two measurement locations are highlighted. Different resource evaluation methodologies were also analyzed, e.g., using a representative ADCP ``bin'' vs. a more refined, turbine-geometry-specific methodology, and using static bin height vs. bin height that move w.r.t. the free surface throughout a tidal cycle (representative of a bottom-fixed or floating turbine deployment, respectively). ADCP operating frequencies and bin sizes affect the standard deviation of measurements, and measurement uncertainties are evaluated. Supported by NSF-IIP grant 1430260.

  14. Estimation of solar energy resources for low salinity water desalination in several regions of Russia

    Science.gov (United States)

    Tarasenko, A. B.; Kiseleva, S. V.; Shakun, V. P.; Gabderakhmanova, T. S.

    2018-01-01

    This paper focuses on estimation of demanded photovoltaic (PV) array areas and capital expenses to feed a reverse osmosis desalination unit (1 m3/day fresh water production rate). The investigation have been made for different climatic conditions of Russia using regional data on ground water salinity from different sources and empirical dependence of specific energy consumption on salinity and temperature. The most optimal results were obtained for Krasnodar, Volgograd, Crimea Republic and some other southern regions. Combination of salinity, temperature and solar radiation level there makes reverse osmosis coupled with photovoltaics very attractive to solve infrastructure problems in rural areas. Estimation results are represented as maps showing PV array areas and capital expenses for selected regions.

  15. Earth's copper resources estimated from tectonic diffusion of porphyry copper deposits

    Science.gov (United States)

    Kesler, Stephen E.; Wilkinson, Bruce H.

    2008-03-01

    Improved estimates of global mineral endowments are relevantto issues ranging from strategic planning to global geochemicalcycling. We have used a time-space model for the tectonic migrationof porphyry copper deposits vertically through the crust tocalculate Earth's endowment of copper in mineral deposits. Themodel relies only on knowledge of numbers and ages of porphyrycopper deposits, Earth's most widespread and important sourceof copper, in order to estimate numbers of eroded and preserveddeposits in the crust. Model results indicate that 125,895 porphyrycopper deposits were formed during Phanerozoic time, that only47,789 of these remain at various crustal depths, and that thesecontain 1.7 x 1011 tonnes (t) of copper. Assuming that othertypes of copper deposits behave similarly in the crust and haveabundances proportional to their current global production yieldsan estimate of 3 x 1011 t for total global copper resourcesat all levels in Earth's crust. Thus, 0.25% of the copper inthe crust has been concentrated into deposits through Phanerozoictime, and about two-thirds of this has been recycled by upliftand erosion. The amount of copper in deposits above 3.3 km,a likely limit of future mining, could supply current worldmine production for 5500 yr, thus quantifying the highly unusualand nonrenewable nature of mineral deposits.

  16. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  17. Global Estimates of Errors in Quantum Computation by the Feynman-Vernon Formalism

    Science.gov (United States)

    Aurell, Erik

    2018-04-01

    The operation of a quantum computer is considered as a general quantum operation on a mixed state on many qubits followed by a measurement. The general quantum operation is further represented as a Feynman-Vernon double path integral over the histories of the qubits and of an environment, and afterward tracing out the environment. The qubit histories are taken to be paths on the two-sphere S^2 as in Klauder's coherent-state path integral of spin, and the environment is assumed to consist of harmonic oscillators initially in thermal equilibrium, and linearly coupled to to qubit operators \\hat{S}_z . The environment can then be integrated out to give a Feynman-Vernon influence action coupling the forward and backward histories of the qubits. This representation allows to derive in a simple way estimates that the total error of operation of a quantum computer without error correction scales linearly with the number of qubits and the time of operation. It also allows to discuss Kitaev's toric code interacting with an environment in the same manner.

  18. Global Estimates of Errors in Quantum Computation by the Feynman-Vernon Formalism

    Science.gov (United States)

    Aurell, Erik

    2018-06-01

    The operation of a quantum computer is considered as a general quantum operation on a mixed state on many qubits followed by a measurement. The general quantum operation is further represented as a Feynman-Vernon double path integral over the histories of the qubits and of an environment, and afterward tracing out the environment. The qubit histories are taken to be paths on the two-sphere S^2 as in Klauder's coherent-state path integral of spin, and the environment is assumed to consist of harmonic oscillators initially in thermal equilibrium, and linearly coupled to to qubit operators \\hat{S}_z. The environment can then be integrated out to give a Feynman-Vernon influence action coupling the forward and backward histories of the qubits. This representation allows to derive in a simple way estimates that the total error of operation of a quantum computer without error correction scales linearly with the number of qubits and the time of operation. It also allows to discuss Kitaev's toric code interacting with an environment in the same manner.

  19. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    Science.gov (United States)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  20. Development and comparison of computational models for estimation of absorbed organ radiation dose in rainbow trout (Oncorhynchus mykiss) from uptake of iodine-131

    International Nuclear Information System (INIS)

    Martinez, N.E.; Johnson, T.E.; Capello, K.; Pinder, J.E.

    2014-01-01

    This study develops and compares different, increasingly detailed anatomical phantoms for rainbow trout (Oncorhynchus mykiss) for the purpose of estimating organ absorbed radiation dose and dose rates from 131 I uptake in multiple organs. The models considered are: a simplistic geometry considering a single organ, a more specific geometry employing additional organs with anatomically relevant size and location, and voxel reconstruction of internal anatomy obtained from CT imaging (referred to as CSUTROUT). Dose Conversion Factors (DCFs) for whole body as well as selected organs of O. mykiss were computed using Monte Carlo modeling, and combined with estimated activity concentrations, to approximate dose rates and ultimately determine cumulative radiation dose (μGy) to selected organs after several half-lives of 131 I. The different computational models provided similar results, especially for source organs (less than 30% difference between estimated doses), and whole body DCFs for each model (∼3 × 10 −3 μGy d −1 per Bq kg −1 ) were comparable to DCFs listed in ICRP 108 for 131 I. The main benefit provided by the computational models developed here is the ability to accurately determine organ dose. A conservative mass-ratio approach may provide reasonable results for sufficiently large organs, but is only applicable to individual source organs. Although CSUTROUT is the more anatomically realistic phantom, it required much more resource dedication to develop and is less flexible than the stylized phantom for similar results. There may be instances where a detailed phantom such as CSUTROUT is appropriate, but generally the stylized phantom appears to be the best choice for an ideal balance between accuracy and resource requirements. - Highlights: • Computational models (phantoms) are developed for rainbow trout internal dosimetry. • Phantoms are combined with empirical models for 131 I uptake to estimate dose. • Voxel and stylized phantoms predict

  1. Selective tuberculosis incidence estimation by digital computer information technologies in the MS Excel system

    Directory of Open Access Journals (Sweden)

    G. I. Ilnitsky

    2014-01-01

    Full Text Available The incidence of tuberculosis was estimated in different age groups of people, applying the digital computer information technologies of tracking. For this, the author used the annual forms of the reporting materials stipulated by the Ministry of Health of Ukraine, the results of his observations, and the data of bank information accumulation in the MS Excel system. The initial positions were formed in terms of the epidemiological indicators of Ukraine and the Lvov Region during a 10-year period (2000-2009 that was, in relation with different initial characteristics, divided into Step 1 (2000-2004 in which the tuberculosis epidemic situation progressively deteriorated and Step 2 (2005-2009 in which relative morbidity was relatively stabilized. The results were processed using the MS Excel statistical and mathematical functions that were parametric and nonparametric in establishing a correlation when estimating the changes in epidemic parameters. The findings of studies among the general population could lead to the conclusion that the mean tuberculosis morbidity in Ukraine was much greater than that in the Lvov Region irrespective of the age of a population. At the same time, the morbidity rate in the foci of tuberculosis infection suggested that it rose among both the children, adolescents, and adults, which provided a rationale for that therapeutic and preventive measures should be better implemented.

  2. Fan-out Estimation in Spin-based Quantum Computer Scale-up.

    Science.gov (United States)

    Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R

    2017-10-17

    Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.

  3. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  4. A new formula for estimation of standard liver volume using computed tomography-measured body thickness.

    Science.gov (United States)

    Ma, Ka Wing; Chok, Kenneth S H; Chan, Albert C Y; Tam, Henry S C; Dai, Wing Chiu; Cheung, Tan To; Fung, James Y Y; Lo, Chung Mau

    2017-09-01

    The objective of this article is to derive a more accurate and easy-to-use formula for finding estimated standard liver volume (ESLV) using novel computed tomography (CT) measurement parameters. New formulas for ESLV have been emerging that aim to improve the accuracy of estimation. However, many of these formulas contain body surface area measurements and logarithms in the equations that lead to a more complicated calculation. In addition, substantial errors in ESLV using these old formulas have been shown. An improved version of the formula for ESLV is needed. This is a retrospective cohort of consecutive living donor liver transplantations from 2005 to 2016. Donors were randomly assigned to either the formula derivation or validation groups. Total liver volume (TLV) measured by CT was used as the reference for a linear regression analysis against various patient factors. The derived formula was compared with the existing formulas. There were 722 patients (197 from the derivation group, 164 from the validation group, and 361 from the recipient group) involved in the study. The donor's body weight (odds ratio [OR], 10.42; 95% confidence interval [CI], 7.25-13.60; P Liver Transplantation 23 1113-1122 2017 AASLD. © 2017 by the American Association for the Study of Liver Diseases.

  5. Interaction forces model on a bubble growing for nuclear best estimate computer codes

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Nunez-Carrera, Alejandro; Martinez-Mendez, Elizabeth J.

    2005-01-01

    This paper presents a mathematical model that takes into account the bubble radius variation that take place in a boiling water nuclear reactor during transients with changes in the pressure vessel, changes in the inlet core mass flow rate, density-wave phenomena or flow regime instability. The model with expansion effects was developed considering the interaction force between a dilute dispersion of gas bubbles and a continuous liquid phase. The closure relationships were formulated as an associated problem with the spatial deviation around averaging variables as a function of known variables. In order to solve the closure problem, a geometric model given by an eccentric unit cell was applied as an approach of heterogeneous structure of the two-phase flow. The closure relationship includes additional terms that represent combined effects between translation and pulsation due to displacement and size variation of the bubbles, respectively. This result can be implanted straightforward in best estimate thermo-hydraulics models. An example, the implementation of the closure relationships into TRAC best estimate computer code is presented

  6. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  7. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  8. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...... reliable simulations of stream flow and water balance. The potential of using radar-based precipitation was found to be especially high at a smaller scale, where the impact of spatial resolution was evident from the stream discharge results. Also, groundwater recharge was shown to be sensitive...

  9. A Transmission-Cost-Based Model to Estimate the Amount of Market-Integrable Wind Resources

    DEFF Research Database (Denmark)

    Morales González, Juan Miguel; Pinson, Pierre; Madsen, Henrik

    2012-01-01

    are made to share the expenses in transmission derived from their integration, they may see the doors of electricity markets closed for not being competitive enough. This paper presents a model to decide the amount of wind resources that are economically exploitable at a given location from a transmission......In the pursuit of the large-scale integration of wind power production, it is imperative to evaluate plausible frictions among the stochastic nature of wind generation, electricity markets, and the investments in transmission required to accommodate larger amounts of wind. If wind producers......-cost perspective. This model accounts for the uncertain character of wind by using a modeling framework based on stochastic optimization, simulates market barriers by means of a bi-level structure, and considers the financial risk of investments in transmission through the conditional value-at-risk. The major...

  10. Bragg peak prediction from quantitative proton computed tomography using different path estimates

    International Nuclear Information System (INIS)

    Wang Dongxu; Mackie, T Rockwell; Tome, Wolfgang A

    2011-01-01

    This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a conditional proton path probability map, the relative probability of SLP, CSP and MPP are calculated and compared. The depth dose and Bragg peak are predicted on the pCT images reconstructed using SLP, CSP, and MPP and compared with the simulation result. The root-mean-square physical deviations and the cumulative distribution of the physical deviations show that the performance of CSP is comparable to MPP while SLP is slightly inferior. About 90% of the SLP pixels and 99% of the CSP pixels lie in the 99% relative probability envelope of the MPP. Even at an imaging dose of ∼0.1 mGy the proton Bragg peak for a given incoming energy can be predicted on the pCT image reconstructed using SLP, CSP, or MPP with 1 mm accuracy. This study shows that SLP and CSP, like MPP, are adequate path estimates for pCT reconstruction, and therefore can be chosen as the path estimation method for pCT reconstruction, which can aid the treatment planning and range prediction of proton radiation therapy.

  11. Bragg peak prediction from quantitative proton computed tomography using different path estimates

    Energy Technology Data Exchange (ETDEWEB)

    Wang Dongxu; Mackie, T Rockwell; Tome, Wolfgang A, E-mail: tome@humonc.wisc.edu [Department of Medical Physics, University of Wisconsin School of Medicine and Public Health, Madison, WI 53705 (United States)

    2011-02-07

    This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a conditional proton path probability map, the relative probability of SLP, CSP and MPP are calculated and compared. The depth dose and Bragg peak are predicted on the pCT images reconstructed using SLP, CSP, and MPP and compared with the simulation result. The root-mean-square physical deviations and the cumulative distribution of the physical deviations show that the performance of CSP is comparable to MPP while SLP is slightly inferior. About 90% of the SLP pixels and 99% of the CSP pixels lie in the 99% relative probability envelope of the MPP. Even at an imaging dose of {approx}0.1 mGy the proton Bragg peak for a given incoming energy can be predicted on the pCT image reconstructed using SLP, CSP, or MPP with 1 mm accuracy. This study shows that SLP and CSP, like MPP, are adequate path estimates for pCT reconstruction, and therefore can be chosen as the path estimation method for pCT reconstruction, which can aid the treatment planning and range prediction of proton radiation therapy.

  12. Bragg peak prediction from quantitative proton computed tomography using different path estimates

    Science.gov (United States)

    Wang, Dongxu; Mackie, T Rockwell

    2015-01-01

    This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a conditional proton path probability map, the relative probability of SLP, CSP and MPP are calculated and compared. The depth dose and Bragg peak are predicted on the pCT images reconstructed using SLP, CSP, and MPP and compared with the simulation result. The root-mean-square physical deviations and the cumulative distribution of the physical deviations show that the performance of CSP is comparable to MPP while SLP is slightly inferior. About 90% of the SLP pixels and 99% of the CSP pixels lie in the 99% relative probability envelope of the MPP. Even at an imaging dose of ~0.1 mGy the proton Bragg peak for a given incoming energy can be predicted on the pCT image reconstructed using SLP, CSP, or MPP with 1 mm accuracy. This study shows that SLP and CSP, like MPP, are adequate path estimates for pCT reconstruction, and therefore can be chosen as the path estimation method for pCT reconstruction, which can aid the treatment planning and range prediction of proton radiation therapy. PMID:21212472

  13. Computational method for estimating boundary of abdominal subcutaneous fat for absolute electrical impedance tomography.

    Science.gov (United States)

    Yamaguchi, Tohru F; Okamoto, Yoshiwo

    2018-01-01

    Abdominal fat accumulation is considered an essential indicator of human health. Electrical impedance tomography has considerable potential for abdominal fat imaging because of the low specific conductivity of human body fat. In this paper, we propose a robust reconstruction method for high-fidelity conductivity imaging by abstraction of the abdominal cross section using a relatively small number of parameters. Toward this end, we assume homogeneous conductivity in the abdominal subcutaneous fat area and characterize its geometrical shape by parameters defined as the ratio of the distance from the center to boundary of subcutaneous fat to the distance from the center to outer boundary in 64 equiangular directions. To estimate the shape parameters, the sensitivity of the noninvasively measured voltages with respect to the shape parameters is formulated for numerical optimization. Numerical simulations are conducted to demonstrate the validity of the proposed method. A 3-dimensional finite element method is used to construct a computer model of the human abdomen. The inverse problems of shape parameters and conductivities are solved concurrently by iterative forward and inverse calculations. As a result, conductivity images are reconstructed with a small systemic error of less than 1% for the estimation of the subcutaneous fat area. A novel method is devised for estimating the boundary of the abdominal subcutaneous fat. The fidelity of the overall reconstructed image to the reference image is significantly improved. The results demonstrate the possibility of realization of an abdominal fat scanner as a low-cost, radiation-free medical device. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Estimation of effective doses in pediatric X-ray computed tomography examination.

    Science.gov (United States)

    Obara, Hideki; Takahashi, Midori; Kudou, Kazuya; Mariya, Yasushi; Takai, Yoshihiro; Kashiwakura, Ikuo

    2017-11-01

    X-ray computed tomography (CT) images are used for diagnostic and therapeutic purposes in various medical disciplines. In Japan, the number of facilities that own diagnostic CT equipment, the number of CT examinations and the number of CT scanners increased by ~1.4-fold between 2005 and 2011. CT operators (medical radiological technologists, medical physicists and physicians) must understand the effective doses for examinations at their own institutions and carefully approach each examination. In addition, the patients undergoing the examination (as well as his/her family) must understand the effective dose of each examination in the context of the cumulative dose. In the present study, the numbers of pediatric patients (aged 0-5 years) and total patients who underwent CT at Hirosaki University Hospital (Hirosaki, Japan) between January 2011 and December 2013 were surveyed, and effective doses administered to children aged 0, 1 and 5 years were evaluated. Age- and region-specific conversion factors and dose-length products obtained from the CT scanner were used to estimate the effective doses. The numbers of CT examinations performed in 2011, 2012 and 2013 were 16,662, 17,491 and 17,649, respectively, of which 613 (1.2%) of the overall total involved children aged 0-5 years. The estimated effective doses per examination to children aged 0, 1 and 5 years were 6.3±4.8, 4.9±3.8 and 2.7±3.0 mSv, respectively. This large variation was attributed to several factors associated with scan methods and ranges in actual setting. In conclusion, the requirement for individual patient prospective exposure management systems and estimations of low-dose radiation exposure should be considered in light of the harmful effects of exposure.

  15. BATEMANATER: a computer program to estimate and bootstrap mating system variables based on Bateman's principles.

    Science.gov (United States)

    Jones, Adam G

    2015-11-01

    Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.

  16. Model for Estimating Power and Downtime Effects on Teletherapy Units in Low-Resource Settings

    Directory of Open Access Journals (Sweden)

    Rachel McCarroll

    2017-10-01

    Full Text Available Purpose: More than 6,500 megavoltage teletherapy units are needed worldwide, many in low-resource settings. Cobalt-60 units or linear accelerators (linacs can fill this need. We have evaluated machine performance on the basis of patient throughput to provide insight into machine viability under various conditions in such a way that conclusions can be generalized to a vast array of clinical scenarios. Materials and Methods: Data from patient treatment plans, peer-reviewed studies, and international organizations were combined to assess the relative patient throughput of linacs and cobalt-60 units that deliver radiotherapy with standard techniques under various power and maintenance support conditions. Data concerning the frequency and duration of power outages and downtime characteristics of the machines were used to model teletherapy operation in low-resource settings. Results: Modeled average daily throughput was decreased for linacs because of lack of power infrastructure and for cobalt-60 units because of limited and decaying source strength. For conformal radiotherapy delivered with multileaf collimators, average daily patient throughput over 8 years of operation was equal for cobalt-60 units and linacs when an average of 1.83 hours of power outage occurred per 10-hour working day. Relative to conformal treatments delivered with multileaf collimators on the respective machines, the use of advanced techniques on linacs decreased throughput between 20% and 32% and, for cobalt machines, the need to manually place blocks reduced throughput up to 37%. Conclusion: Our patient throughput data indicate that cobalt-60 units are generally best suited for implementation when machine operation might be 70% or less of total operable time because of power outages or mechanical repair. However, each implementation scenario is unique and requires consideration of all variables affecting implementation.

  17. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  18. Resource Estimations in Contingency Planning for Foot-And-Mouth Disease

    DEFF Research Database (Denmark)

    Boklund, Anette; Sten, Mortensen; Holm Johansen, Maren

    Preparedness planning for a veterinary crisis is important to be fast and effective in the eradication of disease. For countries with a large export of animals and animal products, each extra day in an epidemic will cost millions of euros due to the closure of export markets. This is important....... It was estimated that the need for personnel would peak on day 7 with a requirement of approximately 170 veterinarians, 70 technicians and 45 administrative staff. However, the need for personnel in the Danish Emergency Management Agency (responsible for the hygiene barrier and initial cleaning and disinfection...

  19. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    Science.gov (United States)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  20. A REVIEW ON SECURITY ISSUES AND CHALLENGES IN CLOUD COMPUTING MODEL OF RESOURCE MANAGEMENT

    OpenAIRE

    T. Vaikunth Pai; Dr. P. S. Aithal

    2017-01-01

    Cloud computing services refer to set of IT-enabled services delivered to a customer as services over the Internet on a leased basis and have the capability to extend up or down their service requirements or needs. Usually, cloud computing services are delivered by third party vendors who own the infrastructure. It has several advantages include scalability, elasticity, flexibility, efficiency and outsourcing non-core activities of an organization. Cloud computing offers an innovative busines...

  1. COPATH - a spreadsheet model for the estimation of carbon flows associated with the use of forest resources

    International Nuclear Information System (INIS)

    Makundi, W.; Sathaye, J.; Ketoff, A.

    1995-01-01

    The forest sector plays a key role in the global climate change process. A significant amount of net greenhouse gas emissions emanate from land use changes, and the sector offers a unique opportunity to sequester carbon in vegetation, detritus, soils and forest products. However, the estimates of carbon flows associated with the use of forest resources have been quite imprecise. This paper describes a methodological framework-COPATH-which is a spreadsheet model for estimating carbon emissions and sequestration from deforestation and harvesting of forests. The model has two parts, the first estimates carbon stocks, emissions and uptake in the base year, while the second part forecasts future emissions and the uptake under various scenarios. The forecast module is structured after the main modes of forest conversion, i.e. agriculture, pasture, forest harvesting and other land uses. The model can be used by countries which may not possess an abundance of pertinent data, and allows for the use of forest inventory data to estimate carbon stocks. The choice of the most likely scenario provides the country with a carbon flux profile necessary to formulate GHG mitigation strategies. (Author)

  2. Value of Clean Water Resources: Estimating the Water Quality Improvement in Metro Manila, Philippines

    Directory of Open Access Journals (Sweden)

    Shokhrukh-Mirzo Jalilov

    2017-12-01

    Full Text Available While having many positive impacts, a tremendous economic performance and rapid industrial expansion over the last decades in the Philippines has had negative effects that have resulted in unfavorable hydrological and ecological changes in most urban river systems and has created environmental problems. Usually, these effects would not be part of a systematic assessment of urban water benefits. To address the issue, this study investigates the relationship between poor water quality and resident’s willingness to pay (WTP for improved water quality in Metro Manila. By employing a contingent valuation method (CVM, this paper estimates the benefits of the provision of clean water quality (swimmable and fishable in waterbodies of Metro Manila for its residents. Face-to-face interviews were completed with 240 randomly selected residents. Residents expressed a mean WTP of PHP102.44 (USD2.03 for a swimmable water quality (good quality and a mean WTP of PHP102.39 (USD2.03 for fishable water quality (moderate quality. The aggregation of this mean willingness-to-pay value amounted to annual economic benefits from PHP9443 billion to PHP9447 billion (approx. USD190 million per year for all taxpayers in Metro Manila. As expected, these estimates could inform local decision-makers about the benefits of future policy interventions aimed at improving the quality of waterbodies in Metro Manila.

  3. Use of GRACE Terrestrial Water Storage Retrievals to Evaluate Model Estimates by the Australian Water Resources Assessment System

    Science.gov (United States)

    van Dijk, A. I. J. M.; Renzullo, L. J.; Rodell, M.

    2011-01-01

    Terrestrial water storage (TWS) estimates retrievals from the Gravity Recovery and Climate Experiment (GRACE) satellite mission were compared to TWS modeled by the Australian Water Resources Assessment (AWRA) system. The aim was to test whether differences could be attributed and used to identify model deficiencies. Data for 2003 2010 were decomposed into the seasonal cycle, linear trends and the remaining de-trended anomalies before comparing. AWRA tended to have smaller seasonal amplitude than GRACE. GRACE showed a strong (greater than 15 millimeter per year) drying trend in northwest Australia that was associated with a preceding period of unusually wet conditions, whereas weaker drying trends in the southern Murray Basin and southwest Western Australia were associated with relatively dry conditions. AWRA estimated trends were less negative for these regions, while a more positive trend was estimated for areas affected by cyclone Charlotte in 2009. For 2003-2009, a decrease of 7-8 millimeter per year (50-60 cubic kilometers per year) was estimated from GRACE, enough to explain 6-7% of the contemporary rate of global sea level rise. This trend was not reproduced by the model. Agreement between model and data suggested that the GRACE retrieval error estimates are biased high. A scaling coefficient applied to GRACE TWS to reduce the effect of signal leakage appeared to degrade quantitative agreement for some regions. Model aspects identified for improvement included a need for better estimation of rainfall in northwest Australia, and more sophisticated treatment of diffuse groundwater discharge processes and surface-groundwater connectivity for some regions.

  4. Tidal Energy Conversion Installation at an Estuarine Bridge Site: Resource Evaluation and Energy Production Estimate

    Science.gov (United States)

    Wosnik, M.; Gagnon, I.; Baldwin, K.; Bell, E.

    2015-12-01

    The "Living Bridge" project aims to create a self-diagnosing, self-reporting "smart bridge" powered by a local renewable energy source, tidal energy - transforming Memorial Bridge, a vertical lift bridge over the tidal Piscataqua River connecting Portsmouth, NH and Kittery, ME, into a living laboratory for researchers, engineers, scientists, and the community. The Living Bridge project includes the installation of a tidal turbine at the Memorial Bridge. The energy converted by the turbine will power structural health monitoring, environmental and underwater instrumentation. Utilizing locally available tidal energy can make bridge operation more sustainable, can "harden" transportation infrastructure against prolonged grid outages and can demonstrate a prototype of an "estuarine bridge of the future". A spatio-temporal tidal energy resource assessment was performed using long term bottom-deployed Acoustic Doppler Current Profilers (ADCP) at two locations: near the planned deployment location in 2013-14 for 123 days and mid-channel in 2007 for 35 days. Data were evaluated to determine the amount of available kinetic energy that can be converted into usable electrical energy on the bridge. Changes in available kinetic energy with ebb/flood and spring/neap tidal cycles and electrical energy demand were analyzed. The target deployment site exhibited significantly more energetic ebb tides than flood tides, which can be explained by the local bathymetry of the tidal estuary. A system model is used to calculate the net energy savings using various tidal generator and battery bank configurations. Different resource evaluation methodologies were also analyzed, e.g., using a representative ADCP "bin" vs. a more refined, turbine-geometry-specific methodology, and using static bin height vs. bin height that move w.r.t. the free surface throughout a tidal cycle (representative of a bottom-fixed or floating turbine deployment, respectively). ADCP operating frequencies and bin

  5. Radon estimation in water resources of Mandi - Dharamshala region of Himachal Pradesh, India for health risk assessments

    Science.gov (United States)

    Kumar, Gulshan; Kumari, Punam; Kumar, Mukesh; Kumar, Arvind; Prasher, Sangeeta; Dhar, Sunil

    2017-07-01

    The present study deals with the radon estimation in 40 water samples collected from different natural resources and radium content in the soils of Mandi-Dharamshala Region. Radon concentration is determined by using RAD-7 detector and radium contents of the soil in vicinity of water resources is as well measured by using LR-115 type - II detector, which is further correlated with radon concentration in water samples. The potential health risks related with 222Rn have also been estimated. The results show that the radon concentrations within the range of 1.51 to 22.7Bq/l with an average value of 5.93 Bq/l for all type of water samples taken from study area. The radon concentration in water samples is found lower than 100Bq/l, the exposure limit of radon in water recommended by the World Health Organization. The calculated average effective dose of radon received by the people of study area is 0.022 mSv/y with maximum of 0.083 mSv/y and minimum 0.0056 mSv/y. The total effective dose in all sites of the studied area is found to be within the safe limit (0.1 mSv/year) recommended by World Health Organization. The average value of radium content in the soil of study area is 6.326 Bq/kg.

  6. Everglades Depth Estimation Network (EDEN)—A decade of serving hydrologic information to scientists and resource managers

    Science.gov (United States)

    Patino, Eduardo; Conrads, Paul; Swain, Eric; Beerens, James M.

    2017-10-30

    IntroductionThe Everglades Depth Estimation Network (EDEN) provides scientists and resource managers with regional maps of daily water levels and depths in the freshwater part of the Greater Everglades landscape. The EDEN domain includes all or parts of five Water Conservation Areas, Big Cypress National Preserve, Pennsuco Wetlands, and Everglades National Park. Daily water-level maps are interpolated from water-level data at monitoring gages, and depth is estimated by using a digital elevation model of the land surface. Online datasets provide time series of daily water levels at gages and rainfall and evapotranspiration data (https://sofia.usgs.gov/eden/). These datasets are used by scientists and resource managers to guide large-scale field operations, describe hydrologic changes, and support biological and ecological assessments that measure ecosystem response to the implementation of the Comprehensive Everglades Restoration Plan. EDEN water-level data have been used in a variety of biological and ecological studies including (1) the health of American alligators as a function of water depth, (2) the variability of post-fire landscape dynamics in relation to water depth, (3) the habitat quality for wading birds with dynamic habitat selection, and (4) an evaluation of the habitat of the Cape Sable seaside sparrow.

  7. Estimation of the resource buffers in the assembly process of a shearer machine in the CPPM method

    Directory of Open Access Journals (Sweden)

    Gwiazda Aleksander

    2017-01-01

    Full Text Available Dynamic development of scheduling systems allows significantly improving currently realized tasks. Critical Chain Project Management (CCPM is one of the methods of project management basing on network planning. In this method is utilized the concept of a critical chain derived from the Theory of Constraints. This method allows avoiding losses considered project time and resources. It results in quicker project implementation (20-30%, and in reducing the risk level considered with tasks realization. The projects are cheaper, and the risk of cost overruns is significantly reduced. Factors that distinguish CCPM method from traditional network planning methods are: the balance of resources and the introduction of buffers. Moreover in the CCPM method key elements are: times of tasks that are reduced from traditional estimates to realistic ones. Activities associated with the task start as late as possible in accordance with the ALAP principle (As Late As Possible. This work presents the process of managing the assembly of a shearer machine taking into account the process of safety buffers utilization and the whole project optimization. It is presented the estimation of buffers capacity to obtain the improvement of project realization task.

  8. Motion estimation for cardiac functional analysis using two x-ray computed tomography scans.

    Science.gov (United States)

    Fung, George S K; Ciuffo, Luisa; Ashikaga, Hiroshi; Taguchi, Katsuyuki

    2017-09-01

    This work concerns computed tomography (CT)-based cardiac functional analysis (CFA) with a reduced radiation dose. As CT-CFA requires images over the entire heartbeat, the scans are often performed at 10-20% of the tube current settings that are typically used for coronary CT angiography. A large image noise then degrades the accuracy of motion estimation. Moreover, even if the scan was performed during the sinus rhythm, the cardiac motion observed in CT images may not be cyclic with patients with atrial fibrillation. In this study, we propose to use two CT scan data, one for CT angiography at a quiescent phase at a standard dose and the other for CFA over the entire heart beat at a lower dose. We have made the following four modifications to an image-based cardiac motion estimation method we have previously developed for a full-dose retrospectively gated coronary CT angiography: (a) a full-dose prospectively gated coronary CT angiography image acquired at the least motion phase was used as the reference image; (b) a three-dimensional median filter was applied to lower-dose retrospectively gated cardiac images acquired at 20 phases over one heartbeat in order to reduce image noise; (c) the strength of the temporal regularization term was made adaptive; and (d) a one-dimensional temporal filter was applied to the estimated motion vector field in order to decrease jaggy motion patterns. We describe the conventional method iME1 and the proposed method iME2 in this article. Five observers assessed the accuracy of the estimated motion vector field of iME2 and iME1 using a 4-point scale. The observers repeated the assessment with data presented in a new random order 1 week after the first assessment session. The study confirmed that the proposed iME2 was robust against the mismatch of noise levels, contrast enhancement levels, and shapes of the chambers. There was a statistically significant difference between iME2 and iME1 (accuracy score, 2.08 ± 0.81 versus 2.77

  9. Scientific and practical tools for dealing with water resource estimations for the future

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2015-06-01

    Full Text Available Future flow regimes will be different to today and imperfect knowledge of present and future climate variations, rainfall–runoff processes and anthropogenic impacts make them highly uncertain. Future water resources decisions will rely on practical and appropriate simulation tools that are sensitive to changes, can assimilate different types of change information and flexible enough to accommodate improvements in understanding of change. They need to include representations of uncertainty and generate information appropriate for uncertain decision-making. This paper presents some examples of the tools that have been developed to address these issues in the southern Africa region. The examples include uncertainty in present day simulations due to lack of understanding and data, using climate change projection data from multiple climate models and future catchment responses due to both climate and development effects. The conclusions are that the tools and models are largely available and what we need is more reliable forcing and model evlaution information as well as methods of making decisions with such inevitably uncertain information.

  10. Resource communication: Variability in estimated runoff in a forested area based on different cartographic data sources

    Directory of Open Access Journals (Sweden)

    Laura Fragoso

    2017-10-01

    Full Text Available Aim of study: The goal of this study is to analyse variations in curve number (CN values produced by different cartographic data sources in a forested watershed, and determine which of them best fit with measured runoff volumes. Area of study: A forested watershed located in western Spain. Material and methods: Four digital cartographic data sources were used to determine the runoff CN in the watershed. Main results: None of the cartographic sources provided all the information necessary to determine properly the CN values. Our proposed methodology, focused on the tree canopy cover, improves the achieved results. Research highlights: The estimation of the CN value in forested areas should be attained as a function of tree canopy cover and new calibrated tables should be implemented in a local scale.

  11. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    Science.gov (United States)

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  12. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  13. Evaluation of the individual tube current setting in electrocardiogram-gated cardiac computed tomography estimated from plain chest computed tomography using computed tomography automatic exposure control

    International Nuclear Information System (INIS)

    Kishimoto, Junichi; Sakou, Toshio; Ohta, Yasutoshi

    2013-01-01

    The aim of this study was to estimate the tube current on a cardiac computed tomography (CT) from a plain chest CT using CT-automatic exposure control (CT-AEC), to obtain consistent image noise, and to optimize the scan tube current by individualizing the tube current. Sixty-five patients (Group A) underwent cardiac CT at fixed tube current. The mAs value for plain chest CT using CT-AEC (AEC value) and cardiac CT image noise were measured. The tube current needed to obtain the intended level of image noise in the cardiac CT was determined from their correlation. Another 65 patients (Group B) underwent cardiac CT with tube currents individually determined from the AEC value. Image noise was compared among Group A and B. Image noise of cardiac CT in Group B was 24.4±3.1 Hounsfield unit (HU) and was more uniform than in Group A (21.2±6.1 HU). The error with the desired image noise of 25 HU was lower in Group B (2.4%) than in Group A (15.2%). Individualized tube current selection based on AEC value thus provided consistent image noise and a scan tube current optimized for cardiac CT. (author)

  14. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  16. Validation of computer code TRAFIC used for estimation of charcoal heatup in containment ventilation systems

    International Nuclear Information System (INIS)

    Yadav, D.H.; Datta, D.; Malhotra, P.K.; Ghadge, S.G.; Bajaj, S.S.

    2005-01-01

    Full text of publication follows: Standard Indian PHWRs are provided with a Primary Containment Filtration and Pump-Back System (PCFPB) incorporating charcoal filters in the ventilation circuit to remove radioactive iodine that may be released from reactor core into the containment during LOCA+ECCS failure which is a Design Basis Accident for containment of radioactive release. This system is provided with two identical air circulation loops, each having 2 full capacity fans (1 operating and 1 standby) for a bank of four combined charcoal and High Efficiency Particulate Activity (HEPA) filters, in addition to other filters. While the filtration circuit is designed to operate under forced flow conditions, it is of interest to understand the performance of the charcoal filters, in the event of failure of the fans after operating for some time, i.e., when radio-iodine inventory is at its peak value. It is of interest to check whether the buoyancy driven natural circulation occurring in the filtration circuit is sufficient enough to keep the temperature in the charcoal under safe limits. A computer code TRAFIC (Transient Analysis of Filters in Containment) was developed using conservative one dimensional model to analyze the system. Suitable parametric studies were carried out to understand the problem and to identify the safety of existing system. TRAFIC Code has two important components. The first one estimates the heat generation in charcoal filter based on 'Source Term'; while the other one performs thermal-hydraulic computations. In an attempt validate the Code, experimental studies have been carried out. For this purpose, an experimental set up comprising of scaled down model of filtration circuit with heating coils embedded in charcoal for simulating the heating effect due to radio iodine has been constructed. The present work of validation consists of utilizing the results obtained from experiments conducted for different heat loads, elevations and adsorbent

  17. Computational estimation of rainbow trout estrogen receptor binding affinities for environmental estrogens

    International Nuclear Information System (INIS)

    Shyu, Conrad; Cavileer, Timothy D.; Nagler, James J.; Ytreberg, F. Marty

    2011-01-01

    Environmental estrogens have been the subject of intense research due to their documented detrimental effects on the health of fish and wildlife and their potential to negatively impact humans. A complete understanding of how these compounds affect health is complicated because environmental estrogens are a structurally heterogeneous group of compounds. In this work, computational molecular dynamics simulations were utilized to predict the binding affinity of different compounds using rainbow trout (Oncorhynchus mykiss) estrogen receptors (ERs) as a model. Specifically, this study presents a comparison of the binding affinity of the natural ligand estradiol-17β to the four rainbow trout ER isoforms with that of three known environmental estrogens 17α-ethinylestradiol, bisphenol A, and raloxifene. Two additional compounds, atrazine and testosterone, that are known to be very weak or non-binders to ERs were tested. The binding affinity of these compounds to the human ERα subtype is also included for comparison. The results of this study suggest that, when compared to estradiol-17β, bisphenol A binds less strongly to all four receptors, 17α-ethinylestradiol binds more strongly, and raloxifene has a high affinity for the α subtype only. The results also show that atrazine and testosterone are weak or non-binders to the ERs. All of the results are in excellent qualitative agreement with the known in vivo estrogenicity of these compounds in the rainbow trout and other fishes. Computational estimation of binding affinities could be a valuable tool for predicting the impact of environmental estrogens in fish and other animals.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. Ideal observer estimation and generalized ROC analysis for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Edwards, Darrin C.

    2004-01-01

    The research presented in this dissertation represents an innovative application of computer-aided diagnosis and signal detection theory to the specific task of early detection of breast cancer in the context of screening mammography. A number of automated schemes have been developed in our laboratory to detect masses and clustered microcalcifications in digitized mammograms, on the one hand, and to classify known lesions as malignant or benign, on the other. The development of fully automated classification schemes is difficult, because the output of a detection scheme will contain false-positive detections in addition to detected malignant and benign lesions, resulting in a three-class classification task. Researchers have so far been unable to extend successful tools for analyzing two-class classification tasks, such as receiver operating characteristic (ROC) analysis, to three-class classification tasks. The goals of our research were to use Bayesian artificial neural networks to estimate ideal observer decision variables to both detect and classify clustered microcalcifications and mass lesions in mammograms, and to derive substantial theoretical results indicating potential avenues of approach toward the three-class classification task. Specifically, we have shown that an ideal observer in an N-class classification task achieves an optimal ROC hypersurface, just as the two-class ideal observer achieves an optimal ROC curve; and that an obvious generalization of a well-known two-class performance metric, the area under the ROC curve, is not useful as a performance metric in classification tasks with more than two classes. This work is significant for three reasons. First, it involves the explicit estimation of feature-based (as opposed to image-based) ideal observer decision variables in the tasks of detecting and classifying mammographic lesions. Second, it directly addresses the three-class classification task of distinguishing malignant lesions, benign

  20. Estimation of Postmortem Interval Using the Radiological Techniques, Computed Tomography: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Jiulin Wang

    2017-01-01

    Full Text Available Estimation of postmortem interval (PMI has been an important and difficult subject in the forensic study. It is a primary task of forensic work, and it can help guide the work in field investigation. With the development of computed tomography (CT technology, CT imaging techniques are now being more frequently applied to the field of forensic medicine. This study used CT imaging techniques to observe area changes in different tissues and organs of rabbits after death and the changing pattern of the average CT values in the organs. The study analyzed the relationship between the CT values of different organs and PMI with the imaging software Max Viewer and obtained multiparameter nonlinear regression equation of the different organs, and the study provided an objective and accurate method and reference information for the estimation of PMI in the forensic medicine. In forensic science, PMI refers to the time interval between the discovery or inspection of corpse and the time of death. CT, magnetic resonance imaging, and other imaging techniques have become important means of clinical examinations over the years. Although some scholars in our country have used modern radiological techniques in various fields of forensic science, such as estimation of injury time, personal identification of bodies, analysis of the cause of death, determination of the causes of injury, and identification of the foreign substances of bodies, there are only a few studies on the estimation of time of death. We detected the process of subtle changes in adult rabbits after death, the shape and size of tissues and organs, and the relationship between adjacent organs in three-dimensional space in an effort to develop new method for the estimation of PMI. The bodies of the dead rabbits were stored at 20°C room temperature, sealed condition, and prevented exposure to flesh flies. The dead rabbits were randomly divided into comparison group and experimental group. The whole