WorldWideScience

Sample records for computing resource estimate

  1. LHCb Computing Resources: 2012 re-assessment, 2013 request and 2014 forecast

    CERN Document Server

    Graciani Diaz, Ricardo

    2012-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2012 data-taking period, request of computing resource needs for 2013, and a first forecast of the 2014 needs, when restart of data-taking is foreseen. Estimates are based on 2011 experience, as well as on the results of a simulation of the computing model described in the document. Differences in the model and deviations in the estimates from previous presented results are stressed.

  2. LHCb Computing Resources: 2011 re-assessment, 2012 request and 2013 forecast

    CERN Document Server

    Graciani, R

    2011-01-01

    This note covers the following aspects: re-assessment of computing resource usage estimates for 2011 data taking period, request of computing resource needs for 2012 data taking period and a first forecast of the 2013 needs, when no data taking is foreseen. Estimates are based on 2010 experienced and last updates from LHC schedule, as well as on a new implementation of the computing model simulation tool. Differences in the model and deviations in the estimates from previous presented results are stressed.

  3. Estimation of potential uranium resources

    International Nuclear Information System (INIS)

    Curry, D.L.

    1977-09-01

    Potential estimates, like reserves, are limited by the information on hand at the time and are not intended to indicate the ultimate resources. Potential estimates are based on geologic judgement, so their reliability is dependent on the quality and extent of geologic knowledge. Reliability differs for each of the three potential resource classes. It is greatest for probable potential resources because of the greater knowledge base resulting from the advanced stage of exploration and development in established producing districts where most of the resources in this class are located. Reliability is least for speculative potential resources because no significant deposits are known, and favorability is inferred from limited geologic data. Estimates of potential resources are revised as new geologic concepts are postulated, as new types of uranium ore bodies are discovered, and as improved geophysical and geochemical techniques are developed and applied. Advances in technology that permit the exploitation of deep or low-grade deposits, or the processing of ores of previously uneconomic metallurgical types, also will affect the estimates

  4. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  5. Statistics Online Computational Resource for Education

    Science.gov (United States)

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  6. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  7. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  8. Efficient Resource Management in Cloud Computing

    OpenAIRE

    Rushikesh Shingade; Amit Patil; Shivam Suryawanshi; M. Venkatesan

    2015-01-01

    Cloud computing, one of the widely used technology to provide cloud services for users who are charged for receiving services. In the aspect of a maximum number of resources, evaluating the performance of Cloud resource management policies are difficult to optimize efficiently. There are different simulation toolkits available for simulation and modelling the Cloud computing environment like GridSim CloudAnalyst, CloudSim, GreenCloud, CloudAuction etc. In proposed Efficient Resource Manage...

  9. Meta-analysis of non-renewable energy resource estimates

    International Nuclear Information System (INIS)

    Dale, Michael

    2012-01-01

    This paper offers a review of estimates of ultimately recoverable resources (URR) of non-renewable energy sources: coal, conventional and unconventional oil, conventional and unconventional gas, and uranium for nuclear fission. There is a large range in the estimates of many of the energy sources, even those that have been utilized for a long time and, as such, should be well understood. If it is assumed that the estimates for each resource are normally distributed, then the total value of ultimately recoverable fossil and fissile energy resources is 70,592 EJ. If, on the other hand, the best fitting distribution from each of the resource estimate populations is used, a the total value is 50,702 EJ, a factor of around 30% smaller. - Highlights: ► Brief introduction to categorization of resources. ► Collated over 380 estimates of ultimately recoverable global resources for all non-renewable energy sources. ► Extensive statistical analysis and distribution fitting conducted. ► Cross-energy source comparison of resource magnitudes.

  10. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  11. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  12. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  13. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  14. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    Science.gov (United States)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  15. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  16. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  17. Speculative resources of uranium. A review of International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983

    International Nuclear Information System (INIS)

    1983-01-01

    On a country by country basis the International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983 are reviewed. Information provided includes exploration work, airborne survey, radiometric survey, gamma-ray spectrometric survey, estimate of speculative resources, uranium occurrences, uranium deposits, uranium mineralization, agreements for uranium exploration, feasibilities studies, geological classification of resources, proposed revised resource range, production estimate of uranium

  18. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  19. Improving ATLAS computing resource utilization with HammerCloud

    CERN Document Server

    Schovancova, Jaroslava; The ATLAS collaboration

    2018-01-01

    HammerCloud is a framework to commission, test, and benchmark ATLAS computing resources and components of various distributed systems with realistic full-chain experiment workflows. HammerCloud contributes to ATLAS Distributed Computing (ADC) Operations and automation efforts, providing the automated resource exclusion and recovery tools, that help re-focus operational manpower to areas which have yet to be automated, and improve utilization of available computing resources. We present recent evolution of the auto-exclusion/recovery tools: faster inclusion of new resources in testing machinery, machine learning algorithms for anomaly detection, categorized resources as master vs. slave for the purpose of blacklisting, and a tool for auto-exclusion/recovery of resources triggered by Event Service job failures that is being extended to other workflows besides the Event Service. We describe how HammerCloud helped commissioning various concepts and components of distributed systems: simplified configuration of qu...

  20. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  1. ResourceGate: A New Solution for Cloud Computing Resource Allocation

    OpenAIRE

    Abdullah A. Sheikh

    2012-01-01

    Cloud computing has taken place to be focused by educational and business communities. These concerns include their needs to improve the Quality of Services (QoS) provided, also services such as reliability, performance and reducing costs. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring these benefits is considered to be the major factor in the cloud computing environment. This paper surveys recent research related to cloud computing resource al...

  2. Resource-estimation models and predicted discovery

    International Nuclear Information System (INIS)

    Hill, G.W.

    1982-01-01

    Resources have been estimated by predictive extrapolation from past discovery experience, by analogy with better explored regions, or by inference from evidence of depletion of targets for exploration. Changes in technology and new insights into geological mechanisms have occurred sufficiently often in the long run to form part of the pattern of mature discovery experience. The criterion, that a meaningful resource estimate needs an objective measure of its precision or degree of uncertainty, excludes 'estimates' based solely on expert opinion. This is illustrated by development of error measures for several persuasive models of discovery and production of oil and gas in USA, both annually and in terms of increasing exploration effort. Appropriate generalizations of the models resolve many points of controversy. This is illustrated using two USA data sets describing discovery of oil and of U 3 O 8 ; the latter set highlights an inadequacy of available official data. Review of the oil-discovery data set provides a warrant for adjusting the time-series prediction to a higher resource figure for USA petroleum. (author)

  3. Aggregated Computational Toxicology Resource (ACTOR)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aggregated Computational Toxicology Resource (ACTOR) is a database on environmental chemicals that is searchable by chemical name and other identifiers, and by...

  4. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  5. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  6. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  7. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  8. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  9. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  10. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  11. Some issues of creation of belarusian language computer resources

    OpenAIRE

    Rubashko, N.; Nevmerjitskaia, G.

    2003-01-01

    The main reason for creation of computer resources of natural language is the necessity to bring into accord the ways of language normalization with the form of its existence - the computer form of language usage should correspond to the computer form of language standards fixation. This paper discusses various aspects of the creation of Belarusian language computer resources. It also briefly gives an overview of the objectives of the project involved.

  12. Estimation of economic parameters of U.S. hydropower resources

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Douglas G. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Hunt, Richard T. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Reeves, Kelly S. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Carroll, Greg R. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2003-06-01

    Tools for estimating the cost of developing and operating and maintaining hydropower resources in the form of regression curves were developed based on historical plant data. Development costs that were addressed included: licensing, construction, and five types of environmental mitigation. It was found that the data for each type of cost correlated well with plant capacity. A tool for estimating the annual and monthly electric generation of hydropower resources was also developed. Additional tools were developed to estimate the cost of upgrading a turbine or a generator. The development and operation and maintenance cost estimating tools, and the generation estimating tool were applied to 2,155 U.S. hydropower sites representing a total potential capacity of 43,036 MW. The sites included totally undeveloped sites, dams without a hydroelectric plant, and hydroelectric plants that could be expanded to achieve greater capacity. Site characteristics and estimated costs and generation for each site were assembled in a database in Excel format that is also included within the EERE Library under the title, “Estimation of Economic Parameters of U.S. Hydropower Resources - INL Hydropower Resource Economics Database.”

  13. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  14. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  15. A Semi-Preemptive Computational Service System with Limited Resources and Dynamic Resource Ranking

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2012-03-01

    Full Text Available In this paper, we integrate a grid system and a wireless network to present a convenient computational service system, called the Semi-Preemptive Computational Service system (SePCS for short, which provides users with a wireless access environment and through which a user can share his/her resources with others. In the SePCS, each node is dynamically given a score based on its CPU level, available memory size, current length of waiting queue, CPU utilization and bandwidth. With the scores, resource nodes are classified into three levels. User requests based on their time constraints are also classified into three types. Resources of higher levels are allocated to more tightly constrained requests so as to increase the total performance of the system. To achieve this, a resource broker with the Semi-Preemptive Algorithm (SPA is also proposed. When the resource broker cannot find suitable resources for the requests of higher type, it preempts the resource that is now executing a lower type request so that the request of higher type can be executed immediately. The SePCS can be applied to a Vehicular Ad Hoc Network (VANET, users of which can then exploit the convenient mobile network services and the wireless distributed computing. As a result, the performance of the system is higher than that of the tested schemes.

  16. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  17. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  18. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  19. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  20. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  1. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  2. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  3. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  4. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  5. Computer-Aided Parts Estimation

    OpenAIRE

    Cunningham, Adam; Smart, Robert

    1993-01-01

    In 1991, Ford Motor Company began deployment of CAPE (computer-aided parts estimating system), a highly advanced knowledge-based system designed to generate, evaluate, and cost automotive part manufacturing plans. cape is engineered on an innovative, extensible, declarative process-planning and estimating knowledge representation language, which underpins the cape kernel architecture. Many manufacturing processes have been modeled to date, but eventually every significant process in motor veh...

  6. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  7. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  8. Offshore wind resource estimation for wind energy

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Mouche, A.

    2010-01-01

    Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite observati......Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite...... observations are compared to selected offshore meteorological masts in the Baltic Sea and North Sea. The overall aim of the Norsewind project is a state-of-the-art wind atlas at 100 m height. The satellite winds are all valid at 10 m above sea level. Extrapolation to higher heights is a challenge. Mesoscale...... modeling of the winds at hub height will be compared to data from wind lidars observing at 100 m above sea level. Plans are also to compare mesoscale model results and satellite-based estimates of the offshore wind resource....

  9. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    Science.gov (United States)

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  10. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  11. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  12. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  13. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  14. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  15. Hydrokinetic energy resource estimates of River ERO at Lafiagi ...

    African Journals Online (AJOL)

    Hydrokinetic energy resource estimates of River ERO at Lafiagi, Kwara State, ... cost-effective renewable energy solution without requiring the construction of a ... Keywords: Hydrokinetic Power, Energy Resource, River Ero, Water Resources ... (14); Eritrea (1); Ethiopia (30); Ghana (27); Kenya (29); Lesotho (1); Libya (2) ...

  16. Concrete resource analysis of the quantum linear-system algorithm used to compute the electromagnetic scattering cross section of a 2D target

    Science.gov (United States)

    Scherer, Artur; Valiron, Benoît; Mau, Siun-Chuon; Alexander, Scott; van den Berg, Eric; Chapuran, Thomas E.

    2017-03-01

    We provide a detailed estimate for the logical resource requirements of the quantum linear-system algorithm (Harrow et al. in Phys Rev Lett 103:150502, 2009) including the recently described elaborations and application to computing the electromagnetic scattering cross section of a metallic target (Clader et al. in Phys Rev Lett 110:250504, 2013). Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width (related to parallelism), circuit depth (total number of steps), the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set { X, Y, Z, H, S, T, { CNOT } }. In order to perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the explicit example problem size N=332{,}020{,}680 beyond which, according to a crude big-O complexity comparison, the quantum linear-system algorithm is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy ɛ =0.01 requires an approximate circuit width 340 and circuit depth of order 10^{25} if oracle costs are excluded, and a circuit width and circuit depth of order 10^8 and 10^{29}, respectively, if the resource requirements of oracles are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly (using a fine-grained approach rather than relying on coarse big-O asymptotic approximations) how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient

  17. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  18. Comparing computing formulas for estimating concentration ratios

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.

    1984-03-01

    This paper provides guidance on the choice of computing formulas (estimators) for estimating concentration ratios and other ratio-type measures of radionuclides and other environmental contaminant transfers between ecosystem components. Mathematical expressions for the expected value of three commonly used estimators (arithmetic mean of ratios, geometric mean of ratios, and the ratio of means) are obtained when the multivariate lognormal distribution is assumed. These expressions are used to explain why these estimators will not in general give the same estimate of the average concentration ratio. They illustrate that the magnitude of the discrepancies depends on the magnitude of measurement biases, and on the variances and correlations associated with spatial heterogeneity and measurement errors. This paper also reports on a computer simulation study that compares the accuracy of eight computing formulas for estimating a ratio relationship that is constant over time and/or space. Statistical models appropriate for both controlled spiking experiments and observational field studies for either normal or lognormal distributions are considered. 24 references, 15 figures, 7 tables

  19. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  20. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  1. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  2. The complexity of computing the MCD-estimator

    DEFF Research Database (Denmark)

    Bernholt, T.; Fischer, Paul

    2004-01-01

    In modem statistics the robust estimation of parameters is a central problem, i.e., an estimation that is not or only slightly affected by outliers in the data. The minimum covariance determinant (MCD) estimator (J. Amer. Statist. Assoc. 79 (1984) 871) is probably one of the most important robust...... estimators of location and scatter. The complexity of computing the MCD, however, was unknown and generally thought to be exponential even if the dimensionality of the data is fixed. Here we present a polynomial time algorithm for MCD for fixed dimension of the data. In contrast we show that computing...... the MCD-estimator is NP-hard if the dimension varies. (C) 2004 Elsevier B.V. All rights reserved....

  3. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  4. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  5. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  6. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  7. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  8. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  9. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  10. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  11. Parallel visualization on leadership computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Peterka, T; Ross, R B [Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Shen, H-W [Department of Computer Science and Engineering, Ohio State University, Columbus, OH 43210 (United States); Ma, K-L [Department of Computer Science, University of California at Davis, Davis, CA 95616 (United States); Kendall, W [Department of Electrical Engineering and Computer Science, University of Tennessee at Knoxville, Knoxville, TN 37996 (United States); Yu, H, E-mail: tpeterka@mcs.anl.go [Sandia National Laboratories, California, Livermore, CA 94551 (United States)

    2009-07-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  12. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  13. Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution

    International Nuclear Information System (INIS)

    Souza, Luis Eduardo de; Costa, Joao Felipe C. L.; Koppe, Jair C.

    2004-01-01

    For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit

  14. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  15. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  16. Estimation of wind and solar resources in Mali

    Energy Technology Data Exchange (ETDEWEB)

    Badger, J.; Kamissoko, F.; Olander Rasmussen, M.; Larsen, Soeren; Guidon, N.; Boye Hansen, L.; Dewilde, L.; Alhousseini, M.; Noergaard, P.; Nygaard, I.

    2012-11-15

    The wind resource has been estimated for all of Mali at 7.5 km resolution using the KAMM/WAsP numerical wind atlas methodology. Three domains were used to cover entire country and three sets of wind classes used to capture change in large scale forcing over country. The final output includes generalized climate statistics for any location in Mali, giving wind direction and wind speed distribution. The modelled generalized climate statistics can be used directly in the WAsP software. The preliminary results show a wind resource, which is relatively low, but which under certain conditions may be economically feasible, i.e. at favourably exposed sites, giving enhanced winds, and where practical utilization is possible, given consideration to grid connection or replacement or augmentation of diesel-based electricity systems. The solar energy resource for Mali was assessed for the period between July 2008 and June 2011 using a remote sensing based estimate of the down-welling surface shortwave flux. The remote sensing estimates were adjusted on a month-by-month basis to account for seasonal differences between the remote sensing estimates and in situ data. Calibration was found to improve the coefficient of determination as well as decreasing the mean error both for the calibration and validation data. Compared to the results presented in the ''Renewable energy resources in Mali - preliminary mapping''-report that showed a tendency for underestimation compared to data from the NASA PPOWER/SSE database, the presented results show a very good agreement with the in situ data (after calibration) with no significant bias. Unfortunately, the NASA-database only contains data up until 2005, so a similar comparison could not be done for the time period analyzed in this study, although the agreement with the historic NASA data is still useful as reference. (LN)

  17. Wind resource estimation and siting of wind turbines

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik; Mortensen, N.G.; Landberg, L.

    1994-01-01

    Detailed knowledge of the characteristics of the natural wind is necessary for the design, planning and operational aspect of wind energy systems. Here, we shall only be concerned with those meteorological aspects of wind energy planning that are termed wind resource estimation. The estimation...... of the wind resource ranges from the overall estimation of the mean energy content of the wind over a large area - called regional assessment - to the prediction of the average yearly energy production of a specific wind turbine at a specific location - called siting. A regional assessment will most often...... lead to a so-called wind atlas. A precise prediction of the wind speed at a given site is essential because for aerodynamic reasons the power output of a wind turbine is proportional to the third power of the wind speed, hence even small errors in prediction of wind speed may result in large deviations...

  18. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  19. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  20. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  1. Geological control in computer-based resource estimation

    International Nuclear Information System (INIS)

    Cram, A.A.

    1992-01-01

    The economic assessment of mineral deposits potentially involves a number of phases from initial assessment through to detailed orebody evaluation and subsequently to estimation of bench or stope grades in an operational mine. Obviously the nature and quantity of information varies significantly from one extreme to the other. This paper reports on geological interpretation which obviously plays a major part during the early phases of assessment and it logically follows that computerized orebody assessment techniques must be able to effectively utilize this information. Even in an operational mine the importance of geological control varies according to the type of orebody. The method of modelling coal seams is distinctly different from those techniques used for a massive sulphide deposit. For coal seams significant reliance is placed on the correlation of seam intercepts from holes which are widely spaced in relation to the seam (i.e.) orebody) thickness. The use of geological interpretation in this case is justified other experience gained in the past, compared to the cost of reliance only on samples from a very dense drill hole pattern

  2. Wind Resource Estimation using QuikSCAT Ocean Surface Winds

    DEFF Research Database (Denmark)

    Xu, Qing; Zhang, Guosheng; Cheng, Yongcun

    2011-01-01

    In this study, the offshore wind resources in the East China Sea and South China Sea were estimated from over ten years of QuikSCAT scatterometer wind products. Since the errors of these products are larger close to the coast due to the land contamination of radar backscatter signal...... and the complexity of air-sea interaction processes, an empirical relationship that adjusts QuikSCAT winds in coastal waters was first proposed based on vessel measurements. Then the shape and scale parameters of Weibull function are determined for wind resource estimation. The wind roses are also plotted. Results...

  3. Estimation of intermediate grade uranium resources. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Kendall, G.R.; Klahn, L.J.; Davis, J.C.; Harbaugh, J.W.

    1980-12-01

    The purpose of this project is to analyze the technique currently used by DOE to estimate intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) and, if possible, suggest alternatives to improve the accuracy and precision of the estimate. There are three principal conclusions resulting from this study. They relate to the quantity, distribution and sampling of intermediate grade uranium. While the results of this study must be validated further, they indicate that DOE may be underestimating intermediate level reserves by 20 to 30%. Plots of grade of U 3 O 8 versus tonnage of ore and tonnage U 3 O 8 indicate grade-tonnage relationships that are essentially log-linear, at least down to 0.01% U 3 O 8 . Though this is not an unexpected finding, it may provide a technique for reducing the uncertainty of intermediate grade endowment. The results of this study indicate that a much lower drill hole density is necessary for DOE to estimate uranium resources than for a mining company to calculate ore resources. Though errors in local estimates will occur, they will tend to cancel over the entire deposit

  4. Estimating the Ground Water Resources of Atoll Islands

    Directory of Open Access Journals (Sweden)

    Arne E. Olsen

    2010-01-01

    Full Text Available Ground water resources of atolls, already minimal due to the small surface area and low elevation of the islands, are also subject to recurring, and sometimes devastating, droughts. As ground water resources become the sole fresh water source when rain catchment supplies are exhausted, it is critical to assess current groundwater resources and predict their depletion during drought conditions. Several published models, both analytical and empirical, are available to estimate the steady-state freshwater lens thickness of small oceanic islands. None fully incorporates unique shallow geologic characteristics of atoll islands, and none incorporates time-dependent processes. In this paper, we provide a review of these models, and then present a simple algebraic model, derived from results of a comprehensive numerical modeling study of steady-state atoll island aquifer dynamics, to predict the ground water response to changes in recharge on atoll islands. The model provides an estimate thickness of the freshwater lens as a function of annual rainfall rate, island width, Thurber Discontinuity depth, upper aquifer hydraulic conductivity, presence or absence of a confining reef flat plate, and in the case of drought, time. Results compare favorably with published atoll island lens thickness observations. The algebraic model is incorporated into a spreadsheet interface for use by island water resources managers.

  5. Resource estimations in contingency planning for FMD

    DEFF Research Database (Denmark)

    Boklund, Anette; Sten, Mortensen; Holm Johansen, Maren

    Based on results from a stochastic simulation model, it was possible to create a simple model in excel to estimate the requirements for personnel and materiel during an FMD outbreak in Denmark. The model can easily be adjusted, when new information on resources appears from management of other cr...

  6. Data Security Risk Estimation for Information-Telecommunication Systems on the basis of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anatoly Valeryevich Tsaregorodtsev

    2014-02-01

    Full Text Available Cloud computing will be one of the most common IT technologies to deploy applications, due to its key features: on-demand network access to a shared pool of configurable computing resources, flexibility and good quality/price ratio. Migrating to cloud architecture enables organizations to reduce the overall cost of implementing and maintaining the infrastructure and reduce development time for new business applications. There are many factors that influence the information security environment of cloud, as its multitenant architecture brings new and more complex problems and vulnerabilities. And the approach to risk estimation used in making decisions about the migration of critical data in the cloud infrastructure of the organization are proposed in the paper.

  7. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  8. Sustainable economic growth and exhaustible resources: A model and estimation for the US

    Directory of Open Access Journals (Sweden)

    Almuth Scholl

    2002-01-01

    Full Text Available This paper studies current models on sustainable economic growth with resource constraints and explores to what extent resource constraints can be overcome by substitution and technological change. We also study the problem of intergenerational equity and the different criteria that have been suggested in the literature. The central part of this paper is the presentation of stylized facts on exhaustible resources and an estimation of a basic model with resource constraints for US time series data. The estimated years left until depletion and the empirical trends of the ratios of capital stock and consumption to resources seem to indicate that there might be a threat to sustainable growth in the future. In our estimation, we obtain parameter values, which help to interpret the extent to which growth with exhaustible resources is sustainable.

  9. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  10. An Improved Global Wind Resource Estimate for Integrated Assessment Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-01

    This paper summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquely detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.

  11. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  12. LHCb Computing Resource usage in 2017

    CERN Document Server

    Bozzi, Concezio

    2018-01-01

    This document reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2017. The data in the following sections have been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  13. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  14. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Nan Zhang

    Full Text Available Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  15. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  16. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  17. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  18. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  19. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  20. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  1. Resource-Aware Load Balancing Scheme using Multi-objective Optimization in Cloud Computing

    OpenAIRE

    Kavita Rana; Vikas Zandu

    2016-01-01

    Cloud computing is a service based, on-demand, pay per use model consisting of an interconnected and virtualizes resources delivered over internet. In cloud computing, usually there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, shortest response time, and efficient utilization of resources etc. Hence, job scheduling is the most important concern that aims to ensure that use’s requirement are ...

  2. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  3. River suspended sediment estimation by climatic variables implication: Comparative study among soft computing techniques

    Science.gov (United States)

    Kisi, Ozgur; Shiri, Jalal

    2012-06-01

    Estimating sediment volume carried by a river is an important issue in water resources engineering. This paper compares the accuracy of three different soft computing methods, Artificial Neural Networks (ANNs), Adaptive Neuro-Fuzzy Inference System (ANFIS), and Gene Expression Programming (GEP), in estimating daily suspended sediment concentration on rivers by using hydro-meteorological data. The daily rainfall, streamflow and suspended sediment concentration data from Eel River near Dos Rios, at California, USA are used as a case study. The comparison results indicate that the GEP model performs better than the other models in daily suspended sediment concentration estimation for the particular data sets used in this study. Levenberg-Marquardt, conjugate gradient and gradient descent training algorithms were used for the ANN models. Out of three algorithms, the Conjugate gradient algorithm was found to be better than the others.

  4. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  5. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  6. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  7. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  8. A Computationally Efficient Method for Polyphonic Pitch Estimation

    Directory of Open Access Journals (Sweden)

    Ruohua Zhou

    2009-01-01

    Full Text Available This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  9. State of the art on wind resource estimation

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B.

    1998-12-31

    With the increasing number of wind resource estimation studies carried out for regions, countries and even larger areas all over the world, the IEA finds that the time has come to stop and take stock of the various methods used in these studies. The IEA would therefore like to propose an Experts Meeting on wind resource estimation. The Experts Meeting should describe the models and databases used in the various studies. It should shed light on the strengths and shortcomings of the models and answer questions like: where and under what circumstances should a specific model be used? what is the expected accuracy of the estimate of the model? and what is the applicability? When addressing databases the main goal will be to identify the content and scope of these. Further, the quality, availability and reliability of the databases must also be recognised. In the various studies of wind resources the models and databases have been combined in different ways. A final goal of the Experts Meeting is to see whether it is possible to develop systems of methods which would depend on the available input. These systems of methods should be able to address the simple case (level 0) of a region with barely no data, to the complex case of a region with all available measurements: surface observations, radio soundings, satellite observations and so on. The outcome of the meeting should be an inventory of available models as well as databases and a map of already studied regions. (au)

  10. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  11. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  12. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  13. Impact of changing computer technology on hydrologic and water resource modeling

    OpenAIRE

    Loucks, D.P.; Fedra, K.

    1987-01-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designe...

  14. ACToR - Aggregated Computational Toxicology Resource

    International Nuclear Information System (INIS)

    Judson, Richard; Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-01-01

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast TM

  15. Implications of applying solar industry best practice resource estimation on project financing

    International Nuclear Information System (INIS)

    Pacudan, Romeo

    2016-01-01

    Solar resource estimation risk is one of the main solar PV project risks that influences lender’s decision in providing financing and in determining the cost of capital. More recently, a number of measures have emerged to mitigate this risk. The study focuses on solar industry’s best practice energy resource estimation and assesses its financing implications to the 27 MWp solar PV project study in Brunei Darussalam. The best practice in resource estimation uses multiple data sources through the measure-correlate-predict (MCP) technique as compared with the standard practice that rely solely on modelled data source. The best practice case generates resource data with lower uncertainty and yields superior high-confidence energy production estimate than the standard practice case. Using project financial parameters in Brunei Darussalam for project financing and adopting the international debt-service coverage ratio (DSCR) benchmark rates, the best practice case yields DSCRs that surpass the target rates while those of standard practice case stay below the reference rates. The best practice case could also accommodate higher debt share and have lower levelized cost of electricity (LCOE) while the standard practice case would require a lower debt share but having a higher LCOE. - Highlights: •Best practice solar energy resource estimation uses multiple datasets. •Multiple datasets are combined through measure-correlate-predict technique. •Correlated data have lower uncertainty and yields superior high-confidence energy production. •Best practice case yields debt-service coverage ratios (DSCRs) that surpass the benchmark rates. •Best practice case accommodates high debt share and have low levelized cost of electricity.

  16. Distributed Dynamic State Estimation with Extended Kalman Filter

    Energy Technology Data Exchange (ETDEWEB)

    Du, Pengwei; Huang, Zhenyu; Sun, Yannan; Diao, Ruisheng; Kalsi, Karanjit; Anderson, Kevin K.; Li, Yulan; Lee, Barry

    2011-08-04

    Increasing complexity associated with large-scale renewable resources and novel smart-grid technologies necessitates real-time monitoring and control. Our previous work applied the extended Kalman filter (EKF) with the use of phasor measurement data (PMU) for dynamic state estimation. However, high computation complexity creates significant challenges for real-time applications. In this paper, the problem of distributed dynamic state estimation is investigated. One domain decomposition method is proposed to utilize decentralized computing resources. The performance of distributed dynamic state estimation is tested on a 16-machine, 68-bus test system.

  17. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of nonlinear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  18. An Analysis of the Published Mineral Resource Estimates of the Haji-Gak Iron Deposit, Afghanistan

    International Nuclear Information System (INIS)

    Sutphin, David M.; Renaud, Karine M.; Drew, Lawrence J.

    2011-01-01

    The Haji-Gak iron deposit of eastern Bamyan Province, eastern Afghanistan, was studied extensively and resource calculations were made in the 1960s by Afghan and Russian geologists. Recalculation of the resource estimates verifies the original estimates for categories A (in-place resources known in detail), B (in-place resources known in moderate detail), and C 1 (in-place resources estimated on sparse data), totaling 110.8 Mt, or about 6% of the resources as being supportable for the methods used in the 1960s. C 2 (based on a loose exploration grid with little data) resources are based on one ore grade from one drill hole, and P 2 (prognosis) resources are based on field observations, field measurements, and an ore grade derived from averaging grades from three better sampled ore bodies. C 2 and P 2 resources are 1,659.1 Mt or about 94% of the total resources in the deposit. The vast P 2 resources have not been drilled or sampled to confirm their extent or quality. The purpose of this article is to independently evaluate the resources of the Haji-Gak iron deposit by using the available geologic and mineral resource information including geologic maps and cross sections, sampling data, and the analog-estimating techniques of the 1960s to determine the size and tenor of the deposit.

  19. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    2002-01-01

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of non-linear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  20. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  1. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  2. Computer-aided resource planning and scheduling for radiological services

    Science.gov (United States)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  3. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  4. Can the Teachers' Creativity Overcome Limited Computer Resources?

    Science.gov (United States)

    Nikolov, Rumen; Sendova, Evgenia

    1988-01-01

    Describes experiences of the Research Group on Education (RGE) at the Bulgarian Academy of Sciences and the Ministry of Education in using limited computer resources when teaching informatics. Topics discussed include group projects; the use of Logo; ability grouping; and out-of-class activities, including publishing a pupils' magazine. (13…

  5. Geological 3-D modelling and resources estimation of the Budenovskoye uranium deposit (Kazakhstan)

    International Nuclear Information System (INIS)

    Boytsov, A.; Heyns, M.; Seredkin, M.

    2014-01-01

    The Budenovskoye deposit is the biggest sandstone-hosted, roll front type uranium deposit in Kazakhstan and in the world. Uranium mineralization occurs in the unconsolidated lacustrine-alluvial sediments of Late Cretaceous Mynkuduk and Inkuduk horizons. The Budenovskoye deposit was split into four areas for development with the present Karatau ISL Mine operating No. 2 area and Akbastau ISL Mine Nos. 1, 3 and 4 areas. Mines are owned by Kazatomprom and Uranium One in equal shares. CSA Global was retained by Uranium One to update in accordance with NI 43-101 the Mineral Resource estimates for the Karatau and Akbastau Mines. The modelling Reports shows a significant increase in total uranium resources tonnage at both mines when compared to the March 2012 NI 43-101 resource estimate: at Karartau measured and indicated resources increased by 586% while at Akbastau by 286%. It has also added a 55,766 tonnes U to the Karatau Inferred Mineral Resource category.The new estimates result from the application of 3-D modelling techniques to the extensive database of drilling information, new exploration activities.

  6. Development and comparison of computational models for estimation of absorbed organ radiation dose in rainbow trout (Oncorhynchus mykiss) from uptake of iodine-131

    International Nuclear Information System (INIS)

    Martinez, N.E.; Johnson, T.E.; Capello, K.; Pinder, J.E.

    2014-01-01

    This study develops and compares different, increasingly detailed anatomical phantoms for rainbow trout (Oncorhynchus mykiss) for the purpose of estimating organ absorbed radiation dose and dose rates from 131 I uptake in multiple organs. The models considered are: a simplistic geometry considering a single organ, a more specific geometry employing additional organs with anatomically relevant size and location, and voxel reconstruction of internal anatomy obtained from CT imaging (referred to as CSUTROUT). Dose Conversion Factors (DCFs) for whole body as well as selected organs of O. mykiss were computed using Monte Carlo modeling, and combined with estimated activity concentrations, to approximate dose rates and ultimately determine cumulative radiation dose (μGy) to selected organs after several half-lives of 131 I. The different computational models provided similar results, especially for source organs (less than 30% difference between estimated doses), and whole body DCFs for each model (∼3 × 10 −3 μGy d −1 per Bq kg −1 ) were comparable to DCFs listed in ICRP 108 for 131 I. The main benefit provided by the computational models developed here is the ability to accurately determine organ dose. A conservative mass-ratio approach may provide reasonable results for sufficiently large organs, but is only applicable to individual source organs. Although CSUTROUT is the more anatomically realistic phantom, it required much more resource dedication to develop and is less flexible than the stylized phantom for similar results. There may be instances where a detailed phantom such as CSUTROUT is appropriate, but generally the stylized phantom appears to be the best choice for an ideal balance between accuracy and resource requirements. - Highlights: • Computational models (phantoms) are developed for rainbow trout internal dosimetry. • Phantoms are combined with empirical models for 131 I uptake to estimate dose. • Voxel and stylized phantoms predict

  7. Estimating uranium resources and production. A guide to future supply

    International Nuclear Information System (INIS)

    Taylor, D.M.; Haeussermann, W.

    1983-01-01

    Nuclear power can only continue to grow if sufficient fuel, uranium, is available. Concern has been expressed that, in the not too distant future, the supply of uranium may be inadequate to meet reactor development. This will not be the case. Uranium production capability, actual and planned, is the main indicator of short- and medium-term supply. However, for the longer term, uranium resource estimates and projections of the possible rate of production from the resource base are important. Once an estimate has been made of the resources contained in a deposit, several factors influence the decision to produce the uranium and also the rates at which the uranium can be produced. The effect of these factors, which include uranium market trends and ever increasing lead times from discovery to production, must be taken into account when making projections of future production capability and before comparing these with forecasts of future uranium requirements. The uranium resource base has developed over the last two decades mainly in response to dramatically changing projections of natural uranium requirements. A study of this development and the changes in production, together with the most recent data, shows that in the short- and medium-term, production from already discovered resources should be sufficient to cover any likely reactor requirements. Studies such as those undertaken during the International Uranium Resources Evaluation Project, and others which project future discovery rates and production, are supported by past experience in resource development in showing that uranium supply could continue to meet demand until well into the next century. The uranium supply potential has lessened the need for the early large-scale global introduction of the breeder reactor

  8. LHCb computing model

    CERN Document Server

    Frank, M; Pacheco, Andreu

    1998-01-01

    This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...

  9. Integration of Openstack cloud resources in BES III computing cluster

    Science.gov (United States)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  10. Estimating the Economic Impacts of Recreation Response to Resource Management Alternatives

    Science.gov (United States)

    Donald B.K. English; J. Michael Bowker; John C. Bergstrom; H. Ken Cordell

    1995-01-01

    Managing forest resources involves tradeoffs and making decisions among resource management alternatives. Some alternatives will lead to changes in the level of recreation visitation and the amount of associated visitor spending. Thus, the alternatives can affect local economies. This paper reports a method that can be used to estimate the economic impacts of such...

  11. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  12. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  13. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  14. Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples

    Science.gov (United States)

    Ahlbrandt, T.S.; Klett, T.R.

    2005-01-01

    Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods

  15. Computer modelling of the UK wind energy resource: UK wind speed data package and user manual

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    A software package has been developed for IBM-PC or true compatibles. It is designed to provide easy access to the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. With the wind speed software package, the user is able to obtain a display of the modelled wind speed at 10m, 25m and 45m above ground level for any location in the UK. The required co-ordinates are simply supplied by the user, and the package displays the selected wind speed. This user manual summarises the methodology used in the generation of these UK maps and shows computer generated plots of the 25m wind speeds in 200 x 200 km regions covering the whole UK. The uncertainties inherent in the derivation of these maps are also described, and notes given on their practical usage. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (18 figures, 3 tables, 6 references). (author)

  16. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  18. Copper Mountain, Wyoming, intermediate-grade uranium resource assessment project. Final report. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Madson, M.E.; Ludlam, J.R.; Fukui, L.M.

    1982-11-01

    Intermediate-grade uranium resources were delineated and estimated for Eocene and Precambrian host rock environments in the 39.64 mi 2 Copper Mountain, Wyoming, assessment area. Geologic reconnaissance and geochemical, geophysical, petrologic, borehole, and structural data were interpreted and used to develop a genetic model for uranium mineralization in these environments. Development of a structural scoring system and application of computer graphics in a high-confidence control area established the basis for estimations of uranium resources in the total assessment area. 8 figures, 5 tables

  19. Computationally fast estimation of muscle tension for realtime bio-feedback.

    Science.gov (United States)

    Murai, Akihiko; Kurosaki, Kosuke; Yamane, Katsu; Nakamura, Yoshihiko

    2009-01-01

    In this paper, we propose a method for realtime estimation of whole-body muscle tensions. The main problem of muscle tension estimation is that there are infinite number of solutions to realize a particular joint torque due to the actuation redundancy. Numerical optimization techniques, e.g. quadratic programming, are often employed to obtain a unique solution, but they are usually computationally expensive. For example, our implementation of quadratic programming takes about 0.17 sec per frame on the musculoskeletal model with 274 elements, which is far from realtime computation. Here, we propose to reduce the computational cost by using EMG data and by reducing the number of unknowns in the optimization. First, we compute the tensions of muscles with surface EMG data based on a biological muscle data, which is a very efficient process. We also assume that their synergists have the same activity levels and compute their tensions with the same model. Tensions of the remaining muscles are then computed using quadratic programming, but the number of unknowns is significantly reduced by assuming that the muscles in the same heteronymous group have the same activity level. The proposed method realizes realtime estimation and visualization of the whole-body muscle tensions that can be applied to sports training and rehabilitation.

  20. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  1. Estimation of uranium resources by life-cycle or discovery-rate models: a critique

    International Nuclear Information System (INIS)

    Harris, D.P.

    1976-10-01

    This report was motivated primarily by M. A. Lieberman's ''United States Uranium Resources: An Analysis of Historical Data'' (Science, April 30). His conclusion that only 87,000 tons of U 3 O 8 resources recoverable at a forward cost of $8/lb remain to be discovered is criticized. It is shown that there is no theoretical basis for selecting the exponential or any other function for the discovery rate. Some of the economic (productivity, inflation) and data issues involved in the analysis of undiscovered, recoverable U 3 O 8 resources on discovery rates of $8 reserves are discussed. The problem of the ratio of undiscovered $30 resources to undiscovered $8 resources is considered. It is concluded that: all methods for the estimation of unknown resources must employ a model of some form of the endowment-exploration-production complex, but every model is a simplification of the real world, and every estimate is intrinsically uncertain. The life-cycle model is useless for the appraisal of undiscovered, recoverable U 3 O 8 , and the discovery rate model underestimates these resources

  2. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  3. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  4. Estimating the Wind Resource in Uttarakhand: Comparison of Dynamic Downscaling with Doppler Lidar Wind Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Lundquist, J. K. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pukayastha, A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Martin, C. [Univ. of Colorado, Boulder, CO (United States); Newsom, R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    Previous estimates of the wind resources in Uttarakhand, India, suggest minimal wind resources in this region. To explore whether or not the complex terrain in fact provides localized regions of wind resource, the authors of this study employed a dynamic down scaling method with the Weather Research and Forecasting model, providing detailed estimates of winds at approximately 1 km resolution in the finest nested simulation.

  5. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  6. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  7. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  8. Optimal usage of computing grid network in the fields of nuclear fusion computing task

    International Nuclear Information System (INIS)

    Tenev, D.

    2006-01-01

    Nowadays the nuclear power becomes the main source of energy. To make its usage more efficient, the scientists created complicated simulation models, which require powerful computers. The grid computing is the answer to powerful and accessible computing resources. The article observes, and estimates the optimal configuration of the grid environment in the fields of the complicated nuclear fusion computing tasks. (author)

  9. Analysis of the Possibility of Required Resources Estimation for Nuclear Power Plant Decommissioning Applying BIM

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Estimation of decommissioning cost, decommissioning strategy, and decommissioning quantity at the time when entering into any decommissioning plans are some elements whose inputs are mandatory for nuclear power plant decommissioning. Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. This study aims at analyzing whether required resources for decommissioning nuclear power plants can be estimated, applying BIM. To achieve this goal, this study analyzed the status quo of BIM such as definition, characteristics, and areas applied, and made use of them when drawing out study results by examining types and features of the tools realizing BIM. In order to review how BIM could be used for decommissioning nuclear power plants, the definition, characteristics and applied areas of BIM were discussed. BIM designs objects of the structures (walls, slabs, pillars, stairs, windows and doors, etc.) by 3D technology and endows attribute (function, structure and usage) information for each object, thereby providing visualized information of structures for participants in construction projects. Major characteristics of BIM attribute information are as follows: - Geometry: The information of objects is represented by measurable geometric information - Extensible object attributes: Objects include pre-defined attributes, and allow extension of other attributes. Any model that includes these attributes forms relationships with other various attributes in order to perform analysis and simulation. - All information including the attributes are integrated to ensure continuity, accuracy and accessibility, and all information used during the life cycle of structures are supported. This means that when information of required resources is added as another attributes other than geometric

  10. Computer model for estimating electric utility environmental noise

    International Nuclear Information System (INIS)

    Teplitzky, A.M.; Hahn, K.J.

    1991-01-01

    This paper reports on a computer code for estimating environmental noise emissions from the operation and the construction of electric power plants that was developed based on algorithms. The computer code (Model) is used to predict octave band sound power levels for power plant operation and construction activities on the basis of the equipment operating characteristics and calculates off-site sound levels for each noise source and for an entire plant. Estimated noise levels are presented either as A-weighted sound level contours around the power plant or as octave band levels at user defined receptor locations. Calculated sound levels can be compared with user designated noise criteria, and the program can assist the user in analyzing alternative noise control strategies

  11. Cost Implications of Uncertainty in CO{sub 2} Storage Resource Estimates: A Review

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Steven T., E-mail: sanderson@usgs.gov [National Center, U.S. Geological Survey (United States)

    2017-04-15

    Carbon capture from stationary sources and geologic storage of carbon dioxide (CO{sub 2}) is an important option to include in strategies to mitigate greenhouse gas emissions. However, the potential costs of commercial-scale CO{sub 2} storage are not well constrained, stemming from the inherent uncertainty in storage resource estimates coupled with a lack of detailed estimates of the infrastructure needed to access those resources. Storage resource estimates are highly dependent on storage efficiency values or storage coefficients, which are calculated based on ranges of uncertain geological and physical reservoir parameters. If dynamic factors (such as variability in storage efficiencies, pressure interference, and acceptable injection rates over time), reservoir pressure limitations, boundaries on migration of CO{sub 2}, consideration of closed or semi-closed saline reservoir systems, and other possible constraints on the technically accessible CO{sub 2} storage resource (TASR) are accounted for, it is likely that only a fraction of the TASR could be available without incurring significant additional costs. Although storage resource estimates typically assume that any issues with pressure buildup due to CO{sub 2} injection will be mitigated by reservoir pressure management, estimates of the costs of CO{sub 2} storage generally do not include the costs of active pressure management. Production of saline waters (brines) could be essential to increasing the dynamic storage capacity of most reservoirs, but including the costs of this critical method of reservoir pressure management could increase current estimates of the costs of CO{sub 2} storage by two times, or more. Even without considering the implications for reservoir pressure management, geologic uncertainty can significantly impact CO{sub 2} storage capacities and costs, and contribute to uncertainty in carbon capture and storage (CCS) systems. Given the current state of available information and the

  12. The Trope Tank: A Laboratory with Material Resources for Creative Computing

    Directory of Open Access Journals (Sweden)

    Nick Montfort

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/1807-9288.2014v10n2p53 Principles for organizing and making use of a laboratory with material computing resources are articulated. This laboratory, the Trope Tank, is a facility for teaching, research, and creative collaboration and offers hardware (in working condition and set up for use from the 1970s, 1980s, and 1990s, including videogame systems, home computers, and an arcade cabinet. To aid in investigating the material history of texts, the lab has a small 19th century letterpress, a typewriter, a print terminal, and dot-matrix printers. Other resources include controllers, peripherals, manuals, books, and software on physical media. These resources are used for teaching, loaned for local exhibitions and presentations, and accessed by researchers and artists. The space is primarily a laboratory (rather than a library, studio, or museum, so materials are organized by platform and intended use. Textual information about the historical contexts of the available systems, and resources are set up to allow easy operation, and even casual use, by researchers, teachers, students, and artists.

  13. Variance computations for functional of absolute risk estimates.

    Science.gov (United States)

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  14. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  15. Improved diagnostic model for estimating wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Endlich, R.M.; Lee, J.D.

    1983-03-01

    Because wind data are available only at scattered locations, a quantitative method is needed to estimate the wind resource at specific sites where wind energy generation may be economically feasible. This report describes a computer model that makes such estimates. The model uses standard weather reports and terrain heights in deriving wind estimates; the method of computation has been changed from what has been used previously. The performance of the current model is compared with that of the earlier version at three sites; estimates of wind energy at four new sites are also presented.

  16. Computer modelling of the UK wind energy resource: final overview report

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Ravenscroft, F

    1993-12-31

    This report describes the results of a programme of work to estimate the UK wind energy resource. Mean wind speed maps and quantitative resource estimates were obtained using the NOABL mesoscale (1 km resolution) numerical model for the prediction of wind flow over complex terrain. NOABL was used in conjunction with digitised terrain data and wind data from surface meteorological stations for a ten year period (1975-1984) to provide digital UK maps of mean wind speed at 10m, 25m and 45m above ground level. Also included in the derivation of these maps was the use of the Engineering Science Data Unit (ESDU) method to model the effect on wind speed of the abrupt change in surface roughness that occurs at the coast. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicated that 23% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. (20 figures, 7 tables, 10 references). (author)

  17. Internal Clock Drift Estimation in Computer Clusters

    Directory of Open Access Journals (Sweden)

    Hicham Marouani

    2008-01-01

    Full Text Available Most computers have several high-resolution timing sources, from the programmable interrupt timer to the cycle counter. Yet, even at a precision of one cycle in ten millions, clocks may drift significantly in a single second at a clock frequency of several GHz. When tracing the low-level system events in computer clusters, such as packet sending or reception, each computer system records its own events using an internal clock. In order to properly understand the global system behavior and performance, as reported by the events recorded on each computer, it is important to estimate precisely the clock differences and drift between the different computers in the system. This article studies the clock precision and stability of several computer systems, with different architectures. It also studies the typical network delay characteristics, since time synchronization algorithms rely on the exchange of network packets and are dependent on the symmetry of the delays. A very precise clock, based on the atomic time provided by the GPS satellite network, was used as a reference to measure clock drifts and network delays. The results obtained are of immediate use to all applications which depend on computer clocks or network time synchronization accuracy.

  18. Estimating resting motor thresholds in transcranial magnetic stimulation research and practice: a computer simulation evaluation of best methods.

    Science.gov (United States)

    Borckardt, Jeffrey J; Nahas, Ziad; Koola, Jejo; George, Mark S

    2006-09-01

    Resting motor threshold is the basic unit of dosing in transcranial magnetic stimulation (TMS) research and practice. There is little consensus on how best to estimate resting motor threshold with TMS, and only a few tools and resources are readily available to TMS researchers. The current study investigates the accuracy and efficiency of 5 different approaches to motor threshold assessment for TMS research and practice applications. Computer simulation models are used to test the efficiency and accuracy of 5 different adaptive parameter estimation by sequential testing (PEST) procedures. For each approach, data are presented with respect to the mean number of TMS trials necessary to reach the motor threshold estimate as well as the mean accuracy of the estimates. A simple nonparametric PEST procedure appears to provide the most accurate motor threshold estimates, but takes slightly longer (on average, 3.48 trials) to complete than a popular parametric alternative (maximum likelihood PEST). Recommendations are made for the best starting values for each of the approaches to maximize both efficiency and accuracy. In light of the computer simulation data provided in this article, the authors review and suggest which techniques might best fit different TMS research and clinical situations. Lastly, a free user-friendly software package is described and made available on the world wide web that allows users to run all of the motor threshold estimation procedures discussed in this article for clinical and research applications.

  19. A novel resource management method of providing operating system as a service for mobile transparent computing.

    Science.gov (United States)

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  20. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2014-01-01

    Full Text Available This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU virtualization and mobile agent for mobile transparent computing (MTC to devise a method of managing shared resources and services management (SRSM. It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user’s requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  1. Research on uranium resource models. Part IV. Logic: a computer graphics program to construct integrated logic circuits for genetic-geologic models. Progress report

    International Nuclear Information System (INIS)

    Scott, W.A.; Turner, R.M.; McCammon, R.B.

    1981-01-01

    Integrated logic circuits were described as a means of formally representing genetic-geologic models for estimating undiscovered uranium resources. The logic circuits are logical combinations of selected geologic characteristics judged to be associated with particular types of uranium deposits. Each combination takes on a value which corresponds to the combined presence, absence, or don't know states of the selected characteristic within a specified geographic cell. Within each cell, the output of the logic circuit is taken as a measure of the favorability of occurrence of an undiscovered deposit of the type being considered. In this way, geological, geochemical, and geophysical data are incorporated explicitly into potential uranium resource estimates. The present report describes how integrated logic circuits are constructed by use of a computer graphics program. A user's guide is also included

  2. Subroutine library for error estimation of matrix computation (Ver. 1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi; Shizawa, Yoshihisa; Kishida, Norio

    1999-03-01

    'Subroutine Library for Error Estimation of Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the linear system's solutions or the Hermitian matrices' eigenvalues. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calculate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The subroutines for error estimation of Hermitian matrix eigenvalues derive the error ranges of the eigenvalues according to the Korn-Kato's formula. The test matrix generators supply the matrices appeared in the mathematical research, the ones randomly generated and the ones appeared in the application programs. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  3. Estimating health workforce needs for antiretroviral therapy in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Fullem Andrew

    2006-01-01

    Full Text Available Abstract Background Efforts to increase access to life-saving treatment, including antiretroviral therapy (ART, for people living with HIV/AIDS in resource-limited settings has been the growing focus of international efforts. One of the greatest challenges to scaling up will be the limited supply of adequately trained human resources for health, including doctors, nurses, pharmacists and other skilled providers. As national treatment programmes are planned, better estimates of human resource needs and improved approaches to assessing the impact of different staffing models are critically needed. However there have been few systematic assessments of staffing patterns in existing programmes or of the estimates being used in planning larger programmes. Methods We reviewed the published literature and selected plans and scaling-up proposals, interviewed experts and collected data on staffing patterns at existing treatment sites through a structured survey and site visits. Results We found a wide range of staffing patterns and patient-provider ratios in existing and planned treatment programmes. Many factors influenced health workforce needs, including task assignments, delivery models, other staff responsibilities and programme size. Overall, the number of health care workers required to provide ART to 1000 patients included 1–2 physicians, 2–7 nurses, Discussion These data are consistent with other estimates of human resource requirements for antiretroviral therapy, but highlight the considerable variability of current staffing models and the importance of a broad range of factors in determining personnel needs. Few outcome or cost data are currently available to assess the effectiveness and efficiency of different staffing models, and it will be important to develop improved methods for gathering this information as treatment programmes are scaled up.

  4. Estimation of Total Tree Height from Renewable Resources Evaluation Data

    Science.gov (United States)

    Charles E. Thomas

    1981-01-01

    Many ecological, biological, and genetic studies use the measurement of total tree height. Until recently, the Southern Forest Experiment Station's inventory procedures through Renewable Resources Evaluation (RRE) have not included total height measurements. This note provides equations to estimate total height based on other RRE measurements.

  5. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  6. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  8. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  9. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  10. RHIC off-line computing

    International Nuclear Information System (INIS)

    Featherly, J.; Gibbard, B.; Gould, J.

    1993-01-01

    A report was prepared in Sept 1992, RHIC/DET Note 8, also known as ROCOCO, which estimated the various computing resources which will be required by the RHIC experimental program. A study has now been undertaken to review technical issues associated with supplying these resources. This study, organized by the HEP/NP Computing Group but including other appropriate participants, addresses questions of technologies, manpower, cost and schedule. The following document is an interim summary of this study both in terms of discussions which have occurred and initial conclusions reached

  11. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  12. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  13. ESTIMATION OF EXTERNAL FACTORS INFLUENCE ON THE ORGANIZATIONAL AND RESOURCE SUPPORT OF ENGINEERING

    Directory of Open Access Journals (Sweden)

    Yu. V. Gusak

    2013-09-01

    Full Text Available Purpose. The engineering industry is characterized by deep specialization and high co-operation, which suggests a high degree of interaction with other industries and the economy, highly sensitive to external factors. Effective regulation of the engineering industry’s organizational-resource support will ensure coherence of all the subsystems of the market economy, the competitive environment, a full course of the investment process and the success of the industry. Therefore there is a need for detailed estimation and analysis of the external factors’ influence on the formation and implementation indexes of the engineering industry’s organizational-resource support. Methodology. To establish the close connection between the set of external factors of formation and implementation indexes of the engineering industry organizational-resource support the correlation analysis was used, to calculate the amount of the formation and implementation indexes of the engineering industry organizational-resource support’s change under the influence of the external factors with malleability coefficient were applied. Findings. The external influence factors on the engineering industry organizational-resource support by the source of origin: industrial, economical, political, informational, and social were separated and grouped. The classification of the external factors influence on the engineering industry organizational-resource support, depending on their influence’s direction on the formation and implementation indexes of the engineering industry’s organizational-resource support was made. The connection closeness and the amount of the formation and implementation indexes of the engineering industry organizational-resource support change (the machinery index of and the sales volume machinery index under the influence of the external factors with malleability coefficient were determined. Originality. The estimation of the external factors

  14. Estimate of Hot Dry Rock Geothermal Resource in Daqing Oilfield, Northeast China

    Directory of Open Access Journals (Sweden)

    Guangzheng Jiang

    2016-10-01

    Full Text Available Development and utilization of deep geothermal resources, especially a hot dry rock (HDR geothermal resource, is beneficial for both economic and environmental consideration in oilfields. This study used data from multiple sources to assess the geothermal energy resource in the Daqing Oilfield. The temperature logs in boreholes (both shallow water wells and deep boreholes and the drilling stem test temperature were used to create isothermal maps in depths. Upon the temperature field and thermophysical parameters of strata, the heat content was calculated by 1 km × 1 km × 0.1 km cells. The result shows that in the southeastern part of Daqing Oilfield, the temperature can reach 150 °C at a depth of 3 km. The heat content within 3–5 km is 24.28 × 1021 J, wherein 68.2% exceeded 150 °C. If the recovery factor was given by 2% and the lower limit of temperature was set to be 150 °C, the most conservative estimate for recoverable HDR geothermal resource was 0.33 × 1021 J. The uncertainties of the estimation are mainly contributed to by the temperature extrapolation and the physical parameter selections.

  15. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  16. Estimation of subcriticality with the computed values. 2

    Energy Technology Data Exchange (ETDEWEB)

    Sakurai, Kiyoshi; Arakawa, Takuya; Naito, Yoshitaka [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-02-01

    For measurements of reactivities and neutron count rate space distributions, seven subcritical cores including non-square array cores were constructed using critical assembly TCA. MCNP-4A was used for the experimental analysis. The calculational results of the neutron count rate space distributions agreed with the measured ones within the each error range. It means that for calculation error indirect estimation method, the calculated neutron multiplication factors are equal to ones of experimental reactivities. It is shown that from these experiments and calculations estimation method of subcriticality with the computed values based on the calculation error indirect estimation method is also applicable to six non-square array cores. (author).

  17. The significance of regulation and land use patterns on natural gas resource estimates in the Marcellus shale

    International Nuclear Information System (INIS)

    Blohm, Andrew; Peichel, Jeremy; Smith, Caroline; Kougentakis, Alexandra

    2012-01-01

    Recent advancements in natural gas extraction (e.g. hydraulic fracturing) have significantly increased natural gas reserves in the United States. Estimates of the technically recoverable natural gas (TRR) in the Marcellus range between 141 trillion cubic feet (TCF) and 489 TCF. However, TRR estimation does not incorporate existing policies, regulations, or land use. We find that approximately 48% of the Marcellus in New York and Pennsylvania is inaccessible given land use patterns and current policy. In New York, approximately 83% of the Marcellus is inaccessible; while in Pennsylvania about 32% of the Marcellus is off limits to drilling. The New York portion of the Marcellus is estimated to have a TRR of between 19.9 TCF and 68.9 TCF. We estimate that 79% of the resource is inaccessible, which results in an accessible resource estimate of between 4.2 TCF and 14.4 TCF. In Pennsylvania, the shale gas TRR is estimated at 86.6–300 TCF. However, we estimate that 31% of the resource is inaccessible, which results in an accessible resource estimate of between 60.0 TCF and 208 TCF. - Highlights: ► Existing natural gas reserve estimation techniques ignore land use, regulations, and policies. ► The impact of land use and regulation on the recoverable natural gas resource is significant. ► 48% of the Marcellus Shale in New York and Pennsylvania is inaccessible to drilling. ► In New York, approximately 83% of the Marcellus is inaccessible. ► In Pennsylvania about 32% of the Marcellus is off limits to drilling.

  18. An Estimate of Recoverable Heavy Oil Resources of the Orinoco Oil Belt, Venezuela

    Science.gov (United States)

    Schenk, Christopher J.; Cook, Troy A.; Charpentier, Ronald R.; Pollastro, Richard M.; Klett, Timothy R.; Tennyson, Marilyn E.; Kirschbaum, Mark A.; Brownfield, Michael E.; Pitman, Janet K.

    2009-01-01

    The Orinoco Oil Belt Assessment Unit of the La Luna-Quercual Total Petroleum System encompasses approximately 50,000 km2 of the East Venezuela Basin Province that is underlain by more than 1 trillion barrels of heavy oil-in-place. As part of a program directed at estimating the technically recoverable oil and gas resources of priority petroleum basins worldwide, the U.S. Geological Survey estimated the recoverable oil resources of the Orinoco Oil Belt Assessment Unit. This estimate relied mainly on published geologic and engineering data for reservoirs (net oil-saturated sandstone thickness and extent), petrophysical properties (porosity, water saturation, and formation volume factors), recovery factors determined by pilot projects, and estimates of volumes of oil-in-place. The U.S. Geological Survey estimated a mean volume of 513 billion barrels of technically recoverable heavy oil in the Orinoco Oil Belt Assessment Unit of the East Venezuela Basin Province; the range is 380 to 652 billion barrels. The Orinoco Oil Belt Assessment Unit thus contains one of the largest recoverable oil accumulations in the world.

  19. Computer-assisted estimating for the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Spooner, J.E.

    1976-02-01

    An analysis is made of the cost estimating system currently in use at the Los Alamos Scientific Laboratory (LASL) and the benefits of computer assistance are evaluated. A computer-assisted estimating system (CAE) is proposed for LASL. CAE can decrease turnaround and provide more flexible response to management requests for cost information and analyses. It can enhance value optimization at the design stage, improve cost control and change-order justification, and widen the use of cost information in the design process. CAE costs are not well defined at this time although they appear to break even with present operations. It is recommended that a CAE system description be submitted for contractor consideration and bid while LASL system development continues concurrently

  20. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of Construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  1. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    International Nuclear Information System (INIS)

    Jung, Insu; Kim, Woojung

    2014-01-01

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  2. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  3. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  4. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  5. Geological Carbon Sequestration Storage Resource Estimates for the Ordovician St. Peter Sandstone, Illinois and Michigan Basins, USA

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, David; Ellett, Kevin; Leetaru, Hannes

    2014-09-30

    The Cambro-Ordovician strata of the Midwest of the United States is a primary target for potential geological storage of CO2 in deep saline formations. The objective of this project is to develop a comprehensive evaluation of the Cambro-Ordovician strata in the Illinois and Michigan Basins above the basal Mount Simon Sandstone since the Mount Simon is the subject of other investigations including a demonstration-scale injection at the Illinois Basin Decatur Project. The primary reservoir targets investigated in this study are the middle Ordovician St Peter Sandstone and the late Cambrian to early Ordovician Knox Group carbonates. The topic of this report is a regional-scale evaluation of the geologic storage resource potential of the St Peter Sandstone in both the Illinois and Michigan Basins. Multiple deterministic-based approaches were used in conjunction with the probabilistic-based storage efficiency factors published in the DOE methodology to estimate the carbon storage resource of the formation. Extensive data sets of core analyses and wireline logs were compiled to develop the necessary inputs for volumetric calculations. Results demonstrate how the range in uncertainty of storage resource estimates varies as a function of data availability and quality, and the underlying assumptions used in the different approaches. In the simplest approach, storage resource estimates were calculated from mapping the gross thickness of the formation and applying a single estimate of the effective mean porosity of the formation. Results from this approach led to storage resource estimates ranging from 3.3 to 35.1 Gt in the Michigan Basin, and 1.0 to 11.0 Gt in the Illinois Basin at the P10 and P90 probability level, respectively. The second approach involved consideration of the diagenetic history of the formation throughout the two basins and used depth-dependent functions of porosity to derive a more realistic spatially variable model of porosity rather than applying a

  6. An integrated system for land resources supervision based on the IoT and cloud computing

    Science.gov (United States)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  7. Uranium Resources Modeling And Estimation In Lembah Hitam Sector, Kalan, West Kalimantan

    International Nuclear Information System (INIS)

    Adi Gunawan Muhammad; Bambang Soetopo

    2016-01-01

    Lembah Hitam Sector is part of Schwaner Mountains and Kalan Basin upper part stratigraphy. Uranium (U) mineralization layer is associated with metasiltstone and metapelites schistose heading to N 265° E/60° S. Evaluation drilling carried out with a distance of 50 m from an existing point (FKL 14 and FKL 13) to determine the model and the amount of U resources in measured category. To achieve these objectives some activities including reviewing the previous studies, geological and U mineralization data collecting, grades quantitative estimation using log gross-count gamma ray, database and modeling creation and resource estimation of U carried out. Based on modeling on ten drilling data and completed with drilled core observation, the average grade of U mineralization in Lembah Hitam Sector obtained. The average grade is ranging from 0.0076 - 0.95 % eU_3O_8, with a thickness of mineralization ranging from 0.1 - 4.5 m. Uranium mineralization present as fracture filling (veins) or groups of veins and as matrix filling in tectonic breccia, associated with pyrite, pyrrhotite, magnetite, molybdenite, tourmaline and quartz in metasiltstone and metapelites schistose. Calculation of U resources to 26 ores body using 25 m searching radius resulted in 655.65 tons ores. By using 0.01 % cut-off grade resulted in 546.72 tons ores with an average grade 0.101 % eU_3O_8. Uranium resource categorized as low-grade measured resources. (author)

  8. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  9. Testing a computer-based ostomy care training resource for staff nurses.

    Science.gov (United States)

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  10. Estimated sand and gravel resources of the South Merrimack, Hillsborough County, New Hampshire, 7.5-minute quadrangle

    Science.gov (United States)

    Sutphin, D.M.; Drew, L.J.; Fowler, B.K.

    2006-01-01

    A computer methodology is presented that allows natural aggregate producers, local governmental, and nongovernmental planners to define specific locations that may have sand and gravel deposits meeting user-specified minimum size, thickness, and geographic and geologic criteria, in areas where the surficial geology has been mapped. As an example, the surficial geologic map of the South Merrimack quadrangle was digitized and several digital geographic information system databases were downloaded from the internet and used to estimate the sand and gravel resources in the quadrangle. More than 41 percent of the South Merrimack quadrangle has been mapped as having sand and (or) gravel deposited by glacial meltwaters. These glaciofluvial areas are estimated to contain a total of 10 million m3 of material mapped as gravel, 60 million m3 of material mapped as mixed sand and gravel, and another 50 million m3 of material mapped as sand with minor silt. The mean thickness of these areas is about 1.95 meters. Twenty tracts were selected, each having individual areas of more than about 14 acres4 (5.67 hectares) of stratified glacial-meltwater sand and gravel deposits, at least 10-feet (3.0 m) of material above the watertable, and not sterilized by the proximity of buildings, roads, streams and other bodies of water, or railroads. The 20 tracts are estimated to contain between about 4 and 10 million short tons (st) of gravel and 20 and 30 million st of sand. The five most gravel-rich tracts contain about 71 to 82 percent of the gravel resources in all 20 tracts and about 54-56 percent of the sand. Using this methodology, and the above criteria, a group of four tracts, divided by narrow areas sterilized by a small stream and secondary roads, may have the highest potential in the quadrangle for sand and gravel resources. ?? Springer Science+Business Media, LLC 2006.

  11. Sex estimation from sternal measurements using multidetector computed tomography.

    Science.gov (United States)

    Ekizoglu, Oguzhan; Hocaoglu, Elif; Inci, Ercan; Bilgili, Mustafa Gokhan; Solmaz, Dilek; Erdil, Irem; Can, Ismail Ozgur

    2014-12-01

    We aimed to show the utility and reliability of sternal morphometric analysis for sex estimation.Sex estimation is a very important step in forensic identification. Skeletal surveys are main methods for sex estimation studies. Morphometric analysis of sternum may provide high accuracy rated data in sex discrimination. In this study, morphometric analysis of sternum was evaluated in 1 mm chest computed tomography scans for sex estimation. Four hundred forty 3 subjects (202 female, 241 male, mean age: 44 ± 8.1 [distribution: 30-60 year old]) were included the study. Manubrium length (ML), mesosternum length (2L), Sternebra 1 (S1W), and Sternebra 3 (S3W) width were measured and also sternal index (SI) was calculated. Differences between genders were evaluated by student t-test. Predictive factors of sex were determined by discrimination analysis and receiver operating characteristic (ROC) analysis. Male sternal measurement values are significantly higher than females (P discrimination analysis, MSL has high accuracy rate with 80.2% in females and 80.9% in males. MSL also has the best sensitivity (75.9%) and specificity (87.6%) values. Accuracy rates were above 80% in 3 stepwise discrimination analysis for both sexes. Stepwise 1 (ML, MSL, S1W, S3W) has the highest accuracy rate in stepwise discrimination analysis with 86.1% in females and 83.8% in males. Our study showed that morphometric computed tomography analysis of sternum might provide important information for sex estimation.

  12. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  13. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  14. Schistosomiasis and water resources development: systematic review, meta-analysis, and estimates of people at risk.

    Science.gov (United States)

    Steinmann, Peter; Keiser, Jennifer; Bos, Robert; Tanner, Marcel; Utzinger, Jürg

    2006-07-01

    An estimated 779 million people are at risk of schistosomiasis, of whom 106 million (13.6%) live in irrigation schemes or in close proximity to large dam reservoirs. We identified 58 studies that examined the relation between water resources development projects and schistosomiasis, primarily in African settings. We present a systematic literature review and meta-analysis with the following objectives: (1) to update at-risk populations of schistosomiasis and number of people infected in endemic countries, and (2) to quantify the risk of water resources development and management on schistosomiasis. Using 35 datasets from 24 African studies, our meta-analysis showed pooled random risk ratios of 2.4 and 2.6 for urinary and intestinal schistosomiasis, respectively, among people living adjacent to dam reservoirs. The risk ratio estimate for studies evaluating the effect of irrigation on urinary schistosomiasis was in the range 0.02-7.3 (summary estimate 1.1) and that on intestinal schistosomiasis in the range 0.49-23.0 (summary estimate 4.7). Geographic stratification showed important spatial differences, idiosyncratic to the type of water resources development. We conclude that the development and management of water resources is an important risk factor for schistosomiasis, and hence strategies to mitigate negative effects should become integral parts in the planning, implementation, and operation of future water projects.

  15. Estimating the energy independence of a municipal wastewater treatment plant incorporating green energy resources

    International Nuclear Information System (INIS)

    Chae, Kyu-Jung; Kang, Jihoon

    2013-01-01

    Highlights: • We estimated green energy production in a municipal wastewater treatment plant. • Engineered approaches in mining multiple green energy resources were presented. • The estimated green energy production accounted for 6.5% of energy independence in the plant. • We presented practical information regarding green energy projects in water infrastructures. - Abstract: Increasing energy prices and concerns about global climate change highlight the need to improve energy independence in municipal wastewater treatment plants (WWTPs). This paper presents methodologies for estimating the energy independence of a municipal WWTP with a design capacity of 30,000 m 3 /d incorporating various green energy resources into the existing facilities, including different types of 100 kW photovoltaics, 10 kW small hydropower, and an effluent heat recovery system with a 25 refrigeration ton heat pump. It also provides guidance for the selection of appropriate renewable technologies or their combinations for specific WWTP applications to reach energy self-sufficiency goals. The results showed that annual energy production equal to 107 tons of oil equivalent could be expected when the proposed green energy resources are implemented in the WWTP. The energy independence, which was defined as the percent ratio of green energy production to energy consumption, was estimated to be a maximum of 6.5% and to vary with on-site energy consumption in the WWTP. Implementing green energy resources tailored to specific site conditions is necessary to improve the energy independence in WWTPs. Most of the applied technologies were economically viable primarily because of the financial support under the mandatory renewable portfolio standard in Korea

  16. Accounting for animal movement in estimation of resource selection functions: sampling and data analysis.

    Science.gov (United States)

    Forester, James D; Im, Hae Kyung; Rathouz, Paul J

    2009-12-01

    Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to

  17. Chest X ray effective doses estimation in computed radiography

    International Nuclear Information System (INIS)

    Abdalla, Esra Abdalrhman Dfaalla

    2013-06-01

    Conventional chest radiography is technically difficult because of wide in tissue attenuations in the chest and limitations of screen-film systems. Computed radiography (CR) offers a different approach utilizing a photostimulable phosphor. photostimulable phosphors overcome some image quality limitations of chest imaging. The objective of this study was to estimate the effective dose in computed radiography at three hospitals in Khartoum. This study has been conducted in radiography departments in three centres Advanced Diagnostic Center, Nilain Diagnostic Center, Modern Diagnostic Center. The entrance surface dose (ESD) measurement was conducted for quality control of x-ray machines and survey of operators experimental techniques. The ESDs were measured by UNFORS dosimeter and mathematical equations to estimate patient doses during chest X rays. A total of 120 patients were examined in three centres, among them 62 were males and 58 were females. The overall mean and range of patient dosed was 0.073±0.037 (0.014-0.16) mGy per procedure while the effective dose was 3.4±01.7 (0.6-7.0) mSv per procedure. This study compared radiation doses to patients radiographic examinations of chest using computed radiology. The radiation dose was measured in three centres in Khartoum- Sudan. The results of the measured effective dose showed that the dose in chest radiography was lower in computed radiography compared to previous studies.(Author)

  18. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...... the learning game “Global Conflicts: Latin America” as a resource into the teaching and learning of a course involving the two subjects “English language learning” and “Social studies” at the final year in a Danish high school. The study adapts an explorative research design approach and investigates...

  19. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  20. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  1. Elucidating reaction mechanisms on quantum computers.

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  2. Estimating Resource Costs of Levy Campaigns in Five Ohio School Districts

    Science.gov (United States)

    Ingle, W. Kyle; Petroff, Ruth Ann; Johnson, Paul A.

    2011-01-01

    Using Levin and McEwan's (2001) "ingredients method," this study identified the major activities and associated costs of school levy campaigns in five districts. The ingredients were divided into one of five cost categories--human resources, facilities, fees, marketing, and supplies. As to overall costs of the campaigns, estimates ranged…

  3. Statistically and Computationally Efficient Estimating Equations for Large Spatial Datasets

    KAUST Repository

    Sun, Ying; Stein, Michael L.

    2014-01-01

    For Gaussian process models, likelihood based methods are often difficult to use with large irregularly spaced spatial datasets, because exact calculations of the likelihood for n observations require O(n3) operations and O(n2) memory. Various approximation methods have been developed to address the computational difficulties. In this paper, we propose new unbiased estimating equations based on score equation approximations that are both computationally and statistically efficient. We replace the inverse covariance matrix that appears in the score equations by a sparse matrix to approximate the quadratic forms, then set the resulting quadratic forms equal to their expected values to obtain unbiased estimating equations. The sparse matrix is constructed by a sparse inverse Cholesky approach to approximate the inverse covariance matrix. The statistical efficiency of the resulting unbiased estimating equations are evaluated both in theory and by numerical studies. Our methods are applied to nearly 90,000 satellite-based measurements of water vapor levels over a region in the Southeast Pacific Ocean.

  4. Statistically and Computationally Efficient Estimating Equations for Large Spatial Datasets

    KAUST Repository

    Sun, Ying

    2014-11-07

    For Gaussian process models, likelihood based methods are often difficult to use with large irregularly spaced spatial datasets, because exact calculations of the likelihood for n observations require O(n3) operations and O(n2) memory. Various approximation methods have been developed to address the computational difficulties. In this paper, we propose new unbiased estimating equations based on score equation approximations that are both computationally and statistically efficient. We replace the inverse covariance matrix that appears in the score equations by a sparse matrix to approximate the quadratic forms, then set the resulting quadratic forms equal to their expected values to obtain unbiased estimating equations. The sparse matrix is constructed by a sparse inverse Cholesky approach to approximate the inverse covariance matrix. The statistical efficiency of the resulting unbiased estimating equations are evaluated both in theory and by numerical studies. Our methods are applied to nearly 90,000 satellite-based measurements of water vapor levels over a region in the Southeast Pacific Ocean.

  5. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  6. Estimate of fuel burnup spatial a multipurpose reactor in computer simulation

    International Nuclear Information System (INIS)

    Santos, Nadia Rodrigues dos; Lima, Zelmo Rodrigues de; Moreira, Maria de Lourdes

    2015-01-01

    In previous research, which aimed, through computer simulation, estimate the spatial fuel burnup for the research reactor benchmark, material test research - International Atomic Energy Agency (MTR/IAEA), it was found that the use of the code in FORTRAN language, based on the diffusion theory of neutrons and WIMSD-5B, which makes cell calculation, bespoke be valid to estimate the spatial burnup other nuclear research reactors. That said, this paper aims to present the results of computer simulation to estimate the space fuel burnup of a typical multipurpose reactor, plate type and dispersion. the results were considered satisfactory, being in line with those presented in the literature. for future work is suggested simulations with other core configurations. are also suggested comparisons of WIMSD-5B results with programs often employed in burnup calculations and also test different methods of interpolation values obtained by FORTRAN. Another proposal is to estimate the burning fuel, taking into account the thermohydraulics parameters and the appearance of xenon. (author)

  7. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  8. On robust parameter estimation in brain-computer interfacing

    Science.gov (United States)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  9. Dense image correspondences for computer vision

    CERN Document Server

    Liu, Ce

    2016-01-01

    This book describes the fundamental building-block of many new computer vision systems: dense and robust correspondence estimation. Dense correspondence estimation techniques are now successfully being used to solve a wide range of computer vision problems, very different from the traditional applications such techniques were originally developed to solve. This book introduces the techniques used for establishing correspondences between challenging image pairs, the novel features used to make these techniques robust, and the many problems dense correspondences are now being used to solve. The book provides information to anyone attempting to utilize dense correspondences in order to solve new or existing computer vision problems. The editors describe how to solve many computer vision problems by using dense correspondence estimation. Finally, it surveys resources, code, and data necessary for expediting the development of effective correspondence-based computer vision systems.   ·         Provides i...

  10. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  11. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  12. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  13. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  14. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  15. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  16. FORTRAN subroutine for computing the optimal estimate of f(x)

    International Nuclear Information System (INIS)

    Gaffney, P.W.

    1980-10-01

    A FORTRAN subroutine called RANGE is presented that is designed to compute the optimal estimate of a function f given values of the function at n distinct points x 1 2 < ... < x/sub n/ and given a bound on one of the derivatives of f. We donate this estimate by Ω. It is optimal in the sense that the error abs value (f - Ω) has the smallest possible error bound

  17. Canada's forest biomass resources: deriving estimates from Canada's forest inventory

    International Nuclear Information System (INIS)

    Penner, M.; Power, K.; Muhairwe, C.; Tellier, R.; Wang, Y.

    1997-01-01

    A biomass inventory for Canada was undertaken to address the data needs of carbon budget modelers, specifically to provide estimates of above-ground tree components and of non-merchantable trees in Canadian forests. The objective was to produce a national method for converting volume estimates to biomass that was standardized, repeatable across the country, efficient and well documented. Different conversion methods were used for low productivity forests (productivity class 1) and higher productivity forests (productivity class 2). The conversion factors were computed by constructing hypothetical stands for each site, age, species and province combination, and estimating the merchantable volume and all the above-ground biomass components from suitable published equations. This report documents the procedures for deriving the national biomass inventory, and provides illustrative examples of the results. 46 refs., 9 tabs., 5 figs

  18. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    International Nuclear Information System (INIS)

    Norris, Edward T.; Liu, Xin; Hsieh, Jiang

    2015-01-01

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. The CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer

  19. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  20. Direct estimation of human trabecular bone stiffness using cone beam computed tomography.

    Science.gov (United States)

    Klintström, Eva; Klintström, Benjamin; Pahr, Dieter; Brismar, Torkel B; Smedby, Örjan; Moreno, Rodrigo

    2018-04-10

    The aim of this study was to evaluate the possibility of estimating the biomechanical properties of trabecular bone through finite element simulations by using dental cone beam computed tomography data. Fourteen human radius specimens were scanned in 3 cone beam computed tomography devices: 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan), NewTom 5 G (QR Verona, Verona, Italy), and Verity (Planmed, Helsinki, Finland). The imaging data were segmented by using 2 different methods. Stiffness (Young modulus), shear moduli, and the size and shape of the stiffness tensor were studied. Corresponding evaluations by using micro-CT were regarded as the reference standard. The 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan) showed good performance in estimating stiffness and shear moduli but was sensitive to the choice of segmentation method. NewTom 5 G (QR Verona, Verona, Italy) and Verity (Planmed, Helsinki, Finland) yielded good correlations, but they were not as strong as Accuitomo 80 (J. Morita MFG., Kyoto, Japan). The cone beam computed tomography devices overestimated both stiffness and shear compared with the micro-CT estimations. Finite element-based calculations of biomechanics from cone beam computed tomography data are feasible, with strong correlations for the Accuitomo 80 scanner (J. Morita MFG., Kyoto, Japan) combined with an appropriate segmentation method. Such measurements might be useful for predicting implant survival by in vivo estimations of bone properties. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Computationally Efficient and Noise Robust DOA and Pitch Estimation

    DEFF Research Database (Denmark)

    Karimian-Azari, Sam; Jensen, Jesper Rindom; Christensen, Mads Græsbøll

    2016-01-01

    Many natural signals, such as voiced speech and some musical instruments, are approximately periodic over short intervals. These signals are often described in mathematics by the sum of sinusoids (harmonics) with frequencies that are proportional to the fundamental frequency, or pitch. In sensor...... a joint DOA and pitch estimator. In white Gaussian noise, we derive even more computationally efficient solutions which are designed using the narrowband power spectrum of the harmonics. Numerical results reveal the performance of the estimators in colored noise compared with the Cram\\'{e}r-Rao lower...

  2. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  3. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  4. Photonic entanglement as a resource in quantum computation and quantum communication

    OpenAIRE

    Prevedel, Robert; Aspelmeyer, Markus; Brukner, Caslav; Jennewein, Thomas; Zeilinger, Anton

    2008-01-01

    Entanglement is an essential resource in current experimental implementations for quantum information processing. We review a class of experiments exploiting photonic entanglement, ranging from one-way quantum computing over quantum communication complexity to long-distance quantum communication. We then propose a set of feasible experiments that will underline the advantages of photonic entanglement for quantum information processing.

  5. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    Science.gov (United States)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  6. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; Reber, Tim; McCabe, Kevin; Mooney, Meghan; Young, Katherine R.

    2017-05-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  7. Resource utilization and costs during the initial years of lung cancer screening with computed tomography in Canada.

    Science.gov (United States)

    Cressman, Sonya; Lam, Stephen; Tammemagi, Martin C; Evans, William K; Leighl, Natasha B; Regier, Dean A; Bolbocean, Corneliu; Shepherd, Frances A; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R; Mayo, John R; McWilliams, Annette; Couture, Christian; English, John C; Goffin, John; Hwang, David M; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J; Goss, Glenwood D; Nicholas, Garth; Seely, Jean M; Sekhon, Harmanjatinder S; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D; Tan, Wan C; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J

    2014-10-01

    It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer's perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400-$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553-$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254-$52,200; p = 0.061). In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure.

  8. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  9. U.S. Hydropower Resource Assessment - California

    Energy Technology Data Exchange (ETDEWEB)

    A. M. Conner; B. N. Rinehart; J. E. Francfort

    1998-10-01

    The U.S. Department of Energy is developing an estimate of the underdeveloped hydropower potential in the United States. For this purpose, the Idaho National Engineering and Environmental Laboratory developed a computer model called Hydropower Evaluation Software (HES). HES measures the undeveloped hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of California.

  10. US Hydropower Resource Assessment for Massachusetts

    Energy Technology Data Exchange (ETDEWEB)

    Francfort, J.E.; Rinehart, B.N.

    1995-07-01

    The Department of Energy is developing an estimate of the undeveloped hydropower potential in the United States. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. The software measures the undeveloped hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven software program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report details the resource assessment results for the Commonwealth of Massachusetts.

  11. US hydropower resource assessment for Texas

    Energy Technology Data Exchange (ETDEWEB)

    Francfort, J.E.

    1993-12-01

    The Department of Energy is developing an estimate of the hydropower development potential in this country. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. The HES measures the potential hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a dBASE menu-driven software application that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report details the resource assessment results for the state of Texas.

  12. US hydropower resource assessment for Montana

    Energy Technology Data Exchange (ETDEWEB)

    Francfort, J.E.

    1993-12-01

    The Department of Energy is developing an estimate of the hydropower development potential in this country. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. The HES measures the potential hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a dBASE menu-driven software application that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report details the resource assessment results for the state of Montana.

  13. US hydropower resource assessment for Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Francfort, J.E.

    1993-12-01

    The Department of Energy is developing an estimate of the hydropower development potential in this country. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. The HES measures the potential hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a dBASE menu-driven software application that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report details the resource assessment results for the state of Kansas.

  14. One-way quantum computing in superconducting circuits

    Science.gov (United States)

    Albarrán-Arriagada, F.; Alvarado Barrios, G.; Sanz, M.; Romero, G.; Lamata, L.; Retamal, J. C.; Solano, E.

    2018-03-01

    We propose a method for the implementation of one-way quantum computing in superconducting circuits. Measurement-based quantum computing is a universal quantum computation paradigm in which an initial cluster state provides the quantum resource, while the iteration of sequential measurements and local rotations encodes the quantum algorithm. Up to now, technical constraints have limited a scalable approach to this quantum computing alternative. The initial cluster state can be generated with available controlled-phase gates, while the quantum algorithm makes use of high-fidelity readout and coherent feedforward. With current technology, we estimate that quantum algorithms with above 20 qubits may be implemented in the path toward quantum supremacy. Moreover, we propose an alternative initial state with properties of maximal persistence and maximal connectedness, reducing the required resources of one-way quantum computing protocols.

  15. Model calibration and parameter estimation for environmental and water resource systems

    CERN Document Server

    Sun, Ne-Zheng

    2015-01-01

    This three-part book provides a comprehensive and systematic introduction to the development of useful models for complex systems. Part 1 covers the classical inverse problem for parameter estimation in both deterministic and statistical frameworks, Part 2 is dedicated to system identification, hyperparameter estimation, and model dimension reduction, and Part 3 considers how to collect data and construct reliable models for prediction and decision-making. For the first time, topics such as multiscale inversion, stochastic field parameterization, level set method, machine learning, global sensitivity analysis, data assimilation, model uncertainty quantification, robust design, and goal-oriented modeling, are systematically described and summarized in a single book from the perspective of model inversion, and elucidated with numerical examples from environmental and water resources modeling. Readers of this book will not only learn basic concepts and methods for simple parameter estimation, but also get famili...

  16. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  17. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    Science.gov (United States)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  18. EXTRAN: A computer code for estimating concentrations of toxic substances at control room air intakes

    International Nuclear Information System (INIS)

    Ramsdell, J.V.

    1991-03-01

    This report presents the NRC staff with a tool for assessing the potential effects of accidental releases of radioactive materials and toxic substances on habitability of nuclear facility control rooms. The tool is a computer code that estimates concentrations at nuclear facility control room air intakes given information about the release and the environmental conditions. The name of the computer code is EXTRAN. EXTRAN combines procedures for estimating the amount of airborne material, a Gaussian puff dispersion model, and the most recent algorithms for estimating diffusion coefficients in building wakes. It is a modular computer code, written in FORTRAN-77, that runs on personal computers. It uses a math coprocessor, if present, but does not require one. Code output may be directed to a printer or disk files. 25 refs., 8 figs., 4 tabs

  19. Precision Parameter Estimation and Machine Learning

    Science.gov (United States)

    Wandelt, Benjamin D.

    2008-12-01

    I discuss the strategy of ``Acceleration by Parallel Precomputation and Learning'' (AP-PLe) that can vastly accelerate parameter estimation in high-dimensional parameter spaces and costly likelihood functions, using trivially parallel computing to speed up sequential exploration of parameter space. This strategy combines the power of distributed computing with machine learning and Markov-Chain Monte Carlo techniques efficiently to explore a likelihood function, posterior distribution or χ2-surface. This strategy is particularly successful in cases where computing the likelihood is costly and the number of parameters is moderate or large. We apply this technique to two central problems in cosmology: the solution of the cosmological parameter estimation problem with sufficient accuracy for the Planck data using PICo; and the detailed calculation of cosmological helium and hydrogen recombination with RICO. Since the APPLe approach is designed to be able to use massively parallel resources to speed up problems that are inherently serial, we can bring the power of distributed computing to bear on parameter estimation problems. We have demonstrated this with the CosmologyatHome project.

  20. Resource Constrained Planning of Multiple Projects with Separable Activities

    Science.gov (United States)

    Fujii, Susumu; Morita, Hiroshi; Kanawa, Takuya

    In this study we consider a resource constrained planning problem of multiple projects with separable activities. This problem provides a plan to process the activities considering a resource availability with time window. We propose a solution algorithm based on the branch and bound method to obtain the optimal solution minimizing the completion time of all projects. We develop three methods for improvement of computational efficiency, that is, to obtain initial solution with minimum slack time rule, to estimate lower bound considering both time and resource constraints and to introduce an equivalence relation for bounding operation. The effectiveness of the proposed methods is demonstrated by numerical examples. Especially as the number of planning projects increases, the average computational time and the number of searched nodes are reduced.

  1. Fast covariance estimation for innovations computed from a spatial Gibbs point process

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Rubak, Ege

    In this paper, we derive an exact formula for the covariance of two innovations computed from a spatial Gibbs point process and suggest a fast method for estimating this covariance. We show how this methodology can be used to estimate the asymptotic covariance matrix of the maximum pseudo...

  2. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  3. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng; Fei, Shiyang; Zongan, Wang; Li, Yu; Zhao, Feng; Gao, Xin

    2018-01-01

    structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology

  4. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Science.gov (United States)

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  5. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  6. Open Educational Resources: The Role of OCW, Blogs and Videos in Computer Networks Classroom

    Directory of Open Access Journals (Sweden)

    Pablo Gil

    2012-09-01

    Full Text Available This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

  7. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  8. High-order computer-assisted estimates of topological entropy

    Science.gov (United States)

    Grote, Johannes

    The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.

  9. Computational Error Estimate for the Power Series Solution of Odes ...

    African Journals Online (AJOL)

    This paper compares the error estimation of power series solution with recursive Tau method for solving ordinary differential equations. From the computational viewpoint, the power series using zeros of Chebyshevpolunomial is effective, accurate and easy to use. Keywords: Lanczos Tau method, Chebyshev polynomial, ...

  10. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  11. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods.

    Science.gov (United States)

    Teixeira, C A; Pereira, W C A; Ruano, A E; Ruano, M Graça

    2010-01-01

    This work reports original results on the possibility of non-invasive temperature estimation (NITE) in a multilayered phantom by applying soft-computing methods. The existence of reliable non-invasive temperature estimator models would improve the security and efficacy of thermal therapies. These points would lead to a broader acceptance of this kind of therapies. Several approaches based on medical imaging technologies were proposed, magnetic resonance imaging (MRI) being appointed as the only one to achieve the acceptable temperature resolutions for hyperthermia purposes. However, MRI intrinsic characteristics (e.g., high instrumentation cost) lead us to use backscattered ultrasound (BSU). Among the different BSU features, temporal echo-shifts have received a major attention. These shifts are due to changes of speed-of-sound and expansion of the medium. The originality of this work involves two aspects: the estimator model itself is original (based on soft-computing methods) and the application to temperature estimation in a three-layer phantom is also not reported in literature. In this work a three-layer (non-homogeneous) phantom was developed. The two external layers were composed of (in % of weight): 86.5% degassed water, 11% glycerin and 2.5% agar-agar. The intermediate layer was obtained by adding graphite powder in the amount of 2% of the water weight to the above composition. The phantom was developed to have attenuation and speed-of-sound similar to in vivo muscle, according to the literature. BSU signals were collected and cumulative temporal echo-shifts computed. These shifts and the past temperature values were then considered as possible estimators inputs. A soft-computing methodology was applied to look for appropriate multilayered temperature estimators. The methodology involves radial-basis functions neural networks (RBFNN) with structure optimized by the multi-objective genetic algorithm (MOGA). In this work 40 operating conditions were

  12. Methods for the estimation and economic evaluation of undiscovered uranium endowment and resources

    International Nuclear Information System (INIS)

    1992-01-01

    The present Instruction Manual was prepared as part of a programme of the International Atomic Energy Agency to supply the international uranium community with standard guides for a number of topics related to uranium resource assessment and supply. The quantitative estimation of undiscovered resources and endowments aims at supplying data on potential mineral resources; these data are needed to compare long term projections with one another and to assess the mineral supplies to be obtained from elsewhere. These objectives have relatively recently been supplemented by the concern of land managers and national policy planners to assess the potential of certain lands before the constitution of national parks and other areas reserved from mineral exploration and development. 88 refs, 28 figs, 33 tabs

  13. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  14. U.S. hydropower resource assessment for Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Conner, A.M.; Francfort, J.E.

    1997-10-01

    The US Department of Energy is developing an estimate of the undeveloped hydropower potential in the US. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. HES measures the undeveloped hydropower resources available in the US, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of Nevada.

  15. U.S. hydropower resource assessment for Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Conner, A.M.; Francfort, J.E.

    1998-08-01

    The US Department of Energy is developing an estimate of the undeveloped hydropower potential in the US. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering and Environmental Laboratory for this purpose. HES measures the undeveloped hydropower resources available in the US, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of Idaho.

  16. US hydropower resource assessment for New Jersey

    Energy Technology Data Exchange (ETDEWEB)

    Connor, A.M.; Francfort, J.E.

    1996-03-01

    The Department of Energy is developing an estimate of the undeveloped hydropower potential in this country. The Hydropower Evaluation Software is a computer model that was developed by the Idaho National Engineering Laboratory for this purpose. The software measures the undeveloped hydropower resources available in the United States, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven software program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes present, and generate reports based on these suitability factors. This report details the resource assessment results for the State of New Jersey.

  17. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej; Kuźnik, Krzysztof M.; Paszyński, Maciej R.; Calo, Victor M.; Pardo, D.

    2014-01-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  18. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  19. Australian uranium resources

    International Nuclear Information System (INIS)

    Battey, G.C.; Miezitis, Y.; McKay, A.D.

    1987-01-01

    Australia's uranium resources amount to 29% of the WOCA countries (world outside centrally-planned-economies areas) low-cost Reasonably Assured Resources and 28% of the WOCA countries low-cost Estimated Additional Resources. As at 1 January 1986, the Bureau of Mineral Resources estimated Australia's uranium resources as: (1) Cost range to US$80/kg U -Reasonably Assured Resources, 465 000 t U; Estimated Additional Resources, 256 000 t U; (2) Cost range US$80-130/kg U -Reasonably Assured Resources, 56 000 t U; Estimated Additional Resources, 127 000 t U. Most resources are contained in Proterozoic unconformity-related deposits in the Alligator Rivers uranium field in the Northern Territory (Jabiluka, Ranger, Koongarra, Nabarlek deposits) and the Proterozoic stratabound deposit at Olympic Dam on the Stuart Shelf in South Australia

  20. Estimating resource costs of compliance with EU WFD ecological status requirements at the river basin scale

    DEFF Research Database (Denmark)

    Riegels, Niels; Jensen, Roar; Benasson, Lisa

    2011-01-01

    Resource costs of meeting EU WFD ecological status requirements at the river basin scale are estimated by comparing net benefits of water use given ecological status constraints to baseline water use values. Resource costs are interpreted as opportunity costs of water use arising from water...... scarcity. An optimization approach is used to identify economically efficient ways to meet WFD requirements. The approach is implemented using a river basin simulation model coupled to an economic post-processor; the simulation model and post-processor are run from a central controller that iterates until...... an allocation is found that maximizes net benefits given WFD requirements. Water use values are estimated for urban/domestic, agricultural, industrial, livestock, and tourism water users. Ecological status is estimated using metrics that relate average monthly river flow volumes to the natural hydrologic regime...

  1. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; McCabe, Kevin; Mooney, Meghan; Reber, Timothy; Young, Katherine R.

    2016-10-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  2. A distributed approach for parameters estimation in System Biology models

    International Nuclear Information System (INIS)

    Mosca, E.; Merelli, I.; Alfieri, R.; Milanesi, L.

    2009-01-01

    Due to the lack of experimental measurements, biological variability and experimental errors, the value of many parameters of the systems biology mathematical models is yet unknown or uncertain. A possible computational solution is the parameter estimation, that is the identification of the parameter values that determine the best model fitting respect to experimental data. We have developed an environment to distribute each run of the parameter estimation algorithm on a different computational resource. The key feature of the implementation is a relational database that allows the user to swap the candidate solutions among the working nodes during the computations. The comparison of the distributed implementation with the parallel one showed that the presented approach enables a faster and better parameter estimation of systems biology models.

  3. Resource allocation in grid computing

    NARCIS (Netherlands)

    Koole, Ger; Righter, Rhonda

    2007-01-01

    Grid computing, in which a network of computers is integrated to create a very fast virtual computer, is becoming ever more prevalent. Examples include the TeraGrid and Planet-lab.org, as well as applications on the existing Internet that take advantage of unused computing and storage capacity of

  4. A confirmatory investigation of a job demands-resources model using a categorical estimator.

    Science.gov (United States)

    de Beer, Leon; Rothmann, Sebastiaan; Pienaar, Jaco

    2012-10-01

    A confirmatory investigation of a job demands-resources model was conducted with alternative methods, in a sample of 15,633 working adults aggregated from various economic sectors. The proposed model is in line with job demands-resources theory and assumes two psychological processes at work which are collectively coined "the dual process." The first process, the energetic, presents that job demands lead to ill-health outcomes due to burnout. The second process, the motivational, indicates that job resources lead to organizational commitment due to work engagement. Structural equation modelling analyses were implemented with a categorical estimator. Mediation analyses of each of the processes included bootstrapped indirect effects and kappa-squared values to apply qualitative labels to effect sizes. The relationship between job resources and organizational commitment was mediated by engagement with a large effect. The relationship between job demands and ill-health was mediated by burnout with a medium effect. The implications of the results for theory and practice were discussed.

  5. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  6. Estimation of crop water requirements using remote sensing for operational water resources management

    Science.gov (United States)

    Vasiliades, Lampros; Spiliotopoulos, Marios; Tzabiras, John; Loukas, Athanasios; Mylopoulos, Nikitas

    2015-06-01

    An integrated modeling system, developed in the framework of "Hydromentor" research project, is applied to evaluate crop water requirements for operational water resources management at Lake Karla watershed, Greece. The framework includes coupled components for operation of hydrotechnical projects (reservoir operation and irrigation works) and estimation of agricultural water demands at several spatial scales using remote sensing. The study area was sub-divided into irrigation zones based on land use maps derived from Landsat 5 TM images for the year 2007. Satellite-based energy balance for mapping evapotranspiration with internalized calibration (METRIC) was used to derive actual evapotranspiration (ET) and crop coefficient (ETrF) values from Landsat TM imagery. Agricultural water needs were estimated using the FAO method for each zone and each control node of the system for a number of water resources management strategies. Two operational strategies of hydro-technical project development (present situation without operation of the reservoir and future situation with the operation of the reservoir) are coupled with three water demand strategies. In total, eight (8) water management strategies are evaluated and compared. The results show that, under the existing operational water resources management strategies, the crop water requirements are quite large. However, the operation of the proposed hydro-technical projects in Lake Karla watershed coupled with water demand management measures, like improvement of existing water distribution systems, change of irrigation methods, and changes of crop cultivation could alleviate the problem and lead to sustainable and ecological use of water resources in the study area.

  7. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  8. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  9. Mass Estimate for a Lunar Resource Launcher Based on Existing Terrestrial Electromagnetic Launchers

    Directory of Open Access Journals (Sweden)

    Gordon Roesler

    2013-06-01

    Full Text Available Economic exploitation of lunar resources may be more efficient with a non-rocket approach to launch from the lunar surface. The launch system cost will depend on its design, and on the number of launches from Earth to deliver the system to the Moon. Both of these will depend on the launcher system mass. Properties of an electromagnetic resource launcher are derived from two mature terrestrial electromagnetic launchers. A mass model is derived and used to estimate launch costs for a developmental launch vehicle. A rough manufacturing cost for the system is suggested.

  10. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  11. Monitoring of computing resource use of active software releases at ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219183; The ATLAS collaboration

    2017-01-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and dis...

  12. Big Data in Cloud Computing: A Resource Management Perspective

    Directory of Open Access Journals (Sweden)

    Saeed Ullah

    2018-01-01

    Full Text Available The modern day advancement is increasingly digitizing our lives which has led to a rapid growth of data. Such multidimensional datasets are precious due to the potential of unearthing new knowledge and developing decision-making insights from them. Analyzing this huge amount of data from multiple sources can help organizations to plan for the future and anticipate changing market trends and customer requirements. While the Hadoop framework is a popular platform for processing larger datasets, there are a number of other computing infrastructures, available to use in various application domains. The primary focus of the study is how to classify major big data resource management systems in the context of cloud computing environment. We identify some key features which characterize big data frameworks as well as their associated challenges and issues. We use various evaluation metrics from different aspects to identify usage scenarios of these platforms. The study came up with some interesting findings which are in contradiction with the available literature on the Internet.

  13. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  14. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  15. Estimating the financial resources needed for local public health departments in Minnesota: a multimethod approach.

    Science.gov (United States)

    Riley, William; Briggs, Jill; McCullough, Mac

    2011-01-01

    This study presents a model for determining total funding needed for individual local health departments. The aim is to determine the financial resources needed to provide services for statewide local public health departments in Minnesota based on a gaps analysis done to estimate the funding needs. We used a multimethod analysis consisting of 3 approaches to estimate gaps in local public health funding consisting of (1) interviews of selected local public health leaders, (2) a Delphi panel, and (3) a Nominal Group Technique. On the basis of these 3 approaches, a consensus estimate of funding gaps was generated for statewide projections. The study includes an analysis of cost, performance, and outcomes from 2005 to 2007 for all 87 local governmental health departments in Minnesota. For each of the methods, we selected a panel to represent a profile of Minnesota health departments. The 2 main outcome measures were local-level gaps in financial resources and total resources needed to provide public health services at the local level. The total public health expenditure in Minnesota for local governmental public health departments was $302 million in 2007 ($58.92 per person). The consensus estimate of the financial gaps in local public health departments indicates that an additional $32.5 million (a 10.7% increase or $6.32 per person) is needed to adequately serve public health needs in the local communities. It is possible to make informed estimates of funding gaps for public health activities on the basis of a combination of quantitative methods. There is a wide variation in public health expenditure at the local levels, and methods are needed to establish minimum baseline expenditure levels to adequately treat a population. The gaps analysis can be used by stakeholders to inform policy makers of the need for improved funding of the public health system.

  16. Resource Estimations in Contingency Planning for Foot-and-Mouth Disease

    Directory of Open Access Journals (Sweden)

    Anette Boklund

    2017-05-01

    Full Text Available Preparedness planning for a veterinary crisis is important to be fast and effective in the eradication of disease. For countries with a large export of animals and animal products, each extra day in an epidemic will cost millions of Euros due to the closure of export markets. This is important for the Danish husbandry industry, especially the swine industry, which had an export of €4.4 billion in 2012. The purposes of this project were to (1 develop an iterative tool with the aim of estimating the resources needed during an outbreak of foot-and-mouth disease (FMD in Denmark, (2 identify areas, which can delay the control of the disease. The tool developed should easily be updated, when knowledge is gained from other veterinary crises or during an outbreak of FMD. The stochastic simulation model DTU-DADS was used to simulate spread of FMD in Denmark. For each task occurring during an epidemic of FMD, the time and personnel needed per herd was estimated by a working group with expertise in contingency and crisis management. By combining this information, an iterative model was created to calculate the needed personnel on a daily basis during the epidemic. The needed personnel was predicted to peak within the first week with a requirement of approximately 123 (65–175 veterinarians, 33 (23–64 technicians, and 36 (26–49 administrative staff on day 2, while the personnel needed in the Danish Emergency Management Agency (responsible for the hygiene barrier and initial cleaning and disinfection of the farm was predicted to be 174 (58–464, mostly recruits. The time needed for surveillance visits was predicted to be the most influential factor in the calculations. Based on results from a stochastic simulation model, it was possible to create an iterative model to estimate the requirements for personnel during an FMD outbreak in Denmark. The model can easily be adjusted, when new information on resources appears from management of other crisis or

  17. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  18. Improved Offshore Wind Resource Assessment in Global Climate Stabilization Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Sullivan, P.; Heimiller, D.; Lopez, A.; Eurek, K.; Badger, J.; Jorgensen, H. E.; Kelly, M.; Clarke, L.; Luckow, P.

    2012-10-01

    This paper introduces a technique for digesting geospatial wind-speed data into areally defined -- country-level, in this case -- wind resource supply curves. We combined gridded wind-vector data for ocean areas with bathymetry maps, country exclusive economic zones, wind turbine power curves, and other datasets and relevant parameters to build supply curves that estimate a country's offshore wind resource defined by resource quality, depth, and distance-from-shore. We include a single set of supply curves -- for a particular assumption set -- and study some implications of including it in a global energy model. We also discuss the importance of downscaling gridded wind vector data to capturing the full resource potential, especially over land areas with complex terrain. This paper includes motivation and background for a statistical downscaling methodology to account for terrain effects with a low computational burden. Finally, we use this forum to sketch a framework for building synthetic electric networks to estimate transmission accessibility of renewable resource sites in remote areas.

  19. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    Science.gov (United States)

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  20. Overview of the Practical and Theoretical Approaches to the Estimation of Mineral Resources. A Financial Perspective

    Directory of Open Access Journals (Sweden)

    Leontina Pavaloaia

    2012-10-01

    Full Text Available Mineral resources represent an important natural resource whose exploitation, unless it is rational, can lead to their exhaustion and the collapse of sustainable development. Given the importance of mineral resources and the uncertainty concerning the estimation of extant reserves, they have been analyzed by several national and international institutions. In this article we shall present a few aspects concerning the ways to approach the reserves of mineral resources at national and international level, by considering both economic aspects and those aspects concerned with the definition, classification and aggregation of the reserves of mineral resources by various specialized institutions. At present there are attempts to homogenize practices concerning these aspects for the purpose of presenting correct and comparable information.

  1. NATO Advanced Study Institute on Statistical Treatments for Estimation of Mineral and Energy Resources

    CERN Document Server

    Fabbri, A; Sinding-Larsen, R

    1988-01-01

    This volume contains the edited papers prepared by lecturers and participants of the NATO Advanced Study Institute on "Statistical Treatments for Estimation of Mineral and Energy Resources" held at II Ciocco (Lucca), Italy, June 22 - July 4, 1986. During the past twenty years, tremendous efforts have been made to acquire quantitative geoscience information from ore deposits, geochemical, geophys ical and remotely-sensed measurements. In October 1981, a two-day symposium on "Quantitative Resource Evaluation" and a three-day workshop on "Interactive Systems for Multivariate Analysis and Image Processing for Resource Evaluation" were held in Ottawa, jointly sponsored by the Geological Survey of Canada, the International Association for Mathematical Geology, and the International Geological Correlation Programme. Thirty scientists from different countries in Europe and North America were invited to form a forum for the discussion of quantitative methods for mineral and energy resource assessment. Since then, not ...

  2. A Stream Tilling Approach to Surface Area Estimation for Large Scale Spatial Data in a Shared Memory System

    Directory of Open Access Journals (Sweden)

    Liu Jiping

    2017-12-01

    Full Text Available Surface area estimation is a widely used tool for resource evaluation in the physical world. When processing large scale spatial data, the input/output (I/O can easily become the bottleneck in parallelizing the algorithm due to the limited physical memory resources and the very slow disk transfer rate. In this paper, we proposed a stream tilling approach to surface area estimation that first decomposed a spatial data set into tiles with topological expansions. With these tiles, the one-to-one mapping relationship between the input and the computing process was broken. Then, we realized a streaming framework towards the scheduling of the I/O processes and computing units. Herein, each computing unit encapsulated a same copy of the estimation algorithm, and multiple asynchronous computing units could work individually in parallel. Finally, the performed experiment demonstrated that our stream tilling estimation can efficiently alleviate the heavy pressures from the I/O-bound work, and the measured speedup after being optimized have greatly outperformed the directly parallel versions in shared memory systems with multi-core processors.

  3. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  4. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  5. Estimation of resource savings due to fly ash utilization in road construction

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Subodh; Patil, C.B. [Centre for Energy Studies, Indian Institute of Technology, New Delhi 110016 (India)

    2006-08-15

    A methodology for estimation of natural resource savings due to fly ash utilization in road construction in India is presented. Analytical expressions for the savings of various resources namely soil, stone aggregate, stone chips, sand and cement in the embankment, granular sub-base (GSB), water bound macadam (WBM) and pavement quality concrete (PQC) layers of fly ash based road formation with flexible and rigid pavements of a given geometry have been developed. The quantity of fly ash utilized in these layers of different pavements has also been quantified. In the present study, the maximum amount of resource savings is found in GSB followed by WBM and other layers of pavement. The soil quantity saved increases asymptotically with the rise in the embankment height. The results of financial analysis based on Indian fly ash based road construction cost data indicate that the savings in construction cost decrease with the lead and the investment on this alternative is found to be financially attractive only for a lead less than 60 and 90km for flexible and rigid pavements, respectively. (author)

  6. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  7. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  8. NEWBOX: A computer program for parameter estimation in diffusion problems

    International Nuclear Information System (INIS)

    Nestor, C.W. Jr.; Godbee, H.W.; Joy, D.S.

    1989-01-01

    In the analysis of experiments to determine amounts of material transferred form 1 medium to another (e.g., the escape of chemically hazardous and radioactive materials from solids), there are at least 3 important considerations. These are (1) is the transport amenable to treatment by established mass transport theory; (2) do methods exist to find estimates of the parameters which will give a best fit, in some sense, to the experimental data; and (3) what computational procedures are available for evaluating the theoretical expressions. The authors have made the assumption that established mass transport theory is an adequate model for the situations under study. Since the solutions of the diffusion equation are usually nonlinear in some parameters (diffusion coefficient, reaction rate constants, etc.), use of a method of parameter adjustment involving first partial derivatives can be complicated and prone to errors in the computation of the derivatives. In addition, the parameters must satisfy certain constraints; for example, the diffusion coefficient must remain positive. For these reasons, a variant of the constrained simplex method of M. J. Box has been used to estimate parameters. It is similar, but not identical, to the downhill simplex method of Nelder and Mead. In general, they calculate the fraction of material transferred as a function of time from expressions obtained by the inversion of the Laplace transform of the fraction transferred, rather than by taking derivatives of a calculated concentration profile. With the above approaches to the 3 considerations listed at the outset, they developed a computer program NEWBOX, usable on a personal computer, to calculate the fractional release of material from 4 different geometrical shapes (semi-infinite medium, finite slab, finite circular cylinder, and sphere), accounting for several different boundary conditions

  9. A posteriori error estimator and AMR for discrete ordinates nodal transport methods

    International Nuclear Information System (INIS)

    Duo, Jose I.; Azmy, Yousry Y.; Zikatanov, Ludmil T.

    2009-01-01

    In the development of high fidelity transport solvers, optimization of the use of available computational resources and access to a tool for assessing quality of the solution are key to the success of large-scale nuclear systems' simulation. In this regard, error control provides the analyst with a confidence level in the numerical solution and enables for optimization of resources through Adaptive Mesh Refinement (AMR). In this paper, we derive an a posteriori error estimator based on the nodal solution of the Arbitrarily High Order Transport Method of the Nodal type (AHOT-N). Furthermore, by making assumptions on the regularity of the solution, we represent the error estimator as a function of computable volume and element-edges residuals. The global L 2 error norm is proved to be bound by the estimator. To lighten the computational load, we present a numerical approximation to the aforementioned residuals and split the global norm error estimator into local error indicators. These indicators are used to drive an AMR strategy for the spatial discretization. However, the indicators based on forward solution residuals alone do not bound the cell-wise error. The estimator and AMR strategy are tested in two problems featuring strong heterogeneity and highly transport streaming regime with strong flux gradients. The results show that the error estimator indeed bounds the global error norms and that the error indicator follows the cell-error's spatial distribution pattern closely. The AMR strategy proves beneficial to optimize resources, primarily by reducing the number of unknowns solved for to achieve prescribed solution accuracy in global L 2 error norm. Likewise, AMR achieves higher accuracy compared to uniform refinement when resolving sharp flux gradients, for the same number of unknowns

  10. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  11. Computationally Efficient 2D DOA Estimation for L-Shaped Array with Unknown Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Yang-Yang Dong

    2018-01-01

    Full Text Available Although L-shaped array can provide good angle estimation performance and is easy to implement, its two-dimensional (2D direction-of-arrival (DOA performance degrades greatly in the presence of mutual coupling. To deal with the mutual coupling effect, a novel 2D DOA estimation method for L-shaped array with low computational complexity is developed in this paper. First, we generalize the conventional mutual coupling model for L-shaped array and compensate the mutual coupling blindly via sacrificing a few sensors as auxiliary elements. Then we apply the propagator method twice to mitigate the effect of strong source signal correlation effect. Finally, the estimations of azimuth and elevation angles are achieved simultaneously without pair matching via the complex eigenvalue technique. Compared with the existing methods, the proposed method is computationally efficient without spectrum search or polynomial rooting and also has fine angle estimation performance for highly correlated source signals. Theoretical analysis and simulation results have demonstrated the effectiveness of the proposed method.

  12. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    International Nuclear Information System (INIS)

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  13. Computable Error Estimates for Finite Element Approximations of Elliptic Partial Differential Equations with Rough Stochastic Data

    KAUST Repository

    Hall, Eric Joseph

    2016-12-08

    We derive computable error estimates for finite element approximations of linear elliptic partial differential equations with rough stochastic coefficients. In this setting, the exact solutions contain high frequency content that standard a posteriori error estimates fail to capture. We propose goal-oriented estimates, based on local error indicators, for the pathwise Galerkin and expected quadrature errors committed in standard, continuous, piecewise linear finite element approximations. Derived using easily validated assumptions, these novel estimates can be computed at a relatively low cost and have applications to subsurface flow problems in geophysics where the conductivities are assumed to have lognormal distributions with low regularity. Our theory is supported by numerical experiments on test problems in one and two dimensions.

  14. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  16. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Directory of Open Access Journals (Sweden)

    Qian Li

    Full Text Available BACKGROUND: Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. METHODOLOGY: We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671 between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. CONCLUSIONS: This article proposes a network-based multi-target computational estimation

  17. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Science.gov (United States)

    Li, Qian; Li, Xudong; Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-03-22

    Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by

  18. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  19. Computer simulation comparison of tripolar, bipolar, and spline Laplacian electrocadiogram estimators.

    Science.gov (United States)

    Chen, T; Besio, W; Dai, W

    2009-01-01

    A comparison of the performance of the tripolar and bipolar concentric as well as spline Laplacian electrocardiograms (LECGs) and body surface Laplacian mappings (BSLMs) for localizing and imaging the cardiac electrical activation has been investigated based on computer simulation. In the simulation a simplified eccentric heart-torso sphere-cylinder homogeneous volume conductor model were developed. Multiple dipoles with different orientations were used to simulate the underlying cardiac electrical activities. Results show that the tripolar concentric ring electrodes produce the most accurate LECG and BSLM estimation among the three estimators with the best performance in spatial resolution.

  20. Real-Time Head Pose Estimation on Mobile Platforms

    Directory of Open Access Journals (Sweden)

    Jianfeng Ren

    2010-06-01

    Full Text Available Many computer vision applications such as augmented reality require head pose estimation. As far as the real-time implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to satisfy real-time constraints while maintaining reasonable head pose estimation accuracy. The introduced head pose estimation approach in this paper is an attempt to meet this objective. The approach consists of the following components: Viola-Jones face detection, color-based face tracking using an online calibration procedure, and head pose estimation using Hu moment features and Fisher linear discriminant. Experimental results running on an actual mobile device are reported exhibiting both the real- time and accuracy aspects of the developed approach.

  1. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  2. LHCb: Self managing experiment resources

    CERN Multimedia

    Stagni, F

    2013-01-01

    Within this paper we present an autonomic Computing resources management system used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous resources. For example, LHC experiments very often use resources not provided by WLCG and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System ( Resource Status System ) delivering real time informatio...

  3. Resource optimised reconfigurable modular parallel pipelined stochastic approximation-based self-tuning regulator architecture with reduced latency

    Directory of Open Access Journals (Sweden)

    Varghese Mathew Vaidyan

    2015-09-01

    Full Text Available Present self-tuning regulator architectures based on recursive least-square estimation are computationally expensive and require large amount of resources and time in generating the first control signal due to computational bottlenecks imposed by the calculations involved in estimation stage, different stages of matrix multiplications and the number of intermediate variables at each iteration and precludes its use in applications that have fast required response times and those which run on embedded computing platforms with low-power or low-cost requirements with constraints on resource usage. A salient feature of this study is that a new modular parallel pipelined stochastic approximation-based self-tuning regulator architecture which reduces the time required to generate the first control signal, reduces resource usage and reduces the number of intermediate variables is proposed. Fast matrix multiplication, pipelining and high-speed arithmetic function implementations were used for improving the performance. Results of implementation demonstrate that the proposed architecture has an improvement in control signal generation time by 38% and reduction in resource usage by 41% in terms of multipliers and 44.4% in terms of adders compared with the best existing related work, opening up new possibilities for the application of online embedded self-tuning regulators.

  4. The NILE system architecture: fault-tolerant, wide-area access to computing and data resources

    International Nuclear Information System (INIS)

    Ricciardi, Aleta; Ogg, Michael; Rothfus, Eric

    1996-01-01

    NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

  5. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    Science.gov (United States)

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  6. Computationally Efficient 2D DOA Estimation with Uniform Rectangular Array in Low-Grazing Angle

    Directory of Open Access Journals (Sweden)

    Junpeng Shi

    2017-02-01

    Full Text Available In this paper, we propose a computationally efficient spatial differencing matrix set (SDMS method for two-dimensional direction of arrival (2D DOA estimation with uniform rectangular arrays (URAs in a low-grazing angle (LGA condition. By rearranging the auto-correlation and cross-correlation matrices in turn among different subarrays, the SDMS method can estimate the two parameters independently with one-dimensional (1D subspace-based estimation techniques, where we only perform difference for auto-correlation matrices and the cross-correlation matrices are kept completely. Then, the pair-matching of two parameters is achieved by extracting the diagonal elements of URA. Thus, the proposed method can decrease the computational complexity, suppress the effect of additive noise and also have little information loss. Simulation results show that, in LGA, compared to other methods, the proposed methods can achieve performance improvement in the white or colored noise conditions.

  7. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  8. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  9. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  10. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  11. Development of a computationally efficient algorithm for attitude estimation of a remote sensing satellite

    Science.gov (United States)

    Labibian, Amir; Bahrami, Amir Hossein; Haghshenas, Javad

    2017-09-01

    This paper presents a computationally efficient algorithm for attitude estimation of remote a sensing satellite. In this study, gyro, magnetometer, sun sensor and star tracker are used in Extended Kalman Filter (EKF) structure for the purpose of Attitude Determination (AD). However, utilizing all of the measurement data simultaneously in EKF structure increases computational burden. Specifically, assuming n observation vectors, an inverse of a 3n×3n matrix is required for gain calculation. In order to solve this problem, an efficient version of EKF, namely Murrell's version, is employed. This method utilizes measurements separately at each sampling time for gain computation. Therefore, an inverse of a 3n×3n matrix is replaced by an inverse of a 3×3 matrix for each measurement vector. Moreover, gyro drifts during the time can reduce the pointing accuracy. Therefore, a calibration algorithm is utilized for estimation of the main gyro parameters.

  12. Usefulness of an enhanced Kitaev phase-estimation algorithm in quantum metrology and computation

    Science.gov (United States)

    Kaftal, Tomasz; Demkowicz-Dobrzański, Rafał

    2014-12-01

    We analyze the performance of a generalized Kitaev's phase-estimation algorithm where N phase gates, acting on M qubits prepared in a product state, may be distributed in an arbitrary way. Unlike the standard algorithm, where the mean square error scales as 1 /N , the optimal generalizations offer the Heisenberg 1 /N2 error scaling and we show that they are in fact very close to the fundamental Bayesian estimation bound. We also demonstrate that the optimality of the algorithm breaks down when losses are taken into account, in which case the performance is inferior to the optimal entanglement-based estimation strategies. Finally, we show that when an alternative resource quantification is adopted, which describes the phase estimation in Shor's algorithm more accurately, the standard Kitaev's procedure is indeed optimal and there is no need to consider its generalized version.

  13. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    Science.gov (United States)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original

  14. Estimation of potential biomass resource and biogas production from aquatic plants in Argentina

    Science.gov (United States)

    Fitzsimons, R. E.; Laurino, C. N.; Vallejos, R. H.

    1982-08-01

    The use of aquatic plants in artificial lakes as a biomass source for biogas and fertilizer production through anaerobic fermentation is evaluated, and the magnitude of this resource and the potential production of biogas and fertilizer are estimated. The specific case considered is the artificial lake that will be created by the construction of Parana Medio Hydroelectric Project on the middle Parana River in Argentina. The growth of the main aquatic plant, water hyacinth, on the middle Parana River has been measured, and its conversion to methane by anaerobic fermentation is determined. It is estimated that gross methane production may be between 1.0-4.1 x 10 to the 9th cu cm/year. The fermentation residue can be used as a soil conditioner, and it is estimated production of the residue may represent between 54,900-221,400 tons of nitrogen/year, a value which is 2-8 times the present nitrogen fertilizer demand in Argentina.

  15. Brain-computer interface for alertness estimation and improving

    Science.gov (United States)

    Hramov, Alexander; Maksimenko, Vladimir; Hramova, Marina

    2018-02-01

    Using wavelet analysis of the signals of electrical brain activity (EEG), we study the processes of neural activity, associated with perception of visual stimuli. We demonstrate that the brain can process visual stimuli in two scenarios: (i) perception is characterized by destruction of the alpha-waves and increase in the high-frequency (beta) activity, (ii) the beta-rhythm is not well pronounced, while the alpha-wave energy remains unchanged. The special experiments show that the motivation factor initiates the first scenario, explained by the increasing alertness. Based on the obtained results we build the brain-computer interface and demonstrate how the degree of the alertness can be estimated and controlled in real experiment.

  16. Epicardial adipose tissue volume estimation by postmortem computed tomography of eviscerated hearts

    DEFF Research Database (Denmark)

    Hindsø, Louise; Jakobsen, Lykke S; Jacobsen, Christina

    2017-01-01

    Epicardial adipose tissue (EAT) may play a role in the development of coronary artery disease. The purpose of this study was to evaluate a method based on postmortem computed tomography (PMCT) for the estimation of EAT volume. We PMCT-scanned the eviscerated hearts of 144 deceased individuals, wh...

  17. Controlling user access to electronic resources without password

    Science.gov (United States)

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  18. A Computable Plug-In Estimator of Minimum Volume Sets for Novelty Detection

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Ding, Yu

    2010-01-01

    A minimum volume set of a probability density is a region of minimum size among the regions covering a given probability mass of the density. Effective methods for finding the minimum volume sets are very useful for detecting failures or anomalies in commercial and security applications-a problem known as novelty detection. One theoretical approach of estimating the minimum volume set is to use a density level set where a kernel density estimator is plugged into the optimization problem that yields the appropriate level. Such a plug-in estimator is not of practical use because solving the corresponding minimization problem is usually intractable. A modified plug-in estimator was proposed by Hyndman in 1996 to overcome the computation difficulty of the theoretical approach but is not well studied in the literature. In this paper, we provide theoretical support to this estimator by showing its asymptotic consistency. We also show that this estimator is very competitive to other existing novelty detection methods through an extensive empirical study. ©2010 INFORMS.

  19. A Computable Plug-In Estimator of Minimum Volume Sets for Novelty Detection

    KAUST Repository

    Park, Chiwoo

    2010-10-01

    A minimum volume set of a probability density is a region of minimum size among the regions covering a given probability mass of the density. Effective methods for finding the minimum volume sets are very useful for detecting failures or anomalies in commercial and security applications-a problem known as novelty detection. One theoretical approach of estimating the minimum volume set is to use a density level set where a kernel density estimator is plugged into the optimization problem that yields the appropriate level. Such a plug-in estimator is not of practical use because solving the corresponding minimization problem is usually intractable. A modified plug-in estimator was proposed by Hyndman in 1996 to overcome the computation difficulty of the theoretical approach but is not well studied in the literature. In this paper, we provide theoretical support to this estimator by showing its asymptotic consistency. We also show that this estimator is very competitive to other existing novelty detection methods through an extensive empirical study. ©2010 INFORMS.

  20. Self managing experiment resources

    International Nuclear Information System (INIS)

    Stagni, F; Ubeda, M; Charpentier, P; Tsaregorodtsev, A; Romanovskiy, V; Roiser, S; Graciani, R

    2014-01-01

    Within this paper we present an autonomic Computing resources management system, used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous resources. For example, LHC experiments very often use resources not provided by WLCG, and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware, addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System (Resource Status System) delivering real time information, the system controls the resources topology, independently of the resource types. The Resource Status System applies data mining techniques against all possible information sources available and assesses the status changes, that are then propagated to the topology description. Obviously, giving full control to such an automated system is not risk-free. Therefore, in order to minimise the probability of misbehavior, a battery of tests has been developed in order to certify the correctness of its assessments. We will demonstrate the performance and efficiency of such a system in terms of cost reduction and reliability.

  1. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  2. Estimate of the Geothermal Energy Resource in the Major Sedimentary Basins in the United States (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Esposito, A.; Porro, C.; Augustine, C.; Roberts, B.

    2012-09-01

    Because most sedimentary basins have been explored for oil and gas, well logs, temperatures at depth, and reservoir properties such as depth to basement and formation thickness are well known. The availability of this data reduces exploration risk and allows development of geologic exploration models for each basin. This study estimates the magnitude of recoverable geothermal energy from 15 major known U.S. sedimentary basins and ranks these basins relative to their potential. The total available thermal resource for each basin was estimated using the volumetric heat-in-place method originally proposed by (Muffler, 1979). A qualitative recovery factor was determined for each basin based on data on flow volume, hydrothermal recharge, and vertical and horizontal permeability. Total sedimentary thickness maps, stratigraphic columns, cross sections, and temperature gradient information was gathered for each basin from published articles, USGS reports, and state geological survey reports. When published data were insufficient, thermal gradients and reservoir properties were derived from oil and gas well logs obtained on oil and gas commission databases. Basin stratigraphy, structural history, and groundwater circulation patterns were studied in order to develop a model that estimates resource size, temperature distribution, and a probable quantitative recovery factor.

  3. Effect of water resource development and management on lymphatic filariasis, and estimates of populations at risk.

    Science.gov (United States)

    Erlanger, Tobias E; Keiser, Jennifer; Caldas De Castro, Marcia; Bos, Robert; Singer, Burton H; Tanner, Marcel; Utzinger, Jürg

    2005-09-01

    Lymphatic filariasis (LF) is a debilitating disease overwhelmingly caused by Wuchereria bancrofti, which is transmitted by various mosquito species. Here, we present a systematic literature review with the following objectives: (i) to establish global and regional estimates of populations at risk of LF with particular consideration of water resource development projects, and (ii) to assess the effects of water resource development and management on the frequency and transmission dynamics of the disease. We estimate that globally, 2 billion people are at risk of LF. Among them, there are 394.5 million urban dwellers without access to improved sanitation and 213 million rural dwellers living in close proximity to irrigation. Environmental changes due to water resource development and management consistently led to a shift in vector species composition and generally to a strong proliferation of vector populations. For example, in World Health Organization (WHO) subregions 1 and 2, mosquito densities of the Anopheles gambiae complex and Anopheles funestus were up to 25-fold higher in irrigated areas when compared with irrigation-free sites. Although the infection prevalence of LF often increased after the implementation of a water project, there was no clear association with clinical symptoms. Concluding, there is a need to assess and quantify changes of LF transmission parameters and clinical manifestations over the entire course of water resource developments. Where resources allow, integrated vector management should complement mass drug administration, and broad-based monitoring and surveillance of the disease should become an integral part of large-scale waste management and sanitation programs, whose basic rationale lies in a systemic approach to city, district, and regional level health services and disease prevention.

  4. Requirements for fault-tolerant factoring on an atom-optics quantum computer.

    Science.gov (United States)

    Devitt, Simon J; Stephens, Ashley M; Munro, William J; Nemoto, Kae

    2013-01-01

    Quantum information processing and its associated technologies have reached a pivotal stage in their development, with many experiments having established the basic building blocks. Moving forward, the challenge is to scale up to larger machines capable of performing computational tasks not possible today. This raises questions that need to be urgently addressed, such as what resources these machines will consume and how large will they be. Here we estimate the resources required to execute Shor's factoring algorithm on an atom-optics quantum computer architecture. We determine the runtime and size of the computer as a function of the problem size and physical error rate. Our results suggest that once the physical error rate is low enough to allow quantum error correction, optimization to reduce resources and increase performance will come mostly from integrating algorithms and circuits within the error correction environment, rather than from improving the physical hardware.

  5. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  6. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    Science.gov (United States)

    Southard, Rodney E.

    2013-01-01

    The weather and precipitation patterns in Missouri vary considerably from year to year. In 2008, the statewide average rainfall was 57.34 inches and in 2012, the statewide average rainfall was 30.64 inches. This variability in precipitation and resulting streamflow in Missouri underlies the necessity for water managers and users to have reliable streamflow statistics and a means to compute select statistics at ungaged locations for a better understanding of water availability. Knowledge of surface-water availability is dependent on the streamflow data that have been collected and analyzed by the U.S. Geological Survey for more than 100 years at approximately 350 streamgages throughout Missouri. The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, computed streamflow statistics at streamgages through the 2010 water year, defined periods of drought and defined methods to estimate streamflow statistics at ungaged locations, and developed regional regression equations to compute selected streamflow statistics at ungaged locations. Streamflow statistics and flow durations were computed for 532 streamgages in Missouri and in neighboring States of Missouri. For streamgages with more than 10 years of record, Kendall’s tau was computed to evaluate for trends in streamflow data. If trends were detected, the variable length method was used to define the period of no trend. Water years were removed from the dataset from the beginning of the record for a streamgage until no trend was detected. Low-flow frequency statistics were then computed for the entire period of record and for the period of no trend if 10 or more years of record were available for each analysis. Three methods are presented for computing selected streamflow statistics at ungaged locations. The first method uses power curve equations developed for 28 selected streams in Missouri and neighboring States that have multiple streamgages on the same streams. Statistical

  7. DrugSig: A resource for computational drug repositioning utilizing gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Hongyu Wu

    Full Text Available Computational drug repositioning has been proved as an effective approach to develop new drug uses. However, currently existing strategies strongly rely on drug response gene signatures which scattered in separated or individual experimental data, and resulted in low efficient outputs. So, a fully drug response gene signatures database will be very helpful to these methods. We collected drug response microarray data and annotated related drug and targets information from public databases and scientific literature. By selecting top 500 up-regulated and down-regulated genes as drug signatures, we manually established the DrugSig database. Currently DrugSig contains more than 1300 drugs, 7000 microarray and 800 targets. Moreover, we developed the signature based and target based functions to aid drug repositioning. The constructed database can serve as a resource to quicken computational drug repositioning. Database URL: http://biotechlab.fudan.edu.cn/database/drugsig/.

  8. Natural resource damage assessment models for Great Lakes, coastal, and marine environments

    International Nuclear Information System (INIS)

    French, D.P.; Reed, M.

    1993-01-01

    A computer model of the physical fates, biological effects, and economic damages resulting from releases of oil and other hazardous materials has been developed by Applied Science Associates to be used in Type A natural resource damage assessments under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). Natural resource damage assessment models for great lakes environments and for coastal and marine environments will become available. A coupled geographical information system allows gridded representation of complex coastal boundaries, variable bathymetry, shoreline types, and multiple biological habitats. The physical and biological models are three dimensional. Direct mortality from toxic concentrations and oiling, impacts of habitat loss, and food web losses are included in the model. Estimation of natural resource damages is based both on the lost value of injured resources and on the costs of restoring or replacing those resources. The models are implemented on a personal computer, with a VGA graphical user interface. Following public review, the models will become a formal part of the US regulatory framework. The models are programmed in a modular and generic fashion, to facilitate transportability and application to new areas. The model has several major components. Physical fates and biological effects submodels estimate impacts or injury resulting from a spill. The hydrodynamic submodel calculates currents that transport contaminant(s) or organisms. The compensable value submodel values injuries to help assess damages. The restoration submodel determines what restoration actions will most cost-effectively reduce injuries as measured by compensable values. Injury and restoration costs are assessed for each of a series of habitats (environments) affected by the spill. Environmental, chemical, and biological databases supply required information to the model for computing fates and effects (injury)

  9. Client/server models for transparent, distributed computational resources

    International Nuclear Information System (INIS)

    Hammer, K.E.; Gilman, T.L.

    1991-01-01

    Client/server models are proposed to address issues of shared resources in a distributed, heterogeneous UNIX environment. Recent development of automated Remote Procedure Call (RPC) interface generator has simplified the development of client/server models. Previously, implementation of the models was only possible at the UNIX socket level. An overview of RPCs and the interface generator will be presented and will include a discussion of generation and installation of remote services, the RPC paradigm, and the three levels of RPC programming. Two applications, the Nuclear Plant Analyzer (NPA) and a fluids simulation using molecular modelling, will be presented to demonstrate how client/server models using RPCs and External Data Representations (XDR) have been used production/computation situations. The NPA incorporates a client/server interface for transferring/translation of TRAC or RELAP results from the UNICOS Cray to a UNIX workstation. The fluids simulation program utilizes the client/server model to access the Cray via a single function allowing it to become a shared co-processor to the workstation application. 5 refs., 6 figs

  10. Estimating the Value of Improved Distributed Photovoltaic Adoption Forecasts for Utility Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Gagnon, Pieter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ehlen, Ali [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zuboy, Jarret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2018-05-15

    Misforecasting the adoption of customer-owned distributed photovoltaics (DPV) can have operational and financial implications for utilities; forecasting capabilities can be improved, but generally at a cost. This paper informs this decision-space by using a suite of models to explore the capacity expansion and operation of the Western Interconnection over a 15-year period across a wide range of DPV growth rates and misforecast severities. The system costs under a misforecast are compared against the costs under a perfect forecast, to quantify the costs of misforecasting. Using a simplified probabilistic method applied to these modeling results, an analyst can make a first-order estimate of the financial benefit of improving a utility’s forecasting capabilities, and thus be better informed about whether to make such an investment. For example, under our base assumptions, a utility with 10 TWh per year of retail electric sales who initially estimates that DPV growth could range from 2% to 7.5% of total generation over the next 15 years could expect total present-value savings of approximately $4 million if they could reduce the severity of misforecasting to within ±25%. Utility resource planners can compare those savings against the costs needed to achieve that level of precision, to guide their decision on whether to make an investment in tools or resources.

  11. Assessing different parameters estimation methods of Weibull distribution to compute wind power density

    International Nuclear Information System (INIS)

    Mohammadi, Kasra; Alavi, Omid; Mostafaeipour, Ali; Goudarzi, Navid; Jalilvand, Mahdi

    2016-01-01

    Highlights: • Effectiveness of six numerical methods is evaluated to determine wind power density. • More appropriate method for computing the daily wind power density is estimated. • Four windy stations located in the south part of Alberta, Canada namely is investigated. • The more appropriate parameters estimation method was not identical among all examined stations. - Abstract: In this study, the effectiveness of six numerical methods is evaluated to determine the shape (k) and scale (c) parameters of Weibull distribution function for the purpose of calculating the wind power density. The selected methods are graphical method (GP), empirical method of Justus (EMJ), empirical method of Lysen (EML), energy pattern factor method (EPF), maximum likelihood method (ML) and modified maximum likelihood method (MML). The purpose of this study is to identify the more appropriate method for computing the wind power density in four stations distributed in Alberta province of Canada namely Edmonton City Center Awos, Grande Prairie A, Lethbridge A and Waterton Park Gate. To provide a complete analysis, the evaluations are performed on both daily and monthly scales. The results indicate that the precision of computed wind power density values change when different parameters estimation methods are used to determine the k and c parameters. Four methods of EMJ, EML, EPF and ML present very favorable efficiency while the GP method shows weak ability for all stations. However, it is found that the more effective method is not similar among stations owing to the difference in the wind characteristics.

  12. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  13. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  14. A review of the methods available for estimating soil moisture and its implications for water resource management

    Science.gov (United States)

    Dobriyal, Pariva; Qureshi, Ashi; Badola, Ruchi; Hussain, Syed Ainul

    2012-08-01

    SummaryThe maintenance of elevated soil moisture is an important ecosystem service of the natural ecosystems. Understanding the patterns of soil moisture distribution is useful to a wide range of agencies concerned with the weather and climate, soil conservation, agricultural production and landscape management. However, the great heterogeneity in the spatial and temporal distribution of soil moisture and the lack of standard methods to estimate this property limit its quantification and use in research. This literature based review aims to (i) compile the available knowledge on the methods used to estimate soil moisture at the landscape level, (ii) compare and evaluate the available methods on the basis of common parameters such as resource efficiency, accuracy of results and spatial coverage and (iii) identify the method that will be most useful for forested landscapes in developing countries. On the basis of the strengths and weaknesses of each of the methods reviewed we conclude that the direct method (gravimetric method) is accurate and inexpensive but is destructive, slow and time consuming and does not allow replications thereby having limited spatial coverage. The suitability of indirect methods depends on the cost, accuracy, response time, effort involved in installation, management and durability of the equipment. Our review concludes that measurements of soil moisture using the Time Domain Reflectometry (TDR) and Ground Penetrating Radar (GPR) methods are instantaneously obtained and accurate. GPR may be used over larger areas (up to 500 × 500 m a day) but is not cost-effective and difficult to use in forested landscapes in comparison to TDR. This review will be helpful to researchers, foresters, natural resource managers and agricultural scientists in selecting the appropriate method for estimation of soil moisture keeping in view the time and resources available to them and to generate information for efficient allocation of water resources and

  15. Estimation of effectiveness of automatic exposure control in computed tomograph scanner

    International Nuclear Information System (INIS)

    Akhilesh, Philomina; Sharma, S.D.; Datta, D.; Kulkarni, Arti

    2018-01-01

    With the advent of multiple detector array technology, the use of Computed Tomography (CT) scanning has increase tremendously. Computed Tomography examinations deliver relatively high radiation dose to patients in comparison with conventional radiography. It is therefore required to reduce the dose delivered in CT scans without compromising the image quality. Several parameters like applied potential, tube current, scan length, pitch etc. influence the dose delivered in CT scans. For optimization of patient dose and image quality, all modern CT scanners are enabled with Automatic Exposure Control (AEC) systems. The aim of this work is to compare the dose delivered during CT scans performed with and without AEC in order to estimate the effectiveness of AEC techniques used in CT scanners of various manufacturer

  16. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  17. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  18. Reduced-Complexity Direction of Arrival Estimation Using Real-Valued Computation with Arbitrary Array Configurations

    Directory of Open Access Journals (Sweden)

    Feng-Gang Yan

    2018-01-01

    Full Text Available A low-complexity algorithm is presented to dramatically reduce the complexity of the multiple signal classification (MUSIC algorithm for direction of arrival (DOA estimation, in which both tasks of eigenvalue decomposition (EVD and spectral search are implemented with efficient real-valued computations, leading to about 75% complexity reduction as compared to the standard MUSIC. Furthermore, the proposed technique has no dependence on array configurations and is hence suitable for arbitrary array geometries, which shows a significant implementation advantage over most state-of-the-art unitary estimators including unitary MUSIC (U-MUSIC. Numerical simulations over a wide range of scenarios are conducted to show the performance of the new technique, which demonstrates that with a significantly reduced computational complexity, the new approach is able to provide a close accuracy to the standard MUSIC.

  19. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    International Nuclear Information System (INIS)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-01-01

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  20. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  1. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  2. Robust Wave Resource Estimation

    DEFF Research Database (Denmark)

    Lavelle, John; Kofoed, Jens Peter

    2013-01-01

    density estimates of the PDF as a function both of Hm0 and Tp, and Hm0 and T0;2, together with the mean wave power per unit crest length, Pw, as a function of Hm0 and T0;2. The wave elevation parameters, from which the wave parameters are calculated, are filtered to correct or remove spurious data....... An overview is given of the methods used to do this, and a method for identifying outliers of the wave elevation data, based on the joint distribution of wave elevations and accelerations, is presented. The limitations of using a JONSWAP spectrum to model the measured wave spectra as a function of Hm0 and T0......;2 or Hm0 and Tp for the Hanstholm site data are demonstrated. As an alternative, the non-parametric loess method, which does not rely on any assumptions about the shape of the wave elevation spectra, is used to accurately estimate Pw as a function of Hm0 and T0;2....

  3. Methods for obtaining distributions of uranium occurrence from estimates of geologic features

    International Nuclear Information System (INIS)

    Ford, C.E.; McLaren, R.A.

    1980-04-01

    The problem addressed in this paper is the determination of a quantitative estimate of a resource from estimates of fundamental variables which describe the resource. Due to uncertainty about the estimates, these basic variables are stochastic. The evaluation of random equations involving these variables is the core of the analysis process. The basic variables are originally described in terms of a low and a high percentile (the 5th and 95th, for example) and a central value (the mode, mean or median). The variable thus described is then generally assumed to be represented by a three-parameter lognormal distribution. Expressions involving these variables are evaluated by computing the first four central moments of the random functions (which are usually products and sums of variables). Stochastic independence is discussed. From the final set of moments a Pearson distribution is obtained; the high values of skewness and kurtosis resulting from uranium data require obtaining Pearson curves beyond those described in published tables. A cubic spline solution to the Pearson differential equation accomplishes this task. A sample problem is used to illustrate the application of the process; sensitivity to the estimated values of the basic variables is discussed. Appendices contain details of the methods and descriptions of computer programs

  4. Methods for obtaining distributions of uranium occurrence from estimates of geologic features

    International Nuclear Information System (INIS)

    Ford, C.E.; McLaren, R.A.

    1980-04-01

    The problem addressed in this report is the determination of a quantitative estimate of a resource from estimates of fundamental variables which describe the resource. Due to uncertainty about the estimates, these basic variables are stochastic. The evaluation of random equations involving these variables is the core of the analysis process. The basic variables are originally described in terms of a low and a high percentile (the 5th and 95th, for example) and a central value (the mode, mean or median). The variable thus described is then generally assumed to be represented by a three-parameter lognormal distribution. Expressions involving these variables are evaluated by computing the first four central moments of the random functions (which are usually products and sums of variables). Stochastic independence is discussed. From the final set of moments a Pearson distribution is obtained; the high values of skewness and kurtosis resulting from uranium data requires obtaining Pearson curves beyond those described in published tables. A cubic spline solution to the Pearson differential equation accomplishes this task. A sample problem is used to illustrate the application of the process; sensitivity to the estimated values of the basic variables is discussed. Appendices contain details of the methods and descriptions of computer programs

  5. Monte Carlo estimation of the absorbed dose in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Woo; Youn, Han Bean; Kim, Ho Kyung [Pusan National University, Busan (Korea, Republic of)

    2016-05-15

    The purpose of this study is to devise an algorithm calculating absorbed dose distributions of patients based on Monte Carlo (MC) methods, and which includes the dose estimations due to primary and secondary (scattered) x-ray photons. Assessment of patient dose in computed tomography (CT) at the population level has become a subject of public attention and concern, and ultimate CT quality assurance and dose optimization have the goal of reducing radiation-induced cancer risks in the examined population. However, the conventional CT dose index (CTDI) concept is not a surrogate of risk but it has rather been designed to measure an average central dose. In addition, the CTDI or the dose-length product has showed troubles for helical CT with a wider beam collimation. Simple algorithms to estimate a patient specific CT dose based on the MCNP output data have been introduced. For numerical chest and head phantoms, the spatial dose distributions were calculated. The results were reasonable. The estimated dose distribution map can be readily converted into the effective dose. The important list for further studies includes the validation of the models with the experimental measurements and the acceleration of algorithms.

  6. Stochastic resource allocation in emergency departments with a multi-objective simulation optimization algorithm.

    Science.gov (United States)

    Feng, Yen-Yi; Wu, I-Chin; Chen, Tzu-Li

    2017-03-01

    The number of emergency cases or emergency room visits rapidly increases annually, thus leading to an imbalance in supply and demand and to the long-term overcrowding of hospital emergency departments (EDs). However, current solutions to increase medical resources and improve the handling of patient needs are either impractical or infeasible in the Taiwanese environment. Therefore, EDs must optimize resource allocation given limited medical resources to minimize the average length of stay of patients and medical resource waste costs. This study constructs a multi-objective mathematical model for medical resource allocation in EDs in accordance with emergency flow or procedure. The proposed mathematical model is complex and difficult to solve because its performance value is stochastic; furthermore, the model considers both objectives simultaneously. Thus, this study develops a multi-objective simulation optimization algorithm by integrating a non-dominated sorting genetic algorithm II (NSGA II) with multi-objective computing budget allocation (MOCBA) to address the challenges of multi-objective medical resource allocation. NSGA II is used to investigate plausible solutions for medical resource allocation, and MOCBA identifies effective sets of feasible Pareto (non-dominated) medical resource allocation solutions in addition to effectively allocating simulation or computation budgets. The discrete event simulation model of ED flow is inspired by a Taiwan hospital case and is constructed to estimate the expected performance values of each medical allocation solution as obtained through NSGA II. Finally, computational experiments are performed to verify the effectiveness and performance of the integrated NSGA II and MOCBA method, as well as to derive non-dominated medical resource allocation solutions from the algorithms.

  7. Generalized location-based resource allocation for OFDMA cognitive radio systems

    KAUST Repository

    Ben Ghorbel, Mahdi

    2010-09-01

    Cognitive radio is one of the hot topics for emerging and future wireless communication. Cognitive users can share channels with primary users under the condition of non interference. In order to compute this interference, the cognitive system usually use the channel state information of the primary user which is often impractical to obtain. However, using location information, we can estimate this interference by pathloss computation. In this paper, we introduce a low-complexity resource allocation algorithm for orthogonal frequency division multiple access (OFDMA) based cognitive radio systems, which uses relative location information between primary and secondary users to estimate the interference. This algorithm considers interference with multiple primary users having different thresholds. The simulation results show the efficiency of the proposed algorithm by comparing it with an optimal exhaustive search method. © 2010 IEEE.

  8. NETL CO2 Storage prospeCtive Resource Estimation Excel aNalysis (CO2-SCREEN) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Sanguinito, Sean M. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Goodman, Angela [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Levine, Jonathan [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2017-04-03

    This user’s manual guides the use of the National Energy Technology Laboratory’s (NETL) CO2 Storage prospeCtive Resource Estimation Excel aNalysis (CO2-SCREEN) tool, which was developed to aid users screening saline formations for prospective CO2 storage resources. CO2- SCREEN applies U.S. Department of Energy (DOE) methods and equations for estimating prospective CO2 storage resources for saline formations. CO2-SCREEN was developed to be substantive and user-friendly. It also provides a consistent method for calculating prospective CO2 storage resources that allows for consistent comparison of results between different research efforts, such as the Regional Carbon Sequestration Partnerships (RCSP). CO2-SCREEN consists of an Excel spreadsheet containing geologic inputs and outputs, linked to a GoldSim Player model that calculates prospective CO2 storage resources via Monte Carlo simulation.

  9. INTDOS: a computer code for estimating internal radiation dose using recommendations of the International Commission on Radiological Protection

    International Nuclear Information System (INIS)

    Ryan, M.T.

    1981-09-01

    INTDOS is a user-oriented computer code designed to calculate estimates of internal radiation dose commitment resulting from the acute inhalation intake of various radionuclides. It is designed so that users unfamiliar with the details of such can obtain results by answering a few questions regarding the exposure case. The user must identify the radionuclide name, solubility class, particle size, time since exposure, and the measured lung burden. INTDOS calculates the fractions of the lung burden remaining at time, t, postexposure considering the solubility class and particle size information. From the fraction remaining in the lung at time, t, the quantity inhaled is estimated. Radioactive decay is accounted for in the estimate. Finally, effective committed dose equivalents to various organs and tissues of the body are calculated using inhalation committed dose factors presented by the International Commission on Radiological Protection (ICRP). This computer code was written for execution on a Digital Equipment Corporation PDP-10 computer and is written in Fortran IV. A flow chart and example calculations are discussed in detail to aid the user who is unfamiliar with computer operations

  10. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  11. A geo-informatics approach for estimating water resources management components and their interrelationships

    KAUST Repository

    Liaqat, Umar Waqas

    2016-09-21

    A remote sensing based geo-informatics approach was developed to estimate water resources management (WRM) components across a large irrigation scheme in the Indus Basin of Pakistan. The approach provides a generalized framework for estimating a range of key water management variables and provides a management tool for the sustainable operation of similar schemes globally. A focus on the use of satellite data allowed for the quantification of relationships across a range of spatial and temporal scales. Variables including actual and crop evapotranspiration, net and gross irrigation, net and gross groundwater use, groundwater recharge, net groundwater recharge, were estimated and then their interrelationships explored across the Hakra Canal command area. Spatially distributed remotely sensed estimates of actual evapotranspiration (ETa) rates were determined using the Surface Energy Balance System (SEBS) model and evaluated against ground-based evaporation calculated from the advection-aridity method. Analysis of ETa simulations across two cropping season, referred to as Kharif and Rabi, yielded Pearson correlation (R) values of 0.69 and 0.84, Nash-Sutcliffe criterion (NSE) of 0.28 and 0.63, percentage bias of −3.85% and 10.6% and root mean squared error (RMSE) of 10.6 mm and 12.21 mm for each season, respectively. For the period of study between 2008 and 2014, it was estimated that an average of 0.63 mm day−1 water was supplied through canal irrigation against a crop water demand of 3.81 mm day−1. Approximately 1.86 mm day−1 groundwater abstraction was estimated in the region, which contributed to fulfil the gap between crop water demand and canal water supply. Importantly, the combined canal, groundwater and rainfall sources of water only met 70% of the crop water requirements. As such, the difference between recharge and discharge showed that groundwater depletion was around −115 mm year−1 during the six year study period. Analysis indicated that

  12. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  13. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  14. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  15. The Usage of informal computer based communication in the context of organization’s technological resources

    OpenAIRE

    Raišienė, Agota Giedrė; Jonušauskas, Steponas

    2011-01-01

    Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization's technological resources. Methodology - meta analysis, survey and descriptive analysis. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the ...

  16. Resource-adaptive cognitive processes

    CERN Document Server

    Crocker, Matthew W

    2010-01-01

    This book investigates the adaptation of cognitive processes to limited resources. The central topics of this book are heuristics considered as results of the adaptation to resource limitations, through natural evolution in the case of humans, or through artificial construction in the case of computational systems; the construction and analysis of resource control in cognitive processes; and an analysis of resource-adaptivity within the paradigm of concurrent computation. The editors integrated the results of a collaborative 5-year research project that involved over 50 scientists. After a mot

  17. Computable error estimates of a finite difference scheme for option pricing in exponential Lévy models

    KAUST Repository

    Kiessling, Jonas

    2014-05-06

    Option prices in exponential Lévy models solve certain partial integro-differential equations. This work focuses on developing novel, computable error approximations for a finite difference scheme that is suitable for solving such PIDEs. The scheme was introduced in (Cont and Voltchkova, SIAM J. Numer. Anal. 43(4):1596-1626, 2005). The main results of this work are new estimates of the dominating error terms, namely the time and space discretisation errors. In addition, the leading order terms of the error estimates are determined in a form that is more amenable to computations. The payoff is only assumed to satisfy an exponential growth condition, it is not assumed to be Lipschitz continuous as in previous works. If the underlying Lévy process has infinite jump activity, then the jumps smaller than some (Formula presented.) are approximated by diffusion. The resulting diffusion approximation error is also estimated, with leading order term in computable form, as well as the dependence of the time and space discretisation errors on this approximation. Consequently, it is possible to determine how to jointly choose the space and time grid sizes and the cut off parameter (Formula presented.). © 2014 Springer Science+Business Media Dordrecht.

  18. Monitoring of computing resource use of active software releases at ATLAS

    Science.gov (United States)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  19. Linking resource selection and mortality modeling for population estimation of mountain lions in Montana

    Science.gov (United States)

    Robinson, Hugh S.; Ruth, Toni K.; Gude, Justin A.; Choate, David; DeSimone, Rich; Hebblewhite, Mark; Matchett, Marc R.; Mitchell, Michael S.; Murphy, Kerry; Williams, Jim

    2015-01-01

    To be most effective, the scale of wildlife management practices should match the range of a particular species’ movements. For this reason, combined with our inability to rigorously or regularly census mountain lion populations, several authors have suggested that mountain lions be managed in a source-sink or metapopulation framework. We used a combination of resource selection functions, mortality estimation, and dispersal modeling to estimate cougar population levels in Montana statewide and potential population level effects of planned harvest levels. Between 1980 and 2012, 236 independent mountain lions were collared and monitored for research in Montana. From these data we used 18,695 GPS locations collected during winter from 85 animals to develop a resource selection function (RSF), and 11,726 VHF and GPS locations from 142 animals along with the locations of 6343 mountain lions harvested from 1988–2011 to validate the RSF model. Our RSF model validated well in all portions of the State, although it appeared to perform better in Montana Fish, Wildlife and Parks (MFWP) Regions 1, 2, 4 and 6, than in Regions 3, 5, and 7. Our mean RSF based population estimate for the total population (kittens, juveniles, and adults) of mountain lions in Montana in 2005 was 3926, with almost 25% of the entire population in MFWP Region 1. Estimates based on a high and low reference population estimates produce a possible range of 2784 to 5156 mountain lions statewide. Based on a range of possible survival rates we estimated the mountain lion population in Montana to be stable to slightly increasing between 2005 and 2010 with lambda ranging from 0.999 (SD = 0.05) to 1.02 (SD = 0.03). We believe these population growth rates to be a conservative estimate of true population growth. Our model suggests that proposed changes to female harvest quotas for 2013–2015 will result in an annual statewide population decline of 3% and shows that, due to reduced dispersal, changes to

  20. Using Mosix for Wide-Area Compuational Resources

    Science.gov (United States)

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  1. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  2. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  3. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  4. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  5. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  6. Computational Fluid Dynamic Pressure Drop Estimation of Flow between Parallel Plates

    Energy Technology Data Exchange (ETDEWEB)

    Son, Hyung Min; Yang, Soo Hyung; Park, Jong Hark [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Many pool type reactors have forced downward flows inside the core during normal operation; there is a chance of flow inversion when transients occur. During this phase, the flow undergo transition between turbulent and laminar regions where drastic changes take place in terms of momentum and heat transfer, and the decrease in safety margin is usually observed. Additionally, for high Prandtl number fluids such as water, an effect of the velocity profile inside the channel on the temperature distribution is more pronounced over the low Prandtl number ones. This makes the checking of its pressure drop estimation accuracy less important, assuming the code verification is complete. With an advent of powerful computer hardware, engineering applications of computational fluid dynamics (CFD) methods have become quite common these days. Especially for a fully-turbulent and single phase convective heat transfer, the predictability of the commercial codes has matured enough so that many well-known companies adopt those to accelerate a product development cycle and to realize an increased profitability. In contrast to the above, the transition models for the CFD code are still under development, and the most of the models show limited generality and prediction accuracy. Unlike the system codes, the CFD codes estimate the pressure drop from the velocity profile which is obtained by solving momentum conservation equations, and the resulting friction factor can be a representative parameter for a constant cross section channel flow. In addition, the flow inside a rectangular channel with a high span to gap ratio can be approximated by flow inside parallel plates. The computational fluid dynamics simulation on the flow between parallel plates showed reasonable prediction capability for the laminar and the turbulent regime.

  7. Optimization-based scatter estimation using primary modulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yi; Ma, Jingchen; Zhao, Jun, E-mail: junzhao@sjtu.edu.cn [School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Song, Ying [Department of Radiation Oncology, West China Hospital, Sichuan University, Chengdu 610041 (China)

    2016-08-15

    Purpose: Scatter reduces the image quality in computed tomography (CT), but scatter correction remains a challenge. A previously proposed primary modulation method simultaneously obtains the primary and scatter in a single scan. However, separating the scatter and primary in primary modulation is challenging because it is an underdetermined problem. In this study, an optimization-based scatter estimation (OSE) algorithm is proposed to estimate and correct scatter. Methods: In the concept of primary modulation, the primary is modulated, but the scatter remains smooth by inserting a modulator between the x-ray source and the object. In the proposed algorithm, an objective function is designed for separating the scatter and primary. Prior knowledge is incorporated in the optimization-based framework to improve the accuracy of the estimation: (1) the primary is always positive; (2) the primary is locally smooth and the scatter is smooth; (3) the location of penumbra can be determined; and (4) the scatter-contaminated data provide knowledge about which part is smooth. Results: The simulation study shows that the edge-preserving weighting in OSE improves the estimation accuracy near the object boundary. Simulation study also demonstrates that OSE outperforms the two existing primary modulation algorithms for most regions of interest in terms of the CT number accuracy and noise. The proposed method was tested on a clinical cone beam CT, demonstrating that OSE corrects the scatter even when the modulator is not accurately registered. Conclusions: The proposed OSE algorithm improves the robustness and accuracy in scatter estimation and correction. This method is promising for scatter correction of various kinds of x-ray imaging modalities, such as x-ray radiography, cone beam CT, and the fourth-generation CT.

  8. Reducing usage of the computational resources by event driven approach to model predictive control

    Science.gov (United States)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  9. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  10. Energy efficiency models and optimization algoruthm to enhance on-demand resource delivery in a cloud computing environment / Thusoyaone Joseph Moemi

    OpenAIRE

    Moemi, Thusoyaone Joseph

    2013-01-01

    Online hosed services are what is referred to as Cloud Computing. Access to these services is via the internet. h shifts the traditional IT resource ownership model to renting. Thus, high cost of infrastructure cannot limit the less privileged from experiencing the benefits that this new paradigm brings. Therefore, c loud computing provides flexible services to cloud user in the form o f software, platform and infrastructure as services. The goal behind cloud computing is to provi...

  11. Automated mode shape estimation in agent-based wireless sensor networks

    Science.gov (United States)

    Zimmerman, Andrew T.; Lynch, Jerome P.

    2010-04-01

    Recent advances in wireless sensing technology have made it possible to deploy dense networks of sensing transducers within large structural systems. Because these networks leverage the embedded computing power and agent-based abilities integral to many wireless sensing devices, it is possible to analyze sensor data autonomously and in-network. In this study, market-based techniques are used to autonomously estimate mode shapes within a network of agent-based wireless sensors. Specifically, recent work in both decentralized Frequency Domain Decomposition and market-based resource allocation is leveraged to create a mode shape estimation algorithm derived from free-market principles. This algorithm allows an agent-based wireless sensor network to autonomously shift emphasis between improving mode shape accuracy and limiting the consumption of certain scarce network resources: processing time, storage capacity, and power consumption. The developed algorithm is validated by successfully estimating mode shapes using a network of wireless sensor prototypes deployed on the mezzanine balcony of Hill Auditorium, located on the University of Michigan campus.

  12. DEEBAR - A BASIC interactive computer programme for estimating mean resonance spacings

    International Nuclear Information System (INIS)

    Booth, M.; Pope, A.L.; Smith, R.W.; Story, J.S.

    1988-02-01

    DEEBAR is a BASIC interactive programme, which uses the theories of Dyson and of Dyson and Mehta, to compute estimates of the mean resonance spacings and associated uncertainty statistics from an input file of neutron resonance energies. In applying these theories the broad scale energy dependence of D-bar, as predicted by the ordinary theory of level densities, is taken into account. The mean spacing D-bar ± δD-bar, referred to zero energy of the incident neutrons, is computed from the energies of the first k resonances, for k = 2,3...K in turn and as if no resonances are missing. The user is asked to survey this set of D-bar and δD-bar values and to form a judgement - up to what value of k is the set of resonances complete and what value, in consequence, does the user adopt as the preferred value of D-bar? When the preferred values for k and D-bar have been input, the programme calculates revised values for the level density parameters, consistent with this value for D-bar and with other input information. Two short tables are printed, illustrating the energy variation and spin dependence of D-bar. Dyson's formula based on his Coulomb gas analogy is used for estimating the most likely energies of the topmost bound levels. Finally the quasi-crystalline character of a single level series is exploited by means of a table in which the resonance energies are set alongside an energy ladder whose rungs are regularly spaced with spacing D-bar(E); this comparative table expedites the search for gaps where resonances may have been missed experimentally. Used in conjunction with the program LJPROB, which calculates neutron strengths and compares them against the expected Porter Thomas distribution, estimates of the statistical parameters for use in the unresolved resonance region may be derived. (author)

  13. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  14. Comparison of computer models for estimating hydrology and water quality in an agricultural watershed

    Science.gov (United States)

    Various computer models, ranging from simple to complex, have been developed to simulate hydrology and water quality from field to watershed scales. However, many users are uncertain about which model to choose when estimating water quantity and quality conditions in a watershed. This study compared...

  15. A Bootstrap Approach to Computing Uncertainty in Inferred Oil and Gas Reserve Estimates

    International Nuclear Information System (INIS)

    Attanasi, Emil D.; Coburn, Timothy C.

    2004-01-01

    This study develops confidence intervals for estimates of inferred oil and gas reserves based on bootstrap procedures. Inferred reserves are expected additions to proved reserves in previously discovered conventional oil and gas fields. Estimates of inferred reserves accounted for 65% of the total oil and 34% of the total gas assessed in the U.S. Geological Survey's 1995 National Assessment of oil and gas in US onshore and State offshore areas. When the same computational methods used in the 1995 Assessment are applied to more recent data, the 80-year (from 1997 through 2076) inferred reserve estimates for pre-1997 discoveries located in the lower 48 onshore and state offshore areas amounted to a total of 39.7 billion barrels of oil (BBO) and 293 trillion cubic feet (TCF) of gas. The 90% confidence interval about the oil estimate derived from the bootstrap approach is 22.4 BBO to 69.5 BBO. The comparable 90% confidence interval for the inferred gas reserve estimate is 217 TCF to 413 TCF. The 90% confidence interval describes the uncertainty that should be attached to the estimates. It also provides a basis for developing scenarios to explore the implications for energy policy analysis

  16. Tridimensional modelling and resource estimation of the mining waste piles of São Domingos mine, Iberian Pyrite Belt, Portugal

    Science.gov (United States)

    Vieira, Alexandre; Matos, João; Lopes, Luis; Martins, Ruben

    2016-04-01

    Located in the Iberian Pyrite Belt (IPB) northern sector, near the Portuguese/Spanish border, the outcropping São Domingos deposit was mined since Roman time. Between 1854 and 1966 the Mason & Barry Company developed open pit excavation until 120 m depth and underground mining until 420 m depth. The São Domingos subvertical deposit is associated with felsic volcanics and black shales of the IPB Volcano-Sedimentary Complex and is represented by massive sulphide and stockwork ore (py, cpy, sph, ga, tt, aspy) and related supergene enrichment ore (hematite gossan and covellite/chalcocite). Different mine waste classes were mapped around the old open pit: gossan (W1), felsic volcanic and shales (W2), shales (W3) and mining waste landfill (W4). Using the LNEG (Portuguese Geological Survey) CONASA database (company historical mining waste characterization based on 162 shafts and 160 reverse circulation boreholes), a methodology for tridimensional modelling mining waste pile was followed, and a new mining waste resource is presented. Considering some constraints to waste removal, such as the Mina de São Domingos village proximity of the wastes, the industrial and archaeological patrimony (e.g., mining infrastructures, roman galleries), different resource scenarios were considered: unconditioned resources (total estimates) and conditioned resources (only the volumes without removal constraints considered). Using block modelling (SURPAC software) a mineral inferred resource of 2.38 Mt @ 0.77 g/t Au and 8.26 g/t Ag is estimated in unconditioned volumes of waste. Considering all evaluated wastes, including village areas, an inferred resource of 4.0 Mt @ 0.64 g/t Au and 7.30 g/t Ag is presented, corresponding to a total metal content of 82,878 oz t Au and 955,753 oz t Ag. Keywords. São Domingos mine, mining waste resources, mining waste pile modelling, Iberian Pyrite Belt, Portugal

  17. Decommissioning Cost Estimating -The ''Price'' Approach

    International Nuclear Information System (INIS)

    Manning, R.; Gilmour, J.

    2002-01-01

    Over the past 9 years UKAEA has developed a formalized approach to decommissioning cost estimating. The estimating methodology and computer-based application are known collectively as the PRICE system. At the heart of the system is a database (the knowledge base) which holds resource demand data on a comprehensive range of decommissioning activities. This data is used in conjunction with project specific information (the quantities of specific components) to produce decommissioning cost estimates. PRICE is a dynamic cost-estimating tool, which can satisfy both strategic planning and project management needs. With a relatively limited analysis a basic PRICE estimate can be produced and used for the purposes of strategic planning. This same estimate can be enhanced and improved, primarily by the improvement of detail, to support sanction expenditure proposals, and also as a tender assessment and project management tool. The paper will: describe the principles of the PRICE estimating system; report on the experiences of applying the system to a wide range of projects from contaminated car parks to nuclear reactors; provide information on the performance of the system in relation to historic estimates, tender bids, and outturn costs

  18. The application of LANDSAT remote sensing technology to natural resources management. Section 1: Introduction to VICAR - Image classification module. Section 2: Forest resource assessment of Humboldt County.

    Science.gov (United States)

    Fox, L., III (Principal Investigator); Mayer, K. E.

    1980-01-01

    A teaching module on image classification procedures using the VICAR computer software package was developed to optimize the training benefits for users of the VICAR programs. The field test of the module is discussed. An intensive forest land inventory strategy was developed for Humboldt County. The results indicate that LANDSAT data can be computer classified to yield site specific forest resource information with high accuracy (82%). The "Douglas-fir 80%" category was found to cover approximately 21% of the county and "Mixed Conifer 80%" covering about 13%. The "Redwood 80%" resource category, which represented dense old growth trees as well as large second growth, comprised 4.0% of the total vegetation mosaic. Furthermore, the "Brush" and "Brush-Regeneration" categories were found to be a significant part of the vegetative community, with area estimates of 9.4 and 10.0%.

  19. Uranium resources evaluation model as an exploration tool

    International Nuclear Information System (INIS)

    Ruzicka, V.

    1976-01-01

    Evaluation of uranium resources, as conducted by the Uranium Resources Evaluation Section of the Geological Survey of Canada, comprises operations analogous with those performed during the preparatory stages of uranium exploration. The uranium resources evaluation model, simulating the estimation process, can be divided into four steps. The first step includes definition of major areas and ''unit subdivisions'' for which geological data are gathered, coded, computerized and retrieved. Selection of these areas and ''unit subdivisions'' is based on a preliminary appraisal of their favourability for uranium mineralization. The second step includes analyses of the data, definition of factors controlling uranium minearlization, classification of uranium occurrences into genetic types, and final delineation of favourable areas; this step corresponds to the selection of targets for uranium exploration. The third step includes geological field work; it is equivalent to geological reconnaissance in exploration. The fourth step comprises computation of resources; the preliminary evaluation techniques in the exploration are, as a rule, analogous with the simplest methods employed in the resource evaluation. The uranium resources evaluation model can be conceptually applied for decision-making during exploration or for formulation of exploration strategy using the quantified data as weighting factors. (author)

  20. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    Science.gov (United States)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  1. Piping data bank and erection system of Angra 2: structure, computational resources and systems

    International Nuclear Information System (INIS)

    Abud, P.R.; Court, E.G.; Rosette, A.C.

    1992-01-01

    The Piping Data Bank of Angra 2 called - Erection Management System - Was developed to manage the piping erection of the Nuclear Power Plant of Angra 2. Beyond the erection follow-up of piping and supports, it manages: the piping design, the material procurement, the flow of the fabrication documents, testing of welds and material stocks at the Warehouse. The works developed in the sense of defining the structure of the Data Bank, Computational Resources and System are here described. (author)

  2. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    Science.gov (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  3. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  4. High-altitude wind resources in the Middle East

    KAUST Repository

    Yip, Chak Man Andrew; Gunturu, Udaya; Stenchikov, Georgiy L.

    2017-01-01

    In the Middle East, near-surface wind resources are intermittent. However, high-altitude wind resources are abundant, persistent, and readily available and may provide alternative energy resources in this fossil-fuel-dependent region. Using wind field data from the Modern-Era Retrospective Analysis for Research and Applications Version 2 (MERRA-2), this study identifies areas favorable to the deployment of airborne wind energy (AWE) systems in the Middle East and computes the optimal heights at which such systems would best operate. AWE potential is estimated using realistic AWE system specifications and assumptions about deployment scenarios and is compared with the near-surface wind generation potential with respect to diurnal and seasonal variability. The results show the potential utility of AWE in areas in the Middle East where the energy demand is high. In particular, Oman and Saudi Arabia have a high level of the potential power generation with low annual variability.

  5. High-altitude wind resources in the Middle East

    KAUST Repository

    Yip, Chak Man Andrew

    2017-08-23

    In the Middle East, near-surface wind resources are intermittent. However, high-altitude wind resources are abundant, persistent, and readily available and may provide alternative energy resources in this fossil-fuel-dependent region. Using wind field data from the Modern-Era Retrospective Analysis for Research and Applications Version 2 (MERRA-2), this study identifies areas favorable to the deployment of airborne wind energy (AWE) systems in the Middle East and computes the optimal heights at which such systems would best operate. AWE potential is estimated using realistic AWE system specifications and assumptions about deployment scenarios and is compared with the near-surface wind generation potential with respect to diurnal and seasonal variability. The results show the potential utility of AWE in areas in the Middle East where the energy demand is high. In particular, Oman and Saudi Arabia have a high level of the potential power generation with low annual variability.

  6. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    Science.gov (United States)

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  7. Computing challenges in HEP for WLHC grid

    CERN Document Server

    Muralidharan, Servesh

    2017-01-01

    As CERN moves towards preparation for increasing the luminosity of the particle beam towards HL-LHC, predictions shows computing demand would out grow our conservative scaling estimates by over ten times. Fortunately we are talking about a time scale of roughly ten years to develop new techniques and novel solutions to address this gap in compute resources. Experiments at CERN face a unique scenario where in they need to scale both latency sensitive workloads such as data acquisition of the detectors and throughput based ones such as simulations and reconstruction of high level events and physics processes. In this talk we cover some of the ongoing research at tier-0 in CERN which investigates several aspects of throughput sensitive workloads that consume significant compute cycles.

  8. Development and test validation of a computational scheme for high-fidelity fluence estimations of the Swiss BWRs

    International Nuclear Information System (INIS)

    Vasiliev, A.; Wieselquist, W.; Ferroukhi, H.; Canepa, S.; Heldt, J.; Ledergerber, G.

    2011-01-01

    One of the current objectives within reactor analysis related projects at the Paul Scherrer Institut is the establishment of a comprehensive computational methodology for fast neutron fluence (FNF) estimations of reactor pressure vessels (RPV) and internals for both PWRs and BWRs. In the recent past, such an integral calculational methodology based on the CASMO-4/SIMULATE- 3/MCNPX system of codes was developed for PWRs and validated against RPV scraping tests. Based on the very satisfactory validation results, the methodology was recently applied for predictive FNF evaluations of a Swiss PWR to support the national nuclear safety inspectorate in the framework of life-time estimations. Today, focus is at PSI given to develop a corresponding advanced methodology for high-fidelity FNF estimations of BWR reactors. In this paper, the preliminary steps undertaken in that direction are presented. To start, the concepts of the PWR computational scheme and its transfer/adaptation to BWR are outlined. Then, the modelling of a Swiss BWR characterized by very heterogeneous core designs is presented along with preliminary sensitivity studies carried out to assess the sufficient level of details required for the complex core region. Finally, a first validation test case is presented on the basis of two dosimeter monitors irradiated during two recent cycles of the given BWR reactor. The achieved computational results show a satisfactory agreement with measured dosimeter data and illustrate thereby the feasibility of applying the PSI FNF computational scheme also for BWRs. Further sensitivity/optimization studies are nevertheless necessary in order to consolidate the scheme and to ensure increasing continuously, the fidelity and reliability of the BWR FNF estimations. (author)

  9. Estimating indigenous resources for fuel-wood and poles and plantation requirements in the tribal trust lands of Zimbabwe Rhodesia

    Energy Technology Data Exchange (ETDEWEB)

    Furness, C K

    1981-01-01

    The difficulties encountered in planning for the conservation of indigenous timber resources and in estimating the timber consumption in tribal trust land are outlined in this paper. An estimate of these resources and of the consumption of timber, together with an estimate of exotic plantations required to make up any shortfall of timber, is given. Some 66,000 ha of eucalypts are currently required in the tribal trust lands, where planting has thus far provided only 3800 ha. The types of plantations established and the species used are mentioned. The rural population has, generally speaking, shown only limited enthusiasm for growing exotics, one of the reasons being the traditional use of indigenous timber which is still available in most areas without cost, and the preference for indigenous timber compared to eucalypts. The need for more reliable data for future planning is emphasized. Substitutes for fuel-wood are discussed and the need to reserve areas of indigenous timber in tribal trust land for the protection of soil and water and for fuel-wood are proposed. (Refs. 1).

  10. Flexible resources for quantum metrology

    Science.gov (United States)

    Friis, Nicolai; Orsucci, Davide; Skotiniotis, Michalis; Sekatski, Pavel; Dunjko, Vedran; Briegel, Hans J.; Dür, Wolfgang

    2017-06-01

    Quantum metrology offers a quadratic advantage over classical approaches to parameter estimation problems by utilising entanglement and nonclassicality. However, the hurdle of actually implementing the necessary quantum probe states and measurements, which vary drastically for different metrological scenarios, is usually not taken into account. We show that for a wide range of tasks in metrology, 2D cluster states (a particular family of states useful for measurement-based quantum computation) can serve as flexible resources that allow one to efficiently prepare any required state for sensing, and perform appropriate (entangled) measurements using only single qubit operations. Crucially, the overhead in the number of qubits is less than quadratic, thus preserving the quantum scaling advantage. This is ensured by using a compression to a logarithmically sized space that contains all relevant information for sensing. We specifically demonstrate how our method can be used to obtain optimal scaling for phase and frequency estimation in local estimation problems, as well as for the Bayesian equivalents with Gaussian priors of varying widths. Furthermore, we show that in the paradigmatic case of local phase estimation 1D cluster states are sufficient for optimal state preparation and measurement.

  11. Flexible resources for quantum metrology

    International Nuclear Information System (INIS)

    Friis, Nicolai; Orsucci, Davide; Skotiniotis, Michalis; Sekatski, Pavel; Dunjko, Vedran; Briegel, Hans J; Dür, Wolfgang

    2017-01-01

    Quantum metrology offers a quadratic advantage over classical approaches to parameter estimation problems by utilising entanglement and nonclassicality. However, the hurdle of actually implementing the necessary quantum probe states and measurements, which vary drastically for different metrological scenarios, is usually not taken into account. We show that for a wide range of tasks in metrology, 2D cluster states (a particular family of states useful for measurement-based quantum computation) can serve as flexible resources that allow one to efficiently prepare any required state for sensing, and perform appropriate (entangled) measurements using only single qubit operations. Crucially, the overhead in the number of qubits is less than quadratic, thus preserving the quantum scaling advantage. This is ensured by using a compression to a logarithmically sized space that contains all relevant information for sensing. We specifically demonstrate how our method can be used to obtain optimal scaling for phase and frequency estimation in local estimation problems, as well as for the Bayesian equivalents with Gaussian priors of varying widths. Furthermore, we show that in the paradigmatic case of local phase estimation 1D cluster states are sufficient for optimal state preparation and measurement. (paper)

  12. Computational Benchmark for Estimation of Reactivity Margin from Fission Products and Minor Actinides in PWR Burnup Credit

    International Nuclear Information System (INIS)

    Wagner, J.C.

    2001-01-01

    This report proposes and documents a computational benchmark problem for the estimation of the additional reactivity margin available in spent nuclear fuel (SNF) from fission products and minor actinides in a burnup-credit storage/transport environment, relative to SNF compositions containing only the major actinides. The benchmark problem/configuration is a generic burnup credit cask designed to hold 32 pressurized water reactor (PWR) assemblies. The purpose of this computational benchmark is to provide a reference configuration for the estimation of the additional reactivity margin, which is encouraged in the U.S. Nuclear Regulatory Commission (NRC) guidance for partial burnup credit (ISG8), and document reference estimations of the additional reactivity margin as a function of initial enrichment, burnup, and cooling time. Consequently, the geometry and material specifications are provided in sufficient detail to enable independent evaluations. Estimates of additional reactivity margin for this reference configuration may be compared to those of similar burnup-credit casks to provide an indication of the validity of design-specific estimates of fission-product margin. The reference solutions were generated with the SAS2H-depletion and CSAS25-criticality sequences of the SCALE 4.4a package. Although the SAS2H and CSAS25 sequences have been extensively validated elsewhere, the reference solutions are not directly or indirectly based on experimental results. Consequently, this computational benchmark cannot be used to satisfy the ANS 8.1 requirements for validation of calculational methods and is not intended to be used to establish biases for burnup credit analyses

  13. Unbiased estimation of the eyeball volume using the Cavalieri principle on computed tomography images.

    Science.gov (United States)

    Acer, Niyazi; Sahin, Bunyamin; Ucar, Tolga; Usanmaz, Mustafa

    2009-01-01

    The size of the eyeball has been the subject of a few studies. None of them used stereological methods to estimate the volume. In the current study, we estimated the volume of eyeball in normal men and women using the stereological methods. Eyeball volume (EV) was estimated using the Cavalieri principle as a combination of point-counting and planimetry techniques. We used computed tomography scans taken from 36 participants (15 men and 21 women) to estimate the EV. The mean (SD) EV values obtained by planimetry method were 7.49 (0.79) and 7.06 (0.85) cm in men and women, respectively. By using point-counting method, the mean (SD) values were 7.48 (0.85) and 7.21 (0.84) cm in men and women, respectively. There was no statistically significant difference between the findings from the 2 methods (P > 0.05). A weak correlation was found between the axial length of eyeball and the EV estimated by point counting and planimetry (P eyeball.

  14. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  15. Computing ordinary least-squares parameter estimates for the National Descriptive Model of Mercury in Fish

    Science.gov (United States)

    Donato, David I.

    2013-01-01

    A specialized technique is used to compute weighted ordinary least-squares (OLS) estimates of the parameters of the National Descriptive Model of Mercury in Fish (NDMMF) in less time using less computer memory than general methods. The characteristics of the NDMMF allow the two products X'X and X'y in the normal equations to be filled out in a second or two of computer time during a single pass through the N data observations. As a result, the matrix X does not have to be stored in computer memory and the computationally expensive matrix multiplications generally required to produce X'X and X'y do not have to be carried out. The normal equations may then be solved to determine the best-fit parameters in the OLS sense. The computational solution based on this specialized technique requires O(8p2+16p) bytes of computer memory for p parameters on a machine with 8-byte double-precision numbers. This publication includes a reference implementation of this technique and a Gaussian-elimination solver in preliminary custom software.

  16. A SAS-macro for estimation of the cumulative incidence using Poisson regression

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    2009-01-01

    the hazard rates, and the hazard rates are often estimated by the Cox regression. This procedure may not be suitable for large studies due to limited computer resources. Instead one uses Poisson regression, which approximates the Cox regression. Rosthøj et al. presented a SAS-macro for the estimation...... of the cumulative incidences based on the Cox regression. I present the functional form of the probabilities and variances when using piecewise constant hazard rates and a SAS-macro for the estimation using Poisson regression. The use of the macro is demonstrated through examples and compared to the macro presented...

  17. How to stop e-mail spam, spyware, malware, computer viruses, and hackers from ruining your computer or network the complete guide for your home and work

    CERN Document Server

    Brown, Bruce

    2010-01-01

    It seems like everywhere you go on the Internet, there is spam, spyware, and the risk of viruses infecting your computer and ruining your online experience. In businesses alone, according to Nucleus Research Inc. spam costs more than 712 per employee each year in productivity and computing resources and the estimation on money lost by businesses due to computer viruses ranges between 100 million and 2 billion annually depending on how the total is calculated. This complete, revolutionary book has compiled all of the vital information you need to make sure that you are able to combat the risk

  18. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    Directory of Open Access Journals (Sweden)

    Riccardo Barzaghi

    2016-07-01

    Full Text Available Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively, their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper that must be defined by simulations.

  19. Solar resources estimation combining digital terrain models and satellite images techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bosch, J.L.; Batlles, F.J. [Universidad de Almeria, Departamento de Fisica Aplicada, Ctra. Sacramento s/n, 04120-Almeria (Spain); Zarzalejo, L.F. [CIEMAT, Departamento de Energia, Madrid (Spain); Lopez, G. [EPS-Universidad de Huelva, Departamento de Ingenieria Electrica y Termica, Huelva (Spain)

    2010-12-15

    One of the most important steps to make use of any renewable energy is to perform an accurate estimation of the resource that has to be exploited. In the designing process of both active and passive solar energy systems, radiation data is required for the site, with proper spatial resolution. Generally, a radiometric stations network is used in this evaluation, but when they are too dispersed or not available for the study area, satellite images can be utilized as indirect solar radiation measurements. Although satellite images cover wide areas with a good acquisition frequency they usually have a poor spatial resolution limited by the size of the image pixel, and irradiation must be interpolated to evaluate solar irradiation at a sub-pixel scale. When pixels are located in flat and homogeneous areas, correlation of solar irradiation is relatively high, and classic interpolation can provide a good estimation. However, in complex topography zones, data interpolation is not adequate and the use of Digital Terrain Model (DTM) information can be helpful. In this work, daily solar irradiation is estimated for a wide mountainous area using a combination of Meteosat satellite images and a DTM, with the advantage of avoiding the necessity of ground measurements. This methodology utilizes a modified Heliosat-2 model, and applies for all sky conditions; it also introduces a horizon calculation of the DTM points and accounts for the effect of snow covers. Model performance has been evaluated against data measured in 12 radiometric stations, with results in terms of the Root Mean Square Error (RMSE) of 10%, and a Mean Bias Error (MBE) of +2%, both expressed as a percentage of the mean value measured. (author)

  20. AN ESTIMATION OF HISTORICAL-CULTURAL RESOURCES OF THE TURKIVSKOGO DISTRICT IS FOR NECESSITIES OF ETHNIC TOURISM.

    OpenAIRE

    Безручко, Л.С.

    2016-01-01

    In the article thefeatures of estimation of historical-culturalresources are considered for the necessitiesof ethnic tourism. The list of objects thatcan be used as resources in ethnic toutismis distinguished. In particular, the objects ofJewish heritage (synagogue, Jewish burialplaces), material objects that remainedfrom the German colonists (two churches),are studied, and also the material and nonmaterialculture of boyko ethnos (churches,building, traditions, museums) is studied.The compres...

  1. Prediction of resource volumes at untested locations using simple local prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  2. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  3. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  4. AIRDOS-II computer code for estimating radiation dose to man from airborne radionuclides in areas surrouding nuclear facilities

    International Nuclear Information System (INIS)

    Moore, R.E.

    1977-04-01

    The AIRDOS-II computer code estimates individual and population doses resulting from the simultaneous atmospheric release of as many as 36 radionuclides from a nuclear facility. This report describes the meteorological and environmental models used is the code, their computer implementation, and the applicability of the code to assessments of radiological impact. Atmospheric dispersion and surface deposition of released radionuclides are estimated as a function of direction and distance from a nuclear power plant or fuel-cycle facility, and doses to man through inhalation, air immersion, exposure to contaminated ground, food ingestion, and water immersion are estimated in the surrounding area. Annual doses are estimated for total body, GI tract, bone, thyroid, lungs, muscle, kidneys, liver, spleen, testes, and ovaries. Either the annual population doses (man-rems/year) or the highest annual individual doses in the assessment area (rems/year), whichever are applicable, are summarized in output tables in several ways--by nuclides, modes of exposure, and organs. The location of the highest individual doses for each reference organ estimated for the area is specified in the output data

  5. Estimation of kinetic and thermodynamic ligand-binding parameters using computational strategies.

    Science.gov (United States)

    Deganutti, Giuseppe; Moro, Stefano

    2017-04-01

    Kinetic and thermodynamic ligand-protein binding parameters are gaining growing importance as key information to consider in drug discovery. The determination of the molecular structures, using particularly x-ray and NMR techniques, is crucial for understanding how a ligand recognizes its target in the final binding complex. However, for a better understanding of the recognition processes, experimental studies of ligand-protein interactions are needed. Even though several techniques can be used to investigate both thermodynamic and kinetic profiles for a ligand-protein complex, these procedures are very often laborious, time consuming and expensive. In the last 10 years, computational approaches have enormous potential in providing insights into each of the above effects and in parsing their contributions to the changes in both kinetic and thermodynamic binding parameters. The main purpose of this review is to summarize the state of the art of computational strategies for estimating the kinetic and thermodynamic parameters of a ligand-protein binding.

  6. Estimation of the state-of-the-art and possibilities for development of the geothermal resource in the Republic of Macedonia

    International Nuclear Information System (INIS)

    Popovski, Kiril

    1995-01-01

    Based on the present k now-how in Macedonia and the world, a trial is made to analyse and estimate the influencing factors defining the situation and justifiability of development of the geothermal energy resource in Macedonia, as it follows: 1) Nature and location of the energy resource; 2) 'Know-how' on disposal; 3) Application technologies on disposal; 4) Industrial production of equipment and materials on disposal; 5) Possible market for the energy resource; 6) Financial competitiveness; 7) Environment protection; 8) Regional aspects of possible development; 9) Barriers for development; 10) Necessary measures to enable development. (Original)

  7. A thermal analysis computer programme package for the estimation of KANUPP coolant channel flows and outlet header temperature distribution

    International Nuclear Information System (INIS)

    Siddiqui, M.S.

    1992-06-01

    COFTAN is a computer code for actual estimation of flows and temperatures in the coolant channels of a pressure tube heavy water reactor. The code is being used for Candu type reactor with coolant flowing 208 channels. The simulation model first performs the detailed calculation of flux and power distribution based on two groups diffusion theory treatment on a three dimensional mesh and then channel powers, resulting from the summation of eleven bundle powers in each of the 208 channels, are employed to make actual estimation of coolant flows using channel powers and channel outlet temperature monitored by digital computers. The code by using the design flows in individual channels and applying a correction factor based on control room monitored flows in eight selected channels, can also provide a reserve computational tool of estimating individual channel outlet temperatures, thus providing an alternate arrangements for checking Rads performance. 42 figs. (Orig./A.B.)

  8. Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact

    Science.gov (United States)

    Frank, Jeremy

    2004-01-01

    We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.

  9. Estimation of subcriticality with the computed values analysis using MCNP of experiment on coupled cores

    International Nuclear Information System (INIS)

    Sakurai, Kiyoshi; Yamamoto, Toshihiro; Arakawa, Takuya; Naito, Yoshitaka

    1998-01-01

    Experiments on coupled cores performed at TCA were analysed using continuous energy Monte Carlo calculation code MCNP 4A. Errors of neutron multiplication factors are evaluated using Indirect Bias Estimation Method proposed by authors. Calculation for simulation of pulsed neutron method was performed for 17 X 17 + 5G + 17 x 17 core system and its of exponential experiment method was also performed for 16 x 9 + 3G + 16 x 9 and 16 x 9 + 5G + 16 x 9 core systems. Errors of neutron multiplication factors are estimated to be (-1.5) - (-0.6)% evaluated by Indirect Bias Estimation Method. Its errors evaluated by conventional pulsed neutron method and exponential experiment method are estimated to be 7%, but it is below 1% for estimation of subcriticality with the computed values by applying Indirect Bias Estimation Method. Feasibility of subcriticality management is higher by application of the method to full scale fuel strage facility. (author)

  10. Performance analysis of cloud computing services for many-tasks scientific computing

    NARCIS (Netherlands)

    Iosup, A.; Ostermann, S.; Yigitbasi, M.N.; Prodan, R.; Fahringer, T.; Epema, D.H.J.

    2011-01-01

    Cloud computing is an emerging commercial infrastructure paradigm that promises to eliminate the need for maintaining expensive computing facilities by companies and institutes alike. Through the use of virtualization and resource time sharing, clouds serve with a single set of physical resources a

  11. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Directory of Open Access Journals (Sweden)

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  12. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    Science.gov (United States)

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  13. Uranium resources, demand and production

    International Nuclear Information System (INIS)

    Stipanicic, P.N.

    1985-05-01

    Estimations of the demand and production of principal uranium resource categories are presented. The estimations based on data analysis made by a joint 'NEA/IAEA Working Party on Uranium Resources' and the corresponding results are published by the OECD (Organization for Economic Co-operation and Development) in the 'Uranium Resources, Production and Demand' Known as 'Red Book'. (M.C.K.) [pt

  14. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  15. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, Tom; Yang, Xi

    2018-01-16

    The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyberinfrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum of compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyberinfrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate

  16. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  17. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  18. Computing requirements for S.S.C. accelerator design and studies

    International Nuclear Information System (INIS)

    Dragt, A.; Talman, R.; Siemann, R.; Dell, G.F.; Leemann, B.; Leemann, C.; Nauenberg, U.; Peggs, S.; Douglas, D.

    1984-01-01

    We estimate the computational hardware resources that will be required for accelerator physics studies during the design of the Superconducting SuperCollider. It is found that both Class IV and Class VI facilities (1) will be necessary. We describe a user environment for these facilities that is desirable within the context of accelerator studies. An acquisition scenario for these facilities is presented

  19. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  20. Estimating long-term uranium resource availability and discovery requirements. A Canadian case study

    International Nuclear Information System (INIS)

    Martin, H.L.; Azis, A.; Williams, R.M.

    1979-01-01

    Well-founded estimates of the rate at which a country's resources might be made available are a prime requisite for energy planners and policy makers at the national level. To meet this need, a method is discussed that can aid in the analysis of future supply patterns of uranium and other metals. Known sources are first appraised, on a mine-by-mine basis, in relation to projected domestic needs and expectable export levels. The gap between (a) production from current and anticipated mines, and (b) production levels needed to meet both domestic needs and export opportunities, would have to be met by new sources. Using as measuring sticks the resources and production capabilities of typical uranium deposits, a measure can be obtained of the required timing and magnitude of discovery needs. The new discoveries, when developed into mines, would need to be sufficient to meet not only any shortfalls in production capability, but also any special reserve requirements as stipulated, for example, under Canada's uranium export guidelines. Since the method can be followed simply and quickly, it can serve as a valuable tool for long-term supply assessments of any mineral commodity from a nation's mines. (author)

  1. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  2. Computable error estimates of a finite difference scheme for option pricing in exponential Lévy models

    KAUST Repository

    Kiessling, Jonas; Tempone, Raul

    2014-01-01

    jump activity, then the jumps smaller than some (Formula presented.) are approximated by diffusion. The resulting diffusion approximation error is also estimated, with leading order term in computable form, as well as the dependence of the time

  3. Computational error estimates for Monte Carlo finite element approximation with log normal diffusion coefficients

    KAUST Repository

    Sandberg, Mattias

    2015-01-07

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with log normal distributed diffusion coefficients, e.g. modelling ground water flow. Typical models use log normal diffusion coefficients with H¨older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. This talk will address how the total error can be estimated by the computable error.

  4. Aging adult skull remains through radiological density estimates: A comparison of different computed tomography systems and the use of computer simulations to judge the accuracy of results.

    Science.gov (United States)

    Obert, Martin; Kubelt, Carolin; Schaaf, Thomas; Dassinger, Benjamin; Grams, Astrid; Gizewski, Elke R; Krombach, Gabriele A; Verhoff, Marcel A

    2013-05-10

    The objective of this article was to explore age-at-death estimates in forensic medicine, which were methodically based on age-dependent, radiologically defined bone-density (HC) decay and which were investigated with a standard clinical computed tomography (CT) system. Such density decay was formerly discovered with a high-resolution flat-panel CT in the skulls of adult females. The development of a standard CT methodology for age estimations--with thousands of installations--would have the advantage of being applicable everywhere, whereas only few flat-panel prototype CT systems are in use worldwide. A Multi-Slice CT scanner (MSCT) was used to obtain 22,773 images from 173 European human skulls (89 male, 84 female), taken from a population of patients from the Department of Neuroradiology at the University Hospital Giessen and Marburg during 2010 and 2011. An automated image analysis was carried out to evaluate HC of all images. The age dependence of HC was studied by correlation analysis. The prediction accuracy of age-at-death estimates was calculated. Computer simulations were carried out to explore the influence of noise on the accuracy of age predictions. Human skull HC values strongly scatter as a function of age for both sexes. Adult male skull bone-density remains constant during lifetime. Adult female HC decays during lifetime, as indicated by a correlation coefficient (CC) of -0.53. Prediction errors for age-at-death estimates for both of the used scanners are in the range of ±18 years at a 75% confidence interval (CI). Computer simulations indicate that this is the best that can be expected for such noisy data. Our results indicate that HC-decay is indeed present in adult females and that it can be demonstrated both by standard and by high-resolution CT methods, applied to different subject groups of an identical population. The weak correlation between HC and age found by both CT methods only enables a method to estimate age-at-death with limited

  5. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  6. Estimate of the energy resources of the Aracatuba region, Sao Paulo, Brazil; Estimativa dos recursos energeticos da regiao de Aracatuba

    Energy Technology Data Exchange (ETDEWEB)

    Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro; Grimoni, Jose Aquiles Baesso; Souza, Carlos Antonio Farias de [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Engenharia de Energia e Automacao Eletricas. Grupo de Energia], e-mail: udaeta@pea.usp.br

    2004-07-01

    The complete assessment on energy producing resources, as referred, indicates a consideration about technical, economical, political and environmental aspects of these resources. The developed project presents a methodology for the implementation of this complete assessment, which consists of identification and inventory of the technologies and resources available at the time to a specific geographical site and their classification as an indicative and computative character of the mentioned aspects, taking in consideration the sustainable development. This methodology was used in the Administrative Region of Aracatuba, obtaining as results a ranking of alternatives. The solar collectors acquired the best evaluation. From this result it's possible to indicate the better option, in this case the solar collector, of investment in a preliminary approach. (author)

  7. Estimate of the energy resources of the Aracatuba region, Sao Paulo, Brazil; Estimativa dos recursos energeticos da regiao de Aracatuba

    Energy Technology Data Exchange (ETDEWEB)

    Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro; Grimoni, Jose Aquiles Baesso; Souza, Carlos Antonio Farias de [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Engenharia de Energia e Automacao Eletricas. Grupo de Energia], e-mail: udaeta@pea.usp.br

    2004-07-01

    The complete assessment on energy producing resources, as referred, indicates a consideration about technical, economical, political and environmental aspects of these resources. The developed project presents a methodology for the implementation of this complete assessment, which consists of identification and inventory of the technologies and resources available at the time to a specific geographical site and their classification as an indicative and computative character of the mentioned aspects, taking in consideration the sustainable development. This methodology was used in the Administrative Region of Aracatuba, obtaining as results a ranking of alternatives. The solar collectors acquired the best evaluation. From this result it's possible to indicate the better option, in this case the solar collector, of investment in a preliminary approach. (author)

  8. Reciprocal Estimation of Pedestrian Location and Motion State toward a Smartphone Geo-Context Computing Solution

    Directory of Open Access Journals (Sweden)

    Jingbin Liu

    2015-06-01

    Full Text Available The rapid advance in mobile communications has made information and services ubiquitously accessible. Location and context information have become essential for the effectiveness of services in the era of mobility. This paper proposes the concept of geo-context that is defined as an integral synthesis of geographical location, human motion state and mobility context. A geo-context computing solution consists of a positioning engine, a motion state recognition engine, and a context inference component. In the geo-context concept, the human motion states and mobility context are associated with the geographical location where they occur. A hybrid geo-context computing solution is implemented that runs on a smartphone, and it utilizes measurements of multiple sensors and signals of opportunity that are available within a smartphone. Pedestrian location and motion states are estimated jointly under the framework of hidden Markov models, and they are used in a reciprocal manner to improve their estimation performance of one another. It is demonstrated that pedestrian location estimation has better accuracy when its motion state is known, and in turn, the performance of motion state recognition can be improved with increasing reliability when the location is given. The geo-context inference is implemented simply with the expert system principle, and more sophisticated approaches will be developed.

  9. Estimation of the resource buffers in the assembly process of a shearer machine in the CPPM method

    Directory of Open Access Journals (Sweden)

    Gwiazda Aleksander

    2017-01-01

    Full Text Available Dynamic development of scheduling systems allows significantly improving currently realized tasks. Critical Chain Project Management (CCPM is one of the methods of project management basing on network planning. In this method is utilized the concept of a critical chain derived from the Theory of Constraints. This method allows avoiding losses considered project time and resources. It results in quicker project implementation (20-30%, and in reducing the risk level considered with tasks realization. The projects are cheaper, and the risk of cost overruns is significantly reduced. Factors that distinguish CCPM method from traditional network planning methods are: the balance of resources and the introduction of buffers. Moreover in the CCPM method key elements are: times of tasks that are reduced from traditional estimates to realistic ones. Activities associated with the task start as late as possible in accordance with the ALAP principle (As Late As Possible. This work presents the process of managing the assembly of a shearer machine taking into account the process of safety buffers utilization and the whole project optimization. It is presented the estimation of buffers capacity to obtain the improvement of project realization task.

  10. Information resource management concepts for records managers

    Energy Technology Data Exchange (ETDEWEB)

    Seesing, P.R.

    1992-10-01

    Information Resource Management (ERM) is the label given to the various approaches used to foster greater accountability for the use of computing resources. It is a corporate philosophy that treats information as it would its other resources. There is a reorientation from simply expenditures to considering the value of the data stored on that hardware. Accountability for computing resources is expanding beyond just the data processing (DP) or management information systems (MIS) manager to include senior organization management and user management. Management's goal for office automation is being refocused from saving money to improving productivity. A model developed by Richard Nolan (1982) illustrates the basic evolution of computer use in organizations. Computer Era: (1) Initiation (computer acquisition), (2) Contagion (intense system development), (3) Control (proliferation of management controls). Data Resource Era: (4) Integration (user service orientation), (5) Data Administration (corporate value of information), (6) Maturity (strategic approach to information technology). The first three stages mark the growth of traditional data processing and management information systems departments. The development of the IRM philosophy in an organization involves the restructuring of the DP organization and new management techniques. The three stages of the Data Resource Era represent the evolution of IRM. This paper examines each of them in greater detail.

  11. Information resource management concepts for records managers

    Energy Technology Data Exchange (ETDEWEB)

    Seesing, P.R.

    1992-10-01

    Information Resource Management (ERM) is the label given to the various approaches used to foster greater accountability for the use of computing resources. It is a corporate philosophy that treats information as it would its other resources. There is a reorientation from simply expenditures to considering the value of the data stored on that hardware. Accountability for computing resources is expanding beyond just the data processing (DP) or management information systems (MIS) manager to include senior organization management and user management. Management`s goal for office automation is being refocused from saving money to improving productivity. A model developed by Richard Nolan (1982) illustrates the basic evolution of computer use in organizations. Computer Era: (1) Initiation (computer acquisition), (2) Contagion (intense system development), (3) Control (proliferation of management controls). Data Resource Era: (4) Integration (user service orientation), (5) Data Administration (corporate value of information), (6) Maturity (strategic approach to information technology). The first three stages mark the growth of traditional data processing and management information systems departments. The development of the IRM philosophy in an organization involves the restructuring of the DP organization and new management techniques. The three stages of the Data Resource Era represent the evolution of IRM. This paper examines each of them in greater detail.

  12. Market-Oriented Cloud Computing: Vision, Hype, and Reality for Delivering IT Services as Computing Utilities

    OpenAIRE

    Buyya, Rajkumar; Yeo, Chee Shin; Venugopal, Srikumar

    2008-01-01

    This keynote paper: presents a 21st century vision of computing; identifies various computing paradigms promising to deliver the vision of computing utilities; defines Cloud computing and provides the architecture for creating market-oriented Clouds by leveraging technologies such as VMs; provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; presents...

  13. Comparison of Personal Resources in Patients Who Differently Estimate the Impact of Multiple Sclerosis.

    Science.gov (United States)

    Wilski, Maciej; Tomczak, Maciej

    2017-04-01

    Discrepancies between physicians' assessment and patients' subjective representations of the disease severity may influence physician-patient communication and management of a chronic illness, such as multiple sclerosis (MS). For these reasons, it is important to recognize factors that distinguish patients who differently estimate the impact of MS. The purpose of this study was to verify if the patients who overestimate or underestimate the impact of MS differ in their perception of personal resources from individuals presenting with a realistic appraisal of their physical condition. A total of 172 women and 92 men diagnosed with MS completed Multiple Sclerosis Impact Scale, University of Washington Self Efficacy Scale, Rosenberg Self-Esteem Scale, Body Esteem Scale, Brief Illness Perception Questionnaire, Treatment Beliefs Scale, Actually Received Support Scale, and Socioeconomic resources scale. Physician's assessment of health status was determined with Expanded Disability Status Scale. Linear regression analysis was conducted to identify the subsets of patients with various patterns of subjective health and Expanded Disability Status Scale (EDSS) scores. Patients overestimating the impact of their disease presented with significantly lower levels of self-esteem, self-efficacy in MS, and body esteem; furthermore, they perceived their condition more threatening than did realists and underestimators. They also assessed anti-MS treatment worse, had less socioeconomic resources, and received less support than underestimators. Additionally, underestimators presented with significantly better perception of their disease, self, and body than did realists. Self-assessment of MS-related symptoms is associated with specific perception of personal resources in coping with the disease. These findings may facilitate communication with patients and point to new directions for future research on adaptation to MS.

  14. Application of High Performance Computing to Earthquake Hazard and Disaster Estimation in Urban Area

    Directory of Open Access Journals (Sweden)

    Muneo Hori

    2018-02-01

    Full Text Available Integrated earthquake simulation (IES is a seamless simulation of analyzing all processes of earthquake hazard and disaster. There are two difficulties in carrying out IES, namely, the requirement of large-scale computation and the requirement of numerous analysis models for structures in an urban area, and they are solved by taking advantage of high performance computing (HPC and by developing a system of automated model construction. HPC is a key element in developing IES, as it needs to analyze wave propagation and amplification processes in an underground structure; a model of high fidelity for the underground structure exceeds a degree-of-freedom larger than 100 billion. Examples of IES for Tokyo Metropolis are presented; the numerical computation is made by using K computer, the supercomputer of Japan. The estimation of earthquake hazard and disaster for a given earthquake scenario is made by the ground motion simulation and the urban area seismic response simulation, respectively, for the target area of 10,000 m × 10,000 m.

  15. Computational estimation of soybean oil adulteration in Nepalese mustard seed oil based on fatty acid composition

    OpenAIRE

    Shrestha, Kshitij; De Meulenaer, Bruno

    2011-01-01

    The experiment was carried out for the computational estimation of soybean oil adulteration in the mustard seed oil using chemometric technique based on fatty acid composition. Principal component analysis and K-mean clustering of fatty acid composition data showed 4 major mustard/rapeseed clusters, two of high erucic and two of low erucic mustard type. Soybean and other possible adulterants made a distinct cluster from them. The methodology for estimation of soybean oil adulteration was deve...

  16. Evaluation of alternative model-data fusion approaches in water balance estimation across Australia

    Science.gov (United States)

    van Dijk, A. I. J. M.; Renzullo, L. J.

    2009-04-01

    Australia's national agencies are developing a continental modelling system to provide a range of water information services. It will include rolling water balance estimation to underpin national water accounts, water resources assessments that interpret current water resources availability and trends in a historical context, and water resources predictions coupled to climate and weather forecasting. The nation-wide coverage, currency, accuracy, and consistency required means that remote sensing will need to play an important role along with in-situ observations. Different approaches to blending models and observations can be considered. Integration of on-ground and remote sensing data into land surface models in atmospheric applications often involves state updating through model-data assimilation techniques. By comparison, retrospective water balance estimation and hydrological scenario modelling to date has mostly relied on static parameter fitting against observations and has made little use of earth observation. The model-data fusion approach most appropriate for a continental water balance estimation system will need to consider the trade-off between computational overhead and the accuracy gains achieved when using more sophisticated synthesis techniques and additional observations. This trade-off was investigated using a landscape hydrological model and satellite-based estimates of soil moisture and vegetation properties for aseveral gauged test catchments in southeast Australia.

  17. Uranium resource assessments

    International Nuclear Information System (INIS)

    1981-01-01

    The objective of this investigation is to examine what is generally known about uranium resources, what is subject to conjecture, how well do the explorers themselves understand the occurrence of uranium, and who are the various participants in the exploration process. From this we hope to reach a better understanding of the quality of uranium resource estimates as well as the nature of the exploration process. The underlying questions will remain unanswered. But given an inability to estimate precisely our uranium resources, how much do we really need to know. To answer this latter question, the various Department of Energy needs for uranium resource estimates are examined. This allows consideration of whether or not given the absence of more complete long-term supply data and the associated problems of uranium deliverability for the electric utility industry, we are now threatened with nuclear power plants eventually standing idle due to an unanticipated lack of fuel for their reactors. Obviously this is of some consequence to the government and energy consuming public. The report is organized into four parts. Section I evaluates the uranium resource data base and the various methodologies of resource assessment. Part II describes the manner in which a private company goes about exploring for uranium and the nature of its internal need for resource information. Part III examines the structure of the industry for the purpose of determining the character of the industry with respect to resource development. Part IV arrives at conclusions about the emerging pattern of industrial behavior with respect to uranium supply and the implications this has for coping with national energy issues

  18. Estimating the Impact of Drought on Groundwater Resources of the Marshall Islands

    Directory of Open Access Journals (Sweden)

    Brandon L. Barkey

    2017-01-01

    Full Text Available Groundwater resources of small coral islands are threatened due to short-term and long-term changes in climate. A significant short-term threat is El Niño events, which typically induce a severe months-long drought for many atoll nations in the western and central Pacific regions that exhausts rainwater supply and necessitates the use of groundwater. This study quantifies fresh groundwater resources under both average rainfall and drought conditions for the Republic of Marshall Islands (RMI, a nation composed solely of atolls and which is severely impacted by El Niño droughts. The atoll island algebraic model is used to estimate the thickness of the freshwater lens for 680 inhabited and uninhabited islands of the RMI, with a focus on the severe 1998 drought. The model accounts for precipitation, island width, hydraulic conductivity of the upper Holocene-age sand aquifer, the depth to the contact between the Holocene aquifer and the lower Pleistocene-age limestone aquifer, and the presence of a reef flat plate underlying the ocean side of the island. Model results are tested for islands that have fresh groundwater data. Results highlight the fragility of groundwater resources for the nation. Average lens thickness during typical seasonal rainfall is approximately 4 m, with only 30% of the islands maintaining a lens thicker than 4.5% and 55% of the islands with a lens less than 2.5 m thick. Thicker lenses typically occur for larger islands, islands located on the leeward side of an atoll due to lower hydraulic conductivity, and islands located in the southern region of the RMI due to higher rainfall rates. During drought, groundwater on small islands (<300 m in width is completely depleted. Over half (54% of the islands are classified as “Highly Vulnerable” to drought. Results provide valuable information for RMI water resources planners, particularly during the current 2016 El Niño drought, and similar methods can be used to quantify

  19. Microdiamond grade as a regionalised variable - some basic requirements for successful local microdiamond resource estimation of kimberlites

    Science.gov (United States)

    Stiefenhofer, Johann; Thurston, Malcolm L.; Bush, David E.

    2018-04-01

    Microdiamonds offer several advantages as a resource estimation tool, such as access to deeper parts of a deposit which may be beyond the reach of large diameter drilling (LDD) techniques, the recovery of the total diamond content in the kimberlite, and a cost benefit due to the cheaper treatment cost compared to large diameter samples. In this paper we take the first step towards local estimation by showing that micro-diamond samples can be treated as a regionalised variable suitable for use in geostatistical applications and we show examples of such output. Examples of microdiamond variograms are presented, the variance-support relationship for microdiamonds is demonstrated and consistency of the diamond size frequency distribution (SFD) is shown with the aid of real datasets. The focus therefore is on why local microdiamond estimation should be possible, not how to generate such estimates. Data from our case studies and examples demonstrate a positive correlation between micro- and macrodiamond sample grades as well as block estimates. This relationship can be demonstrated repeatedly across multiple mining operations. The smaller sample support size for microdiamond samples is a key difference between micro- and macrodiamond estimates and this aspect must be taken into account during the estimation process. We discuss three methods which can be used to validate or reconcile the estimates against macrodiamond data, either as estimates or in the form of production grades: (i) reconcilliation using production data, (ii) by comparing LDD-based grade estimates against microdiamond-based estimates and (iii) using simulation techniques.

  20. Analysis of Traffic Parameter Estimation and Its Impacts on Wireless Channel

    Institute of Scientific and Technical Information of China (English)

    徐玉滨; 沙学军; 强蔚

    2004-01-01

    Wide band or broadband access was paid much attention with the development of radio transmission technique. The wireless access control procedure play an important role in this type of system and efficiency of control algorithm has a great impact on throughput of channel resource. Based on wide band network control model and the characteristics of radio channel, this paper proposed a channel traffic estimation method and then performed a dynamic parameter control procedure and give detail analysis on estimation error and its impact on channel throughput and delay performance. Computation and simulation of system performance show a positive solution on system design.

  1. The pilot way to Grid resources using glideinWMS

    CERN Document Server

    Sfiligoi, Igor; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the non-uniformity of compute resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  2. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  3. Use of simplified methods for predicting natural resource damages

    International Nuclear Information System (INIS)

    Loreti, C.P.; Boehm, P.D.; Gundlach, E.R.; Healy, E.A.; Rosenstein, A.B.; Tsomides, H.J.; Turton, D.J.; Webber, H.M.

    1995-01-01

    To reduce transaction costs and save time, the US Department of the Interior (DOI) and the National Oceanic and Atmospheric Administration (NOAA) have developed simplified methods for assessing natural resource damages from oil and chemical spills. DOI has proposed the use of two computer models, the Natural Resource Damage Assessment Model for Great Lakes Environments (NRDAM/GLE) and a revised Natural Resource Damage Assessment Model for Coastal and Marine Environments (NRDAM/CME) for predicting monetary damages for spills of oils and chemicals into the Great Lakes and coastal and marine environments. NOAA has used versions of these models to create Compensation Formulas, which it has proposed for calculating natural resource damages for oil spills of up to 50,000 gallons anywhere in the US. Based on a review of the documentation supporting the methods, the results of hundreds of sample runs of DOI's models, and the outputs of the thousands of model runs used to create NOAA's Compensation Formulas, this presentation discusses the ability of these simplified assessment procedures to make realistic damage estimates. The limitations of these procedures are described, and the need for validating the assumptions used in predicting natural resource injuries is discussed

  4. Mongolia wind resource assessment project

    International Nuclear Information System (INIS)

    Elliott, D.; Chadraa, B.; Natsagdorj, L.

    1998-01-01

    The development of detailed, regional wind-resource distributions and other pertinent wind resource characteristics (e.g., assessment maps and reliable estimates of seasonal, diurnal, and directional) is an important step in planning and accelerating the deployment of wind energy systems. This paper summarizes the approach and methods being used to conduct a wind energy resource assessment of Mongolia. The primary goals of this project are to develop a comprehensive wind energy resource atlas of Mongolia and to establish a wind measurement program in specific regions of Mongolia to identify prospective sites for wind energy projects and to help validate some of the wind resource estimates. The Mongolian wind resource atlas will include detailed, computerized wind power maps and other valuable wind resource characteristic information for the different regions of Mongolia

  5. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  6. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  7. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  8. A Multiagent Evolutionary Algorithm for the Resource-Constrained Project Portfolio Selection and Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Yongyi Shou

    2014-01-01

    Full Text Available A multiagent evolutionary algorithm is proposed to solve the resource-constrained project portfolio selection and scheduling problem. The proposed algorithm has a dual level structure. In the upper level a set of agents make decisions to select appropriate project portfolios. Each agent selects its project portfolio independently. The neighborhood competition operator and self-learning operator are designed to improve the agent’s energy, that is, the portfolio profit. In the lower level the selected projects are scheduled simultaneously and completion times are computed to estimate the expected portfolio profit. A priority rule-based heuristic is used by each agent to solve the multiproject scheduling problem. A set of instances were generated systematically from the widely used Patterson set. Computational experiments confirmed that the proposed evolutionary algorithm is effective for the resource-constrained project portfolio selection and scheduling problem.

  9. PEDIC - A COMPUTER PROGRAM TO ESTIMATE THE EFFECT OF EVACUATION ON POPULATION EXPOSURE FOLLOWING ACUTE RADIONUCLIDE RELEASES TO THE ATOMSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D. L.; Peloquin, R. A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems.

  10. Security of fixed and wireless computer networks

    NARCIS (Netherlands)

    Verschuren, J.; Degen, A.J.G.; Veugen, P.J.M.

    2003-01-01

    A few decades ago, most computers were stand-alone machines: they were able to process information using their own resources. Later, computer systems were connected to each other enabling a computer system to exchange data with another computer and to use resources of another computer. With the

  11. Practical experimental certification of computational quantum gates using a twirling procedure.

    Science.gov (United States)

    Moussa, Osama; da Silva, Marcus P; Ryan, Colm A; Laflamme, Raymond

    2012-08-17

    Because of the technical difficulty of building large quantum computers, it is important to be able to estimate how faithful a given implementation is to an ideal quantum computer. The common approach of completely characterizing the computation process via quantum process tomography requires an exponential amount of resources, and thus is not practical even for relatively small devices. We solve this problem by demonstrating that twirling experiments previously used to characterize the average fidelity of quantum memories efficiently can be easily adapted to estimate the average fidelity of the experimental implementation of important quantum computation processes, such as unitaries in the Clifford group, in a practical and efficient manner with applicability in current quantum devices. Using this procedure, we demonstrate state-of-the-art coherent control of an ensemble of magnetic moments of nuclear spins in a single crystal solid by implementing the encoding operation for a 3-qubit code with only a 1% degradation in average fidelity discounting preparation and measurement errors. We also highlight one of the advances that was instrumental in achieving such high fidelity control.

  12. Computer-modeling codes to improve exploration nuclear-logging methods. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Wilson, R.D.; Price, R.K.; Kosanke, K.L.

    1983-03-01

    As part of the Department of Energy's National Uranium Resource Evaluation (NURE) project's Technology Development effort, a number of computer codes and accompanying data bases were assembled for use in modeling responses of nuclear borehole logging Sondes. The logging methods include fission neutron, active and passive gamma-ray, and gamma-gamma. These CDC-compatible computer codes and data bases are available on magnetic tape from the DOE Technical Library at its Grand Junction Area Office. Some of the computer codes are standard radiation-transport programs that have been available to the radiation shielding community for several years. Other codes were specifically written to model the response of borehole radiation detectors or are specialized borehole modeling versions of existing Monte Carlo transport programs. Results from several radiation modeling studies are available as two large data bases (neutron and gamma-ray). These data bases are accompanied by appropriate processing programs that permit the user to model a wide range of borehole and formation-parameter combinations for fission-neutron, neutron-, activation and gamma-gamma logs. The first part of this report consists of a brief abstract for each code or data base. The abstract gives the code name and title, short description, auxiliary requirements, typical running time (CDC 6600), and a list of references. The next section gives format specifications and/or directory for the tapes. The final section of the report presents listings for programs used to convert data bases between machine floating-point and EBCDIC

  13. Fan-out Estimation in Spin-based Quantum Computer Scale-up.

    Science.gov (United States)

    Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R

    2017-10-17

    Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.

  14. Estimating social carrying capacity through computer simulation modeling: an application to Arches National Park, Utah

    Science.gov (United States)

    Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere

    2001-01-01

    Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...

  15. CLOUD COMPUTING OVERVIEW AND CHALLENGES: A REVIEW PAPER

    OpenAIRE

    Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap

    2017-01-01

    Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...

  16. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    Science.gov (United States)

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  17. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    Science.gov (United States)

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.

  18. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    Science.gov (United States)

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  19. Unconventional energy resources in a crowded subsurface: Reducing uncertainty and developing a separation zone concept for resource estimation and deep 3D subsurface planning using legacy mining data.

    Science.gov (United States)

    Monaghan, Alison A

    2017-12-01

    Over significant areas of the UK and western Europe, anthropogenic alteration of the subsurface by mining of coal has occurred beneath highly populated areas which are now considering a multiplicity of 'low carbon' unconventional energy resources including shale gas and oil, coal bed methane, geothermal energy and energy storage. To enable decision making on the 3D planning, licensing and extraction of these resources requires reduced uncertainty around complex geology and hydrogeological and geomechanical processes. An exemplar from the Carboniferous of central Scotland, UK, illustrates how, in areas lacking hydrocarbon well production data and 3D seismic surveys, legacy coal mine plans and associated boreholes provide valuable data that can be used to reduce the uncertainty around geometry and faulting of subsurface energy resources. However, legacy coal mines also limit unconventional resource volumes since mines and associated shafts alter the stress and hydrogeochemical state of the subsurface, commonly forming pathways to the surface. To reduce the risk of subsurface connections between energy resources, an example of an adapted methodology is described for shale gas/oil resource estimation to include a vertical separation or 'stand-off' zone between the deepest mine workings, to ensure the hydraulic fracturing required for shale resource production would not intersect legacy coal mines. Whilst the size of such separation zones requires further work, developing the concept of 3D spatial separation and planning is key to utilising the crowded subsurface energy system, whilst mitigating against resource sterilisation and environmental impacts, and could play a role in positively informing public and policy debate. Copyright © 2017 British Geological Survey, a component institute of NERC. Published by Elsevier B.V. All rights reserved.

  20. Different methods for the estimation of available water resources in the future under the influence of climate changes

    Science.gov (United States)

    Majkic-Dursun, B.; Boreli-Zdravkovic, Dj.; Djuric, D.

    2012-04-01

    The paper analyzes different approaches for the calculation of available water resources the under influence of CC, for cases of drinking water sources in the alluviums of the Sava River (Belgrade GW source)and Nišava River (Mediana GW source). Different types of analyzed sources (bank-filtered and artificially recharged) required different approaches, adjusted to the specific characteristics. The Belgrade GW source (capacity of 4-5 m3/s), is comprised of 99 horizontal wells and over 40 tube wells positioned on the 50 km on the alluvial plain of the most downstream Sava River banks. Deep parts of the water bearing complex are comprised of river-lacustrine polycyclic sediments (from sandy-gravels to silts), while the upper part are alluvial sediments. Main recharge stems from the Sava River by bank filtration process, while due to the layering of the aquifer, recharge from the hinterland in some river bank sections reaches up to 30 %. Test area covers 240km2, of Sava river valley. Future water availability has to be calculated according to the "new" -expected boundary conditions, vertical water balance on the test area and "estimated" river water fluctuations. The artificially-recharged GW source "Mediana" provides water supply to the City of Niš, as one of 6 water supply sources. The concept of this groundwater source is based on surface water abstraction from the Nišava River (catchement area is 4,086 km2 totally, where 1,096 km2 is in Bulgaria), which is transported to infiltration lakes after pre-treatment process. Once in the infiltration lake, the water is infiltrated into the aquifer and abstracted by wells, or collected by a drainage system. This site was used for the analysis of the impacts of climate changes on the discharge of Nisava River, since it feeds aquifer through infiltration lakes (approx. 95-98%) after surface water pretreatment. Estimation of available water resources was done for period until 2100 for A1B climate scenario. Climate

  1. Everglades Depth Estimation Network (EDEN)—A decade of serving hydrologic information to scientists and resource managers

    Science.gov (United States)

    Patino, Eduardo; Conrads, Paul; Swain, Eric; Beerens, James M.

    2017-10-30

    IntroductionThe Everglades Depth Estimation Network (EDEN) provides scientists and resource managers with regional maps of daily water levels and depths in the freshwater part of the Greater Everglades landscape. The EDEN domain includes all or parts of five Water Conservation Areas, Big Cypress National Preserve, Pennsuco Wetlands, and Everglades National Park. Daily water-level maps are interpolated from water-level data at monitoring gages, and depth is estimated by using a digital elevation model of the land surface. Online datasets provide time series of daily water levels at gages and rainfall and evapotranspiration data (https://sofia.usgs.gov/eden/). These datasets are used by scientists and resource managers to guide large-scale field operations, describe hydrologic changes, and support biological and ecological assessments that measure ecosystem response to the implementation of the Comprehensive Everglades Restoration Plan. EDEN water-level data have been used in a variety of biological and ecological studies including (1) the health of American alligators as a function of water depth, (2) the variability of post-fire landscape dynamics in relation to water depth, (3) the habitat quality for wading birds with dynamic habitat selection, and (4) an evaluation of the habitat of the Cape Sable seaside sparrow.

  2. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    Science.gov (United States)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  3. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  4. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  5. TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling

    Science.gov (United States)

    Nelson, J.; Jones, N.; Ames, D. P.

    2015-12-01

    Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.

  6. A resource management architecture for metacomputing systems.

    Energy Technology Data Exchange (ETDEWEB)

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  7. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D' Albuquerque, Luiz A.C., E-mail: rsnpinheiro@gmail.com [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Gastroenterologia. Div. de Transplante de Orgaos do Aparelho Digestivo; Lai, Quirino [Universidade de L' Aquila, San Salvatore Hospital (Italy); Ibuki, Felicia S.; Rocha, Manoel S. [Universidade de Sao Paulo (USP), SP (Brazil). Departamento de Radiologia

    2017-09-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r{sup 2} =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  8. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    International Nuclear Information System (INIS)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D'Albuquerque, Luiz A.C.; Ibuki, Felicia S.; Rocha, Manoel S.

    2017-01-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r"2 =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  9. PREOPERATIVE COMPUTED TOMOGRAPHY VOLUMETRY AND GRAFT WEIGHT ESTIMATION IN ADULT LIVING DONOR LIVER TRANSPLANTATION

    Science.gov (United States)

    PINHEIRO, Rafael S.; CRUZ-JR, Ruy J.; ANDRAUS, Wellington; DUCATTI, Liliana; MARTINO, Rodrigo B.; NACIF, Lucas S.; ROCHA-SANTOS, Vinicius; ARANTES, Rubens M; LAI, Quirino; IBUKI, Felicia S.; ROCHA, Manoel S.; D´ALBUQUERQUE, Luiz A. C.

    2017-01-01

    ABSTRACT Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r2=0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 - 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. PMID:28489167

  10. COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...

    Science.gov (United States)

    This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).

  11. Parameter estimation and inverse problems

    CERN Document Server

    Aster, Richard C; Thurber, Clifford H

    2005-01-01

    Parameter Estimation and Inverse Problems primarily serves as a textbook for advanced undergraduate and introductory graduate courses. Class notes have been developed and reside on the World Wide Web for faciliting use and feedback by teaching colleagues. The authors'' treatment promotes an understanding of fundamental and practical issus associated with parameter fitting and inverse problems including basic theory of inverse problems, statistical issues, computational issues, and an understanding of how to analyze the success and limitations of solutions to these probles. The text is also a practical resource for general students and professional researchers, where techniques and concepts can be readily picked up on a chapter-by-chapter basis.Parameter Estimation and Inverse Problems is structured around a course at New Mexico Tech and is designed to be accessible to typical graduate students in the physical sciences who may not have an extensive mathematical background. It is accompanied by a Web site that...

  12. Hemoglobin estimation by the HemoCue® portable hemoglobin photometer in a resource poor setting

    Directory of Open Access Journals (Sweden)

    Idriss Ali

    2011-04-01

    Full Text Available Abstract Background In resource poor settings where automated hematology analyzers are not available, the Cyanmethemoglobin method is often used. This method though cheaper, takes more time. In blood donations, the semi-quantitative gravimetric copper sulfate method which is very easy and inexpensive may be used but does not provide an acceptable degree of accuracy. The HemoCue® hemoglobin photometer has been used for these purposes. This study was conducted to generate data to support or refute its use as a point-of-care device for hemoglobin estimation in mobile blood donations and critical care areas in health facilities. Method EDTA blood was collected from study participants drawn from five groups: pre-school children, school children, pregnant women, non-pregnant women and men. Blood collected was immediately processed to estimate the hemoglobin concentration using three different methods (HemoCue®, Sysmex KX21N and Cyanmethemoglobin. Agreement between the test methods was assessed by the method of Bland and Altman. The Intraclass correlation coefficient (ICC was used to determine the within subject variability of measured hemoglobin. Results Of 398 subjects, 42% were males with the overall mean age being 19.4 years. The overall mean hemoglobin as estimated by each method was 10.4 g/dl for HemoCue, 10.3 g/dl for Sysmex KX21N and 10.3 g/dl for Cyanmethemoglobin. Pairwise analysis revealed that the hemoglobin determined by the HemoCue method was higher than that measured by the KX21N and Cyanmethemoglobin. Comparing the hemoglobin determined by the HemoCue to Cyanmethemoglobin, the concordance correlation coefficient was 0.995 (95% CI: 0.994-0.996, p Conclusion Hemoglobin determined by the HemoCue method is comparable to that determined by the other methods. The HemoCue photometer is therefore recommended for use as on-the-spot device for determining hemoglobin in resource poor setting.

  13. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  14. Economic filters for evaluating porphyry copper deposit resource assessments using grade-tonnage deposit models, with examples from the U.S. Geological Survey global mineral resource assessment: Chapter H in Global mineral resource assessment

    Science.gov (United States)

    Robinson, Gilpin R.; Menzie, W. David

    2012-01-01

    An analysis of the amount and location of undiscovered mineral resources that are likely to be economically recoverable is important for assessing the long-term adequacy and availability of mineral supplies. This requires an economic evaluation of estimates of undiscovered resources generated by traditional resource assessments (Singer and Menzie, 2010). In this study, simplified engineering cost models were used to estimate the economic fraction of resources contained in undiscovered porphyry copper deposits, predicted in a global assessment of copper resources. The cost models of Camm (1991) were updated with a cost index to reflect increases in mining and milling costs since 1989. The updated cost models were used to perform an economic analysis of undiscovered resources estimated in porphyry copper deposits in six tracts located in North America. The assessment estimated undiscovered porphyry copper deposits within 1 kilometer of the land surface in three depth intervals.

  15. Using Multiple Seasonal Holt-Winters Exponential Smoothing to Predict Cloud Resource Provisioning

    OpenAIRE

    Ashraf A. Shahin

    2016-01-01

    Elasticity is one of the key features of cloud computing that attracts many SaaS providers to minimize their services' cost. Cost is minimized by automatically provision and release computational resources depend on actual computational needs. However, delay of starting up new virtual resources can cause Service Level Agreement violation. Consequently, predicting cloud resources provisioning gains a lot of attention to scale computational resources in advance. However, most of current approac...

  16. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...... reliable simulations of stream flow and water balance. The potential of using radar-based precipitation was found to be especially high at a smaller scale, where the impact of spatial resolution was evident from the stream discharge results. Also, groundwater recharge was shown to be sensitive...

  17. Estimation of intermediate-grade uranium resources II. Proposed method for estimating intermediate-grade uranium resources in roll-front deposits. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Yee, S.N.

    1981-09-01

    The purpose of this and a previous project was to examine the feasibility of estimating intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) on the basis of existing, sparsely drilled holes. All data are from the Powder River Basin in Wyoming. DOE makes preliminary estimates of endowment by calculating an Average Area of Influence (AAI) based on densely drilled areas, multiplying that by the thickness of the mineralization and then dividing by a tonnage factor. The resulting tonnage of ore is then multiplied by the average grade of the interval to obtain the estimate of U 3 O 8 tonnage. Total endowment is the sum of these values over all mineralized intervals in all wells in the area. In regions where wells are densely drilled and approximately regularly spaced this technique approaches the classical polygonal estimation technique used to estimate ore reserves and should be fairly reliable. The method is conservative because: (1) in sparsely drilled regions a large fraction of the area is not considered to contribute to endowment; (2) there is a bias created by the different distributions of point grades and mining block grades. A conservative approach may be justified for purposes of ore reserve estimation, where large investments may hinge on local forecasts. But for estimates of endowment over areas as large as 1 0 by 2 0 quadrangles, or the nation as a whole, errors in local predictions are not critical as long as they tend to cancel and a less conservative estimation approach may be justified.One candidate, developed for this study and described is called the contoured thickness technique. A comparison of estimates based on the contoured thickness approach with DOE calculations for five areas of Wyoming roll-fronts in the Powder River Basin is presented. The sensitivity of the technique to well density is examined and the question of predicting intermediate grade endowment from data on higher grades is discussed

  18. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  19. METRIC CHARACTERISTICS OF VARIOUS METHODS FOR NUMERICAL DENSITY ESTIMATION IN TRANSMISSION LIGHT MICROSCOPY – A COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Miroslav Kališnik

    2011-05-01

    Full Text Available In the introduction the evolution of methods for numerical density estimation of particles is presented shortly. Three pairs of methods have been analysed and compared: (1 classical methods for particles counting in thin and thick sections, (2 original and modified differential counting methods and (3 physical and optical disector methods. Metric characteristics such as accuracy, efficiency, robustness, and feasibility of methods have been estimated and compared. Logical, geometrical and mathematical analysis as well as computer simulations have been applied. In computer simulations a model of randomly distributed equal spheres with maximal contrast against surroundings has been used. According to our computer simulation all methods give accurate results provided that the sample is representative and sufficiently large. However, there are differences in their efficiency, robustness and feasibility. Efficiency and robustness increase with increasing slice thickness in all three pairs of methods. Robustness is superior in both differential and both disector methods compared to both classical methods. Feasibility can be judged according to the additional equipment as well as to the histotechnical and counting procedures necessary for performing individual counting methods. However, it is evident that not all practical problems can efficiently be solved with models.

  20. Quantum key distribution with finite resources: Secret key rates via Renyi entropies

    Energy Technology Data Exchange (ETDEWEB)

    Abruzzo, Silvestre; Kampermann, Hermann; Mertz, Markus; Bruss, Dagmar [Institute for Theoretical Physics III, Heinrich-Heine-universitaet Duesseldorf, D-40225 Duesseldorf (Germany)

    2011-09-15

    A realistic quantum key distribution (QKD) protocol necessarily deals with finite resources, such as the number of signals exchanged by the two parties. We derive a bound on the secret key rate which is expressed as an optimization problem over Renyi entropies. Under the assumption of collective attacks by an eavesdropper, a computable estimate of our bound for the six-state protocol is provided. This bound leads to improved key rates in comparison to previous results.

  1. Quantum key distribution with finite resources: Secret key rates via Renyi entropies

    International Nuclear Information System (INIS)

    Abruzzo, Silvestre; Kampermann, Hermann; Mertz, Markus; Bruss, Dagmar

    2011-01-01

    A realistic quantum key distribution (QKD) protocol necessarily deals with finite resources, such as the number of signals exchanged by the two parties. We derive a bound on the secret key rate which is expressed as an optimization problem over Renyi entropies. Under the assumption of collective attacks by an eavesdropper, a computable estimate of our bound for the six-state protocol is provided. This bound leads to improved key rates in comparison to previous results.

  2. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  3. Statistical model of global uranium resources and long-term availability

    International Nuclear Information System (INIS)

    Monnet, A.; Gabriel, S.; Percebois, J.

    2016-01-01

    Most recent studies on the long-term supply of uranium make simplistic assumptions on the available resources and their production costs. Some consider the whole uranium quantities in the Earth's crust and then estimate the production costs based on the ore grade only, disregarding the size of ore bodies and the mining techniques. Other studies consider the resources reported by countries for a given cost category, disregarding undiscovered or unreported quantities. In both cases, the resource estimations are sorted following a cost merit order. In this paper, we describe a methodology based on 'geological environments'. It provides a more detailed resource estimation and it is more flexible regarding cost modelling. The global uranium resource estimation introduced in this paper results from the sum of independent resource estimations from different geological environments. A geological environment is defined by its own geographical boundaries, resource dispersion (average grade and size of ore bodies and their variance), and cost function. With this definition, uranium resources are considered within ore bodies. The deposit breakdown of resources is modelled using a bivariate statistical approach where size and grade are the two random variables. This makes resource estimates possible for individual projects. Adding up all geological environments provides a distribution of all Earth's crust resources in which ore bodies are sorted by size and grade. This subset-based estimation is convenient to model specific cost structures. (authors)

  4. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  5. ESPRIT-like algorithm for computational-efficient angle estimation in bistatic multiple-input multiple-output radar

    Science.gov (United States)

    Gong, Jian; Lou, Shuntian; Guo, Yiduo

    2016-04-01

    An estimation of signal parameters via a rotational invariance techniques-like (ESPRIT-like) algorithm is proposed to estimate the direction of arrival and direction of departure for bistatic multiple-input multiple-output (MIMO) radar. The properties of a noncircular signal and Euler's formula are first exploited to establish a real-valued bistatic MIMO radar array data, which is composed of sine and cosine data. Then the receiving/transmitting selective matrices are constructed to obtain the receiving/transmitting rotational invariance factors. Since the rotational invariance factor is a cosine function, symmetrical mirror angle ambiguity may occur. Finally, a maximum likelihood function is used to avoid the estimation ambiguities. Compared with the existing ESPRIT, the proposed algorithm can save about 75% of computational load owing to the real-valued ESPRIT algorithm. Simulation results confirm the effectiveness of the ESPRIT-like algorithm.

  6. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  7. Homeless Mentally Ill: Problems and Options in Estimating Numbers and Trends. Report to the Chairman, Committee on Labor and Human Resources, U.S. Senate.

    Science.gov (United States)

    General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.

    In response to a request by the United States Senate Committee on Labor and Human Resources, the General Accounting Office (GAO) examined the methodological soundness of current population estimates of the number of homeless chronically mentally ill persons, and proposed several options for estimating the size of this population. The GAO reviewed…

  8. COPATH - a spreadsheet model for the estimation of carbon flows associated with the use of forest resources

    International Nuclear Information System (INIS)

    Makundi, W.; Sathaye, J.; Ketoff, A.

    1995-01-01

    The forest sector plays a key role in the global climate change process. A significant amount of net greenhouse gas emissions emanate from land use changes, and the sector offers a unique opportunity to sequester carbon in vegetation, detritus, soils and forest products. However, the estimates of carbon flows associated with the use of forest resources have been quite imprecise. This paper describes a methodological framework-COPATH-which is a spreadsheet model for estimating carbon emissions and sequestration from deforestation and harvesting of forests. The model has two parts, the first estimates carbon stocks, emissions and uptake in the base year, while the second part forecasts future emissions and the uptake under various scenarios. The forecast module is structured after the main modes of forest conversion, i.e. agriculture, pasture, forest harvesting and other land uses. The model can be used by countries which may not possess an abundance of pertinent data, and allows for the use of forest inventory data to estimate carbon stocks. The choice of the most likely scenario provides the country with a carbon flux profile necessary to formulate GHG mitigation strategies. (Author)

  9. Estimation of the transboundary economic impacts of the Grand Ethiopia Renaissance Dam: A Computable General Equilibrium Analysis

    NARCIS (Netherlands)

    Kahsay, T.N.; Kuik, O.J.; Brouwer, R.; van der Zaag, P.

    2015-01-01

    Employing a multi-region multi-sector computable general equilibrium (CGE) modeling framework, this study estimates the direct and indirect economic impacts of the Grand Ethiopian Renaissance Dam (GERD) on the Eastern Nile economies. The study contributes to the existing literature by evaluating the

  10. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  11. Computable error estimates for Monte Carlo finite element approximation of elliptic PDE with lognormal diffusion coefficients

    KAUST Repository

    Hall, Eric

    2016-01-09

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with lognormal distributed diffusion coefficients, e.g. modeling ground water flow. Typical models use lognormal diffusion coefficients with H´ older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. We address how the total error can be estimated by the computable error.

  12. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  14. Global Estimates of Errors in Quantum Computation by the Feynman-Vernon Formalism

    Science.gov (United States)

    Aurell, Erik

    2018-04-01

    The operation of a quantum computer is considered as a general quantum operation on a mixed state on many qubits followed by a measurement. The general quantum operation is further represented as a Feynman-Vernon double path integral over the histories of the qubits and of an environment, and afterward tracing out the environment. The qubit histories are taken to be paths on the two-sphere S^2 as in Klauder's coherent-state path integral of spin, and the environment is assumed to consist of harmonic oscillators initially in thermal equilibrium, and linearly coupled to to qubit operators \\hat{S}_z . The environment can then be integrated out to give a Feynman-Vernon influence action coupling the forward and backward histories of the qubits. This representation allows to derive in a simple way estimates that the total error of operation of a quantum computer without error correction scales linearly with the number of qubits and the time of operation. It also allows to discuss Kitaev's toric code interacting with an environment in the same manner.

  15. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    Science.gov (United States)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  16. Global Estimates of Errors in Quantum Computation by the Feynman-Vernon Formalism

    Science.gov (United States)

    Aurell, Erik

    2018-06-01

    The operation of a quantum computer is considered as a general quantum operation on a mixed state on many qubits followed by a measurement. The general quantum operation is further represented as a Feynman-Vernon double path integral over the histories of the qubits and of an environment, and afterward tracing out the environment. The qubit histories are taken to be paths on the two-sphere S^2 as in Klauder's coherent-state path integral of spin, and the environment is assumed to consist of harmonic oscillators initially in thermal equilibrium, and linearly coupled to to qubit operators \\hat{S}_z. The environment can then be integrated out to give a Feynman-Vernon influence action coupling the forward and backward histories of the qubits. This representation allows to derive in a simple way estimates that the total error of operation of a quantum computer without error correction scales linearly with the number of qubits and the time of operation. It also allows to discuss Kitaev's toric code interacting with an environment in the same manner.

  17. Estimation and change tendency of rape straw resource in Leshan

    Science.gov (United States)

    Guan, Qinlan; Gong, Mingfu

    2018-04-01

    Rape straw in Leshan area are rape stalks, including stems, leaves and pods after removing rapeseed. Leshan area is one of the main rape planting areas in Sichuan Province and rape planting area is large. Each year will produce a lot of rape straw. Based on the analysis of the trend of rapeseed planting area and rapeseed yield from 2008 to 2014, the change trend of rape straw resources in Leshan from 2008 to 2014 was analyzed and the decision-making reference was provided for resource utilization of rape straw. The results showed that the amount of rape straw resources in Leshan was very large, which was more than 100,000 tons per year, which was increasing year by year. By 2014, the amount of rape straw resources in Leshan was close to 200,000 tons.

  18. Inventory of Canadian marine renewable energy resources

    Energy Technology Data Exchange (ETDEWEB)

    Cornett, A. [National Research Council of Canada, Ottawa, ON (Canada). Canadian Hydraulics Centre; Tarbotton, M. [Triton Consultants Ltd., Vancouver, BC (Canada)

    2006-07-01

    The future development of marine renewable energy sources was discussed with reference to an inventory of both wave energy and tidal current resources in Canada. Canada is endowed with rich potential in wave energy resources which are spatially and temporally variable. The potential offshore resource is estimated at 37,000 MW in the Pacific and 145,000 MW in the Atlantic. The potential nearshore resource is estimated at 9,600 MW near the Queen Charlotte Islands, 9,400 MW near Vancouver Island, 1,000 MW near Sable Island, and 9,000 MW near southeast Newfoundland. It was noted that only a fraction of the potential wave energy resource is recoverable and further work is needed to delineate important local variations in energy potential close to shore. Canada also has rich potential in the tidal resource which is highly predictable and reliable. The resource is spatially and temporally variable, with 190 sites in Canada with an estimated 42,200 MW; 89 sites in British Columbia with an estimated 4,000 MW; and, 34 sites in Nunavut with an estimated 30,500 MW. It was also noted that only a fraction of the potential tidal resource is recoverable. It was suggested that the effects of energy extraction should be evaluated on a case-by-case basis for both wave and tidal energy. This presentation provided a site-by site inventory as well as an analysis of buoy measurements and results from wind-wave hindcasts and tide models. Future efforts will focus on wave modelling to define nearshore resources; tidal modelling to fill gaps and refine initial estimates; assessing impacts of energy extraction at leading sites; and developing a web-enabled atlas of marine renewable energy resources. The factors not included in this analysis were environmental impacts, technological developments, climate related factors, site location versus power grid demand, hydrogen economy developments and economic factors. tabs., figs.

  19. Exploiting opportunistic resources for ATLAS with ARC CE and the Event Service

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00226583; The ATLAS collaboration; Filipčič, Andrej; Guan, Wen; Tsulaia, Vakhtang; Walker, Rodney; Wenaus, Torre

    2017-01-01

    With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from the resources that comprise the Grid computing of most experiments, therefore exploiting these resources requires a change in strategy for the experiment. The resources may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The ARC CE with its non-intrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the Event Service primarily to address the issue of jobs that can be terminated at any point when opportunistic resources are needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in...

  20. Exploiting Opportunistic Resources for ATLAS with ARC CE and the Event Service

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2016-01-01

    With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from the resources that comprise the Grid computing of most experiments, therefore exploiting these resources requires a change in strategy for the experiment. The resources may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The ARC CE with its non-intrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the Event Service primarily to address the issue of jobs that can be terminated at any point when opportunistic resources are needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in...