WorldWideScience

Sample records for computing resource estimate

  1. Robust Wave Resource Estimation

    DEFF Research Database (Denmark)

    Lavelle, John; Kofoed, Jens Peter

    2013-01-01

    An assessment of the wave energy resource at the location of the Danish Wave Energy test Centre (DanWEC) is presented in this paper. The Wave Energy Converter (WEC) test centre is located at Hanstholm in the of North West Denmark. Information about the long term wave statistics of the resource...... is necessary for WEC developers, both to optimise the WEC for the site, and to estimate its average yearly power production using a power matrix. The wave height and wave period sea states parameters are commonly characterized with a bivariate histogram. This paper presents bivariate histograms and kernel....... An overview is given of the methods used to do this, and a method for identifying outliers of the wave elevation data, based on the joint distribution of wave elevations and accelerations, is presented. The limitations of using a JONSWAP spectrum to model the measured wave spectra as a function of Hm0 and T0...

  2. Book review: Mineral resource estimation

    Science.gov (United States)

    Mihalasky, Mark J.

    2016-01-01

    Mineral Resource Estimation is about estimating mineral resources at the scale of an ore deposit and is not to be mistaken with mineral resource assessment, which is undertaken at a significantly broader scale, even if similar data and geospatial/geostatistical methods are used. The book describes geological, statistical, and geostatistical tools and methodologies used in resource estimation and modeling, and presents case studies for illustration. The target audience is the expert, which includes professional mining geologists and engineers, as well as graduate-level and advanced undergraduate students.

  3. Estimation of potential uranium resources

    International Nuclear Information System (INIS)

    Curry, D.L.

    1977-09-01

    Potential estimates, like reserves, are limited by the information on hand at the time and are not intended to indicate the ultimate resources. Potential estimates are based on geologic judgement, so their reliability is dependent on the quality and extent of geologic knowledge. Reliability differs for each of the three potential resource classes. It is greatest for probable potential resources because of the greater knowledge base resulting from the advanced stage of exploration and development in established producing districts where most of the resources in this class are located. Reliability is least for speculative potential resources because no significant deposits are known, and favorability is inferred from limited geologic data. Estimates of potential resources are revised as new geologic concepts are postulated, as new types of uranium ore bodies are discovered, and as improved geophysical and geochemical techniques are developed and applied. Advances in technology that permit the exploitation of deep or low-grade deposits, or the processing of ores of previously uneconomic metallurgical types, also will affect the estimates

  4. Aggregated Computational Toxicology Online Resource

    Data.gov (United States)

    U.S. Environmental Protection Agency — Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data...

  5. Extrapolating phosphorus production to estimate resource reserves.

    Science.gov (United States)

    Vaccari, David A; Strigul, Nikolay

    2011-08-01

    Various indicators of resource scarcity and methods for extrapolating resource availability are examined for phosphorus. These include resource lifetime, and trends in resource price, ore grade and discovery rates, and Hubbert curve extrapolation. Several of these indicate increasing scarcity of phosphate resources. Calculated resource lifetime is subject to a number of caveats such as unanticipated future changes in resource discovery, mining and beneficiation technology, population growth or per-capita demand. Thus it should be used only as a rough planning index or as a relative indicator of potential scarcity. This paper examines the uncertainty in one method for estimating available resources from historical production data. The confidence intervals for the parameters and predictions of the Hubbert curves are computed as they relate to the amount of information available. These show that Hubbert-type extrapolations are not robust for predicting the ultimately recoverable reserves or year of peak production of phosphate rock. Previous successes of the Hubbert curve are for cases in which there exist alternative resources, which is not the situation for phosphate. It is suggested that data other than historical production, such as population growth, identified resources and economic factors, should be included in making such forecasts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  7. Cluster management of computing resources

    Directory of Open Access Journals (Sweden)

    Ruchkin V.N.

    2016-01-01

    Full Text Available The study suggests management of computing resources through set-theoretic clustering. The research proposes production modelling of knowledge — rules of managing explicit and fuzzy structures, according to their technical specifications: sustem productivity, number of computing modules, capacity of microprocessor memory, etc. The paper describes an algorithm of “evaluation of management criterion value”, according to the additive utility function.

  8. CERN Computing Resources Lifecycle Management

    International Nuclear Information System (INIS)

    Tselishchev, Alexey; Tedesco, Paolo; Ormancey, Emmanuel; Isnard, Christian

    2011-01-01

    Computing environments in High Energy Physics are typically complex and heterogeneous, with a wide variety of hardware resources, operating systems and applications. The research activity in all its aspects is carried out by international collaborations constituted by a growing number of participants with a high manpower turnover. These factors can increase the administrative workload required to manage the computing infrastructure and to track resource usage and inheritance. It is therefore necessary to rationalize and formalize the computing resources management, while respecting the requirement of flexibility of scientific applications and services. This paper shows how during the last years the CERN computing infrastructure has been moving in this direction, establishing well-defined policies and lifecycles for resource management. Applications are being migrated towards proposed common identity, authentication and authorization models, reducing their complexity while increasing security and usability. Regular tasks like the creation of primary user accounts are being automated, and self-service facilities are being introduced for common operations, like creation of additional accounts, group subscriptions and password reset. This approach is leading to more efficient and manageable systems.

  9. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  10. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  11. Computational Modelling of Collaborative Resources Sharing in ...

    African Journals Online (AJOL)

    In grid computing, Grid users who submit jobs or tasks and resources providers who provide resources have different motivations when they join the Grid system. However, due to autonomy both the Grid users' and resource providers' objectives often conflict. This paper proposes autonomous hybrid resource management ...

  12. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  13. Fast computation of distance estimators

    Directory of Open Access Journals (Sweden)

    Lagergren Jens

    2007-03-01

    Full Text Available Abstract Background Some distance methods are among the most commonly used methods for reconstructing phylogenetic trees from sequence data. The input to a distance method is a distance matrix, containing estimated pairwise distances between all pairs of taxa. Distance methods themselves are often fast, e.g., the famous and popular Neighbor Joining (NJ algorithm reconstructs a phylogeny of n taxa in time O(n3. Unfortunately, the fastest practical algorithms known for Computing the distance matrix, from n sequences of length l, takes time proportional to l·n2. Since the sequence length typically is much larger than the number of taxa, the distance estimation is the bottleneck in phylogeny reconstruction. This bottleneck is especially apparent in reconstruction of large phylogenies or in applications where many trees have to be reconstructed, e.g., bootstrapping and genome wide applications. Results We give an advanced algorithm for Computing the number of mutational events between DNA sequences which is significantly faster than both Phylip and Paup. Moreover, we give a new method for estimating pairwise distances between sequences which contain ambiguity Symbols. This new method is shown to be more accurate as well as faster than earlier methods. Conclusion Our novel algorithm for Computing distance estimators provides a valuable tool in phylogeny reconstruction. Since the running time of our distance estimation algorithm is comparable to that of most distance methods, the previous bottleneck is removed. All distance methods, such as NJ, require a distance matrix as input and, hence, our novel algorithm significantly improves the overall running time of all distance methods. In particular, we show for real world biological applications how the running time of phylogeny reconstruction using NJ is improved from a matter of hours to a matter of seconds.

  14. Enabling opportunistic resources for CMS Computing Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hufnagel, Dick [Fermilab

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  15. Resource estimations in contingency planning for FMD

    DEFF Research Database (Denmark)

    Boklund, Anette; Sten, Mortensen; Holm Johansen, Maren

    Based on results from a stochastic simulation model, it was possible to create a simple model in excel to estimate the requirements for personnel and materiel during an FMD outbreak in Denmark. The model can easily be adjusted, when new information on resources appears from management of other cr...

  16. Resource management in mobile computing environments

    CERN Document Server

    Mavromoustakis, Constandinos X; Mastorakis, George

    2014-01-01

    This book reports the latest advances on the design and development of mobile computing systems, describing their applications in the context of modeling, analysis and efficient resource management. It explores the challenges on mobile computing and resource management paradigms, including research efforts and approaches recently carried out in response to them to address future open-ended issues. The book includes 26 rigorously refereed chapters written by leading international researchers, providing the readers with technical and scientific information about various aspects of mobile computing, from basic concepts to advanced findings, reporting the state-of-the-art on resource management in such environments. It is mainly intended as a reference guide for researchers and practitioners involved in the design, development and applications of mobile computing systems, seeking solutions to related issues. It also represents a useful textbook for advanced undergraduate and graduate courses, addressing special t...

  17. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  18. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  19. Turning Video Resource Management into Cloud Computing

    Directory of Open Access Journals (Sweden)

    Weili Kou

    2016-07-01

    Full Text Available Big data makes cloud computing more and more popular in various fields. Video resources are very useful and important to education, security monitoring, and so on. However, issues of their huge volumes, complex data types, inefficient processing performance, weak security, and long times for loading pose challenges in video resource management. The Hadoop Distributed File System (HDFS is an open-source framework, which can provide cloud-based platforms and presents an opportunity for solving these problems. This paper presents video resource management architecture based on HDFS to provide a uniform framework and a five-layer model for standardizing the current various algorithms and applications. The architecture, basic model, and key algorithms are designed for turning video resources into a cloud computing environment. The design was tested by establishing a simulation system prototype.

  20. Offshore wind resource estimation for wind energy

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Mouche, A.

    2010-01-01

    Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite observati......Satellite remote sensing from active and passive microwave instruments is used to estimate the offshore wind resource in the Northern European Seas in the EU-Norsewind project. The satellite data include 8 years of Envisat ASAR, 10 years of QuikSCAT, and 23 years of SSM/I. The satellite...... observations are compared to selected offshore meteorological masts in the Baltic Sea and North Sea. The overall aim of the Norsewind project is a state-of-the-art wind atlas at 100 m height. The satellite winds are all valid at 10 m above sea level. Extrapolation to higher heights is a challenge. Mesoscale...... modeling of the winds at hub height will be compared to data from wind lidars observing at 100 m above sea level. Plans are also to compare mesoscale model results and satellite-based estimates of the offshore wind resource....

  1. Framework of Resource Management for Intercloud Computing

    Directory of Open Access Journals (Sweden)

    Mohammad Aazam

    2014-01-01

    Full Text Available There has been a very rapid increase in digital media content, due to which media cloud is gaining importance. Cloud computing paradigm provides management of resources and helps create extended portfolio of services. Through cloud computing, not only are services managed more efficiently, but also service discovery is made possible. To handle rapid increase in the content, media cloud plays a very vital role. But it is not possible for standalone clouds to handle everything with the increasing user demands. For scalability and better service provisioning, at times, clouds have to communicate with other clouds and share their resources. This scenario is called Intercloud computing or cloud federation. The study on Intercloud computing is still in its start. Resource management is one of the key concerns to be addressed in Intercloud computing. Already done studies discuss this issue only in a trivial and simplistic way. In this study, we present a resource management model, keeping in view different types of services, different customer types, customer characteristic, pricing, and refunding. The presented framework was implemented using Java and NetBeans 8.0 and evaluated using CloudSim 3.0.3 toolkit. Presented results and their discussion validate our model and its efficiency.

  2. Are local wind power resources well estimated?

    Science.gov (United States)

    Lundtang Petersen, Erik; Troen, Ib; Jørgensen, Hans E.; Mann, Jakob

    2013-03-01

    Planning and financing of wind power installations require very importantly accurate resource estimation in addition to a number of other considerations relating to environment and economy. Furthermore, individual wind energy installations cannot in general be seen in isolation. It is well known that the spacing of turbines in wind farms is critical for maximum power production. It is also well established that the collective effect of wind turbines in large wind farms or of several wind farms can limit the wind power extraction downwind. This has been documented by many years of production statistics. For the very large, regional sized wind farms, a number of numerical studies have pointed to additional adverse changes to the regional wind climate, most recently by the detailed studies of Adams and Keith [1]. They show that the geophysical limit to wind power production is likely to be lower than previously estimated. Although this problem is of far future concern, it has to be considered seriously. In their paper they estimate that a wind farm larger than 100 km2 is limited to about 1 W m-2. However, a 20 km2 off shore farm, Horns Rev 1, has in the last five years produced 3.98 W m-2 [5]. In that light it is highly unlikely that the effects pointed out by [1] will pose any immediate threat to wind energy in coming decades. Today a number of well-established mesoscale and microscale models exist for estimating wind resources and design parameters and in many cases they work well. This is especially true if good local data are available for calibrating the models or for their validation. The wind energy industry is still troubled by many projects showing considerable negative discrepancies between calculated and actually experienced production numbers and operating conditions. Therefore it has been decided on a European Union level to launch a project, 'The New European Wind Atlas', aiming at reducing overall uncertainties in determining wind conditions. The

  3. Optimised resource construction for verifiable quantum computation

    International Nuclear Information System (INIS)

    Kashefi, Elham; Wallden, Petros

    2017-01-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph. (paper)

  4. Automating usability of ATLAS Distributed Computing resources

    CERN Document Server

    "Tupputi, S A; The ATLAS collaboration

    2013-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic exclusion/recovery of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources who feature non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes SAM (Site Availability Test) site-by-site SRM tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites.\

  5. LHCb Computing Resource usage in 2017

    CERN Document Server

    Bozzi, Concezio

    2018-01-01

    This document reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2017. The data in the following sections have been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  6. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  7. LHCb Computing Resource usage in 2015 (II)

    CERN Document Server

    Bozzi, Concezio

    2016-01-01

    This documents reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2015. The data in the following sections has been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  8. Parallel visualization on leadership computing resources

    International Nuclear Information System (INIS)

    Peterka, T; Ross, R B; Shen, H-W; Ma, K-L; Kendall, W; Yu, H

    2009-01-01

    Changes are needed in the way that visualization is performed, if we expect the analysis of scientific data to be effective at the petascale and beyond. By using similar techniques as those used to parallelize simulations, such as parallel I/O, load balancing, and effective use of interprocess communication, the supercomputers that compute these datasets can also serve as analysis and visualization engines for them. Our team is assessing the feasibility of performing parallel scientific visualization on some of the most powerful computational resources of the U.S. Department of Energy's National Laboratories in order to pave the way for analyzing the next generation of computational results. This paper highlights some of the conclusions of that research.

  9. Automating usability of ATLAS Distributed Computing resources

    Science.gov (United States)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  10. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  11. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  12. NMRbox: A Resource for Biomolecular NMR Computation.

    Science.gov (United States)

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  13. Contract on using computer resources of another

    Directory of Open Access Journals (Sweden)

    Cvetković Mihajlo

    2016-01-01

    Full Text Available Contractual relations involving the use of another's property are quite common. Yet, the use of computer resources of others over the Internet and legal transactions arising thereof certainly diverge from the traditional framework embodied in the special part of contract law dealing with this issue. Modern performance concepts (such as: infrastructure, software or platform as high-tech services are highly unlikely to be described by the terminology derived from Roman law. The overwhelming novelty of high-tech services obscures the disadvantageous position of contracting parties. In most cases, service providers are global multinational companies which tend to secure their own unjustified privileges and gain by providing lengthy and intricate contracts, often comprising a number of legal documents. General terms and conditions in these service provision contracts are further complicated by the '.service level agreement', rules of conduct and (nonconfidentiality guarantees. Without giving the issue a second thought, users easily accept the pre-fabricated offer without reservations, unaware that such a pseudo-gratuitous contract actually conceals a highly lucrative and mutually binding agreement. The author examines the extent to which the legal provisions governing sale of goods and services, lease, loan and commodatum may apply to 'cloud computing' contracts, and analyses the scope and advantages of contractual consumer protection, as a relatively new area in contract law. The termination of a service contract between the provider and the user features specific post-contractual obligations which are inherent to an online environment.

  14. Internal Clock Drift Estimation in Computer Clusters

    Directory of Open Access Journals (Sweden)

    Hicham Marouani

    2008-01-01

    Full Text Available Most computers have several high-resolution timing sources, from the programmable interrupt timer to the cycle counter. Yet, even at a precision of one cycle in ten millions, clocks may drift significantly in a single second at a clock frequency of several GHz. When tracing the low-level system events in computer clusters, such as packet sending or reception, each computer system records its own events using an internal clock. In order to properly understand the global system behavior and performance, as reported by the events recorded on each computer, it is important to estimate precisely the clock differences and drift between the different computers in the system. This article studies the clock precision and stability of several computer systems, with different architectures. It also studies the typical network delay characteristics, since time synchronization algorithms rely on the exchange of network packets and are dependent on the symmetry of the delays. A very precise clock, based on the atomic time provided by the GPS satellite network, was used as a reference to measure clock drifts and network delays. The results obtained are of immediate use to all applications which depend on computer clocks or network time synchronization accuracy.

  15. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  16. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  17. LHCb Computing Resources: 2019 requests and reassessment of 2018 requests

    CERN Document Server

    Bozzi, Concezio

    2017-01-01

    This document presents the computing resources needed by LHCb in 2019 and a reassessment of the 2018 requests, as resulting from the current experience of Run2 data taking and minor changes in the LHCb computing model parameters.

  18. Estimation of Reminant Gold Resources of Old Underground ...

    African Journals Online (AJOL)

    The lack of geological control and old underground development at AngloGold Ashanti (Bibiani mine) in Ghana was a major setback for the conventional methods of resource estimation to produce accurate estimates of the deposit. Consequently, inverse distance weighting of indicator variables were used. This estimation ...

  19. Integration of Cloud resources in the LHCb Distributed Computing

    CERN Document Server

    Ubeda Garcia, Mario; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keepin...

  20. Estimating near-shore wind resources

    DEFF Research Database (Denmark)

    Floors, Rogier Ralph; Hahmann, Andrea N.; Peña, Alfredo

    An evaluation and sensitivity study using the WRF mesoscale model to estimate the wind in a coastal area is performed using a unique data set consisting of scanning, profiling and floating lidars. The ability of the WRF model to represent the wind speed was evaluated by running the model for a four...... RMSE and correlation coefficient. Using a finer grid spacing of 1 and 0.5 km did not give better results and sensitivity to the input of different SST and land cover data in the RUNE area was small. The difference in mean wind speed between all simulations over a region 80 km around the RUNE area were...... month period in twelve different set-ups. The atmospheric boundary layer was parametrized using the first-order YSU scheme and the 1.5-order MYJ scheme. Simulations with two sources of land use data, two sources of reanalysis data, two sources of sea-surface temperatures and three different horizontal...

  1. Meta-analysis of non-renewable energy resource estimates

    International Nuclear Information System (INIS)

    Dale, Michael

    2012-01-01

    This paper offers a review of estimates of ultimately recoverable resources (URR) of non-renewable energy sources: coal, conventional and unconventional oil, conventional and unconventional gas, and uranium for nuclear fission. There is a large range in the estimates of many of the energy sources, even those that have been utilized for a long time and, as such, should be well understood. If it is assumed that the estimates for each resource are normally distributed, then the total value of ultimately recoverable fossil and fissile energy resources is 70,592 EJ. If, on the other hand, the best fitting distribution from each of the resource estimate populations is used, a the total value is 50,702 EJ, a factor of around 30% smaller. - Highlights: ► Brief introduction to categorization of resources. ► Collated over 380 estimates of ultimately recoverable global resources for all non-renewable energy sources. ► Extensive statistical analysis and distribution fitting conducted. ► Cross-energy source comparison of resource magnitudes.

  2. Data estimation and prediction for natural resources public data

    Science.gov (United States)

    Hans T. Schreuder; Robin M. Reich

    1998-01-01

    A key product of both Forest Inventory and Analysis (FIA) of the USDA Forest Service and the Natural Resources Inventory (NRI) of the Natural Resources Conservation Service is a scientific data base that should be defensible in court. Multiple imputation procedures (MIPs) have been proposed both for missing value estimation and prediction of non-remeasured cells in...

  3. Resource management in utility and cloud computing

    CERN Document Server

    Zhao, Han

    2013-01-01

    This SpringerBrief reviews the existing market-oriented strategies for economically managing resource allocation in distributed systems. It describes three new schemes that address cost-efficiency, user incentives, and allocation fairness with regard to different scheduling contexts. The first scheme, taking the Amazon EC2? market as a case of study, investigates the optimal resource rental planning models based on linear integer programming and stochastic optimization techniques. This model is useful to explore the interaction between the cloud infrastructure provider and the cloud resource c

  4. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  5. Wind resource estimation and siting of wind turbines

    DEFF Research Database (Denmark)

    Lundtang Petersen, Erik; Mortensen, N.G.; Landberg, L.

    1994-01-01

    Detailed knowledge of the characteristics of the natural wind is necessary for the design, planning and operational aspect of wind energy systems. Here, we shall only be concerned with those meteorological aspects of wind energy planning that are termed wind resource estimation. The estimation...... of the wind resource ranges from the overall estimation of the mean energy content of the wind over a large area - called regional assessment - to the prediction of the average yearly energy production of a specific wind turbine at a specific location - called siting. A regional assessment will most often...

  6. Study on the Computational Estimation Performance and Computational Estimation Attitude of Elementary School Fifth Graders in Taiwan

    Science.gov (United States)

    Tsao, Yea-Ling; Pan, Ting-Rung

    2011-01-01

    Main purpose of this study is to investigate what level of computational estimation performance is possessed by fifth graders and explore computational estimation attitude towards fifth graders. Two hundred and thirty-five Grade-5 students from four elementary schools in Taipei City were selected for "Computational Estimation Test" and…

  7. Resource Provisioning in SLA-Based Cluster Computing

    Science.gov (United States)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  8. Estimation of economic parameters of U.S. hydropower resources

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Douglas G. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Hunt, Richard T. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Reeves, Kelly S. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL); Carroll, Greg R. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2003-06-01

    Tools for estimating the cost of developing and operating and maintaining hydropower resources in the form of regression curves were developed based on historical plant data. Development costs that were addressed included: licensing, construction, and five types of environmental mitigation. It was found that the data for each type of cost correlated well with plant capacity. A tool for estimating the annual and monthly electric generation of hydropower resources was also developed. Additional tools were developed to estimate the cost of upgrading a turbine or a generator. The development and operation and maintenance cost estimating tools, and the generation estimating tool were applied to 2,155 U.S. hydropower sites representing a total potential capacity of 43,036 MW. The sites included totally undeveloped sites, dams without a hydroelectric plant, and hydroelectric plants that could be expanded to achieve greater capacity. Site characteristics and estimated costs and generation for each site were assembled in a database in Excel format that is also included within the EERE Library under the title, “Estimation of Economic Parameters of U.S. Hydropower Resources - INL Hydropower Resource Economics Database.”

  9. Security Aspects in Resource Management Systems in Distributed Computing Environments

    Directory of Open Access Journals (Sweden)

    Adamski Marcin

    2017-12-01

    Full Text Available In many distributed computing systems, aspects related to security are getting more and more relevant. Security is ubiquitous and could not be treated as a separated problem or a challenge. In our opinion it should be considered in the context of resource management in distributed computing environments like Grids and Clouds, e.g. scheduled computations can be much delayed because of cyber-attacks, inefficient infrastructure or users valuable and sensitive data can be stolen even in the process of correct computation. To prevent such cases there is a need to introduce new evaluation metrics for resource management that will represent the level of security of computing resources and more broadly distributed computing infrastructures. In our approach, we have introduced a new metric called reputation, which simply determines the level of reliability of computing resources from the security perspective and could be taken into account during scheduling procedures. The new reputation metric is based on various relevant parameters regarding cyber-attacks (also energy attacks, administrative activities such as security updates, bug fixes and security patches. Moreover, we have conducted various computational experiments within the Grid Scheduling Simulator environment (GSSIM inspired by real application scenarios. Finally, our experimental studies of new resource management approaches taking into account critical security aspects are also discussed in this paper.

  10. Data-centric computing on distributed resources

    NARCIS (Netherlands)

    Cushing, R.S.

    2015-01-01

    Distributed computing has always been a challenge due to the NP-completeness of finding optimal underlying management routines. The advent of big data increases the dimensionality of the problem whereby data partitionability, processing complexity and locality play a crucial role in the

  11. Human Resource Management, Computers, and Organization Theory.

    Science.gov (United States)

    Garson, G. David

    In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…

  12. Standardizing Access to Computer-Based Medical Resources

    Science.gov (United States)

    Cimino, Christopher; Barnett, G. Octo

    1990-01-01

    Methods of using computer-based medical resources efficiently have previously required either the user to manage the choice of resource and terms, or specialized programming to access each individual resource. Standardized descriptions of what resources can do and how they may be accessed would allow the creation of an interface for multiple resources. This interface would assist a user in formulating queries, accessing the resources and managing the results. This paper describes a working Interactive Query Workstation (IQW). The IQW allows users to query multiple resources: a medical knowledge base (DXplain*), a clinical database (COSTAR/MQL*), a bibliographic database (MEDLINE*), a cancer database (PDQ*), and a drug interaction database (PDR*). The IQW has evolved from requiring alteration of resource code to using off-the-shelf products (Kappa* & Microsoft® Windows) to control resources. Descriptions of each resource were developed to allow IQW to access these resources. There are three components to these descriptions; information on how data is sent and received from a resource, information on types of queries to which a resource can respond, and information on what types of information are needed to execute a query. These components form the basis of a standard description of resources.

  13. Decentralized Resource Management in Distributed Computer Systems.

    Science.gov (United States)

    1982-02-01

    Interprocess Communication 14 2.3.2.5 Decentralized Resource Management 15 2.3.3 MicroNet 16 * 2.3.3.1 System Goals and Objectives 16 2.3.3.2 Physical...executive level) is moderately low. 16 Background 2.3.3 MicroNet 2.3.3.1 System Goals and Objectives MicroNet [47] was designed to support multiple...tolerate the loss of nodes, allow for a wide variety of interconnect topologies, and adapt to dynamic variations in loading. The designers of MicroNet

  14. Speculative resources of uranium. A review of International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983

    International Nuclear Information System (INIS)

    1983-01-01

    On a country by country basis the International Uranium Resources Evaluation Project (IUREP) estimates 1982-1983 are reviewed. Information provided includes exploration work, airborne survey, radiometric survey, gamma-ray spectrometric survey, estimate of speculative resources, uranium occurrences, uranium deposits, uranium mineralization, agreements for uranium exploration, feasibilities studies, geological classification of resources, proposed revised resource range, production estimate of uranium

  15. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  16. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  17. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  18. Estimation of Gold Resources from Exploration Drilling using ...

    African Journals Online (AJOL)

    Estimation of gold resources from exploration drilling has passed through various phases and methods at the AngloGold Ashanti, Iduapriem Mine Limited, Tarkwa in Ghana. From the use of Inverse Distance Weighting (IDW) to the use of Ordinary Kriging (OK) and currently the use of Uniform Conditioning (UC). This is all ...

  19. Estimation of Total Tree Height from Renewable Resources Evaluation Data

    Science.gov (United States)

    Charles E. Thomas

    1981-01-01

    Many ecological, biological, and genetic studies use the measurement of total tree height. Until recently, the Southern Forest Experiment Station's inventory procedures through Renewable Resources Evaluation (RRE) have not included total height measurements. This note provides equations to estimate total height based on other RRE measurements.

  20. Wind Resource Estimation using QuikSCAT Ocean Surface Winds

    DEFF Research Database (Denmark)

    Xu, Qing; Zhang, Guosheng; Cheng, Yongcun

    2011-01-01

    In this study, the offshore wind resources in the East China Sea and South China Sea were estimated from over ten years of QuikSCAT scatterometer wind products. Since the errors of these products are larger close to the coast due to the land contamination of radar backscatter signal and the compl...

  1. Wind Resource Estimation using QuikSCAT Ocean Surface Winds

    DEFF Research Database (Denmark)

    Xu, Qing; Zhang, Guosheng; Cheng, Yongcun

    2011-01-01

    In this study, the offshore wind resources in the East China Sea and South China Sea were estimated from over ten years of QuikSCAT scatterometer wind products. Since the errors of these products are larger close to the coast due to the land contamination of radar backscatter signal...

  2. Integration of cloud resources in the LHCb distributed computing

    International Nuclear Information System (INIS)

    García, Mario Úbeda; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel; Muñoz, Víctor Méndez

    2014-01-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) – instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  3. Integration of Cloud resources in the LHCb Distributed Computing

    Science.gov (United States)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  4. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  5. Estimating uranium resources and production. A guide to future supply

    International Nuclear Information System (INIS)

    Taylor, D.M.; Haeussermann, W.

    1983-01-01

    Nuclear power can only continue to grow if sufficient fuel, uranium, is available. Concern has been expressed that, in the not too distant future, the supply of uranium may be inadequate to meet reactor development. This will not be the case. Uranium production capability, actual and planned, is the main indicator of short- and medium-term supply. However, for the longer term, uranium resource estimates and projections of the possible rate of production from the resource base are important. Once an estimate has been made of the resources contained in a deposit, several factors influence the decision to produce the uranium and also the rates at which the uranium can be produced. The effect of these factors, which include uranium market trends and ever increasing lead times from discovery to production, must be taken into account when making projections of future production capability and before comparing these with forecasts of future uranium requirements. The uranium resource base has developed over the last two decades mainly in response to dramatically changing projections of natural uranium requirements. A study of this development and the changes in production, together with the most recent data, shows that in the short- and medium-term, production from already discovered resources should be sufficient to cover any likely reactor requirements. Studies such as those undertaken during the International Uranium Resources Evaluation Project, and others which project future discovery rates and production, are supported by past experience in resource development in showing that uranium supply could continue to meet demand until well into the next century. The uranium supply potential has lessened the need for the early large-scale global introduction of the breeder reactor

  6. Dynamic computing resource allocation in online flood monitoring and prediction

    Science.gov (United States)

    Kuchar, S.; Podhoranyi, M.; Vavrik, R.; Portero, A.

    2016-08-01

    This paper presents tools and methodologies for dynamic allocation of high performance computing resources during operation of the Floreon+ online flood monitoring and prediction system. The resource allocation is done throughout the execution of supported simulations to meet the required service quality levels for system operation. It also ensures flexible reactions to changing weather and flood situations, as it is not economically feasible to operate online flood monitoring systems in the full performance mode during non-flood seasons. Different service quality levels are therefore described for different flooding scenarios, and the runtime manager controls them by allocating only minimal resources currently expected to meet the deadlines. Finally, an experiment covering all presented aspects of computing resource allocation in rainfall-runoff and Monte Carlo uncertainty simulation is performed for the area of the Moravian-Silesian region in the Czech Republic.

  7. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  8. The Computational Estimation and Instructional Perspectives of Elementary School Teachers

    Science.gov (United States)

    Tsao, Yea-Ling; Pan, Ting-Rung

    2013-01-01

    The purpose of this study is to investigate teachers' understanding and knowledge of computational estimation, and teaching practice toward to computational estimation. There are six fifth-grade elementary teachers who participated in this study; three teachers with mathematics/ science major and three teachers with non-mathematics/science major.…

  9. Dose estimation for paediatric cranial computed tomography

    International Nuclear Information System (INIS)

    Curci Daros, K.A.; Bitelli Medeiros, R.; Curci Daros, K.A.; Oliveira Echeimberg, J. de

    2006-01-01

    In the last ten years, the number of paediatric computed tomography (CT) scans have increased worldwide, contributing to higher population radiation dose. Technique diversification in paediatrics and different CT equipment technologies have led to various exposure levels complicating precise evaluation of doses and operational conditions necessary for good quality images. The objective of this study was to establish a quantitative relationship between absorbed dose and cranial region in children up to 6 years old undergoing CT exams. Methods: X-ray was measured on the cranial surface of 64 patients undergoing CT using thermoluminescent (T.L.) dosimeters. Forty T.L.D.100 thermoluminescent dosimeters (T.L.D.) were evenly distributed on each patients skin surface along the sagittal axis. Measurements were performed in facial regions exposed to scatter radiation and in the supratentorial and posterior fossa regions, submitted to primary radiation. T.L.D. were calibrated for 120 kV X-ray over the acrylic phantom. T.L. measurements were made with a Harshaw 4000 system. Patient mean T.L. readings were determined for position, pi, of T.L.D. and normalized to the maximum supratentorial reading. From integrating the linear T.L. density function (?) resulting from radiation distribution in each of the three exposed regions, dose fraction was determined in the region of interest, along with total dose under the technical conditions used in that specific exam protocol. For each T.L.D. position along the patient cranium, there were n T.L. measurements with 2% uncertainty due to T.L. reader, and 5% due to thermal treatment of dosimeters. Also, mean T.L. readings and their uncertainties were calculated for each patient at each position, p. Results: Mean linear T.L. density for the region exposed to secondary radiation defined by position, 0.3≤p≤6 cm, was ρ((p)=7.9(4)x10 -2 +7(5)x10 -5 p 4.5(4) cm -1 ; exposed to primary X-ray for the posterior fossa region defined by position

  10. Dose estimation for paediatric cranial computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Curci Daros, K.A.; Bitelli Medeiros, R. [Sao Paulo Univ. Federal (Brazil); Curci Daros, K.A.; Oliveira Echeimberg, J. de [Centro Univ. Sao Camilo, Sao Paulo (Brazil)

    2006-07-01

    In the last ten years, the number of paediatric computed tomography (CT) scans have increased worldwide, contributing to higher population radiation dose. Technique diversification in paediatrics and different CT equipment technologies have led to various exposure levels complicating precise evaluation of doses and operational conditions necessary for good quality images. The objective of this study was to establish a quantitative relationship between absorbed dose and cranial region in children up to 6 years old undergoing CT exams. Methods: X-ray was measured on the cranial surface of 64 patients undergoing CT using thermoluminescent (T.L.) dosimeters. Forty T.L.D.100 thermoluminescent dosimeters (T.L.D.) were evenly distributed on each patients skin surface along the sagittal axis. Measurements were performed in facial regions exposed to scatter radiation and in the supratentorial and posterior fossa regions, submitted to primary radiation. T.L.D. were calibrated for 120 kV X-ray over the acrylic phantom. T.L. measurements were made with a Harshaw 4000 system. Patient mean T.L. readings were determined for position, pi, of T.L.D. and normalized to the maximum supratentorial reading. From integrating the linear T.L. density function (?) resulting from radiation distribution in each of the three exposed regions, dose fraction was determined in the region of interest, along with total dose under the technical conditions used in that specific exam protocol. For each T.L.D. position along the patient cranium, there were n T.L. measurements with 2% uncertainty due to T.L. reader, and 5% due to thermal treatment of dosimeters. Also, mean T.L. readings and their uncertainties were calculated for each patient at each position, p. Results: Mean linear T.L. density for the region exposed to secondary radiation defined by position, 0.3{<=}p{<=}6 cm, was {rho}((p)=7.9(4)x10{sup -2}+7(5)x10{sup -5}p{sup 4.5(4)} cm{sup -1}; exposed to primary X-ray for the posterior fossa

  11. Using OSG Computing Resources with (iLC)Dirac

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Petric, Marko

    2017-01-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called 'SiteDirectors', which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional sitespecific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were develo...

  12. Integration of Openstack cloud resources in BES III computing cluster

    Science.gov (United States)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  13. Estimation of wind and solar resources in Mali

    Energy Technology Data Exchange (ETDEWEB)

    Badger, J.; Kamissoko, F.; Olander Rasmussen, M.; Larsen, Soeren; Guidon, N.; Boye Hansen, L.; Dewilde, L.; Alhousseini, M.; Noergaard, P.; Nygaard, I.

    2012-11-15

    The wind resource has been estimated for all of Mali at 7.5 km resolution using the KAMM/WAsP numerical wind atlas methodology. Three domains were used to cover entire country and three sets of wind classes used to capture change in large scale forcing over country. The final output includes generalized climate statistics for any location in Mali, giving wind direction and wind speed distribution. The modelled generalized climate statistics can be used directly in the WAsP software. The preliminary results show a wind resource, which is relatively low, but which under certain conditions may be economically feasible, i.e. at favourably exposed sites, giving enhanced winds, and where practical utilization is possible, given consideration to grid connection or replacement or augmentation of diesel-based electricity systems. The solar energy resource for Mali was assessed for the period between July 2008 and June 2011 using a remote sensing based estimate of the down-welling surface shortwave flux. The remote sensing estimates were adjusted on a month-by-month basis to account for seasonal differences between the remote sensing estimates and in situ data. Calibration was found to improve the coefficient of determination as well as decreasing the mean error both for the calibration and validation data. Compared to the results presented in the ''Renewable energy resources in Mali - preliminary mapping''-report that showed a tendency for underestimation compared to data from the NASA PPOWER/SSE database, the presented results show a very good agreement with the in situ data (after calibration) with no significant bias. Unfortunately, the NASA-database only contains data up until 2005, so a similar comparison could not be done for the time period analyzed in this study, although the agreement with the historic NASA data is still useful as reference. (LN)

  14. Using OSG Computing Resources with (iLC)Dirac

    Science.gov (United States)

    Sailer, A.; Petric, M.; CLICdp collaboration

    2017-10-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.

  15. Computer-aided resource planning and scheduling for radiological services

    Science.gov (United States)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  16. State of the art on wind resource estimation

    Energy Technology Data Exchange (ETDEWEB)

    Maribo Pedersen, B.

    1998-12-31

    With the increasing number of wind resource estimation studies carried out for regions, countries and even larger areas all over the world, the IEA finds that the time has come to stop and take stock of the various methods used in these studies. The IEA would therefore like to propose an Experts Meeting on wind resource estimation. The Experts Meeting should describe the models and databases used in the various studies. It should shed light on the strengths and shortcomings of the models and answer questions like: where and under what circumstances should a specific model be used? what is the expected accuracy of the estimate of the model? and what is the applicability? When addressing databases the main goal will be to identify the content and scope of these. Further, the quality, availability and reliability of the databases must also be recognised. In the various studies of wind resources the models and databases have been combined in different ways. A final goal of the Experts Meeting is to see whether it is possible to develop systems of methods which would depend on the available input. These systems of methods should be able to address the simple case (level 0) of a region with barely no data, to the complex case of a region with all available measurements: surface observations, radio soundings, satellite observations and so on. The outcome of the meeting should be an inventory of available models as well as databases and a map of already studied regions. (au)

  17. Estimation of lung growth using computed tomography

    NARCIS (Netherlands)

    P.A. de Jong (Pim); Y. Nakano (Yasutaka); M.H. Lequin (Maarten); P.J.F.M. Merkus (Peter); H.A.W.M. Tiddens (Harm); J.C. Hogg (James); H.O. Coxson (Harvey)

    2003-01-01

    textabstractAnatomical studies suggest that normal lungs grow by rapid alveolar addition until about 2 yrs of age followed by a gradual increase in alveolar dimensions. The aim of this study was to examine the hypothesis that normal lung growth can be monitored by computed

  18. Estimating Computer-Based Training Development Times

    Science.gov (United States)

    1987-10-14

    Sonia Gunderson Scientific Systems, Inc. ARI Field Unit at Fort Knox, Kentucky Donald F. Haggard, Chief Training Research Laboratory Jack H. Hiller ...formative eva ]ati’ __ programming routines writing lessons computer dowr,-t: e programming lessons _ learning content meetings video production

  19. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  20. A Genetic Programming infrastructure profiting from public computation resources

    Energy Technology Data Exchange (ETDEWEB)

    Chavez de la O, F.; Rubio del Solar, M.; Guisado, J. L.; Lombrana Gonzalez, D.; Cardenas Montes, M.; Fernandez de la Vega, F.

    2007-07-01

    In this article an experience of the utilization of PRC (Public Resource Computation) in research projects that needs large quantities of CPU time is presented. We have developed a distributed architecture based on middle ware BOINC and LilGP Genetic Programming tool. In order to run LilGP applications under BOINC platforms, some core LilGP functions has been adapted to BOINC requirements. We have used a classic GP problem known as the artificial ANT in Santa Fe Trail. Some computers from a classroom were used acting as clients, proving that they can be used for scientific computation in conjunction with their primary uses. (Author)

  1. Estimation of intermediate grade uranium resources. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Kendall, G.R.; Klahn, L.J.; Davis, J.C.; Harbaugh, J.W.

    1980-12-01

    The purpose of this project is to analyze the technique currently used by DOE to estimate intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) and, if possible, suggest alternatives to improve the accuracy and precision of the estimate. There are three principal conclusions resulting from this study. They relate to the quantity, distribution and sampling of intermediate grade uranium. While the results of this study must be validated further, they indicate that DOE may be underestimating intermediate level reserves by 20 to 30%. Plots of grade of U 3 O 8 versus tonnage of ore and tonnage U 3 O 8 indicate grade-tonnage relationships that are essentially log-linear, at least down to 0.01% U 3 O 8 . Though this is not an unexpected finding, it may provide a technique for reducing the uncertainty of intermediate grade endowment. The results of this study indicate that a much lower drill hole density is necessary for DOE to estimate uranium resources than for a mining company to calculate ore resources. Though errors in local estimates will occur, they will tend to cancel over the entire deposit

  2. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  3. Can the Teachers' Creativity Overcome Limited Computer Resources?

    Science.gov (United States)

    Nikolov, Rumen; Sendova, Evgenia

    1988-01-01

    Describes experiences of the Research Group on Education (RGE) at the Bulgarian Academy of Sciences and the Ministry of Education in using limited computer resources when teaching informatics. Topics discussed include group projects; the use of Logo; ability grouping; and out-of-class activities, including publishing a pupils' magazine. (13…

  4. GridFactory - Distributed computing on ephemeral resources

    DEFF Research Database (Denmark)

    Orellana, Frederik; Niinimaki, Marko

    2011-01-01

    A novel batch system for high throughput computing is presented. The system is specifically designed to leverage virtualization and web technology to facilitate deployment on cloud and other ephemeral resources. In particular, it implements a security model suited for forming collaborations...

  5. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    OpenAIRE

    Cirasella, Jill

    2009-01-01

    This article is an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news.

  6. Computing Resource And Work Allocations Using Social Profiles

    Directory of Open Access Journals (Sweden)

    Peter Lavin

    2013-01-01

    Full Text Available If several distributed and disparate computer resources exist, many of whichhave been created for different and diverse reasons, and several large scale com-puting challenges also exist with similar diversity in their backgrounds, then oneproblem which arises in trying to assemble enough of these resources to addresssuch challenges is the need to align and accommodate the different motivationsand objectives which may lie behind the existence of both the resources andthe challenges. Software agents are offered as a mainstream technology formodelling the types of collaborations and relationships needed to do this. Asan initial step towards forming such relationships, agents need a mechanism toconsider social and economic backgrounds. This paper explores addressing so-cial and economic differences using a combination of textual descriptions knownas social profiles and search engine technology, both of which are integrated intoan agent technology.

  7. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  8. Integration of computation and testing for reliability estimation

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2001-01-01

    This paper develops a methodology to integrate reliability testing and computational reliability analysis for product development. The presence of information uncertainty such as statistical uncertainty and modeling error is incorporated. The integration of testing and computation leads to a more cost-efficient estimation of failure probability and life distribution than the tests-only approach currently followed by the industry. A Bayesian procedure is proposed to quantify the modeling uncertainty using random parameters, including the uncertainty in mechanical and statistical model selection and the uncertainty in distribution parameters. An adaptive method is developed to determine the number of tests needed to achieve a desired confidence level in the reliability estimates, by combining prior computational prediction and test data. Two kinds of tests -- failure probability estimation and life estimation -- are considered. The prior distribution and confidence interval of failure probability in both cases are estimated using computational reliability methods, and are updated using the results of tests performed during the product development phase

  9. Computing Bounds on Resource Levels for Flexible Plans

    Science.gov (United States)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  10. Towards a Resource Reservation Approach for an Opportunistic Computing Environment

    International Nuclear Information System (INIS)

    Gomes, Eliza; Dantas, M A R

    2014-01-01

    Advanced reservation has been used in grid environments to provide quality of service (QoS) and to guarantee resources available at the execution time. However, in grid subtypes, such as opportunistic grid computing, it is a challenge provides QoS and guarantee of availability resources. In this article, we propose a new advanced reservation approach which offers to a user the possibility to select resources in advance for a future utilization. Therefore, the main goal of this proposal is to offer a best effort feature to a user from an opportunistic configuration. In these types of environments, it is not possible to provide QoS, because, usually, there are no guarantees of resources availability and, consequently, the execution of users applications. In addition, this research work provides a way to organize executions, what it can improve the scheduling and system operations. Experimental results, carried out through a case study, shown the efficiency and relevance of our proposal

  11. Resource pools: an abstraction for configurable computing codesign

    Science.gov (United States)

    Peterson, James B.; Athanas, Peter M.

    1996-10-01

    The utility of configurable computing platforms has been demonstrated and documented for a wide variety of applications. Retargeting an application to custom computing machines (CCMs) has been shown to accelerate execution speeds with respect to execution on a sequential, general- purpose processor. Unfortunately, these platforms have proven to be rather difficult to program when compared to contemporary general-purpose platforms. Retargeting applications is non-trivial, due to the lack of design tools which work at a high level and consider all available computational units in the target architecture. To make configurable computing accessible to a wide user base, high- level entry tools -- preferably targeted toward familiar programming environments -- are needed. Also, in order to target a wide variety of custom computing machines, such tools cannot depend on a particular, fixed, architectural configuration. This paper introduces resource pools as an abstraction of general computing devices which provides a homogeneous description of FPGAs, ASICs, CPUs, or even an entire network of workstations. Also presented is an architecture-independent design tool which accepts a target architecture's description as a collection of resource pools, and partitions a program written in a high-level language onto that architecture, effectively synthesizing a hardware description for the FPGA portions of A CCM, and a software description for any attached CPUs.

  12. Estimation of subcriticality with the computed value

    Energy Technology Data Exchange (ETDEWEB)

    Naito, Yoshitaka; Arakawa, Takuya; Sakurai, Kiyoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1995-07-01

    A new evaluation method of neutron multiplication factor is proposed on theory of calculation error indirect estimation method. The theory and its application to neutron source multiplication method are described in chapter 2, verification test based on experimental data of JAERI critical assembly TCA is described in chapter 3. This method is applied to neutron multiplication factor calculated by continuous energy Monte Carlo code MCNP-4A and the results are compared with the result evaluated by exponential experiment method. It is concluded that the both results are consistent with each other. (author).

  13. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    International Nuclear Information System (INIS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie

    2014-01-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  14. Pemanfaatan Cloud Computing untuk Enterprise Resources Planning di Indonesia

    Directory of Open Access Journals (Sweden)

    Asniar Asniar

    2015-05-01

    Full Text Available Cloud Computing adalah sebuah topik yang telah mendapatkan momentum dalam beberapa tahun terakhir. ERP (Enterprise Resource Planning adalah struktur sistem informasi yang digunakan untuk mengintegrasikan proses bisnis dalam perusahaan manufaktur/jasa yang meliputi operasional dan distribusi produk yang dihasilkan. Selama ini, aplikasi ERP dan cloud computing berjalan sendiri-sendiri, belum pernah ada yang menyatukannya untuk kepentingan bisnis perusahaan. Aplikasi ERP untuk Cloud Computing (ERP Cloud akan menguntungkan perusahaan kecil karena dapat menurunkan hambatan umum terutama berkaitan dengan biaya investasi yang besar untuk penggunaan Sistem ERP pada umumnya. Usaha Kecil Menengah (UKM dapat memiliki akses penuh atas sistem ERP yang sesuai dengan proses bisnis perusahaan tanpa perlu membeli keseluruhan aplikasi ERP atau menyewa konsultan TI yang mahal. Dengan kondisi Indonesia yang masih didominasi oleh perusahaan dengan skala kecil menengah maka di Indonesia sangat diperlukan pemanfaatan jasa could computing untuk ERP perusahaan sehingga semakin banyak perusahaan yang memanfaatkan aplikasi ERP dengan biaya yang lebih ringan.

  15. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  16. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  17. Distributed Data Mining using a Public Resource Computing Framework

    Science.gov (United States)

    Cesario, Eugenio; de Caria, Nicola; Mastroianni, Carlo; Talia, Domenico

    The public resource computing paradigm is often used as a successful and low cost mechanism for the management of several classes of scientific and commercial applications that require the execution of a large number of independent tasks. Public computing frameworks, also known as “Desktop Grids”, exploit the computational power and storage facilities of private computers, or “workers”. Despite the inherent decentralized nature of the applications for which they are devoted, these systems often adopt a centralized mechanism for the assignment of jobs and distribution of input data, as is the case for BOINC, the most popular framework in this realm. We present a decentralized framework that aims at increasing the flexibility and robustness of public computing applications, thanks to two basic features: (i) the adoption of a P2P protocol for dynamically matching the job specifications with the worker characteristics, without relying on centralized resources; (ii) the use of distributed cache servers for an efficient dissemination and reutilization of data files. This framework is exploitable for a wide set of applications. In this work, we describe how a Java prototype of the framework was used to tackle the problem of mining frequent itemsets from a transactional dataset, and show some preliminary yet interesting performance results that prove the efficiency improvements that can derive from the presented architecture.

  18. Parallelized reliability estimation of reconfigurable computer networks

    Science.gov (United States)

    Nicol, David M.; Das, Subhendu; Palumbo, Dan

    1990-01-01

    A parallelized system, ASSURE, for computing the reliability of embedded avionics flight control systems which are able to reconfigure themselves in the event of failure is described. ASSURE accepts a grammar that describes a reliability semi-Markov state-space. From this it creates a parallel program that simultaneously generates and analyzes the state-space, placing upper and lower bounds on the probability of system failure. ASSURE is implemented on a 32-node Intel iPSC/860, and has achieved high processor efficiencies on real problems. Through a combination of improved algorithms, exploitation of parallelism, and use of an advanced microprocessor architecture, ASSURE has reduced the execution time on substantial problems by a factor of one thousand over previous workstation implementations. Furthermore, ASSURE's parallel execution rate on the iPSC/860 is an order of magnitude faster than its serial execution rate on a Cray-2 supercomputer. While dynamic load balancing is necessary for ASSURE's good performance, it is needed only infrequently; the particular method of load balancing used does not substantially affect performance.

  19. Common accounting system for monitoring the ATLAS Distributed Computing resources

    CERN Document Server

    Karavakis, E; The ATLAS collaboration; Campana, S; Gayazov, S; Jezequel, S; Saiz, P; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  20. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  1. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  2. Ridge: a computer program for calculating ridge regression estimates

    Science.gov (United States)

    Donald E. Hilt; Donald W. Seegrist

    1977-01-01

    Least-squares coefficients for multiple-regression models may be unstable when the independent variables are highly correlated. Ridge regression is a biased estimation procedure that produces stable estimates of the coefficients. Ridge regression is discussed, and a computer program for calculating the ridge coefficients is presented.

  3. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of nonlinear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  4. Parameter Estimation for a Computable General Equilibrium Model

    DEFF Research Database (Denmark)

    Arndt, Channing; Robinson, Sherman; Tarp, Finn

    2002-01-01

    We introduce a maximum entropy approach to parameter estimation for computable general equilibrium (CGE) models. The approach applies information theory to estimating a system of non-linear simultaneous equations. It has a number of advantages. First, it imposes all general equilibrium constraints...

  5. Multicriteria Resource Brokering in Cloud Computing for Streaming Service

    Directory of Open Access Journals (Sweden)

    Chih-Lun Chou

    2015-01-01

    Full Text Available By leveraging cloud computing such as Infrastructure as a Service (IaaS, the outsourcing of computing resources used to support operations, including servers, storage, and networking components, is quite beneficial for various providers of Internet application. With this increasing trend, resource allocation that both assures QoS via Service Level Agreement (SLA and avoids overprovisioning in order to reduce cost becomes a crucial priority and challenge in the design and operation of complex service-based platforms such as streaming service. On the other hand, providers of IaaS also concern their profit performance and energy consumption while offering these virtualized resources. In this paper, considering both service-oriented and infrastructure-oriented criteria, we regard this resource allocation problem as Multicriteria Decision Making problem and propose an effective trade-off approach based on goal programming model. To validate its effectiveness, a cloud architecture for streaming application is addressed and extensive analysis is performed for related criteria. The results of numerical simulations show that the proposed approach strikes a balance between these conflicting criteria commendably and achieves high cost efficiency.

  6. Handling uncertainty in quantitative estimates in integrated resource planning

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Wagner, C.G. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Mathematics

    1995-01-01

    This report addresses uncertainty in Integrated Resource Planning (IRP). IRP is a planning and decisionmaking process employed by utilities, usually at the behest of Public Utility Commissions (PUCs), to develop plans to ensure that utilities have resources necessary to meet consumer demand at reasonable cost. IRP has been used to assist utilities in developing plans that include not only traditional electricity supply options but also demand-side management (DSM) options. Uncertainty is a major issue for IRP. Future values for numerous important variables (e.g., future fuel prices, future electricity demand, stringency of future environmental regulations) cannot ever be known with certainty. Many economically significant decisions are so unique that statistically-based probabilities cannot even be calculated. The entire utility strategic planning process, including IRP, encompasses different types of decisions that are made with different time horizons and at different points in time. Because of fundamental pressures for change in the industry, including competition in generation, gone is the time when utilities could easily predict increases in demand, enjoy long lead times to bring on new capacity, and bank on steady profits. The purpose of this report is to address in detail one aspect of uncertainty in IRP: Dealing with Uncertainty in Quantitative Estimates, such as the future demand for electricity or the cost to produce a mega-watt (MW) of power. A theme which runs throughout the report is that every effort must be made to honestly represent what is known about a variable that can be used to estimate its value, what cannot be known, and what is not known due to operational constraints. Applying this philosophy to the representation of uncertainty in quantitative estimates, it is argued that imprecise probabilities are superior to classical probabilities for IRP.

  7. The complexity of computing the MCD-estimator

    DEFF Research Database (Denmark)

    Bernholt, T.; Fischer, Paul

    2004-01-01

    estimators of location and scatter. The complexity of computing the MCD, however, was unknown and generally thought to be exponential even if the dimensionality of the data is fixed. Here we present a polynomial time algorithm for MCD for fixed dimension of the data. In contrast we show that computing......In modem statistics the robust estimation of parameters is a central problem, i.e., an estimation that is not or only slightly affected by outliers in the data. The minimum covariance determinant (MCD) estimator (J. Amer. Statist. Assoc. 79 (1984) 871) is probably one of the most important robust...... the MCD-estimator is NP-hard if the dimension varies. (C) 2004 Elsevier B.V. All rights reserved....

  8. Cost-Benefit Analysis of Computer Resources for Machine Learning

    Science.gov (United States)

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  9. Geology and mineral and energy resources, Roswell Resource Area, New Mexico; an interactive computer presentation

    Science.gov (United States)

    Tidball, Ronald R.; Bartsch-Winkler, S. B.

    1995-01-01

    This Compact Disc-Read Only Memory (CD-ROM) contains a program illustrating the geology and mineral and energy resources of the Roswell Resource Area, an administrative unit of the U.S. Bureau of Land Management in east-central New Mexico. The program enables the user to access information on the geology, geochemistry, geophysics, mining history, metallic and industrial mineral commodities, hydrocarbons, and assessments of the area. The program was created with the display software, SuperCard, version 1.5, by Aldus. The program will run only on a Macintosh personal computer. This CD-ROM was produced in accordance with Macintosh HFS standards. The program was developed on a Macintosh II-series computer with system 7.0.1. The program is a compiled, executable form that is nonproprietary and does not require the presence of the SuperCard software.

  10. Allocating Tactical High-Performance Computer (HPC) Resources to Offloaded Computation in Battlefield Scenarios

    Science.gov (United States)

    2013-12-01

    algorithm for efficient resource discovery 7. Conclusions In order to enable battlefield computation on mobile devices a complete theory of distributed...India, January 2012. 47. Kaur, G.; Jain, V.; Chaba, Y. Wormhole Attacks: Performance Evaluation of On-Demand Routing Protocols in Mobile Ad-Hoc

  11. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  12. Complexity estimates based on integral transforms induced by computational units

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2012-01-01

    Roč. 33, September (2012), s. 160-167 ISSN 0893-6080 R&D Projects: GA ČR GAP202/11/1368 Institutional research plan: CEZ:AV0Z10300504 Institutional support: RVO:67985807 Keywords : neural networks * estimates of model complexity * approximation from a dictionary * integral transforms * norms induced by computational units Subject RIV: IN - Informatics, Computer Science Impact factor: 1.927, year: 2012

  13. Mobile devices and computing cloud resources allocation for interactive applications

    Directory of Open Access Journals (Sweden)

    Krawczyk Henryk

    2017-06-01

    Full Text Available Using mobile devices such as smartphones or iPads for various interactive applications is currently very common. In the case of complex applications, e.g. chess games, the capabilities of these devices are insufficient to run the application in real time. One of the solutions is to use cloud computing. However, there is an optimization problem of mobile device and cloud resources allocation. An iterative heuristic algorithm for application distribution is proposed. The algorithm minimizes the energy cost of application execution with constrained execution time.

  14. Model of Dynamic Management of Telecommunication and Computer Resources

    Directory of Open Access Journals (Sweden)

    Alla E. Goryushkina

    2016-01-01

    Full Text Available The document identified the problem and developed a model of the dynamic management of telecommunication and computer resources. The present level of development of information and telecommunication technologies, the improvement of communication and their integration into the high-performance human-machine systems administration cause the creation of a single information and telecommunication space of mobile units. The work is devoted to consideration of bases of research of a set of tasks of tactical management of information security. Presents direct and inverse problems and solution methods.

  15. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  16. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  17. A Computationally Efficient Method for Polyphonic Pitch Estimation

    Directory of Open Access Journals (Sweden)

    Ruohua Zhou

    2009-01-01

    Full Text Available This paper presents a computationally efficient method for polyphonic pitch estimation. The method employs the Fast Resonator Time-Frequency Image (RTFI as the basic time-frequency analysis tool. The approach is composed of two main stages. First, a preliminary pitch estimation is obtained by means of a simple peak-picking procedure in the pitch energy spectrum. Such spectrum is calculated from the original RTFI energy spectrum according to harmonic grouping principles. Then the incorrect estimations are removed according to spectral irregularity and knowledge of the harmonic structures of the music notes played on commonly used music instruments. The new approach is compared with a variety of other frame-based polyphonic pitch estimation methods, and results demonstrate the high performance and computational efficiency of the approach.

  18. Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment

    Science.gov (United States)

    1976-10-01

    A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...

  19. Computational Error Estimate for the Power Series Solution of Odes ...

    African Journals Online (AJOL)

    This paper compares the error estimation of power series solution with recursive Tau method for solving ordinary differential equations. From the computational viewpoint, the power series using zeros of Chebyshevpolunomial is effective, accurate and easy to use. Keywords: Lanczos Tau method, Chebyshev polynomial, ...

  20. Procedures for parameter estimates of computational models for localized failure

    NARCIS (Netherlands)

    Iacono, C.

    2007-01-01

    In the last years, many computational models have been developed for tensile fracture in concrete. However, their reliability is related to the correct estimate of the model parameters, not all directly measurable during laboratory tests. Hence, the development of inverse procedures is needed, that

  1. Addressing Computational Estimation in the Kuwaiti Curriculum: Teachers' Views

    Science.gov (United States)

    Alajmi, Amal Hussain

    2009-01-01

    Computational estimation has not yet established a place in the Kuwaiti national curriculum. An attempt was made to include it during the early 1990s, but it was dropped by the Kuwaiti Ministry of Education because of the difficulties teachers had teaching it. In an effort to provide guidance for reintroducing the concept into the curriculum, this…

  2. Cloud storage and computing resources for the UNAVCO SAR Archive

    Science.gov (United States)

    Baker, S.; Crosby, C. J.; Meertens, C. M.

    2016-12-01

    UNAVCO is a non-profit university-governed consortium that operates the National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The synthetic aperture radar (SAR) archive at UNAVCO currently provides access to over 70TB of unprocessed data for community geoscience research. Historically, users have downloaded data and performed InSAR processing on local machines. However, given the increasing volumes of SAR data available and the size of an individual scene, this model may be inefficient. As cloud computing has become more mainstream, UNAVCO has begun developing capabilities to provide data and processing resources in the same location. The test environment is using the Texas Advanced Computing Center (TACC), part of the NSF Extreme Science and Engineering Discovery Environment (XSEDE). The entire UNAVCO SAR archive is available at TACC along with virtual machines preconfigured with InSAR processing software. Users will be able to quickly access and process SAR data, providing a scalable computing environment for more efficient and larger scale analyzes by the UNAVCO WInSAR community.

  3. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    Science.gov (United States)

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  4. Estimation of feline renal volume using computed tomography and ultrasound.

    Science.gov (United States)

    Tyson, Reid; Logsdon, Stacy A; Werre, Stephen R; Daniel, Gregory B

    2013-01-01

    Renal volume estimation is an important parameter for clinical evaluation of kidneys and research applications. A time efficient, repeatable, and accurate method for volume estimation is required. The purpose of this study was to describe the accuracy of ultrasound and computed tomography (CT) for estimating feline renal volume. Standardized ultrasound and CT scans were acquired for kidneys of 12 cadaver cats, in situ. Ultrasound and CT multiplanar reconstructions were used to record renal length measurements that were then used to calculate volume using the prolate ellipsoid formula for volume estimation. In addition, CT studies were reconstructed at 1 mm, 5 mm, and 1 cm, and transferred to a workstation where the renal volume was calculated using the voxel count method (hand drawn regions of interest). The reference standard kidney volume was then determined ex vivo using water displacement with the Archimedes' principle. Ultrasound measurement of renal length accounted for approximately 87% of the variability in renal volume for the study population. The prolate ellipsoid formula exhibited proportional bias and underestimated renal volume by a median of 18.9%. Computed tomography volume estimates using the voxel count method with hand-traced regions of interest provided the most accurate results, with increasing accuracy for smaller voxel sizes in grossly normal kidneys (-10.1 to 0.6%). Findings from this study supported the use of CT and the voxel count method for estimating feline renal volume in future clinical and research studies. © 2012 Veterinary Radiology & Ultrasound.

  5. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  6. On robust parameter estimation in brain–computer interfacing

    Science.gov (United States)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain–computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  7. Shale Gas Boom or Bust? Estimating US and Global Economically Recoverable Resources

    Science.gov (United States)

    Brecha, R. J.; Hilaire, J.; Bauer, N.

    2014-12-01

    One of the most disruptive energy system technological developments of the past few decades is the rapid expansion of shale gas production in the United States. Because the changes have been so rapid there are great uncertainties as to the impacts of shale production for medium- and long-term energy and climate change mitigation policies. A necessary starting point for incorporating shale resources into modeling efforts is to understand the size of the resource, how much is technically recoverable (TRR), and finally, how much is economically recoverable (ERR) at a given cost. To assess production costs of shale gas, we combine top-down data with detailed bottom-up information. Studies solely based on top-down approaches do not adequately account for the heterogeneity of shale gas deposits and are unlikely to appropriately estimate extraction costs. We design an expedient bottom-up method based on publicly available US data to compute the levelized costs of shale gas extraction. Our results indicate the existence of economically attractive areas but also reveal a dramatic cost increase as lower-quality reservoirs are exploited. Extrapolating results for the US to the global level, our best estimate suggests that, at a cost of 6 US$/GJ, only 39% of the technically recoverable resources reported in top-down studies should be considered economically recoverable. This estimate increases to about 77% when considering optimistic TRR and estimated ultimate recovery parameters but could be lower than 12% for more pessimistic parameters. The current lack of information on the heterogeneity of shale gas deposits as well as on the development of future production technologies leads to significant uncertainties regarding recovery rates and production costs. Much of this uncertainty may be inherent, but for energy system planning purposes, with or without climate change mitigation policies, it is crucial to recognize the full ranges of recoverable quantities and costs.

  8. Sex Estimation From Sternal Measurements Using Multidetector Computed Tomography

    Science.gov (United States)

    Ekizoglu, Oguzhan; Hocaoglu, Elif; Inci, Ercan; Bilgili, Mustafa Gokhan; Solmaz, Dilek; Erdil, Irem; Can, Ismail Ozgur

    2014-01-01

    Abstract We aimed to show the utility and reliability of sternal morphometric analysis for sex estimation. Sex estimation is a very important step in forensic identification. Skeletal surveys are main methods for sex estimation studies. Morphometric analysis of sternum may provide high accuracy rated data in sex discrimination. In this study, morphometric analysis of sternum was evaluated in 1 mm chest computed tomography scans for sex estimation. Four hundred forty 3 subjects (202 female, 241 male, mean age: 44 ± 8.1 [distribution: 30–60 year old]) were included the study. Manubrium length (ML), mesosternum length (2L), Sternebra 1 (S1W), and Sternebra 3 (S3W) width were measured and also sternal index (SI) was calculated. Differences between genders were evaluated by student t-test. Predictive factors of sex were determined by discrimination analysis and receiver operating characteristic (ROC) analysis. Male sternal measurement values are significantly higher than females (P computed tomography analysis of sternum might provide important information for sex estimation. PMID:25501090

  9. Modeling of Groundwater Resources Heavy Metals Concentration Using Soft Computing Methods: Application of Different Types of Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Meysam Alizamir

    2017-09-01

    Full Text Available Nowadays, groundwater resources play a vital role as a source of drinking water in arid and semiarid regions and forecasting of pollutants content in these resources is very important. Therefore, this study aimed to compare two soft computing methods for modeling Cd, Pb and Zn concentration in groundwater resources of Asadabad Plain, Western Iran. The relative accuracy of several soft computing models, namely multi-layer perceptron (MLP and radial basis function (RBF for forecasting of heavy metals concentration have been investigated. In addition, Levenberg-Marquardt, gradient descent and conjugate gradient training algorithms were utilized for the MLP models. The ANN models for this study were developed using MATLAB R 2014 Software program. The MLP performs better than the other models for heavy metals concentration estimation. The simulation results revealed that MLP model was able to model heavy metals concentration in groundwater resources favorably. It generally is effectively utilized in environmental applications and in the water quality estimations. In addition, out of three algorithms, Levenberg-Marquardt was better than the others were. This study proposed soft computing modeling techniques for the prediction and estimation of heavy metals concentration in groundwater resources of Asadabad Plain. Based on collected data from the plain, MLP and RBF models were developed for each heavy metal. MLP can be utilized effectively in applications of prediction of heavy metals concentration in groundwater resources of Asadabad Plain.

  10. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    Science.gov (United States)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  11. Computer model for estimating electric utility environmental noise

    International Nuclear Information System (INIS)

    Teplitzky, A.M.; Hahn, K.J.

    1991-01-01

    This paper reports on a computer code for estimating environmental noise emissions from the operation and the construction of electric power plants that was developed based on algorithms. The computer code (Model) is used to predict octave band sound power levels for power plant operation and construction activities on the basis of the equipment operating characteristics and calculates off-site sound levels for each noise source and for an entire plant. Estimated noise levels are presented either as A-weighted sound level contours around the power plant or as octave band levels at user defined receptor locations. Calculated sound levels can be compared with user designated noise criteria, and the program can assist the user in analyzing alternative noise control strategies

  12. Computer-assisted estimating for the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Spooner, J.E.

    1976-02-01

    An analysis is made of the cost estimating system currently in use at the Los Alamos Scientific Laboratory (LASL) and the benefits of computer assistance are evaluated. A computer-assisted estimating system (CAE) is proposed for LASL. CAE can decrease turnaround and provide more flexible response to management requests for cost information and analyses. It can enhance value optimization at the design stage, improve cost control and change-order justification, and widen the use of cost information in the design process. CAE costs are not well defined at this time although they appear to break even with present operations. It is recommended that a CAE system description be submitted for contractor consideration and bid while LASL system development continues concurrently

  13. A resource-sharing model based on a repeated game in fog computing.

    Science.gov (United States)

    Sun, Yan; Zhang, Nan

    2017-03-01

    With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  14. Chest X ray effective doses estimation in computed radiography

    International Nuclear Information System (INIS)

    Abdalla, Esra Abdalrhman Dfaalla

    2013-06-01

    Conventional chest radiography is technically difficult because of wide in tissue attenuations in the chest and limitations of screen-film systems. Computed radiography (CR) offers a different approach utilizing a photostimulable phosphor. photostimulable phosphors overcome some image quality limitations of chest imaging. The objective of this study was to estimate the effective dose in computed radiography at three hospitals in Khartoum. This study has been conducted in radiography departments in three centres Advanced Diagnostic Center, Nilain Diagnostic Center, Modern Diagnostic Center. The entrance surface dose (ESD) measurement was conducted for quality control of x-ray machines and survey of operators experimental techniques. The ESDs were measured by UNFORS dosimeter and mathematical equations to estimate patient doses during chest X rays. A total of 120 patients were examined in three centres, among them 62 were males and 58 were females. The overall mean and range of patient dosed was 0.073±0.037 (0.014-0.16) mGy per procedure while the effective dose was 3.4±01.7 (0.6-7.0) mSv per procedure. This study compared radiation doses to patients radiographic examinations of chest using computed radiology. The radiation dose was measured in three centres in Khartoum- Sudan. The results of the measured effective dose showed that the dose in chest radiography was lower in computed radiography compared to previous studies.(Author)

  15. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  16. Computationally Efficient and Noise Robust DOA and Pitch Estimation

    DEFF Research Database (Denmark)

    Karimian-Azari, Sam; Jensen, Jesper Rindom; Christensen, Mads Græsbøll

    2016-01-01

    signals are often contaminated by different types of noise, which challenges the assumption of white Gaussian noise in most state-of-the-art methods. We establish filtering methods based on noise statistics to apply to nonparametric spectral and spatial parameter estimates of the harmonics. We design...... a joint DOA and pitch estimator. In white Gaussian noise, we derive even more computationally efficient solutions which are designed using the narrowband power spectrum of the harmonics. Numerical results reveal the performance of the estimators in colored noise compared with the Cram\\'{e}r-Rao lower...... bound. Experiments on real-life signals indicate the applicability of the methods in practical low local signal-to-noise ratios....

  17. Estimating solar resources in Mexico using cloud cover data

    Energy Technology Data Exchange (ETDEWEB)

    Renne, David; George, Ray; Brady, Liz; Marion, Bill [National Renewable Energy Laboratory, Colorado (United States); Estrada Cajigal, Vicente [Cuernavaca, Morelos (Mexico)

    2000-07-01

    This paper presents the results of applying the National Renewable Energy Laboratory's (NREL) Climatological Solar Radiation (CSR) model to Mexico to develop solar resource data. A major input to the CSR model is a worldwide surface and satellite-derived cloud cover database, called the Real Time Nephanalysis (RTNEPH). The RTNEPH is developed by the U.S. Air Force and distributed by the U.S. National Climatic Data Center. The RTNEPH combines routine ground-based cloud cover observations made every three hours at national weather centers throughout the world with satellite-derived cloud cover information developed from polar orbiting weather satellites. The data are geospatially digitized so that multilayerd cloud cover information is available on a grid of approximately 40-km to a side. The development of this database is an ongoing project that now covers more than twenty years of observations. For the North America analysis (including Mexico) we used an 8-year summarized histogram of the RTNEPH that provides monthly average cloud cover information for the period 1985-1992. The CSR model also accounts for attenuation of the solar beam due to aerosols, atmospheric trace gases, and water vapor. The CSR model outputs monthly average direct normal, global horizontal and diffuse solar information for each of the 40-km grid cells. From this information it is also possible to produce solar resource estimates for various solar collector types and orientations, such as flat plate collectors oriented at latitude tilt, or concentrating solar power collectors. Model results are displayed using Geographic Information System software. CSR model results for Mexico are presented here, along with a discussion of earlier solar resource assessment studies for Mexico, where both modeling approaches and measurement analyses have been used. [Spanish] Este articulo presenta los resultados de aplicar el modelo Radiacion Solar Climatologica CSR del NREL (National Renewable Energy

  18. Statistically and Computationally Efficient Estimating Equations for Large Spatial Datasets

    KAUST Repository

    Sun, Ying

    2014-11-07

    For Gaussian process models, likelihood based methods are often difficult to use with large irregularly spaced spatial datasets, because exact calculations of the likelihood for n observations require O(n3) operations and O(n2) memory. Various approximation methods have been developed to address the computational difficulties. In this paper, we propose new unbiased estimating equations based on score equation approximations that are both computationally and statistically efficient. We replace the inverse covariance matrix that appears in the score equations by a sparse matrix to approximate the quadratic forms, then set the resulting quadratic forms equal to their expected values to obtain unbiased estimating equations. The sparse matrix is constructed by a sparse inverse Cholesky approach to approximate the inverse covariance matrix. The statistical efficiency of the resulting unbiased estimating equations are evaluated both in theory and by numerical studies. Our methods are applied to nearly 90,000 satellite-based measurements of water vapor levels over a region in the Southeast Pacific Ocean.

  19. Resources

    Science.gov (United States)

    ... Colon cancer - resources Cystic fibrosis - resources Depression - resources Diabetes - resources Digestive disease - resources Drug abuse - resources Eating disorders - resources Elder care - resources Epilepsy - resources Family ...

  20. A study of computer graphics technology in application of communication resource management

    Science.gov (United States)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  1. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  2. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  3. Peat resource estimation in South Carolina. Final report, Year 2

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, M.; Andrejko, M.; Corvinus, D.; Tisdale, M.

    1982-01-01

    South Carolina has few indigenous energy resources. Most widely known and utilized are hydropower, wood, and solar. Peat is a material composed of partially decomposed organic matter that, after burial for long periods of time, may eventually become coal. Peat is utilized as an energy resource for the production of electricity and for home heating in Europe and the Soviet Union. There are peat deposits in South Carolina, but peat has never been used as an energy resource within the state. This report presents the results of the two years of a planned four-year study of the quantity and energy potential of peat in South Carolina. In this year's survey two activities were undertaken. The first was to visit highly probable peat deposits to confirm the presence of fuel-grade peat. The second was to survey and characterize in more detail the areas judged to be of highest potential as major resources. The factors carrying the greatest weight in our determination of priority areas were: (1) a description of peat deposits in the scientific literature or from discussions with state and federal soil scientists; (2) mention of organic soils on soil maps or in the literature; and (3) information from farmers and other local citizens.

  4. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    Science.gov (United States)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  5. Estimating the Wind Resource in Uttarakhand: Comparison of Dynamic Downscaling with Doppler Lidar Wind Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Lundquist, J. K. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pukayastha, A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Martin, C. [Univ. of Colorado, Boulder, CO (United States); Newsom, R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    Previous estimates of the wind resources in Uttarakhand, India, suggest minimal wind resources in this region. To explore whether or not the complex terrain in fact provides localized regions of wind resource, the authors of this study employed a dynamic down scaling method with the Weather Research and Forecasting model, providing detailed estimates of winds at approximately 1 km resolution in the finest nested simulation.

  6. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  7. Decision support for hospital bed management using adaptable individual length of stay estimations and shared resources.

    Science.gov (United States)

    Schmidt, Robert; Geisler, Sandra; Spreckelsen, Cord

    2013-01-07

    Elective patient admission and assignment planning is an important task of the strategic and operational management of a hospital and early on became a central topic of clinical operations research. The management of hospital beds is an important subtask. Various approaches have been proposed, involving the computation of efficient assignments with regard to the patients' condition, the necessity of the treatment, and the patients' preferences. However, these approaches are mostly based on static, unadaptable estimates of the length of stay and, thus, do not take into account the uncertainty of the patient's recovery. Furthermore, the effect of aggregated bed capacities have not been investigated in this context. Computer supported bed management, combining an adaptable length of stay estimation with the treatment of shared resources (aggregated bed capacities) has not yet been sufficiently investigated. The aim of our work is: 1) to define a cost function for patient admission taking into account adaptable length of stay estimations and aggregated resources, 2) to define a mathematical program formally modeling the assignment problem and an architecture for decision support, 3) to investigate four algorithmic methodologies addressing the assignment problem and one base-line approach, and 4) to evaluate these methodologies w.r.t. cost outcome, performance, and dismissal ratio. The expected free ward capacity is calculated based on individual length of stay estimates, introducing Bernoulli distributed random variables for the ward occupation states and approximating the probability densities. The assignment problem is represented as a binary integer program. Four strategies for solving the problem are applied and compared: an exact approach, using the mixed integer programming solver SCIP; and three heuristic strategies, namely the longest expected processing time, the shortest expected processing time, and random choice. A baseline approach serves to compare these

  8. SYSTEMATIC LITERATURE REVIEW ON RESOURCE ALLOCATION AND RESOURCE SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    B. Muni Lavanya; C. Shoba Bindu

    2016-01-01

    The objective the work is intended to highlight the key features and afford finest future directions in the research community of Resource Allocation, Resource Scheduling and Resource management from 2009 to 2016. Exemplifying how research on Resource Allocation, Resource Scheduling and Resource management has progressively increased in the past decade by inspecting articles, papers from scientific and standard publications. Survey materialized in three-fold process. Firstly, investigate on t...

  9. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    Science.gov (United States)

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  10. Parameter estimation and model selection in computational biology.

    Directory of Open Access Journals (Sweden)

    Gabriele Lillacci

    2010-03-01

    Full Text Available A central challenge in computational modeling of biological systems is the determination of the model parameters. Typically, only a fraction of the parameters (such as kinetic rate constants are experimentally measured, while the rest are often fitted. The fitting process is usually based on experimental time course measurements of observables, which are used to assign parameter values that minimize some measure of the error between these measurements and the corresponding model prediction. The measurements, which can come from immunoblotting assays, fluorescent markers, etc., tend to be very noisy and taken at a limited number of time points. In this work we present a new approach to the problem of parameter selection of biological models. We show how one can use a dynamic recursive estimator, known as extended Kalman filter, to arrive at estimates of the model parameters. The proposed method follows. First, we use a variation of the Kalman filter that is particularly well suited to biological applications to obtain a first guess for the unknown parameters. Secondly, we employ an a posteriori identifiability test to check the reliability of the estimates. Finally, we solve an optimization problem to refine the first guess in case it should not be accurate enough. The final estimates are guaranteed to be statistically consistent with the measurements. Furthermore, we show how the same tools can be used to discriminate among alternate models of the same biological process. We demonstrate these ideas by applying our methods to two examples, namely a model of the heat shock response in E. coli, and a model of a synthetic gene regulation system. The methods presented are quite general and may be applied to a wide class of biological systems where noisy measurements are used for parameter estimation or model selection.

  11. Professional Computer Education Organizations--A Resource for Administrators.

    Science.gov (United States)

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  12. Discovery of resources using MADM approaches for parallel and distributed computing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2017-06-01

    Full Text Available Grid, a form of parallel and distributed computing, allows the sharing of data and computational resources among its users from various geographical locations. The grid resources are diverse in terms of their underlying attributes. The majority of the state-of-the-art resource discovery techniques rely on the static resource attributes during resource selection. However, the matching resources based on the static resource attributes may not be the most appropriate resources for the execution of user applications because they may have heavy job loads, less storage space or less working memory (RAM. Hence, there is a need to consider the current state of the resources in order to find the most suitable resources. In this paper, we have proposed a two-phased multi-attribute decision making (MADM approach for discovery of grid resources by using P2P formalism. The proposed approach considers multiple resource attributes for decision making of resource selection and provides the best suitable resource(s to grid users. The first phase describes a mechanism to discover all matching resources and applies SAW method to shortlist the top ranked resources, which are communicated to the requesting super-peer. The second phase of our proposed methodology applies integrated MADM approach (AHP enriched PROMETHEE-II on the list of selected resources received from different super-peers. The pairwise comparison of the resources with respect to their attributes is made and the rank of each resource is determined. The top ranked resource is then communicated to the grid user by the grid scheduler. Our proposed methodology enables the grid scheduler to allocate the most suitable resource to the user application and also reduces the search complexity by filtering out the less suitable resources during resource discovery.

  13. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  14. Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples

    Science.gov (United States)

    Ahlbrandt, T.S.; Klett, T.R.

    2005-01-01

    Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods

  15. Early estimation of resource expenditures and program size

    Science.gov (United States)

    Card, D.

    1983-01-01

    A substantial amount of software engineering research effort was focused on the development of software estimation models. A measure of lines of code was derived based on the origin of the delivered code, L sub dev = N + E + 0.2 (S + O) (where L sub dev = developed lines of code, N = newly implemented lines of code, E = extensively modified lines of code, S = slightly modified lines of code, and O = old unchanged lines of code) that is substituted in the following equation H sub s = aL sup b (where H sub s = staff-hours of effort, L = lines of code, a = a constant, and b = a constant). The limitations of this model are discussed and some alternative estimation models that can be used earlier in the development process are proposed.

  16. South African uranium resource and production capability estimates

    International Nuclear Information System (INIS)

    Camisani-Calzolari, F.A.G.M.; Toens, P.D.

    1980-09-01

    South Africa, along with Canada and the United States, submitted forecasts of uranium capacities and capabilites to the year 2025 for the 1979 'Red Book' edition. This report deals with the methodologies used in arriving at the South African forecasts. As the future production trends of the South African uranium producers cannot be confidently defined, chiefly because uranium is extracted as a by-product of the gold mining industry and is thus highly sensitive to market fluctuations for both uranium and gold, the Evaluation Group of the Atomic Energy Board has carried out numerous forecast exercises using current and historical norms and assuming various degrees of 'adverse', 'normal' and 'most favourable' conditions. The two exercises, which were submitted for the 'Red Book', are shown in the Appendices. This paper has been prepared for presentation to the Working Group on Methodologies for Forecasting Uranium Availability of the NEA/IAEA Steering Group on Uranium Resources [af

  17. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  18. RESOURCE-EFFICIENT ALLOCATION HEURISTICS FOR MANAGEMENT OF DATA CENTERS FOR CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Vitaliy Litvinov

    2014-07-01

    Full Text Available Survey of research in resource-efficient computing and architectural principles forresource-efficient management of Clouds are offered in this article. Resource-efficient resource allocation policies and scheduling algorithms considering QoS expectations and power usage characteristics of the devices are defined.

  19. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    Science.gov (United States)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  20. State-Estimation Algorithm Based on Computer Vision

    Science.gov (United States)

    Bayard, David; Brugarolas, Paul

    2007-01-01

    An algorithm and software to implement the algorithm are being developed as means to estimate the state (that is, the position and velocity) of an autonomous vehicle, relative to a visible nearby target object, to provide guidance for maneuvering the vehicle. In the original intended application, the autonomous vehicle would be a spacecraft and the nearby object would be a small astronomical body (typically, a comet or asteroid) to be explored by the spacecraft. The algorithm could also be used on Earth in analogous applications -- for example, for guiding underwater robots near such objects of interest as sunken ships, mineral deposits, or submerged mines. It is assumed that the robot would be equipped with a vision system that would include one or more electronic cameras, image-digitizing circuitry, and an imagedata- processing computer that would generate feature-recognition data products.

  1. What is the resource footprint of a computer science department? Place, People and Pedagogy

    OpenAIRE

    Mian, I.S; Twisleton, D.; Timm, D.

    2017-01-01

    Our goal is formulating policies and developing guidelines that create a more resilient and healthier Department of Computer Science at University College London: a living laboratory for teaching and learning about resource constrained computing, computation and communication. Here, we outline a roadmap and propose high-level principles to aid this effort. We focus on how, when and where resources – energy, (raw) materials including water, space and time – are consumed by the building (place)...

  2. Data Security Risk Estimation for Information-Telecommunication Systems on the basis of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anatoly Valeryevich Tsaregorodtsev

    2014-02-01

    Full Text Available Cloud computing will be one of the most common IT technologies to deploy applications, due to its key features: on-demand network access to a shared pool of configurable computing resources, flexibility and good quality/price ratio. Migrating to cloud architecture enables organizations to reduce the overall cost of implementing and maintaining the infrastructure and reduce development time for new business applications. There are many factors that influence the information security environment of cloud, as its multitenant architecture brings new and more complex problems and vulnerabilities. And the approach to risk estimation used in making decisions about the migration of critical data in the cloud infrastructure of the organization are proposed in the paper.

  3. Energy reserves. [Summary of reserve estimates and economic supply models for exhaustible resources

    Energy Technology Data Exchange (ETDEWEB)

    Tessmer, R.G. Jr.; Carhart, S.C.; Marcuse, W.

    1977-03-01

    There is an increasing concern about scarcity of the world's remaining natural energy resources and, in particular, the future supply of oil and natural gas. This paper summarizes recent estimates of energy reserves and economic supply models for exhaustible resources. The basic economic theory of resource exhaustion is reviewed, and recent estimates of both discovered and undiscovered energy resources are presented and compared. Domestic and world-wide reserve estimates are presented for crude oil and natural gas liquids, natural gas, coal, and uranium. Economic models projecting supply of these energy forms, given reserve estimates and other pertinent information, are discussed. Finally, a set of recent models which project world oil prices are summarized and their published results compared. The impact of energy conservation efforts on energy supply is also briefly discussed. 53 references.

  4. Computer Crime: Criminal Justice Resource Manual (Second Edition).

    Science.gov (United States)

    Parker, Donn B.

    This advanced training and reference manual is designed to aid investigators and prosecutors in dealing with white collar computer crime. The first five sections follow the typical order of events for prosecutors handling a criminal case: classifying the crime, computer abuse methods and detection, experts and suspects using information systems,…

  5. Estimating the Economic Impacts of Recreation Response to Resource Management Alternatives

    Science.gov (United States)

    Donald B.K. English; J. Michael Bowker; John C. Bergstrom; H. Ken Cordell

    1995-01-01

    Managing forest resources involves tradeoffs and making decisions among resource management alternatives. Some alternatives will lead to changes in the level of recreation visitation and the amount of associated visitor spending. Thus, the alternatives can affect local economies. This paper reports a method that can be used to estimate the economic impacts of such...

  6. Quantum computing with incoherent resources and quantum jumps.

    Science.gov (United States)

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  7. Forest Resource Management System by Standing Tree Volume Estimation Using Aerial Stereo Photos

    Science.gov (United States)

    Kamiya, T.; Koizumi, H.; Wang, J.; Itaya, A.

    2012-07-01

    Forest resource management usually requires much human labour for the field survey to keep the data up-to-date especially for the mountainous area. Furthermore, forest resources start to draw more and more attention not only as lumber resources but also as biomass resources in terms of alternative energy. This paper describes a novel system for forest resource management based on threedimensional data acquired from stereo matching of aerial photographs. The proposed system consists of image analysis of aerial photograph for forest resource estimation, and a GIS system aiming at better management of the forest resources. We have built a prototype GIS system and applied it to the experiment forest in Mie prefecture, Japan.

  8. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Science.gov (United States)

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  9. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Nan Zhang

    Full Text Available Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  10. Computational approach in estimating the need of ditch network maintenance

    Science.gov (United States)

    Lauren, Ari; Hökkä, Hannu; Launiainen, Samuli; Palviainen, Marjo; Repo, Tapani; Leena, Finer; Piirainen, Sirpa

    2015-04-01

    Ditch network maintenance (DNM), implemented annually in 70 000 ha area in Finland, is the most controversial of all forest management practices. Nationwide, it is estimated to increase the forest growth by 1…3 million m3 per year, but simultaneously to cause 65 000 tons export of suspended solids and 71 tons of phosphorus (P) to water courses. A systematic approach that allows simultaneous quantification of the positive and negative effects of DNM is required. Excess water in the rooting zone slows the gas exchange and decreases biological activity interfering with the forest growth in boreal forested peatlands. DNM is needed when: 1) the excess water in the rooting zone restricts the forest growth before the DNM, and 2) after the DNM the growth restriction ceases or decreases, and 3) the benefits of DNM are greater than the caused adverse effects. Aeration in the rooting zone can be used as a drainage criterion. Aeration is affected by several factors such as meteorological conditions, tree stand properties, hydraulic properties of peat, ditch depth, and ditch spacing. We developed a 2-dimensional DNM simulator that allows the user to adjust these factors and to evaluate their effect on the soil aeration at different distance from the drainage ditch. DNM simulator computes hydrological processes and soil aeration along a water flowpath between two ditches. Applying daily time step it calculates evapotranspiration, snow accumulation and melt, infiltration, soil water storage, ground water level, soil water content, air-filled porosity and runoff. The model performance in hydrology has been tested against independent high frequency field monitoring data. Soil aeration at different distance from the ditch is computed under steady-state assumption using an empirical oxygen consumption model, simulated air-filled porosity, and diffusion coefficient at different depths in soil. Aeration is adequate and forest growth rate is not limited by poor aeration if the

  11. iTools: a framework for classification, categorization and integration of computational biology resources.

    Science.gov (United States)

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  12. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  13. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    Science.gov (United States)

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  14. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    Science.gov (United States)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  15. Optimal Computing Resource Management Based on Utility Maximization in Mobile Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Haoyu Meng

    2017-01-01

    Full Text Available Mobile crowdsourcing, as an emerging service paradigm, enables the computing resource requestor (CRR to outsource computation tasks to each computing resource provider (CRP. Considering the importance of pricing as an essential incentive to coordinate the real-time interaction among the CRR and CRPs, in this paper, we propose an optimal real-time pricing strategy for computing resource management in mobile crowdsourcing. Firstly, we analytically model the CRR and CRPs behaviors in form of carefully selected utility and cost functions, based on concepts from microeconomics. Secondly, we propose a distributed algorithm through the exchange of control messages, which contain the information of computing resource demand/supply and real-time prices. We show that there exist real-time prices that can align individual optimality with systematic optimality. Finally, we also take account of the interaction among CRPs and formulate the computing resource management as a game with Nash equilibrium achievable via best response. Simulation results demonstrate that the proposed distributed algorithm can potentially benefit both the CRR and CRPs. The coordinator in mobile crowdsourcing can thus use the optimal real-time pricing strategy to manage computing resources towards the benefit of the overall system.

  16. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  17. Computer System Resource Requirements of Novice Programming Students.

    Science.gov (United States)

    Nutt, Gary J.

    The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…

  18. Energy-efficient cloud computing : autonomic resource provisioning for datacenters

    OpenAIRE

    Tesfatsion, Selome Kostentinos

    2018-01-01

    Energy efficiency has become an increasingly important concern in data centers because of issues associated with energy consumption, such as capital costs, operating expenses, and environmental impact. While energy loss due to suboptimal use of facilities and non-IT equipment has largely been reduced through the use of best-practice technologies, addressing energy wastage in IT equipment still requires the design and implementation of energy-aware resource management systems. This thesis focu...

  19. TOWARDS NEW COMPUTATIONAL ARCHITECTURES FOR MASS-COLLABORATIVE OPENEDUCATIONAL RESOURCES

    OpenAIRE

    Ismar Frango Silveira; Xavier Ochoa; Antonio Silva Sprock; Pollyana Notargiacomo Mustaro; Yosly C. Hernandez Bieluskas

    2011-01-01

    Open Educational Resources offer several benefits mostly in education and training. Being potentially reusable, their use can reduce time and cost of developing educational programs, so that these savings could be transferred directly to students through the production of a large range of open, freely available content, which vary from hypermedia to digital textbooks. This paper discuss this issue and presents a project and a research network that, in spite of being directed to Latin America'...

  20. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2014-01-01

    Full Text Available This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU virtualization and mobile agent for mobile transparent computing (MTC to devise a method of managing shared resources and services management (SRSM. It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user’s requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  1. A novel resource management method of providing operating system as a service for mobile transparent computing.

    Science.gov (United States)

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  2. A Framework for Generic State Estimation in Computer Vision Applications

    NARCIS (Netherlands)

    Sminchisescu, Cristian; Telea, Alexandru

    2001-01-01

    Experimenting and building integrated, operational systems in computational vision poses both theoretical and practical challenges, involving methodologies from control theory, statistics, optimization, computer graphics, and interaction. Consequently, a control and communication structure is needed

  3. Contextuality as a Resource for Models of Quantum Computation with Qubits.

    Science.gov (United States)

    Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E; Okay, Cihan; Raussendorf, Robert

    2017-09-22

    A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.

  4. Economic models for management of resources in peer-to-peer and grid computing

    Science.gov (United States)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  5. Dynamic remapping of parallel computations with varying resource demands

    Science.gov (United States)

    Nicol, David M.; Saltz, Joel H.

    1988-01-01

    The issue of deciding when to invoke a global load remapping mechanism is studied. Such a decision policy must effectively weigh the costs of remapping against the performance benefits, and should be general enough to apply automatically to a wide range of computations. The authors propose a general mapping decision heuristic, then study its effectiveness and its anticipated behavior on two very different models of load evolution. Assuming only that the remapping cost is known, this policy dynamically minimizes system degradation (including the cost of remapping) for each computation step. This policy is quite simple, choosing to remap when the first local minimum in the degradation function is detected. Simulations show that the decision obtained provides significantly better performance than that achieved by never remapping. The authors also observe that the average intermapping frequency is quite close to the optimal fixed remapping frequency.

  6. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  7. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  8. MCPLOTS: a particle physics resource based on volunteer computing

    Science.gov (United States)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P. Z.

    2014-02-01

    The mcplots.cern.ch web site ( mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc@home 2.0 platform.

  9. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  10. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  11. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  12. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    Science.gov (United States)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea

    2017-10-01

    The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.

  13. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  14. Using High Performance Computing to Support Water Resource Planning

    Energy Technology Data Exchange (ETDEWEB)

    Groves, David G. [RAND Corporation, Santa Monica, CA (United States); Lembert, Robert J. [RAND Corporation, Santa Monica, CA (United States); May, Deborah W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leek, James R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Syme, James [RAND Corporation, Santa Monica, CA (United States)

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  15. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    Energy Technology Data Exchange (ETDEWEB)

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  16. Mobile Cloud Computing: Resource Discovery, Session Connectivity and Other Open Issues

    NARCIS (Netherlands)

    Schüring, Markus; Karagiannis, Georgios

    2011-01-01

    Abstract—Cloud computing can be considered as a model that provides network access to a shared pool of resources, such as storage and computing power, which can be rapidly provisioned and released with minimal management effort. This paper describes a research activity in the area of mobile cloud

  17. Monitoring of computing resource utilization of the ATLAS experiment

    CERN Document Server

    Rousseau, D; The ATLAS collaboration; Vukotic, I; Aidel, O; Schaffer, RD; Albrand, S

    2012-01-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  18. Overview of the Practical and Theoretical Approaches to the Estimation of Mineral Resources. A Financial Perspective

    Directory of Open Access Journals (Sweden)

    Leontina Pavaloaia

    2012-10-01

    Full Text Available Mineral resources represent an important natural resource whose exploitation, unless it is rational, can lead to their exhaustion and the collapse of sustainable development. Given the importance of mineral resources and the uncertainty concerning the estimation of extant reserves, they have been analyzed by several national and international institutions. In this article we shall present a few aspects concerning the ways to approach the reserves of mineral resources at national and international level, by considering both economic aspects and those aspects concerned with the definition, classification and aggregation of the reserves of mineral resources by various specialized institutions. At present there are attempts to homogenize practices concerning these aspects for the purpose of presenting correct and comparable information.

  19. Estimation of uranium resources by life-cycle or discovery-rate models: a critique

    International Nuclear Information System (INIS)

    Harris, D.P.

    1976-10-01

    This report was motivated primarily by M. A. Lieberman's ''United States Uranium Resources: An Analysis of Historical Data'' (Science, April 30). His conclusion that only 87,000 tons of U 3 O 8 resources recoverable at a forward cost of $8/lb remain to be discovered is criticized. It is shown that there is no theoretical basis for selecting the exponential or any other function for the discovery rate. Some of the economic (productivity, inflation) and data issues involved in the analysis of undiscovered, recoverable U 3 O 8 resources on discovery rates of $8 reserves are discussed. The problem of the ratio of undiscovered $30 resources to undiscovered $8 resources is considered. It is concluded that: all methods for the estimation of unknown resources must employ a model of some form of the endowment-exploration-production complex, but every model is a simplification of the real world, and every estimate is intrinsically uncertain. The life-cycle model is useless for the appraisal of undiscovered, recoverable U 3 O 8 , and the discovery rate model underestimates these resources

  20. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2010-01-01

    GoeGrid is a grid resource center located in Goettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center will be presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster will be detailed. The benefits are an efficient use of computer and manpower resources. Further interdisciplinary projects are commonly organized courses for students of all fields to support education on grid-computing.

  1. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  2. An Improved Global Wind Resource Estimate for Integrated Assessment Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-01

    This paper summarizes initial steps to improving the robustness and accuracy of global renewable resource and techno-economic assessments for use in integrated assessment models. We outline a method to construct country-level wind resource supply curves, delineated by resource quality and other parameters. Using mesoscale reanalysis data, we generate estimates for wind quality, both terrestrial and offshore, across the globe. Because not all land or water area is suitable for development, appropriate database layers provide exclusions to reduce the total resource to its technical potential. We expand upon estimates from related studies by: using a globally consistent data source of uniquely detailed wind speed characterizations; assuming a non-constant coefficient of performance for adjusting power curves for altitude; categorizing the distance from resource sites to the electric power grid; and characterizing offshore exclusions on the basis of sea ice concentrations. The product, then, is technical potential by country, classified by resource quality as determined by net capacity factor. Additional classifications dimensions are available, including distance to transmission networks for terrestrial wind and distance to shore and water depth for offshore. We estimate the total global wind generation potential of 560 PWh for terrestrial wind with 90% of resource classified as low-to-mid quality, and 315 PWh for offshore wind with 67% classified as mid-to-high quality. These estimates are based on 3.5 MW composite wind turbines with 90 m hub heights, 0.95 availability, 90% array efficiency, and 5 MW/km2 deployment density in non-excluded areas. We compare the underlying technical assumption and results with other global assessments.

  3. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  4. Accounting for animal movement in estimation of resource selection functions: sampling and data analysis.

    Science.gov (United States)

    Forester, James D; Im, Hae Kyung; Rathouz, Paul J

    2009-12-01

    Patterns of resource selection by animal populations emerge as a result of the behavior of many individuals. Statistical models that describe these population-level patterns of habitat use can miss important interactions between individual animals and characteristics of their local environment; however, identifying these interactions is difficult. One approach to this problem is to incorporate models of individual movement into resource selection models. To do this, we propose a model for step selection functions (SSF) that is composed of a resource-independent movement kernel and a resource selection function (RSF). We show that standard case-control logistic regression may be used to fit the SSF; however, the sampling scheme used to generate control points (i.e., the definition of availability) must be accommodated. We used three sampling schemes to analyze simulated movement data and found that ignoring sampling and the resource-independent movement kernel yielded biased estimates of selection. The level of bias depended on the method used to generate control locations, the strength of selection, and the spatial scale of the resource map. Using empirical or parametric methods to sample control locations produced biased estimates under stronger selection; however, we show that the addition of a distance function to the analysis substantially reduced that bias. Assuming a uniform availability within a fixed buffer yielded strongly biased selection estimates that could be corrected by including the distance function but remained inefficient relative to the empirical and parametric sampling methods. As a case study, we used location data collected from elk in Yellowstone National Park, USA, to show that selection and bias may be temporally variable. Because under constant selection the amount of bias depends on the scale at which a resource is distributed in the landscape, we suggest that distance always be included as a covariate in SSF analyses. This approach to

  5. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; McCabe, Kevin; Mooney, Meghan; Reber, Timothy; Young, Katherine R.

    2016-10-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  6. An Estimate of Shallow, Low-Temperature Geothermal Resources of the United States

    Energy Technology Data Exchange (ETDEWEB)

    Mullane, Michelle; Gleason, Michael; Reber, Tim; McCabe, Kevin; Mooney, Meghan; Young, Katherine R.

    2017-05-01

    Low-temperature geothermal resources in the United States potentially hold an enormous quantity of thermal energy, useful for direct use in residential, commercial and industrial applications such as space and water heating, greenhouse warming, pool heating, aquaculture, and low-temperature manufacturing processes. Several studies published over the past 40 years have provided assessments of the resource potential for multiple types of low-temperature geothermal systems (e.g. hydrothermal convection, hydrothermal conduction, and enhanced geothermal systems) with varying temperature ranges and depths. This paper provides a summary and additional analysis of these assessments of shallow (= 3 km), low-temperature (30-150 degrees C) geothermal resources in the United States, suitable for use in direct-use applications. This analysis considers six types of geothermal systems, spanning both hydrothermal and enhanced geothermal systems (EGS). We outline the primary data sources and quantitative parameters used to describe resources in each of these categories, and present summary statistics of the total resources available. In sum, we find that low-temperature hydrothermal resources and EGS resources contain approximately 8 million and 800 million TWh of heat-in-place, respectively. In future work, these resource potential estimates will be used for modeling of the technical and market potential for direct-use geothermal applications for the U.S. Department of Energy's Geothermal Vision Study.

  7. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  8. Software Defined Resource Orchestration System for Multitask Application in Heterogeneous Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Qi Qi

    2016-01-01

    Full Text Available The mobile cloud computing (MCC that combines mobile computing and cloud concept takes wireless access network as the transmission medium and uses mobile devices as the client. When offloading the complicated multitask application to the MCC environment, each task executes individually in terms of its own computation, storage, and bandwidth requirement. Due to user’s mobility, the provided resources contain different performance metrics that may affect the destination choice. Nevertheless, these heterogeneous MCC resources lack integrated management and can hardly cooperate with each other. Thus, how to choose the appropriate offload destination and orchestrate the resources for multitask is a challenge problem. This paper realizes a programming resource provision for heterogeneous energy-constrained computing environments, where a software defined controller is responsible for resource orchestration, offload, and migration. The resource orchestration is formulated as multiobjective optimal problem that contains the metrics of energy consumption, cost, and availability. Finally, a particle swarm algorithm is used to obtain the approximate optimal solutions. Simulation results show that the solutions for all of our studied cases almost can hit Pareto optimum and surpass the comparative algorithm in approximation, coverage, and execution time.

  9. A chance constraint estimation approach to optimizing resource management under uncertainty

    Science.gov (United States)

    Michael Bevers

    2007-01-01

    Chance-constrained optimization is an important method for managing risk arising from random variations in natural resource systems, but the probabilistic formulations often pose mathematical programming problems that cannot be solved with exact methods. A heuristic estimation method for these problems is presented that combines a formulation for order statistic...

  10. Interactive Query Workstation: standardizing access to computer-based medical resources.

    Science.gov (United States)

    Cimino, C; Barnett, G O; Hassan, L; Blewett, D R; Piggins, J L

    1991-08-01

    Methods of using multiple computer-based medical resources efficiently have previously required either the user to manage the choice of resource and terms, or specialized programming. Standardized descriptions of what resources can do and how they may be accessed would allow the creation of an interface for multiple resources. This interface would assist a user in formulating queries, accessing the resources and managing the results. This paper describes a working prototype, the Interactive Query Workstation (IQW). The IQW allows users to query multiple resources: a medical knowledge base (DXplain), a clinical database (COSTAR/MQL), a bibliographic database (MEDLINE), a cancer database (PDQ), and a drug interaction database (PDR). Descriptions of each resource were developed to allow IQW to access these resources. The descriptions are composed of information on how data are sent and received from a resource, information on types of query to which a resource can respond, and information on what types of information are needed to execute a query. These components form the basis of a standard description of resources.

  11. Analysis of the Possibility of Required Resources Estimation for Nuclear Power Plant Decommissioning Applying BIM

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Estimation of decommissioning cost, decommissioning strategy, and decommissioning quantity at the time when entering into any decommissioning plans are some elements whose inputs are mandatory for nuclear power plant decommissioning. Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. This study aims at analyzing whether required resources for decommissioning nuclear power plants can be estimated, applying BIM. To achieve this goal, this study analyzed the status quo of BIM such as definition, characteristics, and areas applied, and made use of them when drawing out study results by examining types and features of the tools realizing BIM. In order to review how BIM could be used for decommissioning nuclear power plants, the definition, characteristics and applied areas of BIM were discussed. BIM designs objects of the structures (walls, slabs, pillars, stairs, windows and doors, etc.) by 3D technology and endows attribute (function, structure and usage) information for each object, thereby providing visualized information of structures for participants in construction projects. Major characteristics of BIM attribute information are as follows: - Geometry: The information of objects is represented by measurable geometric information - Extensible object attributes: Objects include pre-defined attributes, and allow extension of other attributes. Any model that includes these attributes forms relationships with other various attributes in order to perform analysis and simulation. - All information including the attributes are integrated to ensure continuity, accuracy and accessibility, and all information used during the life cycle of structures are supported. This means that when information of required resources is added as another attributes other than geometric

  12. Analysis of the Possibility of Required Resources Estimation for Nuclear Power Plant Decommissioning Applying BIM

    International Nuclear Information System (INIS)

    Jung, Insu; Kim, Woojung

    2014-01-01

    Estimation of decommissioning cost, decommissioning strategy, and decommissioning quantity at the time when entering into any decommissioning plans are some elements whose inputs are mandatory for nuclear power plant decommissioning. Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. This study aims at analyzing whether required resources for decommissioning nuclear power plants can be estimated, applying BIM. To achieve this goal, this study analyzed the status quo of BIM such as definition, characteristics, and areas applied, and made use of them when drawing out study results by examining types and features of the tools realizing BIM. In order to review how BIM could be used for decommissioning nuclear power plants, the definition, characteristics and applied areas of BIM were discussed. BIM designs objects of the structures (walls, slabs, pillars, stairs, windows and doors, etc.) by 3D technology and endows attribute (function, structure and usage) information for each object, thereby providing visualized information of structures for participants in construction projects. Major characteristics of BIM attribute information are as follows: - Geometry: The information of objects is represented by measurable geometric information - Extensible object attributes: Objects include pre-defined attributes, and allow extension of other attributes. Any model that includes these attributes forms relationships with other various attributes in order to perform analysis and simulation. - All information including the attributes are integrated to ensure continuity, accuracy and accessibility, and all information used during the life cycle of structures are supported. This means that when information of required resources is added as another attributes other than geometric

  13. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  14. The Impact of a Professional Learning Intervention Designed to Enhance Year Six Students' Computational Estimation Performance

    Science.gov (United States)

    Mildenhall, Paula; Hackling, Mark

    2012-01-01

    This paper reports on the analysis of a study of a professional learning intervention focussing on computational estimation. Using a multiple case study design it was possible to describe the impact of the intervention of students' beliefs and computational estimation performance. The study revealed some noteworthy impacts on computational…

  15. Implications of applying solar industry best practice resource estimation on project financing

    International Nuclear Information System (INIS)

    Pacudan, Romeo

    2016-01-01

    Solar resource estimation risk is one of the main solar PV project risks that influences lender’s decision in providing financing and in determining the cost of capital. More recently, a number of measures have emerged to mitigate this risk. The study focuses on solar industry’s best practice energy resource estimation and assesses its financing implications to the 27 MWp solar PV project study in Brunei Darussalam. The best practice in resource estimation uses multiple data sources through the measure-correlate-predict (MCP) technique as compared with the standard practice that rely solely on modelled data source. The best practice case generates resource data with lower uncertainty and yields superior high-confidence energy production estimate than the standard practice case. Using project financial parameters in Brunei Darussalam for project financing and adopting the international debt-service coverage ratio (DSCR) benchmark rates, the best practice case yields DSCRs that surpass the target rates while those of standard practice case stay below the reference rates. The best practice case could also accommodate higher debt share and have lower levelized cost of electricity (LCOE) while the standard practice case would require a lower debt share but having a higher LCOE. - Highlights: •Best practice solar energy resource estimation uses multiple datasets. •Multiple datasets are combined through measure-correlate-predict technique. •Correlated data have lower uncertainty and yields superior high-confidence energy production. •Best practice case yields debt-service coverage ratios (DSCRs) that surpass the benchmark rates. •Best practice case accommodates high debt share and have low levelized cost of electricity.

  16. NATO Advanced Study Institute on Statistical Treatments for Estimation of Mineral and Energy Resources

    CERN Document Server

    Fabbri, A; Sinding-Larsen, R

    1988-01-01

    This volume contains the edited papers prepared by lecturers and participants of the NATO Advanced Study Institute on "Statistical Treatments for Estimation of Mineral and Energy Resources" held at II Ciocco (Lucca), Italy, June 22 - July 4, 1986. During the past twenty years, tremendous efforts have been made to acquire quantitative geoscience information from ore deposits, geochemical, geophys ical and remotely-sensed measurements. In October 1981, a two-day symposium on "Quantitative Resource Evaluation" and a three-day workshop on "Interactive Systems for Multivariate Analysis and Image Processing for Resource Evaluation" were held in Ottawa, jointly sponsored by the Geological Survey of Canada, the International Association for Mathematical Geology, and the International Geological Correlation Programme. Thirty scientists from different countries in Europe and North America were invited to form a forum for the discussion of quantitative methods for mineral and energy resource assessment. Since then, not ...

  17. Computational resources for ribosome profiling: from database to Web server and software.

    Science.gov (United States)

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Goettingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2011-01-01

    GoeGrid is a grid resource center located in G¨ottingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and manpower resources.

  19. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Gettingen

    International Nuclear Information System (INIS)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel

    2011-01-01

    GoeGrid is a grid resource center located in Gettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  20. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  1. ESTIMATION OF EXTERNAL FACTORS INFLUENCE ON THE ORGANIZATIONAL AND RESOURCE SUPPORT OF ENGINEERING

    Directory of Open Access Journals (Sweden)

    Yu. V. Gusak

    2013-09-01

    Full Text Available Purpose. The engineering industry is characterized by deep specialization and high co-operation, which suggests a high degree of interaction with other industries and the economy, highly sensitive to external factors. Effective regulation of the engineering industry’s organizational-resource support will ensure coherence of all the subsystems of the market economy, the competitive environment, a full course of the investment process and the success of the industry. Therefore there is a need for detailed estimation and analysis of the external factors’ influence on the formation and implementation indexes of the engineering industry’s organizational-resource support. Methodology. To establish the close connection between the set of external factors of formation and implementation indexes of the engineering industry organizational-resource support the correlation analysis was used, to calculate the amount of the formation and implementation indexes of the engineering industry organizational-resource support’s change under the influence of the external factors with malleability coefficient were applied. Findings. The external influence factors on the engineering industry organizational-resource support by the source of origin: industrial, economical, political, informational, and social were separated and grouped. The classification of the external factors influence on the engineering industry organizational-resource support, depending on their influence’s direction on the formation and implementation indexes of the engineering industry’s organizational-resource support was made. The connection closeness and the amount of the formation and implementation indexes of the engineering industry organizational-resource support change (the machinery index of and the sales volume machinery index under the influence of the external factors with malleability coefficient were determined. Originality. The estimation of the external factors

  2. Open Educational Resources: The Role of OCW, Blogs and Videos in Computer Networks Classroom

    Directory of Open Access Journals (Sweden)

    Pablo Gil

    2012-09-01

    Full Text Available This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

  3. Computer program for updating timber resource statistics by county, with tables for Mississippi

    Science.gov (United States)

    Roy C. Beltz; Joe F. Christopher

    1970-01-01

    A computer program is available for updating Forest Survey estimates of timber growth, cut, and inventory volume by species group, for sawtimber and growing stock. Changes in rate of product removal are estimated from changes in State severance tax data. Updated tables are given for Mississippi.

  4. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  5. Uranium Resources Modeling And Estimation In Lembah Hitam Sector, Kalan, West Kalimantan

    International Nuclear Information System (INIS)

    Adi Gunawan Muhammad; Bambang Soetopo

    2016-01-01

    Lembah Hitam Sector is part of Schwaner Mountains and Kalan Basin upper part stratigraphy. Uranium (U) mineralization layer is associated with metasiltstone and metapelites schistose heading to N 265° E/60° S. Evaluation drilling carried out with a distance of 50 m from an existing point (FKL 14 and FKL 13) to determine the model and the amount of U resources in measured category. To achieve these objectives some activities including reviewing the previous studies, geological and U mineralization data collecting, grades quantitative estimation using log gross-count gamma ray, database and modeling creation and resource estimation of U carried out. Based on modeling on ten drilling data and completed with drilled core observation, the average grade of U mineralization in Lembah Hitam Sector obtained. The average grade is ranging from 0.0076 - 0.95 % eU 3 O 8 , with a thickness of mineralization ranging from 0.1 - 4.5 m. Uranium mineralization present as fracture filling (veins) or groups of veins and as matrix filling in tectonic breccia, associated with pyrite, pyrrhotite, magnetite, molybdenite, tourmaline and quartz in metasiltstone and metapelites schistose. Calculation of U resources to 26 ores body using 25 m searching radius resulted in 655.65 tons ores. By using 0.01 % cut-off grade resulted in 546.72 tons ores with an average grade 0.101 % eU 3 O 8 . Uranium resource categorized as low-grade measured resources. (author)

  6. Estimating resource costs of compliance with EU WFD ecological status requirements at the river basin scale

    DEFF Research Database (Denmark)

    Riegels, Niels; Jensen, Roar; Benasson, Lisa

    2011-01-01

    Resource costs of meeting EU WFD ecological status requirements at the river basin scale are estimated by comparing net benefits of water use given ecological status constraints to baseline water use values. Resource costs are interpreted as opportunity costs of water use arising from water...... minimizes the number of decision variables in the optimization problem and provides guidance for pricing policies that meet WFD objectives. Results from a real-world application in northern Greece show the suitability of the approach for use in complex, water-stressed basins. The impact of uncertain input...

  7. Evaluation of non-linear power estimation models in a computing cluster

    NARCIS (Netherlands)

    Zhu, H.; Liao, X.; de Laat, C.; Grosso, P.

    The data center industry is responsible for 1.5–2% of the world energy consumption. Energy management technologies have been proposed for energy-efficient scheduling of computing workloads and for allocating resources in such computing infrastructures. One of the important factors for this energy

  8. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  9. A Computational Approach to Investigate Properties of Estimators

    Science.gov (United States)

    Caudle, Kyle A.; Ruth, David M.

    2013-01-01

    Teaching undergraduates the basic properties of an estimator can be difficult. Most definitions are easy enough to comprehend, but difficulties often lie in gaining a "good feel" for these properties and why one property might be more desired as compared to another property. Simulations which involve visualization of these properties can…

  10. Latent-failure risk estimates for computer control

    Science.gov (United States)

    Dunn, William R.; Folsom, Rolfe A.; Green, Owen R.

    1991-01-01

    It is shown that critical computer controls employing unmonitored safety circuits are unsafe. Analysis supporting this result leads to two additional, important conclusions: (1) annual maintenance checks of safety circuit function do not, as widely believed, eliminate latent failure risk; (2) safety risk remains even if multiple, series-connected protection circuits are employed. Finally, it is shown analytically that latent failure risk is eliminated when continuous monitoring is employed.

  11. Estimated sand and gravel resources of the South Merrimack, Hillsborough County, New Hampshire, 7.5-minute quadrangle

    Science.gov (United States)

    Sutphin, D.M.; Drew, L.J.; Fowler, B.K.

    2006-01-01

    A computer methodology is presented that allows natural aggregate producers, local governmental, and nongovernmental planners to define specific locations that may have sand and gravel deposits meeting user-specified minimum size, thickness, and geographic and geologic criteria, in areas where the surficial geology has been mapped. As an example, the surficial geologic map of the South Merrimack quadrangle was digitized and several digital geographic information system databases were downloaded from the internet and used to estimate the sand and gravel resources in the quadrangle. More than 41 percent of the South Merrimack quadrangle has been mapped as having sand and (or) gravel deposited by glacial meltwaters. These glaciofluvial areas are estimated to contain a total of 10 million m3 of material mapped as gravel, 60 million m3 of material mapped as mixed sand and gravel, and another 50 million m3 of material mapped as sand with minor silt. The mean thickness of these areas is about 1.95 meters. Twenty tracts were selected, each having individual areas of more than about 14 acres4 (5.67 hectares) of stratified glacial-meltwater sand and gravel deposits, at least 10-feet (3.0 m) of material above the watertable, and not sterilized by the proximity of buildings, roads, streams and other bodies of water, or railroads. The 20 tracts are estimated to contain between about 4 and 10 million short tons (st) of gravel and 20 and 30 million st of sand. The five most gravel-rich tracts contain about 71 to 82 percent of the gravel resources in all 20 tracts and about 54-56 percent of the sand. Using this methodology, and the above criteria, a group of four tracts, divided by narrow areas sterilized by a small stream and secondary roads, may have the highest potential in the quadrangle for sand and gravel resources. ?? Springer Science+Business Media, LLC 2006.

  12. Estimating resting motor thresholds in transcranial magnetic stimulation research and practice: a computer simulation evaluation of best methods.

    Science.gov (United States)

    Borckardt, Jeffrey J; Nahas, Ziad; Koola, Jejo; George, Mark S

    2006-09-01

    Resting motor threshold is the basic unit of dosing in transcranial magnetic stimulation (TMS) research and practice. There is little consensus on how best to estimate resting motor threshold with TMS, and only a few tools and resources are readily available to TMS researchers. The current study investigates the accuracy and efficiency of 5 different approaches to motor threshold assessment for TMS research and practice applications. Computer simulation models are used to test the efficiency and accuracy of 5 different adaptive parameter estimation by sequential testing (PEST) procedures. For each approach, data are presented with respect to the mean number of TMS trials necessary to reach the motor threshold estimate as well as the mean accuracy of the estimates. A simple nonparametric PEST procedure appears to provide the most accurate motor threshold estimates, but takes slightly longer (on average, 3.48 trials) to complete than a popular parametric alternative (maximum likelihood PEST). Recommendations are made for the best starting values for each of the approaches to maximize both efficiency and accuracy. In light of the computer simulation data provided in this article, the authors review and suggest which techniques might best fit different TMS research and clinical situations. Lastly, a free user-friendly software package is described and made available on the world wide web that allows users to run all of the motor threshold estimation procedures discussed in this article for clinical and research applications.

  13. Estimating resource costs of compliance with EU WFD ecological status requirements at the river basin scale

    Science.gov (United States)

    Riegels, Niels; Jensen, Roar; Bensasson, Lisa; Banou, Stella; Møller, Flemming; Bauer-Gottwein, Peter

    2011-01-01

    SummaryResource costs of meeting EU WFD ecological status requirements at the river basin scale are estimated by comparing net benefits of water use given ecological status constraints to baseline water use values. Resource costs are interpreted as opportunity costs of water use arising from water scarcity. An optimization approach is used to identify economically efficient ways to meet WFD requirements. The approach is implemented using a river basin simulation model coupled to an economic post-processor; the simulation model and post-processor are run from a central controller that iterates until an allocation is found that maximizes net benefits given WFD requirements. Water use values are estimated for urban/domestic, agricultural, industrial, livestock, and tourism water users. Ecological status is estimated using metrics that relate average monthly river flow volumes to the natural hydrologic regime. Ecological status is only estimated with respect to hydrologic regime; other indicators are ignored in this analysis. The decision variable in the optimization is the price of water, which is used to vary demands using consumer and producer water demand functions. The price-based optimization approach minimizes the number of decision variables in the optimization problem and provides guidance for pricing policies that meet WFD objectives. Results from a real-world application in northern Greece show the suitability of the approach for use in complex, water-stressed basins. The impact of uncertain input values on model outcomes is estimated using the Info-Gap decision analysis framework.

  14. An Estimate of Recoverable Heavy Oil Resources of the Orinoco Oil Belt, Venezuela

    Science.gov (United States)

    Schenk, Christopher J.; Cook, Troy A.; Charpentier, Ronald R.; Pollastro, Richard M.; Klett, Timothy R.; Tennyson, Marilyn E.; Kirschbaum, Mark A.; Brownfield, Michael E.; Pitman, Janet K.

    2009-01-01

    The Orinoco Oil Belt Assessment Unit of the La Luna-Quercual Total Petroleum System encompasses approximately 50,000 km2 of the East Venezuela Basin Province that is underlain by more than 1 trillion barrels of heavy oil-in-place. As part of a program directed at estimating the technically recoverable oil and gas resources of priority petroleum basins worldwide, the U.S. Geological Survey estimated the recoverable oil resources of the Orinoco Oil Belt Assessment Unit. This estimate relied mainly on published geologic and engineering data for reservoirs (net oil-saturated sandstone thickness and extent), petrophysical properties (porosity, water saturation, and formation volume factors), recovery factors determined by pilot projects, and estimates of volumes of oil-in-place. The U.S. Geological Survey estimated a mean volume of 513 billion barrels of technically recoverable heavy oil in the Orinoco Oil Belt Assessment Unit of the East Venezuela Basin Province; the range is 380 to 652 billion barrels. The Orinoco Oil Belt Assessment Unit thus contains one of the largest recoverable oil accumulations in the world.

  15. Computational Issues in Linear Least-Squares Estimation and Control

    Science.gov (United States)

    1979-06-06

    of them fascinating. Sir Isaac Newton considered the verification of the law of universal gravitation fundamental to his theory of motion-- so funda...Systems," Applied Mathematics and Comuters, vol. 2, pp. 03-94. Edleston, J., Ed., [1850], Correspondence of Sir Isaac Newton and Professor Cotes, John...Algorithms for Parallel Processing in Optimal Estimation," to appear in Automatica, May, 1979. Newton , Issac , [1926], Philosophe Naturalis Principia

  16. Methods for the estimation and economic evaluation of undiscovered uranium endowment and resources

    International Nuclear Information System (INIS)

    1992-01-01

    The present Instruction Manual was prepared as part of a programme of the International Atomic Energy Agency to supply the international uranium community with standard guides for a number of topics related to uranium resource assessment and supply. The quantitative estimation of undiscovered resources and endowments aims at supplying data on potential mineral resources; these data are needed to compare long term projections with one another and to assess the mineral supplies to be obtained from elsewhere. These objectives have relatively recently been supplemented by the concern of land managers and national policy planners to assess the potential of certain lands before the constitution of national parks and other areas reserved from mineral exploration and development. 88 refs, 28 figs, 33 tabs

  17. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    Science.gov (United States)

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  18. Cost implications of uncertainty in CO2 storage resource estimates: A review

    Science.gov (United States)

    Anderson, Steven T.

    2017-01-01

    Carbon capture from stationary sources and geologic storage of carbon dioxide (CO2) is an important option to include in strategies to mitigate greenhouse gas emissions. However, the potential costs of commercial-scale CO2 storage are not well constrained, stemming from the inherent uncertainty in storage resource estimates coupled with a lack of detailed estimates of the infrastructure needed to access those resources. Storage resource estimates are highly dependent on storage efficiency values or storage coefficients, which are calculated based on ranges of uncertain geological and physical reservoir parameters. If dynamic factors (such as variability in storage efficiencies, pressure interference, and acceptable injection rates over time), reservoir pressure limitations, boundaries on migration of CO2, consideration of closed or semi-closed saline reservoir systems, and other possible constraints on the technically accessible CO2 storage resource (TASR) are accounted for, it is likely that only a fraction of the TASR could be available without incurring significant additional costs. Although storage resource estimates typically assume that any issues with pressure buildup due to CO2 injection will be mitigated by reservoir pressure management, estimates of the costs of CO2 storage generally do not include the costs of active pressure management. Production of saline waters (brines) could be essential to increasing the dynamic storage capacity of most reservoirs, but including the costs of this critical method of reservoir pressure management could increase current estimates of the costs of CO2 storage by two times, or more. Even without considering the implications for reservoir pressure management, geologic uncertainty can significantly impact CO2 storage capacities and costs, and contribute to uncertainty in carbon capture and storage (CCS) systems. Given the current state of available information and the scarcity of (data from) long-term commercial-scale CO2

  19. Wind Resource Estimation and Mapping at the National Renewable Energy Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, M.

    1999-04-07

    The National Renewable Energy Laboratory (NREL) has developed an automated technique for wind resource mapping to aid in the acceleration of wind energy deployment. The new automated mapping system was developed with the following two primary goals: (1) to produce a more consistent and detailed analysis of the wind resource for a variety of physiographic settings, particularly in areas of complex terrain; and (2) to generate high quality map products on a timely basis. Using computer mapping techniques reduces the time it takes to produce a wind map that reflects a consistent analysis of the distribution of the wind resource throughout the region of interest. NREL's mapping system uses commercially available geographic information system software packages. Regional wind resource maps using this new system have been produced for areas of the United States, Mexico, Chile, Indonesia (1), and China. Countrywide wind resource assessments are under way for the Philippines, the Dominican Re public, and Mongolia. Regional assessments in Argentina and Russia are scheduled to begin soon.

  20. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  1. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  2. Estimating healthcare resource use associated with the treatment of metastatic melanoma in eight countries.

    Science.gov (United States)

    McKendrick, Jan; Gijsen, Merel; Quinn, Casey; Barber, Beth; Zhao, Zhongyun

    2016-06-01

    Objectives Studies reporting healthcare resourse use (HRU) for melanoma, one of the most costly cancers to treat, are limited. Using consistent, robust methodology, this study estimated HRU associated with the treatment of metastatic melanoma in eight countries. Methods Using published literature and clinician input, treatment phases were identified: active systemic treatment (pre-progression); disease progression; best supportive care (BSC)/palliative care; and terminal care. HRU elements were identified for each phase and estimates of the magnitude and frequency of use in clinical practice were obtained through country-specific Delphi panels, comprising healthcare professionals with experience in oncology (n = 8). Results Medical oncologists are the key care providers for patients with metastatic melanoma, although in Germany dermato-oncologists also lead care. During the active systemic treatment phase, each patient was estimated to require 0.83-2 consultations with a medical oncologist/month across countries; the median number of such assessments in 3 months was highest in Canada (range = 3.5-5) and lowest in France, the Netherlands and Spain (1). Resource use during the disease progression phase was intensive and similar across countries: all patients were estimated to consult with medical oncologists and 10-40% with a radiation oncologist; up to 40% were estimated to require a brain MRI scan. During the BSC/palliative care phase, all patients were estimated to consult with medical oncologists, and most to consult with a primary care physician (40-100%). Limitations Panelists were from centers of excellence, thus results may not reflect care within smaller hospitals; data obtained from experts may be less variable than data from broader clinical practice. Treatments for metastatic melanoma are continually emerging, thus some elements of our work could be superseded. Conclusions HRU estimates were substantial and varied across countries for some

  3. Estimating the financial resources needed for local public health departments in Minnesota: a multimethod approach.

    Science.gov (United States)

    Riley, William; Briggs, Jill; McCullough, Mac

    2011-01-01

    This study presents a model for determining total funding needed for individual local health departments. The aim is to determine the financial resources needed to provide services for statewide local public health departments in Minnesota based on a gaps analysis done to estimate the funding needs. We used a multimethod analysis consisting of 3 approaches to estimate gaps in local public health funding consisting of (1) interviews of selected local public health leaders, (2) a Delphi panel, and (3) a Nominal Group Technique. On the basis of these 3 approaches, a consensus estimate of funding gaps was generated for statewide projections. The study includes an analysis of cost, performance, and outcomes from 2005 to 2007 for all 87 local governmental health departments in Minnesota. For each of the methods, we selected a panel to represent a profile of Minnesota health departments. The 2 main outcome measures were local-level gaps in financial resources and total resources needed to provide public health services at the local level. The total public health expenditure in Minnesota for local governmental public health departments was $302 million in 2007 ($58.92 per person). The consensus estimate of the financial gaps in local public health departments indicates that an additional $32.5 million (a 10.7% increase or $6.32 per person) is needed to adequately serve public health needs in the local communities. It is possible to make informed estimates of funding gaps for public health activities on the basis of a combination of quantitative methods. There is a wide variation in public health expenditure at the local levels, and methods are needed to establish minimum baseline expenditure levels to adequately treat a population. The gaps analysis can be used by stakeholders to inform policy makers of the need for improved funding of the public health system.

  4. A computer program for the estimation of time of death

    DEFF Research Database (Denmark)

    Lynnerup, N

    1993-01-01

    In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant and that t......In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant...... cooling of bodies is presented. It is proposed that by having a computer program that solves the equation, giving the length of the cooling period in response to a certain rectal temperature, and which allows easy comparison of multiple solutions, the uncertainties related to ambience temperature...

  5. Model calibration and parameter estimation for environmental and water resource systems

    CERN Document Server

    Sun, Ne-Zheng

    2015-01-01

    This three-part book provides a comprehensive and systematic introduction to the development of useful models for complex systems. Part 1 covers the classical inverse problem for parameter estimation in both deterministic and statistical frameworks, Part 2 is dedicated to system identification, hyperparameter estimation, and model dimension reduction, and Part 3 considers how to collect data and construct reliable models for prediction and decision-making. For the first time, topics such as multiscale inversion, stochastic field parameterization, level set method, machine learning, global sensitivity analysis, data assimilation, model uncertainty quantification, robust design, and goal-oriented modeling, are systematically described and summarized in a single book from the perspective of model inversion, and elucidated with numerical examples from environmental and water resources modeling. Readers of this book will not only learn basic concepts and methods for simple parameter estimation, but also get famili...

  6. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    Science.gov (United States)

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  7. Estimating health workforce needs for antiretroviral therapy in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Fullem Andrew

    2006-01-01

    Full Text Available Abstract Background Efforts to increase access to life-saving treatment, including antiretroviral therapy (ART, for people living with HIV/AIDS in resource-limited settings has been the growing focus of international efforts. One of the greatest challenges to scaling up will be the limited supply of adequately trained human resources for health, including doctors, nurses, pharmacists and other skilled providers. As national treatment programmes are planned, better estimates of human resource needs and improved approaches to assessing the impact of different staffing models are critically needed. However there have been few systematic assessments of staffing patterns in existing programmes or of the estimates being used in planning larger programmes. Methods We reviewed the published literature and selected plans and scaling-up proposals, interviewed experts and collected data on staffing patterns at existing treatment sites through a structured survey and site visits. Results We found a wide range of staffing patterns and patient-provider ratios in existing and planned treatment programmes. Many factors influenced health workforce needs, including task assignments, delivery models, other staff responsibilities and programme size. Overall, the number of health care workers required to provide ART to 1000 patients included 1–2 physicians, 2–7 nurses, Discussion These data are consistent with other estimates of human resource requirements for antiretroviral therapy, but highlight the considerable variability of current staffing models and the importance of a broad range of factors in determining personnel needs. Few outcome or cost data are currently available to assess the effectiveness and efficiency of different staffing models, and it will be important to develop improved methods for gathering this information as treatment programmes are scaled up.

  8. Resource Needs for Adolescent Friendly Health Services: Estimates for 74 Low- and Middle-Income Countries

    Science.gov (United States)

    Deogan, Charlotte; Ferguson, Jane; Stenberg, Karin

    2012-01-01

    Background In order to achieve Millennium Development Goals 4, 5 and 6, it is essential to address adolescents’ health. Objective To estimate the additional resources required to scale up adolescent friendly health service interventions with the objective to reduce mortality and morbidity among individuals aged 10 to 19 years in 74 low- and middle- income countries. Methods A costing model was developed to estimate the financial resources needed to scale-up delivery of a set of interventions including contraception, maternity care, management of sexually transmitted infections, HIV testing and counseling, safe abortion services, HIV harm reduction, HIV care and treatment and care of injuries due to intimate partner physical and sexual violence. Financial costs were estimated for each intervention, country and year using a bottom-up ingredients approach, defining costs at different levels of delivery (i.e., community, health centre, and hospital level). Programme activity costs to improve quality of care were also estimated, including activities undertaken at national-, district- and facility level in order to improve adolescents’ use of health services (i.e., to render health services adolescent friendly). Results Costs of achieving universal coverage are estimated at an additional US$ 15.41 billion for the period 2011–2015, increasing from US$ 1.86 billion in 2011 to US$ 4,31 billion in 2015. This corresponds to approximately US$ 1.02 per adolescent in 2011, increasing to 4.70 in 2015. On average, for all 74 countries, an annual additional expenditure per capita ranging from of US$ 0.38 in 2011 to US$ 0.82 in 2015, would be required to support the scale-up of key adolescent friendly health services. Conclusion The estimated costs show a substantial investment gap and are indicative of the additional investments required to scale up health service delivery to adolescents towards universal coverage by 2015. PMID:23300548

  9. Current status and prospects of computational resources for natural product dereplication: a review.

    Science.gov (United States)

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-03-01

    Research in natural products has always enhanced drug discovery by providing new and unique chemical compounds. However, recently, drug discovery from natural products is slowed down by the increasing chance of re-isolating known compounds. Rapid identification of previously isolated compounds in an automated manner, called dereplication, steers researchers toward novel findings, thereby reducing the time and effort for identifying new drug leads. Dereplication identifies compounds by comparing processed experimental data with those of known compounds, and so, diverse computational resources such as databases and tools to process and compare compound data are necessary. Automating the dereplication process through the integration of computational resources has always been an aspired goal of natural product researchers. To increase the utilization of current computational resources for natural products, we first provide an overview of the dereplication process, and then list useful resources, categorizing into databases, methods and software tools and further explaining them from a dereplication perspective. Finally, we discuss the current challenges to automating dereplication and proposed solutions. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. Formal Computer Validation of the Quantum Phase Estimation Algorithm

    Science.gov (United States)

    Witzel, Wayne; Rudinger, Kenneth; Sarovar, Mohan; Carr, Robert

    While peer review and scientific consensus provide some assurance to the validity of ideas, people do make mistakes that can slip through the cracks. A plethora of formal methods tools exist and are in use in a variety of settings where high assurance is demanded. Existing tools, however, require a great deal of expertise and lack versatility, demanding a non-trivial translation between a high-level description of a problem and the formal system. Our software, called Prove-It, allows a nearly direct translation between human-recognizable formulations and the underlying formal system. While Prove-It is not designed for particularly efficient automation, a primary goal of other formal methods tools, it is extremely flexible in following a desired line of reasoning (proof structure). This approach is particularly valuable for validating proofs that are already known. We will demonstrate a validation of the Quantum Phase Estimation Algorithm using Prove-It. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories.

  11. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  12. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpizar, Adan; Schittenhelm, Ralf B.; Ramarathinam, Sri Harsha; Lindestam-Arlehamn, Cecilia S.; Koh, Ching Chiek; Gillet, Ludovic; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony; Rammensee, Hans-Georg; Stevanovic, Stevan; Aebersold, Ruedi

    2015-07-08

    We present a novel proteomics-based workflow and an open source data and computational resource for reproducibly identifying and quantifying HLA-associated peptides at high-throughput. The provided resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra and the analysis of quantitative digital maps of HLA peptidomes generated by SWATH mass spectrometry (MS). This is the first community-based study towards the development of a robust platform for the reproducible and quantitative measurement of HLA peptidomes, an essential step towards the design of efficient immunotherapies.

  13. PredMP: A Web Resource for Computationally Predicted Membrane Proteins via Deep Learning

    KAUST Repository

    Wang, Sheng

    2018-02-06

    Experimental determination of membrane protein (MP) structures is challenging as they are often too large for nuclear magnetic resonance (NMR) experiments and difficult to crystallize. Currently there are only about 510 non-redundant MPs with solved structures in Protein Data Bank (PDB). To elucidate the MP structures computationally, we developed a novel web resource, denoted as PredMP (http://52.87.130.56:3001/#/proteinindex), that delivers one-dimensional (1D) annotation of the membrane topology and secondary structure, two-dimensional (2D) prediction of the contact/distance map, together with three-dimensional (3D) modeling of the MP structure in the lipid bilayer, for each MP target from a given model organism. The precision of the computationally constructed MP structures is leveraged by state-of-the-art deep learning methods as well as cutting-edge modeling strategies. In particular, (i) we annotate 1D property via DeepCNF (Deep Convolutional Neural Fields) that not only models complex sequence-structure relationship but also interdependency between adjacent property labels; (ii) we predict 2D contact/distance map through Deep Transfer Learning which learns the patterns as well as the complex relationship between contacts/distances and protein features from non-membrane proteins; and (iii) we model 3D structure by feeding its predicted contacts and secondary structure to the Crystallography & NMR System (CNS) suite combined with a membrane burial potential that is residue-specific and depth-dependent. PredMP currently contains more than 2,200 multi-pass transmembrane proteins (length<700 residues) from Human. These transmembrane proteins are classified according to IUPHAR/BPS Guide, which provides a hierarchical organization of receptors, channels, transporters, enzymes and other drug targets according to their molecular relationships and physiological functions. Among these MPs, we estimated that our approach could predict correct folds for 1

  14. A confirmatory investigation of a job demands-resources model using a categorical estimator.

    Science.gov (United States)

    de Beer, Leon; Rothmann, Sebastiaan; Pienaar, Jaco

    2012-10-01

    A confirmatory investigation of a job demands-resources model was conducted with alternative methods, in a sample of 15,633 working adults aggregated from various economic sectors. The proposed model is in line with job demands-resources theory and assumes two psychological processes at work which are collectively coined "the dual process." The first process, the energetic, presents that job demands lead to ill-health outcomes due to burnout. The second process, the motivational, indicates that job resources lead to organizational commitment due to work engagement. Structural equation modelling analyses were implemented with a categorical estimator. Mediation analyses of each of the processes included bootstrapped indirect effects and kappa-squared values to apply qualitative labels to effect sizes. The relationship between job resources and organizational commitment was mediated by engagement with a large effect. The relationship between job demands and ill-health was mediated by burnout with a medium effect. The implications of the results for theory and practice were discussed.

  15. Estimating the weight of Douglas-fir tree boles and logs with an iterative computer model.

    Science.gov (United States)

    Dale R. Waddell; Dale L Weyermann; Michael B. Lambert

    1987-01-01

    A computer model that estimates the green weights of standing trees was developed and validated for old-growth Douglas-fir. The model calculates the green weight for the entire bole, for the bole to any merchantable top, and for any log length within the bole. The model was validated by estimating the bias and accuracy of an independent subsample selected from the...

  16. Estimate of Geothermal Energy Resource in Major U.S. Sedimentary Basins (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Porro, C.; Augustine, C.

    2012-04-01

    This study estimates the magnitude of geothermal energy from fifteen major known US sedimentary basins and ranks these basins relative to their potential. Because most sedimentary basins have been explored for oil and gas, well logs, temperatures at depth, and reservoir properties are known. This reduces exploration risk and allows development of geologic exploration models for each basin as well as a relative assessment of geologic risk elements for each play. The total available thermal resource for each basin was estimated using the volumetric heat-in-place method originally proposed by Muffler (USGS). Total sedimentary thickness maps, stratigraphic columns, cross sections, and temperature gradient Information were gathered for each basin from published articles, USGS reports, and state geological survey reports. When published data was insufficient, thermal gradients and reservoir properties were derived from oil and gas well logs obtained on oil and gas commission websites. Basin stratigraphy, structural history, and groundwater circulation patterns were studied in order to develop a model that estimates resource size and temperature distribution, and to qualitatively assess reservoir productivity.

  17. Directions of use of electronic resources at training to computer science of students of a teacher training college

    OpenAIRE

    Светлана Анатольева Баженова

    2009-01-01

    Article is devoted questions of use of electronic resources at training to computer science in a teacher training college, principles of pedagogical expediency of use of electronic resources at training are specified computer science and positive aspects of such use for different forms of work of the student and the teacher are allocated.

  18. Recoverable Resource Estimate of Identified Onshore Geopressured Geothermal Energy in Texas and Louisiana (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Esposito, A.; Augustine, C.

    2012-04-01

    Geopressured geothermal reservoirs are characterized by high temperatures and high pressures with correspondingly large quantities of dissolved methane. Due to these characteristics, the reservoirs provide two sources of energy: chemical energy from the recovered methane, and thermal energy from the recovered fluid at temperatures high enough to operate a binary power plant for electricity production. Formations with the greatest potential for recoverable energy are located in the gulf coastal region of Texas and Louisiana where significantly overpressured and hot formations are abundant. This study estimates the total recoverable onshore geopressured geothermal resource for identified sites in Texas and Louisiana. In this study a geopressured geothermal resource is defined as a brine reservoir with fluid temperature greater than 212 degrees F and a pressure gradient greater than 0.7 psi/ft.

  19. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  20. Piping data bank and erection system of Angra 2: structure, computational resources and systems

    International Nuclear Information System (INIS)

    Abud, P.R.; Court, E.G.; Rosette, A.C.

    1992-01-01

    The Piping Data Bank of Angra 2 called - Erection Management System - Was developed to manage the piping erection of the Nuclear Power Plant of Angra 2. Beyond the erection follow-up of piping and supports, it manages: the piping design, the material procurement, the flow of the fabrication documents, testing of welds and material stocks at the Warehouse. The works developed in the sense of defining the structure of the Data Bank, Computational Resources and System are here described. (author)

  1. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  2. Estimation of forest resources from a country wide laser scanning survey and national forest inventory data

    DEFF Research Database (Denmark)

    Nord-Larsen, Thomas; Schumacher, Johannes

    2012-01-01

    Airborne laser scanning may provide a means for assessing local forest biomass resources. In this study, national forest inventory (NFI) data was used as reference data for modeling forest basal area, volume, aboveground biomass, and total biomass from laser scanning data obtained in a countrywide...... scanning survey. Data covered a wide range of forest ecotypes, stand treatments, tree species, and tree species mixtures. The four forest characteristics were modeled using nonlinear regression and generalized method-of-moments estimation to avoid biased and inefficient estimates. The coefficient...... for deciduous forest and negatively biased for coniferous forest. Species type specific (coniferous, deciduous, or mixed forest) models reduced root mean squared error by 3–12% and removed the bias. In application, model predictions will be improved by stratification into deciduous and coniferous forest using e...

  3. Collocational Relations in Japanese Language Textbooks and Computer-Assisted Language Learning Resources

    Directory of Open Access Journals (Sweden)

    Irena SRDANOVIĆ

    2011-05-01

    Full Text Available In this paper, we explore presence of collocational relations in the computer-assisted language learning systems and other language resources for the Japanese language, on one side, and, in the Japanese language learning textbooks and wordlists, on the other side. After introducing how important it is to learn collocational relations in a foreign language, we examine their coverage in the various learners’ resources for the Japanese language. We particularly concentrate on a few collocations at the beginner’s level, where we demonstrate their treatment across various resources. A special attention is paid to what is referred to as unpredictable collocations, which have a bigger foreign language learning-burden than the predictable ones.

  4. Resource allocation on computational grids using a utility model and the knapsack problem

    CERN Document Server

    Van der ster, Daniel C; Parra-Hernandez, Rafael; Sobie, Randall J

    2009-01-01

    This work introduces a utility model (UM) for resource allocation on computational grids and formulates the allocation problem as a variant of the 0–1 multichoice multidimensional knapsack problem. The notion of task-option utility is introduced, and it is used to effect allocation policies. We present a variety of allocation policies, which are expressed as functions of metrics that are both intrinsic and external to the task and resources. An external user-defined credit-value metric is shown to allow users to intervene in the allocation of urgent or low priority tasks. The strategies are evaluated in simulation against random workloads as well as those drawn from real systems. We measure the sensitivity of the UM-derived schedules to variations in the allocation policies and their corresponding utility functions. The UM allocation strategy is shown to optimally allocate resources congruent with the chosen policies.

  5. The Trope Tank: A Laboratory with Material Resources for Creative Computing

    Directory of Open Access Journals (Sweden)

    Nick Montfort

    2014-12-01

    Full Text Available http://dx.doi.org/10.5007/1807-9288.2014v10n2p53 Principles for organizing and making use of a laboratory with material computing resources are articulated. This laboratory, the Trope Tank, is a facility for teaching, research, and creative collaboration and offers hardware (in working condition and set up for use from the 1970s, 1980s, and 1990s, including videogame systems, home computers, and an arcade cabinet. To aid in investigating the material history of texts, the lab has a small 19th century letterpress, a typewriter, a print terminal, and dot-matrix printers. Other resources include controllers, peripherals, manuals, books, and software on physical media. These resources are used for teaching, loaned for local exhibitions and presentations, and accessed by researchers and artists. The space is primarily a laboratory (rather than a library, studio, or museum, so materials are organized by platform and intended use. Textual information about the historical contexts of the available systems, and resources are set up to allow easy operation, and even casual use, by researchers, teachers, students, and artists.

  6. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  7. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  8. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Directory of Open Access Journals (Sweden)

    Mark E Ralston

    Full Text Available A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers.To determine the accuracy and precision of mid-upper arm circumference (MUAC and height as weight estimation tools in children under five years of age in low-to-middle income countries.This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design.Locations with high prevalence of acute and chronic malnutrition.A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ, weight-for-age z-score (WAZ, and height-for-age z-score (HAZ values.Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined.Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision.Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%; proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97

  9. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  10. A geo-informatics approach for estimating water resources management components and their interrelationships

    KAUST Repository

    Liaqat, Umar Waqas

    2016-09-21

    A remote sensing based geo-informatics approach was developed to estimate water resources management (WRM) components across a large irrigation scheme in the Indus Basin of Pakistan. The approach provides a generalized framework for estimating a range of key water management variables and provides a management tool for the sustainable operation of similar schemes globally. A focus on the use of satellite data allowed for the quantification of relationships across a range of spatial and temporal scales. Variables including actual and crop evapotranspiration, net and gross irrigation, net and gross groundwater use, groundwater recharge, net groundwater recharge, were estimated and then their interrelationships explored across the Hakra Canal command area. Spatially distributed remotely sensed estimates of actual evapotranspiration (ETa) rates were determined using the Surface Energy Balance System (SEBS) model and evaluated against ground-based evaporation calculated from the advection-aridity method. Analysis of ETa simulations across two cropping season, referred to as Kharif and Rabi, yielded Pearson correlation (R) values of 0.69 and 0.84, Nash-Sutcliffe criterion (NSE) of 0.28 and 0.63, percentage bias of −3.85% and 10.6% and root mean squared error (RMSE) of 10.6 mm and 12.21 mm for each season, respectively. For the period of study between 2008 and 2014, it was estimated that an average of 0.63 mm day−1 water was supplied through canal irrigation against a crop water demand of 3.81 mm day−1. Approximately 1.86 mm day−1 groundwater abstraction was estimated in the region, which contributed to fulfil the gap between crop water demand and canal water supply. Importantly, the combined canal, groundwater and rainfall sources of water only met 70% of the crop water requirements. As such, the difference between recharge and discharge showed that groundwater depletion was around −115 mm year−1 during the six year study period. Analysis indicated that

  11. Impact of Changing Computer Technology on Hydrologic and Water Resource Modeling (Paper 7R0049)

    Science.gov (United States)

    Loucks, Daniel P.; Fedra, Kurt

    1987-03-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designed to aid those involved in analyzing hydrologic data and in operating, managing, or planning water resource facilities. Such systems of hardware and software are being designed to allow direct and easy access to a broad and heterogeneous group of users. These systems often combine data-base management; simulation and optimization techniques; symbolic colored displays; heuristic, qualitative approaches; and possibly artificial intelligence methods in an interactive, user-controlled, easily accessible interface. Individuals involved in the use of such systems are not only those with technical training, but also those representing different interest groups and having non-technical backgrounds. The essential difference between what is happening now and the more traditional off-line, non-interactive approaches is that instead of generating solutions to specific problems, model developers are now beginning to deliver, in a much more useful and user-friendly form, computer-based turnkey systems for exploring, analyzing and synthesizing plans or policies. Such tools permit the user to evaluate alternative solutions based on his or her own objectives and subjective judgments in an interactive learning and decision-making process.

  12. Hop-Distance Estimation in Wireless Sensor Networks with Applications to Resources Allocation

    Directory of Open Access Journals (Sweden)

    Liang Zhao

    2007-05-01

    Full Text Available We address a fundamental problem in wireless sensor networks, how many hops does it take a packet to be relayed for a given distance? For a deterministic topology, this hop-distance estimation reduces to a simple geometry problem. However, a statistical study is needed for randomly deployed WSNs. We propose a maximum-likelihood decision based on the conditional pdf of f(r∣Hi. Due to the computational complexity of f(r∣Hi, we also propose an attenuated Gaussian approximation for the conditional pdf. We show that the approximation visibly simplifies the decision process and the error analysis. The latency and energy consumption estimation are also included as application examples. Simulations show that our approximation model can predict the latency and energy consumption with less than half RMSE, compared to the linear models.

  13. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...... for estimating measurement uncertainties are briefly discussed. As we will show, the developed virtual CT (VCT) simulator can be adapted to various scanner systems, providing realistic CT data. Using the Monte Carlo method (MCM), measurement uncertainties for a given measuring task can be estimated, taking...

  15. Linking resource selection and mortality modeling for population estimation of mountain lions in Montana

    Science.gov (United States)

    Robinson, Hugh S.; Ruth, Toni K.; Gude, Justin A.; Choate, David; DeSimone, Rich; Hebblewhite, Mark; Matchett, Marc R.; Mitchell, Michael S.; Murphy, Kerry; Williams, Jim

    2015-01-01

    To be most effective, the scale of wildlife management practices should match the range of a particular species’ movements. For this reason, combined with our inability to rigorously or regularly census mountain lion populations, several authors have suggested that mountain lions be managed in a source-sink or metapopulation framework. We used a combination of resource selection functions, mortality estimation, and dispersal modeling to estimate cougar population levels in Montana statewide and potential population level effects of planned harvest levels. Between 1980 and 2012, 236 independent mountain lions were collared and monitored for research in Montana. From these data we used 18,695 GPS locations collected during winter from 85 animals to develop a resource selection function (RSF), and 11,726 VHF and GPS locations from 142 animals along with the locations of 6343 mountain lions harvested from 1988–2011 to validate the RSF model. Our RSF model validated well in all portions of the State, although it appeared to perform better in Montana Fish, Wildlife and Parks (MFWP) Regions 1, 2, 4 and 6, than in Regions 3, 5, and 7. Our mean RSF based population estimate for the total population (kittens, juveniles, and adults) of mountain lions in Montana in 2005 was 3926, with almost 25% of the entire population in MFWP Region 1. Estimates based on a high and low reference population estimates produce a possible range of 2784 to 5156 mountain lions statewide. Based on a range of possible survival rates we estimated the mountain lion population in Montana to be stable to slightly increasing between 2005 and 2010 with lambda ranging from 0.999 (SD = 0.05) to 1.02 (SD = 0.03). We believe these population growth rates to be a conservative estimate of true population growth. Our model suggests that proposed changes to female harvest quotas for 2013–2015 will result in an annual statewide population decline of 3% and shows that, due to reduced dispersal, changes to

  16. Subroutine library for error estimation of matrix computation (Ver. 1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi; Shizawa, Yoshihisa; Kishida, Norio

    1999-03-01

    'Subroutine Library for Error Estimation of Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the linear system's solutions or the Hermitian matrices' eigenvalues. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calculate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The subroutines for error estimation of Hermitian matrix eigenvalues derive the error ranges of the eigenvalues according to the Korn-Kato's formula. The test matrix generators supply the matrices appeared in the mathematical research, the ones randomly generated and the ones appeared in the application programs. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  17. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    Science.gov (United States)

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  18. Fast Rescheduling of Multiple Workflows to Constrained Heterogeneous Resources Using Multi-Criteria Memetic Computing

    Directory of Open Access Journals (Sweden)

    Wolfgang Süß

    2013-04-01

    Full Text Available This paper is motivated by, but not limited to, the task of scheduling jobs organized in workflows to a computational grid. Due to the dynamic nature of grid computing, more or less permanent replanning is required so that only very limited time is available to come up with a revised plan. To meet the requirements of both users and resource owners, a multi-objective optimization comprising execution time and costs is needed. This paper summarizes our work over the last six years in this field, and reports new results obtained by the combination of heuristics and evolutionary search in an adaptive Memetic Algorithm. We will show how different heuristics contribute to solving varying replanning scenarios and investigate the question of the maximum manageable work load for a grid of growing size starting with a load of 200 jobs and 20 resources up to 7000 jobs and 700 resources. Furthermore, the effect of four different local searchers incorporated into the evolutionary search is studied. We will also report briefly on approaches that failed within the short time frame given for planning.

  19. Exploring Graphics Processing Unit (GPU Resource Sharing Efficiency for High Performance Computing

    Directory of Open Access Journals (Sweden)

    Teng Li

    2013-11-01

    Full Text Available The increasing incorporation of Graphics Processing Units (GPUs as accelerators has been one of the forefront High Performance Computing (HPC trends and provides unprecedented performance; however, the prevalent adoption of the Single-Program Multiple-Data (SPMD programming model brings with it challenges of resource underutilization. In other words, under SPMD, every CPU needs GPU capability available to it. However, since CPUs generally outnumber GPUs, the asymmetric resource distribution gives rise to overall computing resource underutilization. In this paper, we propose to efficiently share the GPU under SPMD and formally define a series of GPU sharing scenarios. We provide performance-modeling analysis for each sharing scenario with accurate experimentation validation. With the modeling basis, we further conduct experimental studies to explore potential GPU sharing efficiency improvements from multiple perspectives. Both further theoretical and experimental GPU sharing performance analysis and results are presented. Our results not only demonstrate the significant performance gain for SPMD programs with the proposed efficient GPU sharing, but also the further improved sharing efficiency with the optimization techniques based on our accurate modeling.

  20. A Safety Resource Allocation Mechanism against Connection Fault for Vehicular Cloud Computing

    Directory of Open Access Journals (Sweden)

    Tianpeng Ye

    2016-01-01

    Full Text Available The Intelligent Transportation System (ITS becomes an important component of the smart city toward safer roads, better traffic control, and on-demand service by utilizing and processing the information collected from sensors of vehicles and road side infrastructure. In ITS, Vehicular Cloud Computing (VCC is a novel technology balancing the requirement of complex services and the limited capability of on-board computers. However, the behaviors of the vehicles in VCC are dynamic, random, and complex. Thus, one of the key safety issues is the frequent disconnections between the vehicle and the Vehicular Cloud (VC when this vehicle is computing for a service. More important, the connection fault will disturb seriously the normal services of VCC and impact the safety works of the transportation. In this paper, a safety resource allocation mechanism is proposed against connection fault in VCC by using a modified workflow with prediction capability. We firstly propose the probability model for the vehicle movement which satisfies the high dynamics and real-time requirements of VCC. And then we propose a Prediction-based Reliability Maximization Algorithm (PRMA to realize the safety resource allocation for VCC. The evaluation shows that our mechanism can improve the reliability and guarantee the real-time performance of the VCC.

  1. Computable Error Estimates for Finite Element Approximations of Elliptic Partial Differential Equations with Rough Stochastic Data

    KAUST Repository

    Hall, Eric Joseph

    2016-12-08

    We derive computable error estimates for finite element approximations of linear elliptic partial differential equations with rough stochastic coefficients. In this setting, the exact solutions contain high frequency content that standard a posteriori error estimates fail to capture. We propose goal-oriented estimates, based on local error indicators, for the pathwise Galerkin and expected quadrature errors committed in standard, continuous, piecewise linear finite element approximations. Derived using easily validated assumptions, these novel estimates can be computed at a relatively low cost and have applications to subsurface flow problems in geophysics where the conductivities are assumed to have lognormal distributions with low regularity. Our theory is supported by numerical experiments on test problems in one and two dimensions.

  2. Computational Package for Copolymerization Reactivity Ratio Estimation: Improved Access to the Error-in-Variables-Model

    Directory of Open Access Journals (Sweden)

    Alison J. Scott

    2018-01-01

    Full Text Available The error-in-variables-model (EVM is the most statistically correct non-linear parameter estimation technique for reactivity ratio estimation. However, many polymer researchers are unaware of the advantages of EVM and therefore still choose to use rather erroneous or approximate methods. The procedure is straightforward but it is often avoided because it is seen as mathematically and computationally intensive. Therefore, the goal of this work is to make EVM more accessible to all researchers through a series of focused case studies. All analyses employ a MATLAB-based computational package for copolymerization reactivity ratio estimation. The basis of the package is previous work in our group over many years. This version is an improvement, as it ensures wider compatibility and enhanced flexibility with respect to copolymerization parameter estimation scenarios that can be considered.

  3. Computationally fast estimation of muscle tension for realtime bio-feedback.

    Science.gov (United States)

    Murai, Akihiko; Kurosaki, Kosuke; Yamane, Katsu; Nakamura, Yoshihiko

    2009-01-01

    In this paper, we propose a method for realtime estimation of whole-body muscle tensions. The main problem of muscle tension estimation is that there are infinite number of solutions to realize a particular joint torque due to the actuation redundancy. Numerical optimization techniques, e.g. quadratic programming, are often employed to obtain a unique solution, but they are usually computationally expensive. For example, our implementation of quadratic programming takes about 0.17 sec per frame on the musculoskeletal model with 274 elements, which is far from realtime computation. Here, we propose to reduce the computational cost by using EMG data and by reducing the number of unknowns in the optimization. First, we compute the tensions of muscles with surface EMG data based on a biological muscle data, which is a very efficient process. We also assume that their synergists have the same activity levels and compute their tensions with the same model. Tensions of the remaining muscles are then computed using quadratic programming, but the number of unknowns is significantly reduced by assuming that the muscles in the same heteronymous group have the same activity level. The proposed method realizes realtime estimation and visualization of the whole-body muscle tensions that can be applied to sports training and rehabilitation.

  4. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Insu [Korea Institute of Construction Technology, Goyang (Korea, Republic of); Kim, Woojung [KHNP-Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  5. Direction for the Estimation of Required Resources for Nuclear Power Plant Decommissioning based on BIM via Case Study

    International Nuclear Information System (INIS)

    Jung, Insu; Kim, Woojung

    2014-01-01

    Ways to estimate decommissioning of required resources in the past have imposed great uncertainty since they analyze required resources at the construction stage, analyzing and consulting decommissioning required resources of overseas nuclear power plants. As demands on efficient management and use of complicated construction information increased these days, demands on the introduction of Building Information Modeling (herein after referred to as BIM) technology has increased. In the area of quotation, considerable effects are expected as to the accuracy and reliability predicting construction costs through the characteristics that can automatically estimate quantities by using attribute information of BIM model. BIM-based estimation and quotation of required resources is more accurate than the existing 2D-based quotations and have many advantages such as reviews over constructability and interference. It can be desirable to estimate decommissioning required resources in nuclear power plants using BIM as well as using tools that are compatible with usual international/industrial standards. As we looked into the cases where required resources were estimated, using BIM in Korea and abroad, they dealt with estimation of required resources, estimation of construction cost and process management at large. In each area, methodologies, classification systems, BIM, and realization tests have been used variably. Nonetheless, several problems have been reported, and among them, it is noticeable that although BIM standard classification system exists, no case was found that has used standard classification system. This means that no interlink among OBS (Object Breakdown Structure), WBS (Work Breakdown Structure) and CBS (Cost Breakdown Structure) was possible. Thus, for nuclear power plant decommissioning, decommissioning method and process, etc. shall be defined clearly in the stage of decommissioning strategy establishment, so that classification systems must be set up

  6. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    Science.gov (United States)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop

  7. An Architecture of IoT Service Delegation and Resource Allocation Based on Collaboration between Fog and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2016-01-01

    Full Text Available Despite the wide utilization of cloud computing (e.g., services, applications, and resources, some of the services, applications, and smart devices are not able to fully benefit from this attractive cloud computing paradigm due to the following issues: (1 smart devices might be lacking in their capacity (e.g., processing, memory, storage, battery, and resource allocation, (2 they might be lacking in their network resources, and (3 the high network latency to centralized server in cloud might not be efficient for delay-sensitive application, services, and resource allocations requests. Fog computing is promising paradigm that can extend cloud resources to edge of network, solving the abovementioned issue. As a result, in this work, we propose an architecture of IoT service delegation and resource allocation based on collaboration between fog and cloud computing. We provide new algorithm that is decision rules of linearized decision tree based on three conditions (services size, completion time, and VMs capacity for managing and delegating user request in order to balance workload. Moreover, we propose algorithm to allocate resources to meet service level agreement (SLA and quality of services (QoS as well as optimizing big data distribution in fog and cloud computing. Our simulation result shows that our proposed approach can efficiently balance workload, improve resource allocation efficiently, optimize big data distribution, and show better performance than other existing methods.

  8. Estimation of numerical uncertainty in computational fluid dynamics simulations of a passively controlled wave energy converter

    DEFF Research Database (Denmark)

    Wang, Weizhi; Wu, Minghao; Palm, Johannes

    2018-01-01

    mathematical models such as computational fluid dynamics are preferred and over the last 5 years, computational fluid dynamics has become more frequently used in the wave energy field. However, rigorous estimation of numerical errors, convergence rates and uncertainties associated with computational fluid...... for almost linear incident waves. First, we show that the computational fluid dynamics simulations have acceptable agreement to experimental data. We then present a verification and validation study focusing on the solution verification covering spatial and temporal discretization, iterative and domain......The wave loads and the resulting motions of floating wave energy converters are traditionally computed using linear radiation–diffraction methods. Yet for certain cases such as survival conditions, phase control and wave energy converters operating in the resonance region, more complete...

  9. Estimation of distribution algorithm for resource allocation in green cooperative cognitive radio sensor networks.

    Science.gov (United States)

    Naeem, Muhammad; Pareek, Udit; Lee, Daniel C; Anpalagan, Alagan

    2013-04-12

    Due to the rapid increase in the usage and demand of wireless sensor networks (WSN), the limited frequency spectrum available for WSN applications will be extremely crowded in the near future. More sensor devices also mean more recharging/replacement of batteries, which will cause significant impact on the global carbon footprint. In this paper, we propose a relay-assisted cognitive radio sensor network (CRSN) that allocates communication resources in an environmentally friendly manner. We use shared band amplify and forward relaying for cooperative communication in the proposed CRSN. We present a multi-objective optimization architecture for resource allocation in a green cooperative cognitive radio sensor network (GC-CRSN). The proposed multi-objective framework jointly performs relay assignment and power allocation in GC-CRSN, while optimizing two conflicting objectives. The first objective is to maximize the total throughput, and the second objective is to minimize the total transmission power of CRSN. The proposed relay assignment and power allocation problem is a non-convex mixed-integer non-linear optimization problem (NC-MINLP), which is generally non-deterministic polynomial-time (NP)-hard. We introduce a hybrid heuristic algorithm for this problem. The hybrid heuristic includes an estimation-of-distribution algorithm (EDA) for performing power allocation and iterative greedy schemes for constraint satisfaction and relay assignment. We analyze the throughput and power consumption tradeoff in GC-CRSN. A detailed analysis of the performance of the proposed algorithm is presented with the simulation results.

  10. Management of water resources and low flow estimation for the Himalayan basins of Nepal

    Science.gov (United States)

    Chalise, Suresh R.; Kansakar, Sunil R.; Rees, Gwyn; Croker, Karen; Zaidman, Maxine

    2003-11-01

    Reliable estimates of low flow are extremely important for the monsoonal mountainous areas of the Hindu Kush-Himalayan (HKH) region as people in this region are facing growing problems of water during dry periods in terms of both quality and quantity. Furthermore, increasing evidence of decrease in snow cover and retreat of glaciers due to global warming have been reported from various parts of the HKH, which has serious implications for low flow in the region. However, reliable methods for low flow are difficult to find and lack of hydrometeorological data have inhibited the development of such methods. This has posed serious problems for sustainable management of water resources systems in the region in view of the difficulties in low flow. This paper also discusses a method for low flow estimation in the mountainous regions of Nepal, which shares with other neighbouring areas the complexity in the nature of the terrain and climate, inadequate hydro-meteorological network and insufficient long-term reliable data on hydrometeorology. The method discussed is based on a method developed in the UK, for estimating the hydrological regime at ungauged sites. Regionalization of the flow has been developed applying multi-variate regression analysis of long-term hydro-meteorological data and catchment characteristics. A number of standardised flow duration type curves have been determined, and regression models of the flows and the topographical and geological characteristics of the catchments were established. The mean flow was estimated using the water balance principle where the long-term mean annual runoff is the difference between long-term average annual precipitation and long-term actual evapotranspiration. Regionalization has been carried out by developing grids of hydrological response. The grids allow the flow regime to be estimated at any point on any stream in the mountainous region of the country. This is the first regional method of this kind to be developed

  11. The model of localized business community economic development under limited financial resources: computer model and experiment

    Directory of Open Access Journals (Sweden)

    Berg Dmitry

    2016-01-01

    Full Text Available Globalization processes now affect and are affected by most of organizations, different type resources, and the natural environment. One of the main restrictions initiated by these processes is the financial one: money turnover in global markets leads to its concentration in the certain financial centers, and local business communities suffer from the money lack. This work discusses the advantages of complementary currency introduction into a local economics. By the computer simulation with the engineered program model and the real economic experiment it was proved that the complementary currency does not compete with the traditional currency, furthermore, it acts in compliance with it, providing conditions for the sustainable business community development.

  12. 75 FR 63724 - Raisins Produced From Grapes Grown in California; Use of Estimated Trade Demand To Compute Volume...

    Science.gov (United States)

    2010-10-18

    ... California; Use of Estimated Trade Demand To Compute Volume Regulation Percentages AGENCY: Agricultural... an estimated trade demand for the 2010-11 crop NS raisins to compute volume regulation percentages... establishment of an estimated trade demand is necessary to ensure that volume regulation in established for the...

  13. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  14. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Agota Giedrė Raišienė

    2013-08-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources. Methodology—meta analysis, survey and descriptive analysis. Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, that significant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  15. The Usage of Informal Computer Based Communication in the Context of Organization’s Technological Resources

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2011-12-01

    Full Text Available Purpose of the article is theoretically and practically analyze the features of informal computer based communication in the context of organization’s technological resources.Methodology—meta analysis, survey and descriptive analysis.Findings. According to scientists, the functions of informal communication cover sharing of work related information, coordination of team activities, spread of organizational culture and feeling of interdependence and affinity. Also, informal communication widens the individuals’ recognition of reality, creates general context of environment between talkers, and strengthens interpersonal attraction. For these reasons, informal communication is desirable and even necessary in organizations because it helps to ensure efficient functioning of the enterprise. However, communicating through electronic channels suppresses informal connections or addresses them to the outside of the organization. So, electronic communication is not beneficial for developing ties in informal organizational network. The empirical research showed, thatsignificant part of courts administration staff is prone to use technological resources of their office for informal communication. Representatives of courts administration choose friends for computer based communication much more often than colleagues (72% and 63%respectively. 93%of the research respondents use an additional e-mail box serviced by commercial providers for non work communication. High intensity of informal electronic communication with friends and familiars shows that workers of court administration are used to meet their psycho emotional needs outside the work place. The survey confirmed conclusion of the theoretical analysis: computer based communication is not beneficial for developing informal contacts between workers. In order for the informal communication could carry out its functions and technological recourses of organization would be used effectively, staff

  16. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  17. Using Computed Tomography Scans and Patient Demographic Data to Estimate Thoracic Epidural Space Depth

    Directory of Open Access Journals (Sweden)

    Alyssa Kosturakis

    2015-01-01

    Full Text Available Background and Objectives. Previous studies have used varying methods to estimate the depth of the epidural space prior to placement of an epidural catheter. We aim to use computed tomography scans, patient demographics, and vertebral level to estimate the depth of the loss of resistance for placement of thoracic epidural catheters. Methods. The records of consecutive patients who received a thoracic epidural catheter were reviewed. Patient demographics, epidural placement site, and technique were collected. Preoperative computed tomography scans were reviewed to measure the skin to epidural space distance. Linear regression was used for a multivariate analysis. Results. The records of 218 patients were reviewed. The mean loss of resistance measurement was significantly larger than the mean computed tomography epidural space depth measurement by 0.79 cm (p<0.001. Our final multivariate model, adjusted for demographic and epidural technique, showed a positive correlation between the loss of resistance and the computed tomography epidural space depth measurement (R2=0.5692, p<0.0001. Conclusions. The measured loss of resistance is positively correlated with the computed tomography epidural space depth measurement and patient demographics. For patients undergoing thoracic or abdominal surgery, estimating the loss of resistance can be a valuable tool.

  18. Error Estimates of the Ares I Computed Turbulent Ascent Longitudinal Aerodynamic Analysis

    Science.gov (United States)

    Abdol-Hamid, Khaled S.; Ghaffari, Farhad

    2012-01-01

    Numerical predictions of the longitudinal aerodynamic characteristics for the Ares I class of vehicles, along with the associated error estimate derived from an iterative convergence grid refinement, are presented. Computational results are based on an unstructured grid, Reynolds-averaged Navier-Stokes analysis. The validity of the approach to compute the associated error estimates, derived from a base grid to an extrapolated infinite-size grid, was first demonstrated on a sub-scaled wind tunnel model at representative ascent flow conditions for which the experimental data existed. Such analysis at the transonic flow conditions revealed a maximum deviation of about 23% between the computed longitudinal aerodynamic coefficients with the base grid and the measured data across the entire roll angles. This maximum deviation from the wind tunnel data was associated with the computed normal force coefficient at the transonic flow condition and was reduced to approximately 16% based on the infinite-size grid. However, all the computed aerodynamic coefficients with the base grid at the supersonic flow conditions showed a maximum deviation of only about 8% with that level being improved to approximately 5% for the infinite-size grid. The results and the error estimates based on the established procedure are also presented for the flight flow conditions.

  19. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  20. Estimating the Impact of Drought on Groundwater Resources of the Marshall Islands

    Directory of Open Access Journals (Sweden)

    Brandon L. Barkey

    2017-01-01

    Full Text Available Groundwater resources of small coral islands are threatened due to short-term and long-term changes in climate. A significant short-term threat is El Niño events, which typically induce a severe months-long drought for many atoll nations in the western and central Pacific regions that exhausts rainwater supply and necessitates the use of groundwater. This study quantifies fresh groundwater resources under both average rainfall and drought conditions for the Republic of Marshall Islands (RMI, a nation composed solely of atolls and which is severely impacted by El Niño droughts. The atoll island algebraic model is used to estimate the thickness of the freshwater lens for 680 inhabited and uninhabited islands of the RMI, with a focus on the severe 1998 drought. The model accounts for precipitation, island width, hydraulic conductivity of the upper Holocene-age sand aquifer, the depth to the contact between the Holocene aquifer and the lower Pleistocene-age limestone aquifer, and the presence of a reef flat plate underlying the ocean side of the island. Model results are tested for islands that have fresh groundwater data. Results highlight the fragility of groundwater resources for the nation. Average lens thickness during typical seasonal rainfall is approximately 4 m, with only 30% of the islands maintaining a lens thicker than 4.5% and 55% of the islands with a lens less than 2.5 m thick. Thicker lenses typically occur for larger islands, islands located on the leeward side of an atoll due to lower hydraulic conductivity, and islands located in the southern region of the RMI due to higher rainfall rates. During drought, groundwater on small islands (<300 m in width is completely depleted. Over half (54% of the islands are classified as “Highly Vulnerable” to drought. Results provide valuable information for RMI water resources planners, particularly during the current 2016 El Niño drought, and similar methods can be used to quantify

  1. Hemoglobin estimation by the HemoCue® portable hemoglobin photometer in a resource poor setting.

    Science.gov (United States)

    Nkrumah, Bernard; Nguah, Samuel Blay; Sarpong, Nimako; Dekker, Denise; Idriss, Ali; May, Juergen; Adu-Sarkodie, Yaw

    2011-04-21

    In resource poor settings where automated hematology analyzers are not available, the Cyanmethemoglobin method is often used. This method though cheaper, takes more time. In blood donations, the semi-quantitative gravimetric copper sulfate method which is very easy and inexpensive may be used but does not provide an acceptable degree of accuracy. The HemoCue® hemoglobin photometer has been used for these purposes. This study was conducted to generate data to support or refute its use as a point-of-care device for hemoglobin estimation in mobile blood donations and critical care areas in health facilities. EDTA blood was collected from study participants drawn from five groups: pre-school children, school children, pregnant women, non-pregnant women and men. Blood collected was immediately processed to estimate the hemoglobin concentration using three different methods (HemoCue®, Sysmex KX21N and Cyanmethemoglobin). Agreement between the test methods was assessed by the method of Bland and Altman. The Intraclass correlation coefficient (ICC) was used to determine the within subject variability of measured hemoglobin. Of 398 subjects, 42% were males with the overall mean age being 19.4 years. The overall mean hemoglobin as estimated by each method was 10.4 g/dl for HemoCue, 10.3 g/dl for Sysmex KX21N and 10.3 g/dl for Cyanmethemoglobin. Pairwise analysis revealed that the hemoglobin determined by the HemoCue method was higher than that measured by the KX21N and Cyanmethemoglobin. Comparing the hemoglobin determined by the HemoCue to Cyanmethemoglobin, the concordance correlation coefficient was 0.995 (95% CI: 0.994-0.996, p < 0.001). The Bland and Altman's limit of agreement was -0.389 - 0.644 g/dl with the mean difference being 0.127 (95% CI: 0.102-0.153) and a non-significant difference in variability between the two measurements (p = 0.843). After adjusting to assess the effect of other possible confounders such as sex, age and category of person, there was no

  2. Hemoglobin estimation by the HemoCue® portable hemoglobin photometer in a resource poor setting

    Directory of Open Access Journals (Sweden)

    Idriss Ali

    2011-04-01

    Full Text Available Abstract Background In resource poor settings where automated hematology analyzers are not available, the Cyanmethemoglobin method is often used. This method though cheaper, takes more time. In blood donations, the semi-quantitative gravimetric copper sulfate method which is very easy and inexpensive may be used but does not provide an acceptable degree of accuracy. The HemoCue® hemoglobin photometer has been used for these purposes. This study was conducted to generate data to support or refute its use as a point-of-care device for hemoglobin estimation in mobile blood donations and critical care areas in health facilities. Method EDTA blood was collected from study participants drawn from five groups: pre-school children, school children, pregnant women, non-pregnant women and men. Blood collected was immediately processed to estimate the hemoglobin concentration using three different methods (HemoCue®, Sysmex KX21N and Cyanmethemoglobin. Agreement between the test methods was assessed by the method of Bland and Altman. The Intraclass correlation coefficient (ICC was used to determine the within subject variability of measured hemoglobin. Results Of 398 subjects, 42% were males with the overall mean age being 19.4 years. The overall mean hemoglobin as estimated by each method was 10.4 g/dl for HemoCue, 10.3 g/dl for Sysmex KX21N and 10.3 g/dl for Cyanmethemoglobin. Pairwise analysis revealed that the hemoglobin determined by the HemoCue method was higher than that measured by the KX21N and Cyanmethemoglobin. Comparing the hemoglobin determined by the HemoCue to Cyanmethemoglobin, the concordance correlation coefficient was 0.995 (95% CI: 0.994-0.996, p Conclusion Hemoglobin determined by the HemoCue method is comparable to that determined by the other methods. The HemoCue photometer is therefore recommended for use as on-the-spot device for determining hemoglobin in resource poor setting.

  3. Validation and enhancement of a computable medication indication resource (MEDI) using a large practice-based dataset.

    Science.gov (United States)

    Wei, Wei-Qi; Mosley, Jonathan D; Bastarache, Lisa; Denny, Joshua C

    2013-01-01

    Linking medications with their indications is important for clinical care and research. We have recently developed a freely-available, computable medication-indication resource, called MEDI, which links RxNorm medications to indications mapped to ICD9 codes. In this paper, we identified the medications and diagnoses for 1.3 million individuals at Vanderbilt University Medical Center to evaluate the medication coverage of MEDI and then to calculate the prevalence for each indication for each medication. Our results demonstrated MEDI covered 97.3% of medications recorded in medical records. The "high precision subset" of MEDI covered 93.8% of recorded medications. No significant prescription drugs were missed by MEDI. Manual physician review of random patient records for four example medications found that the MEDI covered the observed indications, and confirmed the estimated prevalence of these medications using practice information. Indication prevalence information for each medication, previously unavailable in other public resources, may improve the clinical usability of MEDI. We believe MEDI will be useful for both clinical informatics and to aid in recognition of phenotypes for electronic medical record-based research.

  4. Computation of nonlinear least squares estimator and maximum likelihood using principles in matrix calculus

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Sankar, J. Ravi; Balasiddamuni, P.

    2017-11-01

    This paper uses matrix calculus techniques to obtain Nonlinear Least Squares Estimator (NLSE), Maximum Likelihood Estimator (MLE) and Linear Pseudo model for nonlinear regression model. David Pollard and Peter Radchenko [1] explained analytic techniques to compute the NLSE. However the present research paper introduces an innovative method to compute the NLSE using principles in multivariate calculus. This study is concerned with very new optimization techniques used to compute MLE and NLSE. Anh [2] derived NLSE and MLE of a heteroscedatistic regression model. Lemcoff [3] discussed a procedure to get linear pseudo model for nonlinear regression model. In this research article a new technique is developed to get the linear pseudo model for nonlinear regression model using multivariate calculus. The linear pseudo model of Edmond Malinvaud [4] has been explained in a very different way in this paper. David Pollard et.al used empirical process techniques to study the asymptotic of the LSE (Least-squares estimation) for the fitting of nonlinear regression function in 2006. In Jae Myung [13] provided a go conceptual for Maximum likelihood estimation in his work “Tutorial on maximum likelihood estimation

  5. Computationally Efficient 2D DOA Estimation with Uniform Rectangular Array in Low-Grazing Angle

    Directory of Open Access Journals (Sweden)

    Junpeng Shi

    2017-02-01

    Full Text Available In this paper, we propose a computationally efficient spatial differencing matrix set (SDMS method for two-dimensional direction of arrival (2D DOA estimation with uniform rectangular arrays (URAs in a low-grazing angle (LGA condition. By rearranging the auto-correlation and cross-correlation matrices in turn among different subarrays, the SDMS method can estimate the two parameters independently with one-dimensional (1D subspace-based estimation techniques, where we only perform difference for auto-correlation matrices and the cross-correlation matrices are kept completely. Then, the pair-matching of two parameters is achieved by extracting the diagonal elements of URA. Thus, the proposed method can decrease the computational complexity, suppress the effect of additive noise and also have little information loss. Simulation results show that, in LGA, compared to other methods, the proposed methods can achieve performance improvement in the white or colored noise conditions.

  6. Estimation of intermediate-grade uranium resources II. Proposed method for estimating intermediate-grade uranium resources in roll-front deposits. Final report

    International Nuclear Information System (INIS)

    Lambie, F.W.; Yee, S.N.

    1981-09-01

    The purpose of this and a previous project was to examine the feasibility of estimating intermediate grade uranium (0.01 to 0.05% U 3 O 8 ) on the basis of existing, sparsely drilled holes. All data are from the Powder River Basin in Wyoming. DOE makes preliminary estimates of endowment by calculating an Average Area of Influence (AAI) based on densely drilled areas, multiplying that by the thickness of the mineralization and then dividing by a tonnage factor. The resulting tonnage of ore is then multiplied by the average grade of the interval to obtain the estimate of U 3 O 8 tonnage. Total endowment is the sum of these values over all mineralized intervals in all wells in the area. In regions where wells are densely drilled and approximately regularly spaced this technique approaches the classical polygonal estimation technique used to estimate ore reserves and should be fairly reliable. The method is conservative because: (1) in sparsely drilled regions a large fraction of the area is not considered to contribute to endowment; (2) there is a bias created by the different distributions of point grades and mining block grades. A conservative approach may be justified for purposes of ore reserve estimation, where large investments may hinge on local forecasts. But for estimates of endowment over areas as large as 1 0 by 2 0 quadrangles, or the nation as a whole, errors in local predictions are not critical as long as they tend to cancel and a less conservative estimation approach may be justified.One candidate, developed for this study and described is called the contoured thickness technique. A comparison of estimates based on the contoured thickness approach with DOE calculations for five areas of Wyoming roll-fronts in the Powder River Basin is presented. The sensitivity of the technique to well density is examined and the question of predicting intermediate grade endowment from data on higher grades is discussed

  7. Usefulness of an enhanced Kitaev phase-estimation algorithm in quantum metrology and computation

    Science.gov (United States)

    Kaftal, Tomasz; Demkowicz-Dobrzański, Rafał

    2014-12-01

    We analyze the performance of a generalized Kitaev's phase-estimation algorithm where N phase gates, acting on M qubits prepared in a product state, may be distributed in an arbitrary way. Unlike the standard algorithm, where the mean square error scales as 1 /N , the optimal generalizations offer the Heisenberg 1 /N2 error scaling and we show that they are in fact very close to the fundamental Bayesian estimation bound. We also demonstrate that the optimality of the algorithm breaks down when losses are taken into account, in which case the performance is inferior to the optimal entanglement-based estimation strategies. Finally, we show that when an alternative resource quantification is adopted, which describes the phase estimation in Shor's algorithm more accurately, the standard Kitaev's procedure is indeed optimal and there is no need to consider its generalized version.

  8. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  9. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    Science.gov (United States)

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  10. Monitoring of computing resource use of active software releases at ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00219183; The ATLAS collaboration

    2017-01-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and dis...

  11. Monitoring of Computing Resource Use of Active Software Releases in ATLAS

    CERN Document Server

    Limosani, Antonio; The ATLAS collaboration

    2016-01-01

    The LHC is the world's most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the Tier0 at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as "MemoryMonitor", to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed...

  12. Estimation of Distribution Algorithm for Resource Allocation in Green Cooperative Cognitive Radio Sensor Networks

    Directory of Open Access Journals (Sweden)

    Alagan Anpalagan

    2013-04-01

    Full Text Available Due to the rapid increase in the usage and demand of wireless sensor networks (WSN, the limited frequency spectrum available for WSN applications will be extremely crowded in the near future. More sensor devices also mean more recharging/replacement of batteries, which will cause significant impact on the global carbon footprint. In this paper, we propose a relay-assisted cognitive radio sensor network (CRSN that allocates communication resources in an environmentally friendly manner. We use shared band amplify and forward relaying for cooperative communication in the proposed CRSN. We present a multi-objective optimization architecture for resource allocation in a green cooperative cognitive radio sensor network (GC-CRSN. The proposed multi-objective framework jointly performs relay assignment and power allocation in GC-CRSN, while optimizing two conflicting objectives. The first objective is to maximize the total throughput, and the second objective is to minimize the total transmission power of CRSN. The proposed relay assignment and power allocation problem is a non-convex mixed-integer non-linear optimization problem (NC-MINLP, which is generally non-deterministic polynomial-time (NP-hard. We introduce a hybrid heuristic algorithm for this problem. The hybrid heuristic includes an estimation-of-distribution algorithm (EDA for performing power allocation and iterative greedy schemes for constraint satisfaction and relay assignment. We analyze the throughput and power consumption tradeoff in GC-CRSN. A detailed analysis of the performance of the proposed algorithm is presented with the simulation results.

  13. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    Science.gov (United States)

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  14. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    Directory of Open Access Journals (Sweden)

    Shoaib Ehsan

    2015-07-01

    Full Text Available The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF, allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video. Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44% in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  15. A computer vision based method for 3D posture estimation of symmetrical lifting.

    Science.gov (United States)

    Mehrizi, Rahil; Peng, Xi; Xu, Xu; Zhang, Shaoting; Metaxas, Dimitris; Li, Kang

    2018-03-01

    Work-related musculoskeletal disorders (WMSD) are commonly observed among the workers involved in material handling tasks such as lifting. To improve work place safety, it is necessary to assess musculoskeletal and biomechanical risk exposures associated with these tasks. Such an assessment has been mainly conducted using surface marker-based methods, which is time consuming and tedious. During the past decade, computer vision based pose estimation techniques have gained an increasing interest and may be a viable alternative for surface marker-based human movement analysis. The aim of this study is to develop and validate a computer vision based marker-less motion capture method to assess 3D joint kinematics of lifting tasks. Twelve subjects performing three types of symmetrical lifting tasks were filmed from two views using optical cameras. The joints kinematics were calculated by the proposed computer vision based motion capture method as well as a surface marker-based motion capture method. The joint kinematics estimated from the computer vision based method were practically comparable to the joint kinematics obtained by the surface marker-based method. The mean and standard deviation of the difference between the joint angles estimated by the computer vision based method and these obtained by the surface marker-based method was 2.31 ± 4.00°. One potential application of the proposed computer vision based marker-less method is to noninvasively assess 3D joint kinematics of industrial tasks such as lifting. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  17. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    International Nuclear Information System (INIS)

    Norris, Edward T.; Liu, Xin; Hsieh, Jiang

    2015-01-01

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. The CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer

  18. Resources

    Science.gov (United States)

    English in Australia, 1973

    1973-01-01

    Contains seven short resources''--units, lessons, and activities on the power of observation, man and his earth, snakes, group discussion, colloquial and slang, the continuous story, and retelling a story. (DD)

  19. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  20. Computable error estimates for Monte Carlo finite element approximation of elliptic PDE with lognormal diffusion coefficients

    KAUST Repository

    Hall, Eric

    2016-01-09

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with lognormal distributed diffusion coefficients, e.g. modeling ground water flow. Typical models use lognormal diffusion coefficients with H´ older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. We address how the total error can be estimated by the computable error.

  1. Computational error estimates for Monte Carlo finite element approximation with log normal diffusion coefficients

    KAUST Repository

    Sandberg, Mattias

    2015-01-07

    The Monte Carlo (and Multi-level Monte Carlo) finite element method can be used to approximate observables of solutions to diffusion equations with log normal distributed diffusion coefficients, e.g. modelling ground water flow. Typical models use log normal diffusion coefficients with H¨older regularity of order up to 1/2 a.s. This low regularity implies that the high frequency finite element approximation error (i.e. the error from frequencies larger than the mesh frequency) is not negligible and can be larger than the computable low frequency error. This talk will address how the total error can be estimated by the computable error.

  2. Interactive Whiteboards and Computer Games at Highschool Level: Digital Resources for Enhancing Reflection in Teaching and Learning

    DEFF Research Database (Denmark)

    Sorensen, Elsebeth Korsgaard; Poulsen, Mathias; Houmann, Rita

    the learning game “Global Conflicts: Latin America” as a resource into the teaching and learning of a course involving the two subjects “English language learning” and “Social studies” at the final year in a Danish high school. The study adapts an explorative research design approach and investigates......The general potential of computer games for teaching and learning is becoming widely recognized. In particular, within the application contexts of primary and lower secondary education, the relevance and value and computer games seem more accepted, and the possibility and willingness to incorporate...... computer games as a possible resource at the level of other educational resources seem more frequent. For some reason, however, to apply computer games in processes of teaching and learning at the high school level, seems an almost non-existent event. This paper reports on study of incorporating...

  3. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  4. SPADER - Science Planning Analysis and Data Estimation Resource for the NASA Parker Solar Probe Mission

    Science.gov (United States)

    Rodgers, D. J.; Fox, N. J.; Kusterer, M. B.; Turner, F. S.; Woleslagle, A. B.

    2017-12-01

    Scheduled to launch in July 2018, the Parker Solar Probe (PSP) will orbit the Sun for seven years, making a total of twenty-four extended encounters inside a solar radial distance of 0.25 AU. During most orbits, there are extended periods of time where PSP-Sun-Earth geometry dramatically reduces PSP-Earth communications via the Deep Space Network (DSN); there is the possibility that multiple orbits will have little to no high-rate downlink available. Science and housekeeping data taken during an encounter may reside on the spacecraft solid state recorder (SSR) for multiple orbits, potentially running the risk of overflowing the SSR in the absence of mitigation. The Science Planning Analysis and Data Estimation Resource (SPADER) has been developed to provide the science and operations teams the ability to plan operations accounting for multiple orbits in order to mitigate the effects caused by the lack of high-rate downlink. Capabilities and visualizations of SPADER are presented; further complications associated with file downlink priority and high-speed data transfers between instrument SSRs and the spacecraft SSR are discussed, as well as the long-term consequences of variations in DSN downlink parameters on the science data downlink.

  5. Breccia-pipe uranium mining in northern Arizona; estimate of resources and assessment of historical effects

    Science.gov (United States)

    Bills, Donald J.; Brown, Kristin M.; Alpine, Andrea E.; Otton, James K.; Van Gosen, Bradley S.; Hinck, Jo Ellen; Tillman, Fred D.

    2011-01-01

    About 1 million acres of Federal land in the Grand Canyon region of Arizona were temporarily withdrawn from new mining claims in July 2009 by the Secretary of the Interior because of concern that increased uranium mining could have negative impacts on the land, water, people, and wildlife. During a 2-year interval, a Federal team led by the Bureau of Land Management is evaluating the effects of withdrawing these lands for extended periods. As part of this team, the U.S. Geological Survey (USGS) conducted a series of short-term studies to examine the historical effects of breccia-pipe uranium mining in the region. The USGS studies provide estimates of uranium resources affected by the possible land withdrawal, examine the effects of previous breccia-pipe mining, summarize water-chemistry data for streams and springs, and investigate potential biological pathways of exposure to uranium and associated contaminants. This fact sheet summarizes results through December 2009 and outlines further research needs.

  6. Sensitivity of Calibrated Parameters and Water Resource Estimates on Different Objective Functions and Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Delaram Houshmand Kouchi

    2017-05-01

    Full Text Available The successful application of hydrological models relies on careful calibration and uncertainty analysis. However, there are many different calibration/uncertainty analysis algorithms, and each could be run with different objective functions. In this paper, we highlight the fact that each combination of optimization algorithm-objective functions may lead to a different set of optimum parameters, while having the same performance; this makes the interpretation of dominant hydrological processes in a watershed highly uncertain. We used three different optimization algorithms (SUFI-2, GLUE, and PSO, and eight different objective functions (R2, bR2, NSE, MNS, RSR, SSQR, KGE, and PBIAS in a SWAT model to calibrate the monthly discharges in two watersheds in Iran. The results show that all three algorithms, using the same objective function, produced acceptable calibration results; however, with significantly different parameter ranges. Similarly, an algorithm using different objective functions also produced acceptable calibration results, but with different parameter ranges. The different calibrated parameter ranges consequently resulted in significantly different water resource estimates. Hence, the parameters and the outputs that they produce in a calibrated model are “conditioned” on the choices of the optimization algorithm and objective function. This adds another level of non-negligible uncertainty to watershed models, calling for more attention and investigation in this area.

  7. Geological Carbon Sequestration Storage Resource Estimates for the Ordovician St. Peter Sandstone, Illinois and Michigan Basins, USA

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, David; Ellett, Kevin; Leetaru, Hannes

    2014-09-30

    The Cambro-Ordovician strata of the Midwest of the United States is a primary target for potential geological storage of CO2 in deep saline formations. The objective of this project is to develop a comprehensive evaluation of the Cambro-Ordovician strata in the Illinois and Michigan Basins above the basal Mount Simon Sandstone since the Mount Simon is the subject of other investigations including a demonstration-scale injection at the Illinois Basin Decatur Project. The primary reservoir targets investigated in this study are the middle Ordovician St Peter Sandstone and the late Cambrian to early Ordovician Knox Group carbonates. The topic of this report is a regional-scale evaluation of the geologic storage resource potential of the St Peter Sandstone in both the Illinois and Michigan Basins. Multiple deterministic-based approaches were used in conjunction with the probabilistic-based storage efficiency factors published in the DOE methodology to estimate the carbon storage resource of the formation. Extensive data sets of core analyses and wireline logs were compiled to develop the necessary inputs for volumetric calculations. Results demonstrate how the range in uncertainty of storage resource estimates varies as a function of data availability and quality, and the underlying assumptions used in the different approaches. In the simplest approach, storage resource estimates were calculated from mapping the gross thickness of the formation and applying a single estimate of the effective mean porosity of the formation. Results from this approach led to storage resource estimates ranging from 3.3 to 35.1 Gt in the Michigan Basin, and 1.0 to 11.0 Gt in the Illinois Basin at the P10 and P90 probability level, respectively. The second approach involved consideration of the diagenetic history of the formation throughout the two basins and used depth-dependent functions of porosity to derive a more realistic spatially variable model of porosity rather than applying a

  8. Unconventional energy resources in a crowded subsurface: Reducing uncertainty and developing a separation zone concept for resource estimation and deep 3D subsurface planning using legacy mining data.

    Science.gov (United States)

    Monaghan, Alison A

    2017-12-01

    Over significant areas of the UK and western Europe, anthropogenic alteration of the subsurface by mining of coal has occurred beneath highly populated areas which are now considering a multiplicity of 'low carbon' unconventional energy resources including shale gas and oil, coal bed methane, geothermal energy and energy storage. To enable decision making on the 3D planning, licensing and extraction of these resources requires reduced uncertainty around complex geology and hydrogeological and geomechanical processes. An exemplar from the Carboniferous of central Scotland, UK, illustrates how, in areas lacking hydrocarbon well production data and 3D seismic surveys, legacy coal mine plans and associated boreholes provide valuable data that can be used to reduce the uncertainty around geometry and faulting of subsurface energy resources. However, legacy coal mines also limit unconventional resource volumes since mines and associated shafts alter the stress and hydrogeochemical state of the subsurface, commonly forming pathways to the surface. To reduce the risk of subsurface connections between energy resources, an example of an adapted methodology is described for shale gas/oil resource estimation to include a vertical separation or 'stand-off' zone between the deepest mine workings, to ensure the hydraulic fracturing required for shale resource production would not intersect legacy coal mines. Whilst the size of such separation zones requires further work, developing the concept of 3D spatial separation and planning is key to utilising the crowded subsurface energy system, whilst mitigating against resource sterilisation and environmental impacts, and could play a role in positively informing public and policy debate. Copyright © 2017 British Geological Survey, a component institute of NERC. Published by Elsevier B.V. All rights reserved.

  9. Adaptive resource allocation scheme using sliding window subchannel gain computation: context of OFDMA wireless mobiles systems

    International Nuclear Information System (INIS)

    Khelifa, F.; Samet, A.; Ben Hassen, W.; Afif, M.

    2011-01-01

    Multiuser diversity combined with Orthogonal Frequency Division Multiple Access (OFDMA) are a promising technique for achieving high downlink capacities in new generation of cellular and wireless network systems. The total capacity of OFDMA based-system is maximized when each subchannel is assigned to the mobile station with the best channel to noise ratio for that subchannel with power is uniformly distributed between all subchannels. A contiguous method for subchannel construction is adopted in IEEE 802.16 m standard in order to reduce OFDMA system complexity. In this context, new subchannel gain computation method, can contribute, jointly with optimal assignment subchannel to maximize total system capacity. In this paper, two new methods have been proposed in order to achieve a better trade-off between fairness and efficiency use of resources. Numerical results show that proposed algorithms provide low complexity, higher total system capacity and fairness among users compared to others recent methods.

  10. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    Science.gov (United States)

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  11. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  12. Measurement and Estimation of Renal Size by Computed Tomography in Korean Children

    OpenAIRE

    Park, Chan Won; Yu, Nali; Yun, Sin Weon; Chae, Soo Ahn; Lee, Na Mi; Yi, Dae Yong; Choi, Young Bae; Lim, In Seok

    2017-01-01

    Adequate organ growth is an important aspect of growth evaluation in children. Renal size is an important indicator of adequate renal growth; computed tomography (CT) can closely estimate actual kidney size. However, insufficient data are available on normal renal size as measured by CT. This study aimed to evaluate the relationships of anthropometric indices with renal length and volume measured by CT in Korean pediatric patients. Renal length and volume were measured using CT images in 272 ...

  13. Implementation of the EM Algorithm in the Estimation of Item Parameters: The BILOG Computer Program.

    Science.gov (United States)

    Mislevy, Robert J.; Bock, R. Darrell

    This paper reviews the basic elements of the EM approach to estimating item parameters and illustrates its use with one simulated and one real data set. In order to illustrate the use of the BILOG computer program, runs for 1-, 2-, and 3-parameter models are presented for the two sets of data. First is a set of responses from 1,000 persons to five…

  14. Approximate Bayesian computation (ABC) coupled with Bayesian model averaging method for estimating mean and standard deviation

    OpenAIRE

    Kwon, Deukwoo; Reis, Isildinha M.

    2016-01-01

    Background: We proposed approximate Bayesian computation with single distribution selection (ABC-SD) for estimating mean and standard deviation from other reported summary statistics. The ABC-SD generates pseudo data from a single parametric distribution thought to be the true distribution of underlying study data. This single distribution is either an educated guess, or it is selected via model selection using posterior probability criterion for testing two or more candidate distributions. F...

  15. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    Science.gov (United States)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  16. Systems, methods and computer readable media for estimating capacity loss in rechargeable electrochemical cells

    Science.gov (United States)

    Gering, Kevin L.

    2013-06-18

    A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples charge characteristics of the electrochemical cell. The computing system periodically determines cell information from the charge characteristics of the electrochemical cell. The computing system also periodically adds a first degradation characteristic from the cell information to a first sigmoid expression, periodically adds a second degradation characteristic from the cell information to a second sigmoid expression and combines the first sigmoid expression and the second sigmoid expression to develop or augment a multiple sigmoid model (MSM) of the electrochemical cell. The MSM may be used to estimate a capacity loss of the electrochemical cell at a desired point in time and analyze other characteristics of the electrochemical cell. The first and second degradation characteristics may be loss of active host sites and loss of free lithium for Li-ion cells.

  17. The National Streamflow Statistics Program: A Computer Program for Estimating Streamflow Statistics for Ungaged Sites

    Science.gov (United States)

    Ries(compiler), Kernell G.; With sections by Atkins, J. B.; Hummel, P.R.; Gray, Matthew J.; Dusenbury, R.; Jennings, M.E.; Kirby, W.H.; Riggs, H.C.; Sauer, V.B.; Thomas, W.O.

    2007-01-01

    The National Streamflow Statistics (NSS) Program is a computer program that should be useful to engineers, hydrologists, and others for planning, management, and design applications. NSS compiles all current U.S. Geological Survey (USGS) regional regression equations for estimating streamflow statistics at ungaged sites in an easy-to-use interface that operates on computers with Microsoft Windows operating systems. NSS expands on the functionality of the USGS National Flood Frequency Program, and replaces it. The regression equations included in NSS are used to transfer streamflow statistics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally, the equations were developed on a statewide or metropolitan-area basis as part of cooperative study programs. Equations are available for estimating rural and urban flood-frequency statistics, such as the 1 00-year flood, for every state, for Puerto Rico, and for the island of Tutuila, American Samoa. Equations are available for estimating other statistics, such as the mean annual flow, monthly mean flows, flow-duration percentiles, and low-flow frequencies (such as the 7-day, 0-year low flow) for less than half of the states. All equations available for estimating streamflow statistics other than flood-frequency statistics assume rural (non-regulated, non-urbanized) conditions. The NSS output provides indicators of the accuracy of the estimated streamflow statistics. The indicators may include any combination of the standard error of estimate, the standard error of prediction, the equivalent years of record, or 90 percent prediction intervals, depending on what was provided by the authors of the equations. The program includes several other features that can be used only for flood-frequency estimation. These include the ability to generate flood-frequency plots, and plots of typical flood hydrographs for selected recurrence intervals

  18. Distributed Formation State Estimation Algorithms Under Resource and Multi-Tasking Constraints, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent work has developed a number of architectures and algorithms for accurately estimating spacecraft and formation states. The estimation accuracy achievable...

  19. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    Science.gov (United States)

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  20. Monitoring of computing resource use of active software releases at ATLAS

    Science.gov (United States)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  1. Estimating changing snow water resources over the Himalaya from remote sensing at the weekly scale

    Science.gov (United States)

    Ackroyd, C.; Skiles, M.

    2017-12-01

    Water resources in South Asia are critically dependent on High Mountain Asia, namely as the headwaters for the Indus, Ganges, and Brahmaputra River Basins. For water and economic security it is important to understand how the natural snow water reservoir is changing at a time scale that is relevant for water management, which can most feasibly be achieved across this vast and complex landscape through remote sensing. Here we present results from recent efforts to develop an optimal method that combines MODIS fractional snow covered area (MODSCAG) with retrievals of SWE from space borne microwave data (AMSR2) over the Hindu Kush Himalaya, which is further combined with MODIS dust radiative forcing (MODDRFS) to monitor rate of snow darkening, and provide a simple snowmelt metric that informs the contribution to melt by light absorbing particulates like dust and black carbon. For data consistency we are using 8 day composites of all products, and therefore the difference from time step to time step is a weekly, first order approximation, of the amount of SWE lost or gained from the region. MODIS retrievals are valuable for studying the hydrology of South Asia because there are mature sub kilometer scale products for the reflectance and fractional extent of the snow cover, the melt from which is mainly controlled by net solar radiation. The value of retrievals of SWE from space borne microwave data is less well established due to numerous sources of error (e.g. grain size and density, forest obscuration, penetration depth reduction, saturation) and the coarse 25 km spatial scale, which cannot capture the variation in SWE at the scale of individual mountain massifs. Despite these limitations it is currently the only available satellite based SWE product. This research effort is part of a larger NASA-SERVIR project that aims to join SWE estimates from MODIS and AMSR2, subsurface water storage variations from GRACE, and the RAPID river routing model to assess water

  2. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    Science.gov (United States)

    Periwal, Vinita

    2017-07-01

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Microdiamond grade as a regionalised variable - some basic requirements for successful local microdiamond resource estimation of kimberlites

    Science.gov (United States)

    Stiefenhofer, Johann; Thurston, Malcolm L.; Bush, David E.

    2018-04-01

    Microdiamonds offer several advantages as a resource estimation tool, such as access to deeper parts of a deposit which may be beyond the reach of large diameter drilling (LDD) techniques, the recovery of the total diamond content in the kimberlite, and a cost benefit due to the cheaper treatment cost compared to large diameter samples. In this paper we take the first step towards local estimation by showing that micro-diamond samples can be treated as a regionalised variable suitable for use in geostatistical applications and we show examples of such output. Examples of microdiamond variograms are presented, the variance-support relationship for microdiamonds is demonstrated and consistency of the diamond size frequency distribution (SFD) is shown with the aid of real datasets. The focus therefore is on why local microdiamond estimation should be possible, not how to generate such estimates. Data from our case studies and examples demonstrate a positive correlation between micro- and macrodiamond sample grades as well as block estimates. This relationship can be demonstrated repeatedly across multiple mining operations. The smaller sample support size for microdiamond samples is a key difference between micro- and macrodiamond estimates and this aspect must be taken into account during the estimation process. We discuss three methods which can be used to validate or reconcile the estimates against macrodiamond data, either as estimates or in the form of production grades: (i) reconcilliation using production data, (ii) by comparing LDD-based grade estimates against microdiamond-based estimates and (iii) using simulation techniques.

  4. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    Directory of Open Access Journals (Sweden)

    Samuel V Angiuoli

    Full Text Available The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly.We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2, which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers.Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer invested in 16S rRNA amplicon sequencing

  5. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single

  6. Computationally Efficient 2D DOA Estimation for L-Shaped Array with Unknown Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Yang-Yang Dong

    2018-01-01

    Full Text Available Although L-shaped array can provide good angle estimation performance and is easy to implement, its two-dimensional (2D direction-of-arrival (DOA performance degrades greatly in the presence of mutual coupling. To deal with the mutual coupling effect, a novel 2D DOA estimation method for L-shaped array with low computational complexity is developed in this paper. First, we generalize the conventional mutual coupling model for L-shaped array and compensate the mutual coupling blindly via sacrificing a few sensors as auxiliary elements. Then we apply the propagator method twice to mitigate the effect of strong source signal correlation effect. Finally, the estimations of azimuth and elevation angles are achieved simultaneously without pair matching via the complex eigenvalue technique. Compared with the existing methods, the proposed method is computationally efficient without spectrum search or polynomial rooting and also has fine angle estimation performance for highly correlated source signals. Theoretical analysis and simulation results have demonstrated the effectiveness of the proposed method.

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  8. Application of Soft Computing Methods for the Estimation of Roadheader Performance from Schmidt Hammer Rebound Values

    Directory of Open Access Journals (Sweden)

    Hadi Fattahi

    2017-01-01

    Full Text Available Estimation of roadheader performance is one of the main topics in determining the economics of underground excavation projects. The poor performance estimation of roadheader scan leads to costly contractual claims. In this paper, the application of soft computing methods for data analysis called adaptive neuro-fuzzy inference system- subtractive clustering method (ANFIS-SCM and artificial  neural  network  (ANN optimized  by  hybrid  particle  swarm  optimization  and  genetic  algorithm  (HPSOGA to estimate roadheader performance is demonstrated. The data to show the applicability of these methods were collected from tunnels for Istanbul’s sewage system, Turkey. Two estimation models based on ANFIS-SCM and ANN-HPSOGA were developed. In these models, Schmidt hammer rebound values and rock quality designation (RQD were utilized as the input parameters, and net cutting rates constituted the output parameter. Various statistical performance indices were used to compare the performance of those estimation models. The results indicated that the ANFIS-SCM model has strong potentials to estimate roadheader performance with high degrees of accuracy and robustness.

  9. A Computable Plug-In Estimator of Minimum Volume Sets for Novelty Detection

    KAUST Repository

    Park, Chiwoo

    2010-10-01

    A minimum volume set of a probability density is a region of minimum size among the regions covering a given probability mass of the density. Effective methods for finding the minimum volume sets are very useful for detecting failures or anomalies in commercial and security applications-a problem known as novelty detection. One theoretical approach of estimating the minimum volume set is to use a density level set where a kernel density estimator is plugged into the optimization problem that yields the appropriate level. Such a plug-in estimator is not of practical use because solving the corresponding minimization problem is usually intractable. A modified plug-in estimator was proposed by Hyndman in 1996 to overcome the computation difficulty of the theoretical approach but is not well studied in the literature. In this paper, we provide theoretical support to this estimator by showing its asymptotic consistency. We also show that this estimator is very competitive to other existing novelty detection methods through an extensive empirical study. ©2010 INFORMS.

  10. Computer-aided biomarker discovery for precision medicine: data resources, models and applications.

    Science.gov (United States)

    Lin, Yuxin; Qian, Fuliang; Shen, Li; Chen, Feifei; Chen, Jiajia; Shen, Bairong

    2017-11-29

    Biomarkers are a class of measurable and evaluable indicators with the potential to predict disease initiation and progression. In contrast to disease-associated factors, biomarkers hold the promise to capture the changeable signatures of biological states. With methodological advances, computer-aided biomarker discovery has now become a burgeoning paradigm in the field of biomedical science. In recent years, the 'big data' term has accumulated for the systematical investigation of complex biological phenomena and promoted the flourishing of computational methods for systems-level biomarker screening. Compared with routine wet-lab experiments, bioinformatics approaches are more efficient to decode disease pathogenesis under a holistic framework, which is propitious to identify biomarkers ranging from single molecules to molecular networks for disease diagnosis, prognosis and therapy. In this review, the concept and characteristics of typical biomarker types, e.g. single molecular biomarkers, module/network biomarkers, cross-level biomarkers, etc., are explicated on the guidance of systems biology. Then, publicly available data resources together with some well-constructed biomarker databases and knowledge bases are introduced. Biomarker identification models using mathematical, network and machine learning theories are sequentially discussed. Based on network substructural and functional evidences, a novel bioinformatics model is particularly highlighted for microRNA biomarker discovery. This article aims to give deep insights into the advantages and challenges of current computational approaches for biomarker detection, and to light up the future wisdom toward precision medicine and nation-wide healthcare. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Particle approximations of the score and observed information matrix for parameter estimation in state space models with linear computational cost

    OpenAIRE

    Nemeth, Christopher; Fearnhead, Paul; Mihaylova, Lyudmila

    2013-01-01

    Poyiadjis et al. (2011) show how particle methods can be used to estimate both the score and the observed information matrix for state space models. These methods either suffer from a computational cost that is quadratic in the number of particles, or produce estimates whose variance increases quadratically with the amount of data. This paper introduces an alternative approach for estimating these terms at a computational cost that is linear in the number of particles. The method is derived u...

  12. A high resolution global wind atlas - improving estimation of world wind resources

    DEFF Research Database (Denmark)

    Badger, Jake; Ejsing Jørgensen, Hans

    2011-01-01

    data and the tools necessary are present, so the time is right to link the parts together to create a much needed dataset. Geospatial information systems (GIS) will be one of the significant applications of the Global Wind Atlas datasets. As location of wind resource, and its relationships...... resources. These aspects will also be addressed by the Global Wind Atlas. The Global Wind Atlas, through a transparent methodology, will provide a unified, high resolution, and public domain dataset of wind energy resources for the whole world. The wind atlas data will be the most appropriate wind resource...

  13. IMPROVING RESOURCE UTILIZATION USING QoS BASED LOAD BALANCING ALGORITHM FOR MULTIPLE WORKFLOWS IN IAAS CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    L. Shakkeera

    2013-06-01

    Full Text Available loud computing is the extension of parallel computing, distributed computing and grid computing. It provides secure, quick, convenient data storage and net computing services through the internet. The services are available to user in pay per-use-on-demand model. The main aim of using resources from cloud is to reduce the cost and to increase the performance in terms of request response time. Thus, optimizing the resource usage through efficient load balancing strategy is crucial. The main aim of this paper is to develop and implement an Optimized Load balancing algorithm in IaaS virtual cloud environment that aims to utilize the virtual cloud resources efficiently. It minimizes the cost of the applications by effectively using cloud resources and identifies the virtual cloud resources that must be suitable for all the applications. The web application is created with many modules. These modules are considered as tasks and these tasks are submitted to the load balancing server. The server which consists our load balancing policies redirect the tasks to the corresponding virtual machines created by KVM virtual machine manager as per the load balancing algorithm. If the size of the database inside the machine exceeds then the load balancing algorithm uses the other virtual machines for further incoming request. The load balancing strategy are evaluated for various QoS performance metrics like cost, average execution times, throughput, CPU usage, disk space, memory usage, network transmission and reception rate, resource utilization rate and scheduling success rate for the number of virtual machines and it improves the scalability among resources using load balancing techniques.

  14. SARANA: language, compiler and run-time system support for spatially aware and resource-aware mobile computing.

    Science.gov (United States)

    Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei

    2008-10-28

    Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.

  15. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  16. Coupled Crop/Hydrology Model to Estimate Expanded Irrigation Impact on Water Resources

    Science.gov (United States)

    Handyside, C. T.; Cruise, J.

    2017-12-01

    A coupled agricultural and hydrologic systems model is used to examine the environmental impact of irrigation in the Southeast. A gridded crop model for the Southeast is used to determine regional irrigation demand. This irrigation demand is used in a regional hydrologic model to determine the hydrologic impact of irrigation. For the Southeast to maintain/expand irrigated agricultural production and provide adaptation to climate change and climate variability it will require integrated agricultural and hydrologic system models that can calculate irrigation demand and the impact of the this demand on the river hydrology. These integrated models can be used as (1) historical tools to examine vulnerability of expanded irrigation to past climate extremes (2) future tools to examine the sustainability of expanded irrigation under future climate scenarios and (3) a real-time tool to allow dynamic water resource management. Such tools are necessary to assure stakeholders and the public that irrigation can be carried out in a sustainable manner. The system tools to be discussed include a gridded version of the crop modeling system (DSSAT). The gridded model is referred to as GriDSSAT. The irrigation demand from GriDSSAT is coupled to a regional hydrologic model developed by the Eastern Forest Environmental Threat Assessment Center of the USDA Forest Service) (WaSSI). The crop model provides the dynamic irrigation demand which is a function of the weather. The hydrologic model includes all other competing uses of water. Examples of use the crop model coupled with the hydrologic model include historical analyses which show the change in hydrology as additional acres of irrigated land are added to water sheds. The first order change in hydrology is computed in terms of changes in the Water Availability Stress Index (WASSI) which is the ratio of water demand (irrigation, public water supply, industrial use, etc.) and water availability from the hydrologic model. Also

  17. A Novel Sampling Method for Satellite-Based Offshore Wind Resource Estimation

    DEFF Research Database (Denmark)

    Badger, Merete; Badger, Jake; Hasager, Charlotte Bay

    of wind resources. The method is applied within a wind and solar resource assessment study for the United Arab Emirates funded by MASDAR and coordinated by UNEP. Thirty years of NCEP/NCAR reanalysis data are used to define approximately 100 geostrophic wind classes. These wind classes show...

  18. Reciprocal Estimation of Pedestrian Location and Motion State toward a Smartphone Geo-Context Computing Solution

    Directory of Open Access Journals (Sweden)

    Jingbin Liu

    2015-06-01

    Full Text Available The rapid advance in mobile communications has made information and services ubiquitously accessible. Location and context information have become essential for the effectiveness of services in the era of mobility. This paper proposes the concept of geo-context that is defined as an integral synthesis of geographical location, human motion state and mobility context. A geo-context computing solution consists of a positioning engine, a motion state recognition engine, and a context inference component. In the geo-context concept, the human motion states and mobility context are associated with the geographical location where they occur. A hybrid geo-context computing solution is implemented that runs on a smartphone, and it utilizes measurements of multiple sensors and signals of opportunity that are available within a smartphone. Pedestrian location and motion states are estimated jointly under the framework of hidden Markov models, and they are used in a reciprocal manner to improve their estimation performance of one another. It is demonstrated that pedestrian location estimation has better accuracy when its motion state is known, and in turn, the performance of motion state recognition can be improved with increasing reliability when the location is given. The geo-context inference is implemented simply with the expert system principle, and more sophisticated approaches will be developed.

  19. Analytical solution and computer program (FAST) to estimate fluid fluxes from subsurface temperature profiles

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.

    2016-02-01

    This study details the derivation and application of a new analytical solution to the one-dimensional, transient conduction-advection equation that is applied to trace vertical subsurface fluid fluxes. The solution employs a flexible initial condition that allows for nonlinear temperature-depth profiles, providing a key improvement over most previous solutions. The boundary condition is composed of any number of superimposed step changes in surface temperature, and thus it accommodates intermittent warming and cooling periods due to long-term changes in climate or land cover. The solution is verified using an established numerical model of coupled groundwater flow and heat transport. A new computer program FAST (Flexible Analytical Solution using Temperature) is also presented to facilitate the inversion of this analytical solution to estimate vertical groundwater flow. The program requires surface temperature history (which can be estimated from historic climate data), subsurface thermal properties, a present-day temperature-depth profile, and reasonable initial conditions. FAST is written in the Python computing language and can be run using a free graphical user interface. Herein, we demonstrate the utility of the analytical solution and FAST using measured subsurface temperature and climate data from the Sendia Plain, Japan. Results from these illustrative examples highlight the influence of the chosen initial and boundary conditions on estimated vertical flow rates.

  20. Computational cost estimates for parallel shared memory isogeometric multi-frontal solvers

    KAUST Repository

    Woźniak, Maciej

    2014-06-01

    In this paper we present computational cost estimates for parallel shared memory isogeometric multi-frontal solvers. The estimates show that the ideal isogeometric shared memory parallel direct solver scales as O( p2log(N/p)) for one dimensional problems, O(Np2) for two dimensional problems, and O(N4/3p2) for three dimensional problems, where N is the number of degrees of freedom, and p is the polynomial order of approximation. The computational costs of the shared memory parallel isogeometric direct solver are compared with those corresponding to the sequential isogeometric direct solver, being the latest equal to O(N p2) for the one dimensional case, O(N1.5p3) for the two dimensional case, and O(N2p3) for the three dimensional case. The shared memory version significantly reduces both the scalability in terms of N and p. Theoretical estimates are compared with numerical experiments performed with linear, quadratic, cubic, quartic, and quintic B-splines, in one and two spatial dimensions. © 2014 Elsevier Ltd. All rights reserved.

  1. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D' Albuquerque, Luiz A.C., E-mail: rsnpinheiro@gmail.com [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Gastroenterologia. Div. de Transplante de Orgaos do Aparelho Digestivo; Lai, Quirino [Universidade de L' Aquila, San Salvatore Hospital (Italy); Ibuki, Felicia S.; Rocha, Manoel S. [Universidade de Sao Paulo (USP), SP (Brazil). Departamento de Radiologia

    2017-09-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r{sup 2} =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  2. Preoperative computed tomography volumetry and graft weight estimation in adult living donor liver transplantation

    International Nuclear Information System (INIS)

    Pinheiro, Rafael S.; Cruz Junior, Ruy J.; Andraus, Wellington; Ducatti, Liliana; Martino, Rodrigo B.; Nacif, Lucas S.; Rocha-Santos, Vinicius; Arantes, Rubens M.; D'Albuquerque, Luiz A.C.; Ibuki, Felicia S.; Rocha, Manoel S.

    2017-01-01

    Background: Computed tomography volumetry (CTV) is a useful tool for predicting graft weights (GW) for living donor liver transplantation (LDLT). Few studies have examined the correlation between CTV and GW in normal liver parenchyma. Aim: To analyze the correlation between CTV and GW in an adult LDLT population and provide a systematic review of the existing mathematical models to calculate partial liver graft weight. Methods: Between January 2009 and January 2013, 28 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. All grafts were perfused with HTK solution. Estimated graft volume was estimated by CTV and these values were compared to the actual graft weight, which was measured after liver harvesting and perfusion. Results: Median actual GW was 782.5 g, averaged 791.43±136 g and ranged from 520-1185 g. Median estimated graft volume was 927.5 ml, averaged 944.86±200.74 ml and ranged from 600-1477 ml. Linear regression of estimated graft volume and actual GW was significantly linear (GW=0.82 estimated graft volume, r 2 =0.98, slope=0.47, standard deviation of 0.024 and p<0.0001). Spearman Linear correlation was 0.65 with 95% CI of 0.45 – 0.99 (p<0.0001). Conclusion: The one-to-one rule did not applied in patients with normal liver parenchyma. A better estimation of graft weight could be reached by multiplying estimated graft volume by 0.82. (author)

  3. Increasing to Accuracy Estimation of Latent Parameters in Human Resource Management Systems

    Directory of Open Access Journals (Sweden)

    O.N. Gustun

    2011-09-01

    Full Text Available An approach to build an adaptive testing system, in which initial estimates of test items are obtained through calibration testing, is considered. The maximum likehood method is used to obtain these estimates. Optimization of the objective function is carried out using the method of Hooke—Jeeves. The influence of various factors on the accuracy of the estimates obtained is investigated.

  4. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    Science.gov (United States)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  5. Application of High Performance Computing to Earthquake Hazard and Disaster Estimation in Urban Area

    Directory of Open Access Journals (Sweden)

    Muneo Hori

    2018-02-01

    Full Text Available Integrated earthquake simulation (IES is a seamless simulation of analyzing all processes of earthquake hazard and disaster. There are two difficulties in carrying out IES, namely, the requirement of large-scale computation and the requirement of numerous analysis models for structures in an urban area, and they are solved by taking advantage of high performance computing (HPC and by developing a system of automated model construction. HPC is a key element in developing IES, as it needs to analyze wave propagation and amplification processes in an underground structure; a model of high fidelity for the underground structure exceeds a degree-of-freedom larger than 100 billion. Examples of IES for Tokyo Metropolis are presented; the numerical computation is made by using K computer, the supercomputer of Japan. The estimation of earthquake hazard and disaster for a given earthquake scenario is made by the ground motion simulation and the urban area seismic response simulation, respectively, for the target area of 10,000 m × 10,000 m.

  6. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  7. The evidence of personal computer waste quantity in the territory of Serbia -statistical estimation

    Directory of Open Access Journals (Sweden)

    Tadić Branko

    2006-01-01

    Full Text Available In recent years, the state-of-the-art research has been dealing with putting into traffic, withdrawing and freeing the environment from electrical and electronic equipment waste-WEEE. In our country there has been no serious research so far concerning this problem, although current and future members of the European Union (EU are obligated to conduct WEEE directive based on individual responsibility of each "waste manufacturer". The Ministry of Science and Environmental Protection of Serbia has accepted the financing of scientific research project called "The development of electrical and electronic equipment recycling system". In this paper, statistical estimation method of quantity and diffusion of computer waste (which according to the EU classification, belongs to the third category WEEE-devices for computer and communication technique in the territory of Serbia is described. The implications of the problem on our country are also presented.

  8. Estimation of kinetic and thermodynamic ligand-binding parameters using computational strategies.

    Science.gov (United States)

    Deganutti, Giuseppe; Moro, Stefano

    2017-04-01

    Kinetic and thermodynamic ligand-protein binding parameters are gaining growing importance as key information to consider in drug discovery. The determination of the molecular structures, using particularly x-ray and NMR techniques, is crucial for understanding how a ligand recognizes its target in the final binding complex. However, for a better understanding of the recognition processes, experimental studies of ligand-protein interactions are needed. Even though several techniques can be used to investigate both thermodynamic and kinetic profiles for a ligand-protein complex, these procedures are very often laborious, time consuming and expensive. In the last 10 years, computational approaches have enormous potential in providing insights into each of the above effects and in parsing their contributions to the changes in both kinetic and thermodynamic binding parameters. The main purpose of this review is to summarize the state of the art of computational strategies for estimating the kinetic and thermodynamic parameters of a ligand-protein binding.

  9. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10 -5 Gy) per 1,000 mR (258 μC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicists who can make the calculations immediately

  10. A computer-assisted procedure for estimating patient exposure and fetal dose in radiographic examinations

    International Nuclear Information System (INIS)

    Glaze, S.; Schneiders, N.; Bushong, S.C.

    1982-01-01

    A computer program for calculating patient entrance exposure and fetal dose for 11 common radiographic examinations was developed. The output intensity measured at 70 kVp and a 30-inch (76-cm) source-to-skin distance was entered into the program. The change in output intensity with changing kVp was examined for 17 single-phase and 12 three-phase x-ray units. The relationships obtained from a least squares regression analysis of the data, along with the technique factors for each examination, were used to calculate patient exposure. Fetal dose was estimated using published fetal dose in mrad (10(-5) Gy) per 1,000 mR (258 microC/kg) entrance exposure values. The computations are fully automated and individualized to each radiographic unit. The information provides a ready reference in large institutions and is particularly useful at smaller facilities that do not have available physicians who can make the calculations immediately

  11. EFFAIR: a computer program for estimating the dispersion of atmospheric emissions from a nuclear site

    International Nuclear Information System (INIS)

    Dormuth, K.W.; Lyon, R.B.

    1978-11-01

    Analysis of the transport of material through the turbulent atmospheric boundary layer is an important part of environmental impact assessments for nuclear plants. Although this is a complex phenomenon, practical estimates of ground level concentrations downwind of release are usually obtained using a simple Gaussian formula whose coefficients are obtained from empirical correlations. Based on this formula, the computer program EFFAIR has been written to provide a flexible tool for atmospheric dispersion calculations. It is considered appropriate for calculating dilution factors at distances of 10 2 to 10 4 metres from an effluent source if reflection from the inversion lid is negligible in that range. (author)

  12. BAESNUM, a conversational computer program for the Bayesian estimation of a parameter by a numerical method

    International Nuclear Information System (INIS)

    Colombo, A.G.; Jaarsma, R.J.

    1982-01-01

    This report describes a conversational computer program which, via Bayes' theorem, numerically combines the prior distribution of a parameter with a likelihood function. Any type of prior and likelihood function can be considered. The present version of the program includes six types of prior and employs the binomial likelihood. As input the program requires the law and parameters of the prior distribution and the sample data. As output it gives the posterior distribution as a histogram. The use of the program for estimating the constant failure rate of an item is briefly described

  13. Tridimensional modelling and resource estimation of the mining waste piles of São Domingos mine, Iberian Pyrite Belt, Portugal

    Science.gov (United States)

    Vieira, Alexandre; Matos, João; Lopes, Luis; Martins, Ruben

    2016-04-01

    Located in the Iberian Pyrite Belt (IPB) northern sector, near the Portuguese/Spanish border, the outcropping São Domingos deposit was mined since Roman time. Between 1854 and 1966 the Mason & Barry Company developed open pit excavation until 120 m depth and underground mining until 420 m depth. The São Domingos subvertical deposit is associated with felsic volcanics and black shales of the IPB Volcano-Sedimentary Complex and is represented by massive sulphide and stockwork ore (py, cpy, sph, ga, tt, aspy) and related supergene enrichment ore (hematite gossan and covellite/chalcocite). Different mine waste classes were mapped around the old open pit: gossan (W1), felsic volcanic and shales (W2), shales (W3) and mining waste landfill (W4). Using the LNEG (Portuguese Geological Survey) CONASA database (company historical mining waste characterization based on 162 shafts and 160 reverse circulation boreholes), a methodology for tridimensional modelling mining waste pile was followed, and a new mining waste resource is presented. Considering some constraints to waste removal, such as the Mina de São Domingos village proximity of the wastes, the industrial and archaeological patrimony (e.g., mining infrastructures, roman galleries), different resource scenarios were considered: unconditioned resources (total estimates) and conditioned resources (only the volumes without removal constraints considered). Using block modelling (SURPAC software) a mineral inferred resource of 2.38 Mt @ 0.77 g/t Au and 8.26 g/t Ag is estimated in unconditioned volumes of waste. Considering all evaluated wastes, including village areas, an inferred resource of 4.0 Mt @ 0.64 g/t Au and 7.30 g/t Ag is presented, corresponding to a total metal content of 82,878 oz t Au and 955,753 oz t Ag. Keywords. São Domingos mine, mining waste resources, mining waste pile modelling, Iberian Pyrite Belt, Portugal

  14. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    Science.gov (United States)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original

  15. A computational method to geometric measure of biological particles and application to DNA microarray spot size estimation.

    Science.gov (United States)

    Zhang, Mingjun; Mao, Kaixuan; Tao, Weimin; Tarn, Tzyh-Jong

    2006-04-01

    Geometric measures (volume, area and length) of biological particles are of fundamental interest for biological studies. Many times, the measures are at micro-/nano-scale, and based on images of the biological particles. This paper proposes a computational method to geometric measure of biological particles. The method has been applied to DNA microarray spot size estimation. Compared with existing algorithms for microarray spot size estimation, the proposed method is computational efficient and also provides confidence probability on the measure. The contributions of this paper include a generic computational method to geometric measure of biological particles and application to DNA microarray spot size estimation.

  16. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  17. Estimate of the energy resources of the Aracatuba region, Sao Paulo, Brazil; Estimativa dos recursos energeticos da regiao de Aracatuba

    Energy Technology Data Exchange (ETDEWEB)

    Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro; Grimoni, Jose Aquiles Baesso; Souza, Carlos Antonio Farias de [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Engenharia de Energia e Automacao Eletricas. Grupo de Energia], e-mail: udaeta@pea.usp.br

    2004-07-01

    The complete assessment on energy producing resources, as referred, indicates a consideration about technical, economical, political and environmental aspects of these resources. The developed project presents a methodology for the implementation of this complete assessment, which consists of identification and inventory of the technologies and resources available at the time to a specific geographical site and their classification as an indicative and computative character of the mentioned aspects, taking in consideration the sustainable development. This methodology was used in the Administrative Region of Aracatuba, obtaining as results a ranking of alternatives. The solar collectors acquired the best evaluation. From this result it's possible to indicate the better option, in this case the solar collector, of investment in a preliminary approach. (author)

  18. Data Resources for the Computer-Guided Discovery of Bioactive Natural Products.

    Science.gov (United States)

    Chen, Ya; de Bruyn Kops, Christina; Kirchmair, Johannes

    2017-09-25

    Natural products from plants, animals, marine life, fungi, bacteria, and other organisms are an important resource for modern drug discovery. Their biological relevance and structural diversity make natural products good starting points for drug design. Natural product-based drug discovery can benefit greatly from computational approaches, which are a valuable precursor or supplementary method to in vitro testing. We present an overview of 25 virtual and 31 physical natural product libraries that are useful for applications in cheminformatics, in particular virtual screening. The overview includes detailed information about each library, the extent of its structural information, and the overlap between different sources of natural products. In terms of chemical structures, there is a large overlap between freely available and commercial virtual natural product libraries. Of particular interest for drug discovery is that at least ten percent of known natural products are readily purchasable and many more natural products and derivatives are available through on-demand sourcing, extraction and synthesis services. Many of the readily purchasable natural products are of small size and hence of relevance to fragment-based drug discovery. There are also an increasing number of macrocyclic natural products and derivatives becoming available for screening.

  19. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  20. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    Science.gov (United States)

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  1. Areal rainfall estimation using moving cars - computer experiments including hydrological modeling

    Science.gov (United States)

    Rabiei, Ehsan; Haberlandt, Uwe; Sester, Monika; Fitzner, Daniel; Wallner, Markus

    2016-09-01

    The need for high temporal and spatial resolution precipitation data for hydrological analyses has been discussed in several studies. Although rain gauges provide valuable information, a very dense rain gauge network is costly. As a result, several new ideas have emerged to help estimating areal rainfall with higher temporal and spatial resolution. Rabiei et al. (2013) observed that moving cars, called RainCars (RCs), can potentially be a new source of data for measuring rain rate. The optical sensors used in that study are designed for operating the windscreen wipers and showed promising results for rainfall measurement purposes. Their measurement accuracy has been quantified in laboratory experiments. Considering explicitly those errors, the main objective of this study is to investigate the benefit of using RCs for estimating areal rainfall. For that, computer experiments are carried out, where radar rainfall is considered as the reference and the other sources of data, i.e., RCs and rain gauges, are extracted from radar data. Comparing the quality of areal rainfall estimation by RCs with rain gauges and reference data helps to investigate the benefit of the RCs. The value of this additional source of data is not only assessed for areal rainfall estimation performance but also for use in hydrological modeling. Considering measurement errors derived from laboratory experiments, the result shows that the RCs provide useful additional information for areal rainfall estimation as well as for hydrological modeling. Moreover, by testing larger uncertainties for RCs, they observed to be useful up to a certain level for areal rainfall estimation and discharge simulation.

  2. Unbiased estimation of the calcaneus volume using the Cavalieri principle on computed tomography images.

    Science.gov (United States)

    Acer, N; Bayar, B; Basaloglu, H; Oner, E; Bayar, K; Sankur, S

    2008-11-20

    The size and shape of tarsal bones are especially relevant when considering some orthopedic diseases such as clubfoot. For this reason, the measurements of the tarsal bones have been the subject of many studies, none of which has used stereological methods to estimate the volume. In the present stereological study, we estimated the volume of calcaneal bone of normal feet and dry bones. We used a combination of the Cavalieri principle and computer tomographic scans taken from eight males and nine dry calcanei to estimate the volumes of calcaneal bones. The mean volume of dry calcaneal bones was estimated, producing mean results using the point-counting method and Archimedes principle being 49.11+/-10.7 or 48.22+/-11.92 cm(3), respectively. A positive correlation was found between anthropometric measurements and the volume of calcaneal bones. The findings of the present study using the stereological methods could provide data for the evaluation of normal and pathological volumes of calcaneal bones.

  3. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Directory of Open Access Journals (Sweden)

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  4. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  5. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  6. Improving Maryland’s Offshore Wind Energy Resource Estimate Using Doppler Wind Lidar Technology to Assess Microtmeteorology Controls

    Directory of Open Access Journals (Sweden)

    Pé Alexandra St.

    2016-01-01

    Compared to lidar measurements, power law extrapolation estimates and operational National Weather Service models underestimated hub-height wind speeds in the WEA. In addition, lidar observations suggest the frequent development of a low-level wind maximum (LLWM, with high turbinelayer wind shear and low turbulence intensity within a turbine’s rotor layer (40m-160m. Results elucidate the advantages of using Doppler wind lidar technology to improve offshore wind resource estimates and its ability to monitor under-sampled offshore meteorological controls impact on a potential turbine’s ability to produce power.

  7. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  8. Myocardial strain estimation from CT: towards computer-aided diagnosis on infarction identification

    Science.gov (United States)

    Wong, Ken C. L.; Tee, Michael; Chen, Marcus; Bluemke, David A.; Summers, Ronald M.; Yao, Jianhua

    2015-03-01

    Regional myocardial strains have the potential for early quantification and detection of cardiac dysfunctions. Although image modalities such as tagged and strain-encoded MRI can provide motion information of the myocardium, they are uncommon in clinical routine. In contrary, cardiac CT images are usually available, but they only provide motion information at salient features such as the cardiac boundaries. To estimate myocardial strains from a CT image sequence, we adopted a cardiac biomechanical model with hyperelastic material properties to relate the motion on the cardiac boundaries to the myocardial deformation. The frame-to-frame displacements of the cardiac boundaries are obtained using B-spline deformable image registration based on mutual information, which are enforced as boundary conditions to the biomechanical model. The system equation is solved by the finite element method to provide the dense displacement field of the myocardium, and the regional values of the three principal strains and the six strains in cylindrical coordinates are computed in terms of the American Heart Association nomenclature. To study the potential of the estimated regional strains on identifying myocardial infarction, experiments were performed on cardiac CT image sequences of ten canines with artificially induced myocardial infarctions. The leave-one-subject-out cross validations show that, by using the optimal strain magnitude thresholds computed from ROC curves, the radial strain and the first principal strain have the best performance.

  9. Estimating valence from the sound of a word: Computational, experimental, and cross-linguistic evidence.

    Science.gov (United States)

    Louwerse, Max; Qu, Zhan

    2017-06-01

    It is assumed linguistic symbols must be grounded in perceptual information to attain meaning, because the sound of a word in a language has an arbitrary relation with its referent. This paper demonstrates that a strong arbitrariness claim should be reconsidered. In a computational study, we showed that one phonological feature (nasals in the beginning of a word) predicted negative valence in three European languages (English, Dutch, and German) and positive valence in Chinese. In three experiments, we tested whether participants used this feature in estimating the valence of a word. In Experiment 1, Chinese and Dutch participants rated the valence of written valence-neutral words, with Chinese participants rating the nasal-first neutral-valence words more positive and the Dutch participants rating nasal-first neutral-valence words more negative. In Experiment 2, Chinese (and Dutch) participants rated the valence of Dutch (and Chinese) written valence-neutral words without being able to understand the meaning of these words. The patterns replicated the valence patterns from Experiment 1. When the written words from Experiment 2 were transformed into spoken words, results in Experiment 3 again showed that participants estimated the valence of words on the basis of the sound of the word. The computational study and psycholinguistic experiments indicated that language users can bootstrap meaning from the sound of a word.

  10. AN ESTIMATION OF HISTORICAL-CULTURAL RESOURCES OF THE TURKIVSKOGO DISTRICT IS FOR NECESSITIES OF ETHNIC TOURISM.

    OpenAIRE

    Безручко, Л.С.

    2016-01-01

    In the article thefeatures of estimation of historical-culturalresources are considered for the necessitiesof ethnic tourism. The list of objects thatcan be used as resources in ethnic toutismis distinguished. In particular, the objects ofJewish heritage (synagogue, Jewish burialplaces), material objects that remainedfrom the German colonists (two churches),are studied, and also the material and nonmaterialculture of boyko ethnos (churches,building, traditions, museums) is studied.The compres...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. Computational Fluid Dynamic Pressure Drop Estimation of Flow between Parallel Plates

    International Nuclear Information System (INIS)

    Son, Hyung Min; Yang, Soo Hyung; Park, Jong Hark

    2014-01-01

    Many pool type reactors have forced downward flows inside the core during normal operation; there is a chance of flow inversion when transients occur. During this phase, the flow undergo transition between turbulent and laminar regions where drastic changes take place in terms of momentum and heat transfer, and the decrease in safety margin is usually observed. Additionally, for high Prandtl number fluids such as water, an effect of the velocity profile inside the channel on the temperature distribution is more pronounced over the low Prandtl number ones. This makes the checking of its pressure drop estimation accuracy less important, assuming the code verification is complete. With an advent of powerful computer hardware, engineering applications of computational fluid dynamics (CFD) methods have become quite common these days. Especially for a fully-turbulent and single phase convective heat transfer, the predictability of the commercial codes has matured enough so that many well-known companies adopt those to accelerate a product development cycle and to realize an increased profitability. In contrast to the above, the transition models for the CFD code are still under development, and the most of the models show limited generality and prediction accuracy. Unlike the system codes, the CFD codes estimate the pressure drop from the velocity profile which is obtained by solving momentum conservation equations, and the resulting friction factor can be a representative parameter for a constant cross section channel flow. In addition, the flow inside a rectangular channel with a high span to gap ratio can be approximated by flow inside parallel plates. The computational fluid dynamics simulation on the flow between parallel plates showed reasonable prediction capability for the laminar and the turbulent regime

  13. Computational Fluid Dynamic Pressure Drop Estimation of Flow between Parallel Plates

    Energy Technology Data Exchange (ETDEWEB)

    Son, Hyung Min; Yang, Soo Hyung; Park, Jong Hark [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Many pool type reactors have forced downward flows inside the core during normal operation; there is a chance of flow inversion when transients occur. During this phase, the flow undergo transition between turbulent and laminar regions where drastic changes take place in terms of momentum and heat transfer, and the decrease in safety margin is usually observed. Additionally, for high Prandtl number fluids such as water, an effect of the velocity profile inside the channel on the temperature distribution is more pronounced over the low Prandtl number ones. This makes the checking of its pressure drop estimation accuracy less important, assuming the code verification is complete. With an advent of powerful computer hardware, engineering applications of computational fluid dynamics (CFD) methods have become quite common these days. Especially for a fully-turbulent and single phase convective heat transfer, the predictability of the commercial codes has matured enough so that many well-known companies adopt those to accelerate a product development cycle and to realize an increased profitability. In contrast to the above, the transition models for the CFD code are still under development, and the most of the models show limited generality and prediction accuracy. Unlike the system codes, the CFD codes estimate the pressure drop from the velocity profile which is obtained by solving momentum conservation equations, and the resulting friction factor can be a representative parameter for a constant cross section channel flow. In addition, the flow inside a rectangular channel with a high span to gap ratio can be approximated by flow inside parallel plates. The computational fluid dynamics simulation on the flow between parallel plates showed reasonable prediction capability for the laminar and the turbulent regime.

  14. Improving Spleen Volume Estimation Via Computer-assisted Segmentation on Clinically Acquired CT Scans.

    Science.gov (United States)

    Xu, Zhoubing; Gertz, Adam L; Burke, Ryan P; Bansal, Neil; Kang, Hakmook; Landman, Bennett A; Abramson, Richard G

    2016-10-01

    Multi-atlas fusion is a promising approach for computer-assisted segmentation of anatomic structures. The purpose of this study was to evaluate the accuracy and time efficiency of multi-atlas segmentation for estimating spleen volumes on clinically acquired computed tomography (CT) scans. Under an institutional review board approval, we obtained 294 de-identified (Health Insurance Portability and Accountability Act-compliant) abdominal CT scans on 78 subjects from a recent clinical trial. We compared five pipelines for obtaining splenic volumes: Pipeline 1 - manual segmentation of all scans, Pipeline 2 - automated segmentation of all scans, Pipeline 3 - automated segmentation of all scans with manual segmentation for outliers on a rudimentary visual quality check, and Pipelines 4 and 5 - volumes derived from a unidimensional measurement of craniocaudal spleen length and three-dimensional splenic index measurements, respectively. Using Pipeline 1 results as ground truth, the accuracies of Pipelines 2-5 (Dice similarity coefficient, Pearson correlation, R-squared, and percent and absolute deviation of volume from ground truth) were compared for point estimates of splenic volume and for change in splenic volume over time. Time cost was also compared for Pipelines 1-5. Pipeline 3 was dominant in terms of both accuracy and time cost. With a Pearson correlation coefficient of 0.99, average absolute volume deviation of 23.7 cm(3), and time cost of 1 minute per scan, Pipeline 3 yielded the best results. The second-best approach was Pipeline 5, with a Pearson correlation coefficient of 0.98, absolute deviation of 46.92 cm(3), and time cost of 1 minute 30 seconds per scan. Manual segmentation (Pipeline 1) required 11 minutes per scan. A computer-automated segmentation approach with manual correction of outliers generated accurate splenic volumes with reasonable time efficiency. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  16. Application analysis of Monte Carlo to estimate the capacity of geothermal resources in Lawu Mount

    Energy Technology Data Exchange (ETDEWEB)

    Supriyadi, E-mail: supriyadi-uno@yahoo.co.nz [Physics, Faculty of Mathematics and Natural Sciences, University of Jember, Jl. Kalimantan Kampus Bumi Tegal Boto, Jember 68181 (Indonesia); Srigutomo, Wahyu [Complex system and earth physics, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Munandar, Arif [Kelompok Program Penelitian Panas Bumi, PSDG, Badan Geologi, Kementrian ESDM, Jl. Soekarno Hatta No. 444 Bandung 40254 (Indonesia)

    2014-03-24

    Monte Carlo analysis has been applied in calculation of geothermal resource capacity based on volumetric method issued by Standar Nasional Indonesia (SNI). A deterministic formula is converted into a stochastic formula to take into account the nature of uncertainties in input parameters. The method yields a range of potential power probability stored beneath Lawu Mount geothermal area. For 10,000 iterations, the capacity of geothermal resources is in the range of 139.30-218.24 MWe with the most likely value is 177.77 MWe. The risk of resource capacity above 196.19 MWe is less than 10%. The power density of the prospect area covering 17 km{sup 2} is 9.41 MWe/km{sup 2} with probability 80%.

  17. DEEBAR - A BASIC interactive computer programme for estimating mean resonance spacings

    International Nuclear Information System (INIS)

    Booth, M.; Pope, A.L.; Smith, R.W.; Story, J.S.

    1988-02-01

    DEEBAR is a BASIC interactive programme, which uses the theories of Dyson and of Dyson and Mehta, to compute estimates of the mean resonance spacings and associated uncertainty statistics from an input file of neutron resonance energies. In applying these theories the broad scale energy dependence of D-bar, as predicted by the ordinary theory of level densities, is taken into account. The mean spacing D-bar ± δD-bar, referred to zero energy of the incident neutrons, is computed from the energies of the first k resonances, for k = 2,3...K in turn and as if no resonances are missing. The user is asked to survey this set of D-bar and δD-bar values and to form a judgement - up to what value of k is the set of resonances complete and what value, in consequence, does the user adopt as the preferred value of D-bar? When the preferred values for k and D-bar have been input, the programme calculates revised values for the level density parameters, consistent with this value for D-bar and with other input information. Two short tables are printed, illustrating the energy variation and spin dependence of D-bar. Dyson's formula based on his Coulomb gas analogy is used for estimating the most likely energies of the topmost bound levels. Finally the quasi-crystalline character of a single level series is exploited by means of a table in which the resonance energies are set alongside an energy ladder whose rungs are regularly spaced with spacing D-bar(E); this comparative table expedites the search for gaps where resonances may have been missed experimentally. Used in conjunction with the program LJPROB, which calculates neutron strengths and compares them against the expected Porter Thomas distribution, estimates of the statistical parameters for use in the unresolved resonance region may be derived. (author)

  18. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  19. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  20. Computing seismic damage estimates for buildings within a big city. Bucharest case study.

    Science.gov (United States)

    Toma-Danila, Dragos; Armas, Iuliana

    2016-04-01

    The seismic risk analysis of big cities is a very demanding yet necessary task; the modeling of such complex systems requires first of all insightful input data at good resolution, referring to local effects, buildings and socio-economic aspects. Also, seismic risk estimation methods with good confidence levels are needed. Until recently, these requirements were not fulfilled for Bucharest, one of the most endangered capital city in Europe due to earthquakes. Based on 2011 and 2002 census data, standardized according to the framework of the Near-real time System for Estimating the Seismic Damage in Romania (SeisDaRo) through a unique approach and on relevant hazard scenarios, we estimate for the first time the building damage within the city, divided in more than 120 areas. The methodology applied relies on 48 vulnerability curves for buildings, on the Improved Displacement Coefficient Analytical Method included in the SELENA software for computing damage probabilities and on multiple seismic hazard scenarios, including the maximum possible. In order to compare results with real losses we use a scenario based on the 4 March 1977 Vrancea earthquake (7.4 moment-magnitude) that lead to 1424 deaths in Bucharest. By using overlay analysis with satellite imagery and a new methodology integrated in GIS we show how results can be enhanced, reflecting even more local characteristics. Best practices for seismic risk mapping are also expressed. Results are promising and contribute to the mitigation efforts in Bucharest.

  1. Geothermal resource base of the world: a revision of the Electric Power Research Institute's estimate

    Energy Technology Data Exchange (ETDEWEB)

    Aldrich, M.J.; Laughlin, A.W.; Gambill, D.T.

    1981-04-01

    Review of the Electric Power Research Institute's (EPRI) method for calculating the geothermal resource base of a country shows that modifications are needed for several of the assumptions used in the calculation. These modifications include: (1) separating geothermal belts into volcanic types with a geothermal gradient of 50{sup 0}C/km and complex types in which 80% of the area has a temperature gradient of 30{sup 0}C/km and 20% has a gradient of 45{sup 0}C/km, (2) using the actual mean annual temperature of a country rather than an assumed 15{sup 0}C average ambient temperature, and (3) making separate calculations for the resource stored in water/brine and that stored in rock. Comparison of this method (Revised EPRI) for calculating a geothermal resource base with other resource base estimates made from a heat flow map of Europe indicates that the technique yields reasonable values. The calculated geothermal resource bases, stored in water and rock to a depth of 5 km, for each country in the world are given. Approximately five times as much energy is stored in rock as is stored in water.

  2. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  3. Recent revisions of phosphate rock reserves and resources: reassuring or misleading? An in-depth literature review of global estimates of phosphate rock reserves and resources

    Science.gov (United States)

    Edixhoven, J. D.; Gupta, J.; Savenije, H. H. G.

    2013-09-01

    Phosphate rock (PR) is a finite mineral indispensible for fertilizer production and a major pollutant. High grade PR is obtained from deposits which took millions of years to form and are gradually being depleted. Over the past three years, global PR reserves as reported by US Geological Survey (USGS) have seen a massive increase, from 16 000 Mt PR in 2010 to 65 000 Mt PR in 2011. The bulk of this four-fold increase is based on a 2010 report by International Fertilizer Development Center (IFDC), which increased Moroccan reserves from 5700 Mt PR as reported by USGS, to 51 000 Mt PR, reported as upgraded ("beneficiated") concentrate. IFDC used a starkly simplified classification compared to the classification used by USGS and proposed that agreement should be reached on PR resource terminology which should be as simple as possible. The report has profoundly influenced the PR scarcity debate, shifting the emphasis from depletion to the pollution angle of the phosphate problem. Various analysts adopted the findings of IFDC and USGS, and argued that that following depletion of reserves, uneconomic deposits (resources and occurrences) will remain available which will extend the lifetime of available deposits to thousands of years. Given the near total dependence of food production on PR, data on PR deposits must be transparent, comparable, reliable and credible. Based on an in-depth literature review, we analyze (i) how IFDC's simplified terminology compares to international best practice in resource classification and whether it is likely to yield data that meets the abovementioned requirements; (ii) whether the difference between ore reserves and reserves as concentrate is sufficiently noted in the literature, and (iii) whether the IFDC report and its estimate of PR reserves and resources is reliable. We conclude that, while there is a global development toward common criteria in resource reporting, IFDC's definitions contravene this development and - due to their

  4. Estimation of future water resources of Xiangjiang River Basin with VIC model under multiple climate scenarios

    Directory of Open Access Journals (Sweden)

    Guo-qing Wang

    2017-04-01

    Full Text Available Variation trends of water resources in the Xiangjiang River Basin over the coming decades have been investigated using the variable infiltration capacity (VIC model and 14 general circulation models' (GCMs' projections under the representative concentration pathway (RCP4.5 scenario. Results show that the Xiangjiang River Basin will probably experience temperature rises during the period from 2021 to 2050, with precipitation decrease in the 2020s and increase in the 2030s. The VIC model performs well for monthly discharge simulations with better performance for hydrometric stations on the main stream of the Xiangjiang River than for tributary catchments. The simulated annual discharges are significantly correlated to the recorded annual discharges for all the eight selected target stations. The Xiangjiang River Basin may experience water shortages induced by climate change. Annual water resources of the Xiangjiang River Basin over the period from 2021 to 2050 are projected to decrease by 2.76% on average within the range from −7.81% to 7.40%. It is essential to consider the potential impact of climate change on water resources in future planning for sustainable utilization of water resources.

  5. Resources for preparing independent government estimates for remedial contracting work assignments. Directive

    International Nuclear Information System (INIS)

    1992-01-01

    The memorandum provides information regarding the availability of tools, data bases, and assistance for developing independent government estimates of the cost of work to be performed by contractors for remedial work assignments

  6. Distributed Formation State Estimation Algorithms Under Resource and Multi-Tasking Constraints, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent work on distributed multi-spacecraft systems has resulted in a number of architectures and algorithms for accurate estimation of spacecraft and formation...

  7. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    The Danish Meteorological Institute operates a radar network consisting of five C-band Doppler radars. Quantitative precipitation estimation (QPE) using radar data is performed on a daily basis. Radar QPE is considered to have the potential to signifi cantly improve the spatial representation...... of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...

  8. Estimation of staff doses in complex radiological examinations using a Monte Carlo computer code

    International Nuclear Information System (INIS)

    Vanhavere, F.

    2007-01-01

    The protection of medical personnel in interventional radiology is an important issue of radiological protection. The irradiation of the worker is largely non-uniform, and a large part of his body is shielded by a lead apron. The estimation of effective dose (E) under these conditions is difficult and several approaches are used to estimate effective dose involving such a protective apron. This study presents a summary from an extensive series of simulations to determine scatter-dose distribution around the patient and staff effective dose from personal dosimeter readings. The influence of different parameters (like beam energy and size, patient size, irradiated region, worker position and orientation) on the staff doses has been determined. Published algorithms that combine readings of an unshielded and a shielded dosimeter to estimate effective dose have been applied and a new algorithm, that gives more accurate dose estimates for a wide range of situations was proposed. A computational approach was used to determine the dose distribution in the worker's body. The radiation transport and energy deposition was simulated using the MCNP4B code. The human bodies of the patient and radiologist were generated with the Body Builder anthropomorphic model-generating tool. The radiologist is protected with a lead apron (0.5 mm lead equivalent in the front and 0.25 mm lead equivalent in the back and sides) and a thyroid collar (0.35 mm lead equivalent). The lower-arms of the worker were folded to simulate the arms position during clinical examinations. This realistic situation of the folded arms affects the effective dose to the worker. Depending on the worker position and orientation (and of course the beam energy), the difference can go up to 25 percent. A total of 12 Hp(10) dosimeters were positioned above and under the lead apron at the neck, chest and waist levels. Extra dosimeters for the skin dose were positioned at the forehead, the forearms and the front surface of

  9. MONITOR: A computer model for estimating the costs of an integral monitored retrievable storage facility

    International Nuclear Information System (INIS)

    Reimus, P.W.; Sevigny, N.L.; Schutz, M.E.; Heller, R.A.

    1986-12-01

    The MONITOR model is a FORTRAN 77 based computer code that provides parametric life-cycle cost estimates for a monitored retrievable storage (MRS) facility. MONITOR is very flexible in that it can estimate the costs of an MRS facility operating under almost any conceivable nuclear waste logistics scenario. The model can also accommodate input data of varying degrees of complexity and detail (ranging from very simple to more complex) which makes it ideal for use in the MRS program, where new designs and new cost data are frequently offered for consideration. MONITOR can be run as an independent program, or it can be interfaced with the Waste System Transportation and Economic Simulation (WASTES) model, a program that simulates the movement of waste through a complete nuclear waste disposal system. The WASTES model drives the MONITOR model by providing it with the annual quantities of waste that are received, stored, and shipped at the MRS facility. Three runs of MONITOR are documented in this report. Two of the runs are for Version 1 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2A (backup) version of the MRS cost estimate. In one of these runs MONITOR was run as an independent model, and in the other run MONITOR was run using an input file generated by the WASTES model. The two runs correspond to identical cases, and the fact that they gave identical results verified that the code performed the same calculations in both modes of operation. The third run was made for Version 2 of the MONITOR code. A simulation which uses the costs developed by the Ralph M. Parsons Company in the 2B (integral) version of the MRS cost estimate. This run was made with MONITOR being run as an independent model. The results of several cases have been verified by hand calculations

  10. Deoxyglucose method for the estimation of local myocardial glucose metabolism with positron computed tomography

    International Nuclear Information System (INIS)

    Ratib, O.; Phelps, M.E.; Huang, S.C.; Henze, E.; Selin, C.E.; Schelbert, H.R.

    1981-01-01

    The deoxyglucose method originally developed for measurements of the local cerebral metabolic rate for glucose has been investigated in terms of its application to studies of the heart with positron computed tomography (PCT) and FDG. Studies were performed in dogs to measure the tissue kinetics of FDG with PCT and by direct arterial-venous sampling. The operational equation developed in our laboratory as an extension of the Sokoloff model was used to analyze the data. The FDG method accurately predicted the true MMRGlc even when the glucose metabolic rate was normal but myocardial blood flow (MBF) was elevated 5 times the control value or when metabolism was reduced to 10% of normal and MBF increased 5 times normal. Improvements in PCT resolution are required to improve the accuracy of the estimates of the rate constants and the MMRGlc

  11. Three-dimensional speckle-noise reduction by using computational integral imaging and statistical point estimator

    Science.gov (United States)

    Moon, Inkyu

    2011-06-01

    In this paper we overview a method which can remove speckle noises to exist in coherent imaging systems. Integral imaging (II) system under coherent illumination records the elemental image set with speckle noise patterns of a threedimensional (3D) object. The computational geometrical ray propagation and statistical point estimation algorithms are applied to the elemental image set in order to reconstruct the speckle reduced 3D integral imaging. As performance metrics, the SNR and speckle index are calculated. The results are used to compare the speckle reduced 3D image reconstructed by the presented method with the coherent image having speckle patterns. It is shown in experiments that the presented method can three dimensionally reduce the speckle noise in the 3D object reconstruction.

  12. Deoxyglucose method for the estimation of local myocardial glucose metabolism with positron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Ratib, O.; Phelps, M.E.; Huang, S.C.; Henze, E.; Selin, C.E.; Schelbert, H.R.

    1981-01-01

    The deoxyglucose method originally developed for measurements of the local cerebral metabolic rate for glucose has been investigated in terms of its application to studies of the heart with positron computed tomography (PCT) and FDG. Studies were performed in dogs to measure the tissue kinetics of FDG with PCT and by direct arterial-venous sampling. The operational equation developed in our laboratory as an extension of the Sokoloff model was used to analyze the data. The FDG method accurately predicted the true MMRGlc even when the glucose metabolic rate was normal but myocardial blood flow (MBF) was elevated 5 times the control value or when metabolism was reduced to 10% of normal and MBF increased 5 times normal. Improvements in PCT resolution are required to improve the accuracy of the estimates of the rate constants and the MMRGlc.

  13. Estimation of MSAD values in computed tomography scans using radiochromic films

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Bruno Beraldo; Teogenes Augusto da, E-mail: bbo@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Mourao, Arnaldo Prata [Centro Federal de Educacao Tecnologica de Minas Gerais (CEFET-MG), Belo Horizonte, MG (Brazil)

    2013-03-15

    Objective: To evaluate the feasibility of using radiochromic films as an alternative dosimeter to estimate the multiple scan average dose on the basis on kerma profiles. Materials and Methods: The radiochromic films were distributed in cylinders positioned in the center and in four peripheral bores of a standard abdominal phantom utilized for computed tomography dosimetry. Results: Values for multiple scan average dose values corresponded to 13.6 {+-} 0.7, 13.5 {+-} 0.7 and 18.7 {+-} 1.0 mGy for pitch of 0.75, 1.00 and 1.50, respectively. Conclusion: In spite of results showing lower values than the reference level for radiodiagnosis (25 mGy) established by the Brazilian regulations for abdominal studies, it is suggested that there is room to optimize procedures and review the reference level for radiodiagnosis in Brazil. (author)

  14. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  15. Selection of meteorological parameters affecting rainfall estimation using neuro-fuzzy computing methodology

    Science.gov (United States)

    Hashim, Roslan; Roy, Chandrabhushan; Motamedi, Shervin; Shamshirband, Shahaboddin; Petković, Dalibor; Gocic, Milan; Lee, Siew Cheng

    2016-05-01

    Rainfall is a complex atmospheric process that varies over time and space. Researchers have used various empirical and numerical methods to enhance estimation of rainfall intensity. We developed a novel prediction model in this study, with the emphasis on accuracy to identify the most significant meteorological parameters having effect on rainfall. For this, we used five input parameters: wet day frequency (dwet), vapor pressure (e̅a), and maximum and minimum air temperatures (Tmax and Tmin) as well as cloud cover (cc). The data were obtained from the Indian Meteorological Department for the Patna city, Bihar, India. Further, a type of soft-computing method, known as the adaptive-neuro-fuzzy inference system (ANFIS), was applied to the available data. In this respect, the observation data from 1901 to 2000 were employed for testing, validating, and estimating monthly rainfall via the simulated model. In addition, the ANFIS process for variable selection was implemented to detect the predominant variables affecting the rainfall prediction. Finally, the performance of the model was compared to other soft-computing approaches, including the artificial neural network (ANN), support vector machine (SVM), extreme learning machine (ELM), and genetic programming (GP). The results revealed that ANN, ELM, ANFIS, SVM, and GP had R2 of 0.9531, 0.9572, 0.9764, 0.9525, and 0.9526, respectively. Therefore, we conclude that the ANFIS is the best method among all to predict monthly rainfall. Moreover, dwet was found to be the most influential parameter for rainfall prediction, and the best predictor of accuracy. This study also identified sets of two and three meteorological parameters that show the best predictions.

  16. Development and comparison of computational models for estimation of absorbed organ radiation dose in rainbow trout (Oncorhynchus mykiss) from uptake of iodine-131

    International Nuclear Information System (INIS)

    Martinez, N.E.; Johnson, T.E.; Capello, K.; Pinder, J.E.

    2014-01-01

    This study develops and compares different, increasingly detailed anatomical phantoms for rainbow trout (Oncorhynchus mykiss) for the purpose of estimating organ absorbed radiation dose and dose rates from 131 I uptake in multiple organs. The models considered are: a simplistic geometry considering a single organ, a more specific geometry employing additional organs with anatomically relevant size and location, and voxel reconstruction of internal anatomy obtained from CT imaging (referred to as CSUTROUT). Dose Conversion Factors (DCFs) for whole body as well as selected organs of O. mykiss were computed using Monte Carlo modeling, and combined with estimated activity concentrations, to approximate dose rates and ultimately determine cumulative radiation dose (μGy) to selected organs after several half-lives of 131 I. The different computational models provided similar results, especially for source organs (less than 30% difference between estimated doses), and whole body DCFs for each model (∼3 × 10 −3 μGy d −1 per Bq kg −1 ) were comparable to DCFs listed in ICRP 108 for 131 I. The main benefit provided by the computational models developed here is the ability to accurately determine organ dose. A conservative mass-ratio approach may provide reasonable results for sufficiently large organs, but is only applicable to individual source organs. Although CSUTROUT is the more anatomically realistic phantom, it required much more resource dedication to develop and is less flexible than the stylized phantom for similar results. There may be instances where a detailed phantom such as CSUTROUT is appropriate, but generally the stylized phantom appears to be the best choice for an ideal balance between accuracy and resource requirements. - Highlights: • Computational models (phantoms) are developed for rainbow trout internal dosimetry. • Phantoms are combined with empirical models for 131 I uptake to estimate dose. • Voxel and stylized phantoms predict

  17. Estimating the tuberculosis burden in resource-limited countries: a capture-recapture study in Yemen.

    Science.gov (United States)

    Bassili, A; Al-Hammadi, A; Al-Absi, A; Glaziou, P; Seita, A; Abubakar, I; Bierrenbach, A L; van Hest, N A

    2013-04-01

    The lack of applicable population-based methods to measure tuberculosis (TB) incidence rates directly at country level emphasises the global need to generate robust TB surveillance data to ascertain trends in disease burden and to assess the performance of TB control programmes in the context of the United Nations Millenium Development Goals and World Health Organization targets for TB control. To estimate the incidence of TB cases (all forms) and sputum smear-positive disease, and the level of under-reporting of TB in Yemen in 2010. Record-linkage and three-source capture-recapture analysis of data collected through active prospective longitudinal surveillance within the public and private non-National Tuberculosis Programme sector in twelve Yemeni governorates, selected by stratified cluster random sampling. For all TB cases, the estimated ratio of notified to incident cases and completeness of case ascertainment after record linkage, i.e., the ratio of detected to incident cases, was respectively 71% (95%CI 64-80) and 75% (95%CI 68-85). For sputum smear-positive TB cases, these ratios were respectively 67% (95%CI 58-75) and 76% (95%CI 66-84). We estimate that there were 13 082 (95%CI 11 610-14 513) TB cases in Yemen in 2010. Under-reporting of TB in Yemen is estimated at 29% (95%CI 20-36).

  18. Effort estimation for enterprise resource planning implementation projects using social choice - a comparative study

    Science.gov (United States)

    Koch, Stefan; Mitlöhner, Johann

    2010-08-01

    ERP implementation projects have received enormous attention in the last years, due to their importance for organisations, as well as the costs and risks involved. The estimation of effort and costs associated with new projects therefore is an important topic. Unfortunately, there is still a lack of models that can cope with the special characteristics of these projects. As the main focus lies in adapting and customising a complex system, and even changing the organisation, traditional models like COCOMO can not easily be applied. In this article, we will apply effort estimation based on social choice in this context. Social choice deals with aggregating the preferences of a number of voters into a collective preference, and we will apply this idea by substituting the voters by project attributes. Therefore, instead of supplying numeric values for various project attributes, a new project only needs to be placed into rankings per attribute, necessitating only ordinal values, and the resulting aggregate ranking can be used to derive an estimation. We will describe the estimation process using a data set of 39 projects, and compare the results to other approaches proposed in the literature.

  19. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  20. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  1. Investigation of parameter estimator and adaptive controller for assist pump by computer simulation.

    Science.gov (United States)

    Shimooka, T; Mitamura, Y; Yuhta, T

    1991-04-01

    The multi-output adaptive controller of a left ventricular assist device (LVAD) was studied by computer simulation. The controller regulated two outputs--mean aortic pressure (mAoP) and mean atrial pressure (mLAP)--by regulating vacuum pressure (input). The autoregressive models were used to describe the circulatory system. The parameters of the models were estimated by the recursive least squares method. Based on the autoregressive models, the vacuum pressure minimizing a performance index was searched. The index used was the weighted summation of the square errors. Responses of the adaptive controller were simulated when the contractility of the left ventricle was decreased at various rates and the peripheral resistance was changed. Both the mAoP and mLAP were controlled to their predicted values in the steady state. The steady-state errors of the mAoP were less than a few mm Hg, and those of the mLAP were lower than 1 mm Hg. Consequently, the estimated parameters can be regarded as true parameters, and the adaptive controller has the potential to control more than two outputs. The multioutput adaptive controller studied is useful in controlling the LVAD according to the change in circulatory condition.

  2. Fan-out Estimation in Spin-based Quantum Computer Scale-up.

    Science.gov (United States)

    Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R

    2017-10-17

    Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.

  3. Stature estimation in a contemporary Japanese population based on clavicular measurements using multidetector computed tomography.

    Science.gov (United States)

    Torimitsu, Suguru; Makino, Yohsuke; Saitoh, Hisako; Sakuma, Ayaka; Ishii, Namiko; Yajima, Daisuke; Inokuchi, Go; Motomura, Ayumi; Chiba, Fumiko; Yamaguchi, Rutsuko; Hashimoto, Mari; Hoshioka, Yumi; Iwase, Hirotaro

    2017-06-01

    The aims of this study was to assess the correlation between stature and clavicular measurements in a contemporary Japanese population using three-dimensional (3D) computed tomographic (CT) images, and to establish regression equations for predicting stature. A total of 249 cadavers (131 males, 118 females) underwent postmortem CT scanning and subsequent forensic autopsy between October 2011 and May 2016 in our department. Four clavicular variables (linear distances between the superior margins of the left and right sternal facets to the anterior points of the left and right acromial ends and between the superior margins of the left and right sternal facets to the left and right conoid tubercles) were measured using 3D CT reconstructed images that extracted only bone data. The correlations between stature and each of the clavicular measurements were assessed with Pearson product-moment correlation coefficients. These clavicular measurements correlated significantly with stature in both sexes. The lowest standard error of estimation value in all, male, and female subjects was 3.62cm (r 2 =0.836), 3.55cm (r 2 =0.566), and 3.43cm (r 2 =0.663), respectively. In conclusion, clavicular measurements obtained from 3D CT images may be useful for stature estimation of Japanese individuals, particularly in cases where better predictors, such as long bones, are not available. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Selective tuberculosis incidence estimation by digital computer information technologies in the MS Excel system

    Directory of Open Access Journals (Sweden)

    G. I. Ilnitsky

    2014-01-01

    Full Text Available The incidence of tuberculosis was estimated in different age groups of people, applying the digital computer information technologies of tracking. For this, the author used the annual forms of the reporting materials stipulated by the Ministry of Health of Ukraine, the results of his observations, and the data of bank information accumulation in the MS Excel system. The initial positions were formed in terms of the epidemiological indicators of Ukraine and the Lvov Region during a 10-year period (2000-2009 that was, in relation with different initial characteristics, divided into Step 1 (2000-2004 in which the tuberculosis epidemic situation progressively deteriorated and Step 2 (2005-2009 in which relative morbidity was relatively stabilized. The results were processed using the MS Excel statistical and mathematical functions that were parametric and nonparametric in establishing a correlation when estimating the changes in epidemic parameters. The findings of studies among the general population could lead to the conclusion that the mean tuberculosis morbidity in Ukraine was much greater than that in the Lvov Region irrespective of the age of a population. At the same time, the morbidity rate in the foci of tuberculosis infection suggested that it rose among both the children, adolescents, and adults, which provided a rationale for that therapeutic and preventive measures should be better implemented.

  5. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  6. Estimation of computed tomography dose in various phantom shapes and compositions

    International Nuclear Information System (INIS)

    Lee, Chang Lae

    2017-01-01

    The purpose of this study was to investigate CTDI (computed tomography dose index at center) for various phantom shapes, sizes, and compositions by using GATE (geant4 application for tomographic emission) simulations. GATE simulations were performed for various phantom shapes (cylinder, elliptical, and hexagonal prism PMMA phantoms) and phantom compositions (water, PMMA, polyethylene, polyoxymethylene) with various diameters (1-50 cm) at various kVp and mAs levels. The CTDI100center values of cylinder, elliptical, and hexagonal prism phantom at 120 kVp, 200 mAs resulted in 11.1, 13.4, and 12.2 mGy, respectively. The volume is the same, but CTDI 100center values are different depending on the type of phantom. The water, PMMA, and polyoxymethylene phantom CTDI 100center values were relatively low as the material density increased. However, in the case of Polyethylene, the CTDI 100center value was higher than that of PMMA at diameters exceeding 15 cm (CTDI 100center : 35.0 mGy). And a diameter greater than 30 cm (CTDI 100center : 17.7 mGy) showed more CTDI 100center than Water. We have used limited phantoms to evaluate CT doses. In this study, CTDI 100center values were estimated and simulated by GATE simulation according to the material and shape of the phantom. CT dosimetry can be estimated more accurately by using various materials and phantom shapes close to human body

  7. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure

    Directory of Open Access Journals (Sweden)

    Jonathan P. Dandois

    2015-10-01

    Full Text Available Ecological remote sensing is being transformed by three-dimensional (3D, multispectral measurements of forest canopies by unmanned aerial vehicles (UAV and computer vision structure from motion (SFM algorithms. Yet applications of this technology have out-paced understanding of the relationship between collection method and data quality. Here, UAV-SFM remote sensing was used to produce 3D multispectral point clouds of Temperate Deciduous forests at different levels of UAV altitude, image overlap, weather, and image processing. Error in canopy height estimates was explained by the alignment of the canopy height model to the digital terrain model (R2 = 0.81 due to differences in lighting and image overlap. Accounting for this, no significant differences were observed in height error at different levels of lighting, altitude, and side overlap. Overall, accurate estimates of canopy height compared to field measurements (R2 = 0.86, RMSE = 3.6 m and LIDAR (R2 = 0.99, RMSE = 3.0 m were obtained under optimal conditions of clear lighting and high image overlap (>80%. Variation in point cloud quality appeared related to the behavior of SFM ‘image features’. Future research should consider the role of image features as the fundamental unit of SFM remote sensing, akin to the pixel of optical imaging and the laser pulse of LIDAR.

  8. Estimation of Resource Productivity and Efficiency: An Extended Evaluation of Sustainability Related to Material Flow

    Directory of Open Access Journals (Sweden)

    Pin-Chih Wang

    2014-09-01

    Full Text Available This study is intended to conduct an extended evaluation of sustainability based on the material flow analysis of resource productivity. We first present updated information on the material flow analysis (MFA database in Taiwan. Essential indicators are selected to quantify resource productivity associated with the economy-wide MFA of Taiwan. The study also applies the IPAT (impact-population-affluence-technology master equation to measure trends of material use efficiency in Taiwan and to compare them with those of other Asia-Pacific countries. An extended evaluation of efficiency, in comparison with selected economies by applying data envelopment analysis (DEA, is conducted accordingly. The Malmquist Productivity Index (MPI is thereby adopted to quantify the patterns and the associated changes of efficiency. Observations and summaries can be described as follows. Based on the MFA of the Taiwanese economy, the average growth rates of domestic material input (DMI; 2.83% and domestic material consumption (DMC; 2.13% in the past two decades were both less than that of gross domestic product (GDP; 4.95%. The decoupling of environmental pressures from economic growth can be observed. In terms of the decomposition analysis of the IPAT equation and in comparison with 38 other economies, the material use efficiency of Taiwan did not perform as well as its economic growth. The DEA comparisons of resource productivity show that Denmark, Germany, Luxembourg, Malta, Netherlands, United Kingdom and Japan performed the best in 2008. Since the MPI consists of technological change (frontier-shift or innovation and efficiency change (catch-up, the change in efficiency (catch-up of Taiwan has not been accomplished as expected in spite of the increase in its technological efficiency.

  9. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  10. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  11. Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds

    Science.gov (United States)

    Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano

    Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.

  12. A REVIEW ON SECURITY ISSUES AND CHALLENGES IN CLOUD COMPUTING MODEL OF RESOURCE MANAGEMENT

    OpenAIRE

    T. Vaikunth Pai; Dr. P. S. Aithal

    2017-01-01

    Cloud computing services refer to set of IT-enabled services delivered to a customer as services over the Internet on a leased basis and have the capability to extend up or down their service requirements or needs. Usually, cloud computing services are delivered by third party vendors who own the infrastructure. It has several advantages include scalability, elasticity, flexibility, efficiency and outsourcing non-core activities of an organization. Cloud computing offers an innovative busines...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. Resource Evaluation and Energy Production Estimate for a Tidal Energy Conversion Installation using Acoustic Flow Measurements

    Science.gov (United States)

    Gagnon, Ian; Baldwin, Ken; Wosnik, Martin

    2015-11-01

    The ``Living Bridge'' project plans to install a tidal turbine at Memorial Bridge in the Piscataqua River at Portsmouth, NH. A spatio-temporal tidal energy resource assessment was performed using long term bottom-deployed Acoustic Doppler Current Profilers ADCP. Two locations were evaluated: at the planned deployment location and mid-channel. The goal was to determine the amount of available kinetic energy that can be converted into usable electrical energy on the bridge. Changes in available kinetic energy with ebb/flood and spring/neap tidal cycles and electrical energy demand were analyzed. A system model is used to calculate the net energy savings using various tidal generator and battery bank configurations. Differences in the tidal characteristics between the two measurement locations are highlighted. Different resource evaluation methodologies were also analyzed, e.g., using a representative ADCP ``bin'' vs. a more refined, turbine-geometry-specific methodology, and using static bin height vs. bin height that move w.r.t. the free surface throughout a tidal cycle (representative of a bottom-fixed or floating turbine deployment, respectively). ADCP operating frequencies and bin sizes affect the standard deviation of measurements, and measurement uncertainties are evaluated. Supported by NSF-IIP grant 1430260.

  15. Estimation of climate change impact on water resources by using Bilan water balance model

    International Nuclear Information System (INIS)

    Horacek, Stanislav; Kasparek, Ladislav; Novicky, Oldrich

    2008-01-01

    Modelling of water balance under changed climate conditions has been carried out by T. G. Masaryk Water Research Institute in Prague for basins in the Czech Republic since 1990. The studies presently use climate change scenarios derived from simulations by regional climate models. Climate change scenarios are reflected in meteorological time-series for given catchment and subsequently used for simulation of water cycle components by using Bilan water balance model. Results of Bilan model simulations for input meteorological series not affected and affected by climate change scenarios give information for assessing the climate change impacts on output series of the model. The results of the studies generally show that annual runoff could largely decrease. The increased winter temperature could cause an increase in winter flows and a decrease in snow storage, and consequently, spring and summer outflows will decrease significantly, even to their current minimum values. The groundwater storage and base flow could also be highly reduced. The described method has been used in a number of research projects and operational applications. Its typical application is aimed at assessing possible impacts of climate change on surface water resources, whose availability can subsequently be analysed by using water management models of the individual basins. The Bilan model, particularly in combination with Modflow model, can also suitably be used for simulation and assessments of groundwater resources.

  16. A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-10-01

    Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Estimation of solar energy resources for low salinity water desalination in several regions of Russia

    Science.gov (United States)

    Tarasenko, A. B.; Kiseleva, S. V.; Shakun, V. P.; Gabderakhmanova, T. S.

    2018-01-01

    This paper focuses on estimation of demanded photovoltaic (PV) array areas and capital expenses to feed a reverse osmosis desalination unit (1 m3/day fresh water production rate). The investigation have been made for different climatic conditions of Russia using regional data on ground water salinity from different sources and empirical dependence of specific energy consumption on salinity and temperature. The most optimal results were obtained for Krasnodar, Volgograd, Crimea Republic and some other southern regions. Combination of salinity, temperature and solar radiation level there makes reverse osmosis coupled with photovoltaics very attractive to solve infrastructure problems in rural areas. Estimation results are represented as maps showing PV array areas and capital expenses for selected regions.

  18. Projected Urban Growth (2000 - 2050) and Its Estimated Impact on the US Forest Resource

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; Jeffrey T. Walton

    2005-01-01

    Urban land in the United States is projected to increase from 3.1% in 2000 to 8.1% in 2050, an area of 392,400 km2, which is larger than the state of Montana. By 2050, four states (Rhode Island, New Jersey, Massachusetts, and Connecticut) are projected to be more than one-half urban land. The total projected amount of US forestland estimated to be subsumed by...

  19. A population-based approach to the estimation of diabetes prevalence and health resource utilisation.

    Science.gov (United States)

    Smith, James; Jackson, Gary; Orr-Walker, Brandon; Jackson, Rod; Sinclair, Siniva; Thornley, Simon; Riddell, Tania; Chan, Wing Cheuk

    2010-03-05

    This study estimated diabetes prevalence and utilisation of healthcare services in Counties Manukau using routinely collected administrative data and compared estimates with findings for three other district health boards (DHBs) in close geographic proximity. Records of subsidy claims for pharmaceuticals and laboratory investigations were linked to records in a national hospital admissions database to 'reconstruct' populations of four DHBs--Counties Manukau, Northland, Waitemata and Auckland. Individuals were included in reconstructed populations if they had health events recorded between January 2006 and December 2007. Diabetes cases were identified using an algorithm based on claims for monitoring tests and pharmaceuticals, as well as clinical codes for diabetes in hospital admissions. Reconstructed populations were only 6% lower than census population counts indicating that the vast majority of the population use health services in a two year period. The age- and sex-standardised prevalence of diabetes was 7.1% in Counties Manukau and 5.2% in the other three DHBs combined. Prevalence of diabetes was highest amongst Māori (10.6% in women and 12.2% in men) and Pacific peoples (15.0% for women and 13.5% for men). Maori diabetes cases had the highest hospital discharge rate of any ethnic group. Community pharmaceutical prescribing patterns and laboratory test frequency were similar between diabetes cases by ethnicity and deprivation. Estimates of diabetes prevalence using linkage of routinely collected administrative data were consistent with epidemiological surveys, suggesting that linkage of pharmaceutical and laboratory subsidy databases with hospital admissions data can be used as an alternative to traditional surveys for estimating the prevalence of some long-term conditions. This study demonstrated substantial differences in the prevalence of diabetes and in hospitalisation rates by ethnicity, but measures of community diabetes care were similar by ethnicity

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  1. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    Science.gov (United States)

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  2. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  3. Simulation of Hiërarchical Resource Management for Meta-computing Systems

    NARCIS (Netherlands)

    Santoso, J.; van Albada, G.D.; Sloot, P.M.A.; Nazief, B.A.A.

    2000-01-01

    Optimal scheduling in meta-computing environments still is an open research question. Various research management (RM) architectures have been proposed in the literature. In the present paper we explore, through simulation, various muli-level scheduling strategies for compound computing environments

  4. Customization of a hydrological model for the estimation of water resources in an alpine karstified catchment with sparse data

    Science.gov (United States)

    Kauzlaric, Martina; Schädler, Bruno; Weingartner, Rolf

    2014-05-01

    The main objective of the MontanAqua transdisciplinary project is to develop strategies moving towards a more sustainable water resources management in the Crans-Montana-Sierre region (Valais, Switzerland) in view of global change. Therefore, a detailed assessment of the available water resources in the study area today and in the future is needed. The study region is situated in the inner alpine zone, with strong altitudinal precipitation gradients: from the precipitation rich alpine ridge down to the dry Rhône plain. A typical plateau glacier on top of the ridge is partly drained through the karstic underground formations and linked to various springs to either side of the water divide. The main anthropogenic influences on the system are reservoirs and diversions to the irrigation channels. Thus, the study area does not cover a classical hydrological basin as the water flows frequently across natural hydrographic boundaries. This is a big challenge from a hydrological point of view, as we cannot easily achieve a closed, measured water balance. Over and above, a lack of comprehensive historical data in the catchment reduces the degree of process conceptualization possible, as well as prohibits usual parameter estimation procedures. The Penn State Integrated Hydrologic Model (PIHM) (Kumar, 2009) has been selected to estimate the available natural water resource for the whole study area. It is a semi-discrete, physically-based model which includes: channel routing, overland flow, subsurface saturated and unsaturated flow, rainfall interception, snow melting and evapotranspiration. Its unstructured mesh decomposition offers a flexible domain decomposition strategy for efficient and accurate integration of the physiographic, climatic and hydrographic watershed. The model was modified in order to be more suitable for a karstified mountainous catchment: it now includes the possibility to punctually add external sources, and the temperature-index approach for estimating

  5. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  6. Use of GRACE Terrestrial Water Storage Retrievals to Evaluate Model Estimates by the Australian Water Resources Assessment System

    Science.gov (United States)

    van Dijk, A. I. J. M.; Renzullo, L. J.; Rodell, M.

    2011-01-01

    Terrestrial water storage (TWS) estimates retrievals from the Gravity Recovery and Climate Experiment (GRACE) satellite mission were compared to TWS modeled by the Australian Water Resources Assessment (AWRA) system. The aim was to test whether differences could be attributed and used to identify model deficiencies. Data for 2003 2010 were decomposed into the seasonal cycle, linear trends and the remaining de-trended anomalies before comparing. AWRA tended to have smaller seasonal amplitude than GRACE. GRACE showed a strong (greater than 15 millimeter per year) drying trend in northwest Australia that was associated with a preceding period of unusually wet conditions, whereas weaker drying trends in the southern Murray Basin and southwest Western Australia were associated with relatively dry conditions. AWRA estimated trends were less negative for these regions, while a more positive trend was estimated for areas affected by cyclone Charlotte in 2009. For 2003-2009, a decrease of 7-8 millimeter per year (50-60 cubic kilometers per year) was estimated from GRACE, enough to explain 6-7% of the contemporary rate of global sea level rise. This trend was not reproduced by the model. Agreement between model and data suggested that the GRACE retrieval error estimates are biased high. A scaling coefficient applied to GRACE TWS to reduce the effect of signal leakage appeared to degrade quantitative agreement for some regions. Model aspects identified for improvement included a need for better estimation of rainfall in northwest Australia, and more sophisticated treatment of diffuse groundwater discharge processes and surface-groundwater connectivity for some regions.

  7. Computational estimation of rainbow trout estrogen receptor binding affinities for environmental estrogens

    International Nuclear Information System (INIS)

    Shyu, Conrad; Cavileer, Timothy D.; Nagler, James J.; Ytreberg, F. Marty

    2011-01-01

    Environmental estrogens have been the subject of intense research due to their documented detrimental effects on the health of fish and wildlife and their potential to negatively impact humans. A complete understanding of how these compounds affect health is complicated because environmental estrogens are a structurally heterogeneous group of compounds. In this work, computational molecular dynamics simulations were utilized to predict the binding affinity of different compounds using rainbow trout (Oncorhynchus mykiss) estrogen receptors (ERs) as a model. Specifically, this study presents a comparison of the binding affinity of the natural ligand estradiol-17β to the four rainbow trout ER isoforms with that of three known environmental estrogens 17α-ethinylestradiol, bisphenol A, and raloxifene. Two additional compounds, atrazine and testosterone, that are known to be very weak or non-binders to ERs were tested. The binding affinity of these compounds to the human ERα subtype is also included for comparison. The results of this study suggest that, when compared to estradiol-17β, bisphenol A binds less strongly to all four receptors, 17α-ethinylestradiol binds more strongly, and raloxifene has a high affinity for the α subtype only. The results also show that atrazine and testosterone are weak or non-binders to the ERs. All of the results are in excellent qualitative agreement with the known in vivo estrogenicity of these compounds in the rainbow trout and other fishes. Computational estimation of binding affinities could be a valuable tool for predicting the impact of environmental estrogens in fish and other animals.

  8. Organ doses for reference adult male and female undergoing computed tomography estimated by Monte Carlo simulations

    International Nuclear Information System (INIS)

    Lee, Choonsik; Kim, Kwang Pyo; Long, Daniel; Fisher, Ryan; Tien, Chris; Simon, Steven L.; Bouville, Andre; Bolch, Wesley E.

    2011-01-01

    Purpose: To develop a computed tomography (CT) organ dose estimation method designed to readily provide organ doses in a reference adult male and female for different scan ranges to investigate the degree to which existing commercial programs can reasonably match organ doses defined in these more anatomically realistic adult hybrid phantomsMethods: The x-ray fan beam in the SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code MCNPX2.6. The simulated CT scanner model was validated through comparison with experimentally measured lateral free-in-air dose profiles and computed tomography dose index (CTDI) values. The reference adult male and female hybrid phantoms were coupled with the established CT scanner model following arm removal to simulate clinical head and other body region scans. A set of organ dose matrices were calculated for a series of consecutive axial scans ranging from the top of the head to the bottom of the phantoms with a beam thickness of 10 mm and the tube potentials of 80, 100, and 120 kVp. The organ doses for head, chest, and abdomen/pelvis examinations were calculated based on the organ dose matrices and compared to those obtained from two commercial programs, CT-EXPO and CTDOSIMETRY. Organ dose calculations were repeated for an adult stylized phantom by using the same simulation method used for the adult hybrid phantom. Results: Comparisons of both lateral free-in-air dose profiles and CTDI values through experimental measurement with the Monte Carlo simulations showed good agreement to within 9%. Organ doses for head, chest, and abdomen/pelvis scans reported in the commercial programs exceeded those from the Monte Carlo calculations in both the hybrid and stylized phantoms in this study, sometimes by orders of magnitude. Conclusions: The organ dose estimation method and dose matrices established in this study readily provides organ doses for a reference adult male and female for different

  9. Tidal Energy Conversion Installation at an Estuarine Bridge Site: Resource Evaluation and Energy Production Estimate

    Science.gov (United States)

    Wosnik, M.; Gagnon, I.; Baldwin, K.; Bell, E.

    2015-12-01

    The "Living Bridge" project aims to create a self-diagnosing, self-reporting "smart bridge" powered by a local renewable energy source, tidal energy - transforming Memorial Bridge, a vertical lift bridge over the tidal Piscataqua River connecting Portsmouth, NH and Kittery, ME, into a living laboratory for researchers, engineers, scientists, and the community. The Living Bridge project includes the installation of a tidal turbine at the Memorial Bridge. The energy converted by the turbine will power structural health monitoring, environmental and underwater instrumentation. Utilizing locally available tidal energy can make bridge operation more sustainable, can "harden" transportation infrastructure against prolonged grid outages and can demonstrate a prototype of an "estuarine bridge of the future". A spatio-temporal tidal energy resource assessment was performed using long term bottom-deployed Acoustic Doppler Current Profilers (ADCP) at two locations: near the planned deployment location in 2013-14 for 123 days and mid-channel in 2007 for 35 days. Data were evaluated to determine the amount of available kinetic energy that can be converted into usable electrical energy on the bridge. Changes in available kinetic energy with ebb/flood and spring/neap tidal cycles and electrical energy demand were analyzed. The target deployment site exhibited significantly more energetic ebb tides than flood tides, which can be explained by the local bathymetry of the tidal estuary. A system model is used to calculate the net energy savings using various tidal generator and battery bank configurations. Different resource evaluation methodologies were also analyzed, e.g., using a representative ADCP "bin" vs. a more refined, turbine-geometry-specific methodology, and using static bin height vs. bin height that move w.r.t. the free surface throughout a tidal cycle (representative of a bottom-fixed or floating turbine deployment, respectively). ADCP operating frequencies and bin

  10. Estimation of the resource buffers in the assembly process of a shearer machine in the CPPM method

    Directory of Open Access Journals (Sweden)

    Gwiazda Aleksander

    2017-01-01

    Full Text Available Dynamic development of scheduling systems allows significantly improving currently realized tasks. Critical Chain Project Management (CCPM is one of the methods of project management basing on network planning. In this method is utilized the concept of a critical chain derived from the Theory of Constraints. This method allows avoiding losses considered project time and resources. It results in quicker project implementation (20-30%, and in reducing the risk level considered with tasks realization. The projects are cheaper, and the risk of cost overruns is significantly reduced. Factors that distinguish CCPM method from traditional network planning methods are: the balance of resources and the introduction of buffers. Moreover in the CCPM method key elements are: times of tasks that are reduced from traditional estimates to realistic ones. Activities associated with the task start as late as possible in accordance with the ALAP principle (As Late As Possible. This work presents the process of managing the assembly of a shearer machine taking into account the process of safety buffers utilization and the whole project optimization. It is presented the estimation of buffers capacity to obtain the improvement of project realization task.

  11. Everglades Depth Estimation Network (EDEN)—A decade of serving hydrologic information to scientists and resource managers

    Science.gov (United States)

    Patino, Eduardo; Conrads, Paul; Swain, Eric; Beerens, James M.

    2017-10-30

    IntroductionThe Everglades Depth Estimation Network (EDEN) provides scientists and resource managers with regional maps of daily water levels and depths in the freshwater part of the Greater Everglades landscape. The EDEN domain includes all or parts of five Water Conservation Areas, Big Cypress National Preserve, Pennsuco Wetlands, and Everglades National Park. Daily water-level maps are interpolated from water-level data at monitoring gages, and depth is estimated by using a digital elevation model of the land surface. Online datasets provide time series of daily water levels at gages and rainfall and evapotranspiration data (https://sofia.usgs.gov/eden/). These datasets are used by scientists and resource managers to guide large-scale field operations, describe hydrologic changes, and support biological and ecological assessments that measure ecosystem response to the implementation of the Comprehensive Everglades Restoration Plan. EDEN water-level data have been used in a variety of biological and ecological studies including (1) the health of American alligators as a function of water depth, (2) the variability of post-fire landscape dynamics in relation to water depth, (3) the habitat quality for wading birds with dynamic habitat selection, and (4) an evaluation of the habitat of the Cape Sable seaside sparrow.

  12. Radon estimation in water resources of Mandi - Dharamshala region of Himachal Pradesh, India for health risk assessments

    Science.gov (United States)

    Kumar, Gulshan; Kumari, Punam; Kumar, Mukesh; Kumar, Arvind; Prasher, Sangeeta; Dhar, Sunil

    2017-07-01

    The present study deals with the radon estimation in 40 water samples collected from different natural resources and radium content in the soils of Mandi-Dharamshala Region. Radon concentration is determined by using RAD-7 detector and radium contents of the soil in vicinity of water resources is as well measured by using LR-115 type - II detector, which is further correlated with radon concentration in water samples. The potential health risks related with 222Rn have also been estimated. The results show that the radon concentrations within the range of 1.51 to 22.7Bq/l with an average value of 5.93 Bq/l for all type of water samples taken from study area. The radon concentration in water samples is found lower than 100Bq/l, the exposure limit of radon in water recommended by the World Health Organization. The calculated average effective dose of radon received by the people of study area is 0.022 mSv/y with maximum of 0.083 mSv/y and minimum 0.0056 mSv/y. The total effective dose in all sites of the studied area is found to be within the safe limit (0.1 mSv/year) recommended by World Health Organization. The average value of radium content in the soil of study area is 6.326 Bq/kg.

  13. Estimating demographic parameters from large-scale population genomic data using Approximate Bayesian Computation

    Directory of Open Access Journals (Sweden)

    Li Sen

    2012-03-01

    Full Text Available Abstract Background The Approximate Bayesian Computation (ABC approach has been used to infer demographic parameters for numerous species, including humans. However, most applications of ABC still use limited amounts of data, from a small number of loci, compared to the large amount of genome-wide population-genetic data which have become available in the last few years. Results We evaluated the performance of the ABC approach for three 'population divergence' models - similar to the 'isolation with migration' model - when the data consists of several hundred thousand SNPs typed for multiple individuals by simulating data from known demographic models. The ABC approach was used to infer demographic parameters of interest and we compared the inferred values to the true parameter values that was used to generate hypothetical "observed" data. For all three case models, the ABC approach inferred most demographic parameters quite well with narrow credible intervals, for example, population divergence times and past population sizes, but some parameters were more difficult to infer, such as population sizes at present and migration rates. We compared the ability of different summary statistics to infer demographic parameters, including haplotype and LD based statistics, and found that the accuracy of the parameter estimates can be improved by combining summary statistics that capture different parts of information in the data. Furthermore, our results suggest that poor choices of prior distributions can in some circumstances be detected using ABC. Finally, increasing the amount of data beyond some hundred loci will substantially improve the accuracy of many parameter estimates using ABC. Conclusions We conclude that the ABC approach can accommodate realistic genome-wide population genetic data, which may be difficult to analyze with full likelihood approaches, and that the ABC can provide accurate and precise inference of demographic parameters from

  14. Estimation of radiation exposure of prospectively triggered 128-slice computed tomography coronary angiography

    Energy Technology Data Exchange (ETDEWEB)

    Ketelsen, D.; Fenchel, M.; Thomas, C.; Boehringer, N.; Tsiflikas, I.; Kaempf, M.; Syha, R.; Claussen, C.D.; Heuschmid, M. [Tuebingen Univ. (Germany). Abt. fuer Diagnostische und Interventionelle Radiologie; Buchgeister, M. [Tuebingen Univ. (Germany). Medizinische Physik

    2010-12-15

    Purpose: To estimate the effective dose of prospectively triggered computed tomography coronary angiography (CTCA) in step-and-shoot (SAS) mode, depending on the tube current and tube voltage modulation. Materials and Methods: For dose measurements, an Alderson-Rando-phantom equipped with thermoluminescent dosimeters was used. The effective dose was calculated according to ICRP 103. Exposure was performed on a 128-slice single source scanner providing a collimation of 128 x 0.6 mm and a rotation time of 0.38 seconds. CTCA in the SAS mode was acquired with variation of the tube current (160, 240, 320 mAs) and tube voltage (100, 120, 140 kV) at a simulated heart rate of 60 beats per minute and a scan range of 13.5 cm. Results: Depending on gender, tube current and tube voltage, the effective dose of a CTCA in SAS mode varies from 2.8 to 10.8 mSv. Due to breast tissue in the primary scan range, exposure in the case of females showed an increase of up to 60.0 {+-}.4 % compared to males. The dose reduction achieved by a reduction of tube current showed a significant positive, linear correlation to effective dose with a possible decrease in the effective dose of up to 60.4 % (r = 0.998; p = 0.044). Disproportionately high, the estimated effective dose can be reduced by using a lower tube voltage with a dose reduction of up to 52.4 %. Conclusion: Further substantial dose reduction of low-dose CTCA in SAS mode can be achieved by adapting the tube current and tube voltage and should be implemented in the clinical routine, i.e. adapting those protocol parameters to patient body weight. (orig.).

  15. astroABC : An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, E.; Madigan, M.

    2017-04-01

    Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrain high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC

  16. Optimal Resource Allocation in Ultra-low Power Fog-computing SWIPT-based Networks

    OpenAIRE

    Janatian, Nafiseh; Stupia, Ivan; Vandendorpe, Luc

    2017-01-01

    In this paper, we consider a fog computing system consisting of a multi-antenna access point (AP), an ultra-low power (ULP) single antenna device and a fog server. The ULP device is assumed to be capable of both energy harvesting (EH) and information decoding (ID) using a time-switching simultaneous wireless information and power transfer (SWIPT) scheme. The ULP device deploys the harvested energy for ID and either local computing or offloading the computations to the fog server depending on ...

  17. Tidal Current Energy Resources off the South and West Coasts of Korea: Preliminary Observation-Derived Estimates

    Directory of Open Access Journals (Sweden)

    Woo-Jin Jeong

    2013-01-01

    Full Text Available In this study we estimate the prospective tidal current energy resources off the south and west coasts of Korea and explore the influence of modeling tidal current energies based on 15-day versus month-long data records for regimes with pronounced perigean/apogean influences. The tidal current energy resources off southern and western Korea were calculated using 29-day in situ observation data from 264 stations. The resultant annual energy densities found at each station were categorized into six groups, with a greater percentage of sites falling into the lower-energy groups: 1.1% for >10 MWh·m−2; 2.7% for 5 to 10 MWh·m−2; 6.8% for 3 to 5 MWh·m−2; 9.1% for 2 to 3 MWh·m−2 and 80.3% for <2 MWh·m−2. Analysis shows that the greatest concentration of high annual energy densities occurs in the Jeonnam Province coastal region on the western tip of southwest Korea: 23 MWh·m−2 at Uldolmok, 15 MWh·m−2 at Maenggol Sudo, 9.2 MWh·m−2 at Geocha Sudo and 8.8 MWh·m−2 at Jaingjuk Sudo. The second highest annual energy density concentration, with 16 MWh·m−2, was found in Gyudong Suro, in Gyeonggi Province’s Gyeonggi Bay. We then used data from the 264 stations to examine the effect of perigean and apogean influences on tidal current energy density evaluations. Compared to derivations using month-long records, mean annual energy densities derived using 15-day perigean spring-neap current records alone overestimate the annual mean energy by around 10% whereas those derived using only the apogean records underestimate energy by around 12%. In particular, accuracy of the S2 contribution to the energy density calculations is significantly affected by use of the 15-day data sets, compared to the M2 component, which is relatively consistent. Further, annual energy density estimates derived from 29-day records but excluding the N2 constituent underestimate the potential resource by about 5.4%. Results indicate that one month of data is

  18. Evaluation of Computational Techniques for Parameter Estimation and Uncertainty Analysis of Comprehensive Watershed Models

    Science.gov (United States)

    Yen, H.; Arabi, M.; Records, R.

    2012-12-01

    The structural complexity of comprehensive watershed models continues to increase in order to incorporate inputs at finer spatial and temporal resolutions and simulate a larger number of hydrologic and water quality responses. Hence, computational methods for parameter estimation and uncertainty analysis of complex models have gained increasing popularity. This study aims to evaluate the performance and applicability of a range of algorithms from computationally frugal approaches to formal implementations of Bayesian statistics using Markov Chain Monte Carlo (MCMC) techniques. The evaluation procedure hinges on the appraisal of (i) the quality of final parameter solution in terms of the minimum value of the objective function corresponding to weighted errors; (ii) the algorithmic efficiency in reaching the final solution; (iii) the marginal posterior distributions of model parameters; (iv) the overall identifiability of the model structure; and (v) the effectiveness in drawing samples that can be classified as behavior-giving solutions. The proposed procedure recognize an important and often neglected issue in watershed modeling that solutions with minimum objective function values may not necessarily reflect the behavior of the system. The general behavior of a system is often characterized by the analysts according to the goals of studies using various error statistics such as percent bias or Nash-Sutcliffe efficiency coefficient. Two case studies are carried out to examine the efficiency and effectiveness of four Bayesian approaches including Metropolis-Hastings sampling (MHA), Gibbs sampling (GSA), uniform covering by probabilistic rejection (UCPR), and differential evolution adaptive Metropolis (DREAM); a greedy optimization algorithm dubbed dynamically dimensioned search (DDS); and shuffle complex evolution (SCE-UA), a widely implemented evolutionary heuristic optimization algorithm. The Soil and Water Assessment Tool (SWAT) is used to simulate hydrologic and

  19. Scientific and practical tools for dealing with water resource estimations for the future

    Directory of Open Access Journals (Sweden)

    D. A. Hughes

    2015-06-01

    Full Text Available Future flow regimes will be different to today and imperfect knowledge of present and future climate variations, rainfall–runoff processes and anthropogenic impacts make them highly uncertain. Future water resources decisions will rely on practical and appropriate simulation tools that are sensitive to changes, can assimilate different types of change information and flexible enough to accommodate improvements in understanding of change. They need to include representations of uncertainty and generate information appropriate for uncertain decision-making. This paper presents some examples of the tools that have been developed to address these issues in the southern Africa region. The examples include uncertainty in present day simulations due to lack of understanding and data, using climate change projection data from multiple climate models and future catchment responses due to both climate and development effects. The conclusions are that the tools and models are largely available and what we need is more reliable forcing and model evlaution information as well as methods of making decisions with such inevitably uncertain information.

  20. Resource communication: Variability in estimated runoff in a forested area based on different cartographic data sources

    Directory of Open Access Journals (Sweden)

    Laura Fragoso

    2017-10-01

    Full Text Available Aim of study: The goal of this study is to analyse variations in curve number (CN values produced by different cartographic data sources in a forested watershed, and determine which of them best fit with measured runoff volumes. Area of study: A forested watershed located in western Spain. Material and methods: Four digital cartographic data sources were used to determine the runoff CN in the watershed. Main results: None of the cartographic sources provided all the information necessary to determine properly the CN values. Our proposed methodology, focused on the tree canopy cover, improves the achieved results. Research highlights: The estimation of the CN value in forested areas should be attained as a function of tree canopy cover and new calibrated tables should be implemented in a local scale.

  1. Resources monitoring and automatic management system for multi-VO distributed computing system

    Science.gov (United States)

    Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.

    2017-10-01

    Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.

  2. PEDIC: a computer program to estimate the effect of evacuation on population exposure following acute radionuclide releases to the atmosphere

    International Nuclear Information System (INIS)

    Strenge, D.L.; Peloquin, R.A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems

  3. PEDIC - A COMPUTER PROGRAM TO ESTIMATE THE EFFECT OF EVACUATION ON POPULATION EXPOSURE FOLLOWING ACUTE RADIONUCLIDE RELEASES TO THE ATOMSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D. L.; Peloquin, R. A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems.

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gi Hwa [Seoul National Univ., Seoul (Korea, Republic of)

    1997-11-15

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances.

  6. Computing the Deflection of the Vertical for Improving Aerial Surveys: A Comparison between EGM2008 and ITALGEO05 Estimates

    Directory of Open Access Journals (Sweden)

    Riccardo Barzaghi

    2016-07-01

    Full Text Available Recent studies on the influence of the anomalous gravity field in GNSS/INS applications have shown that neglecting the impact of the deflection of vertical in aerial surveys induces horizontal and vertical errors in the measurement of an object that is part of the observed scene; these errors can vary from a few tens of centimetres to over one meter. The works reported in the literature refer to vertical deflection values based on global geopotential model estimates. In this paper we compared this approach with the one based on local gravity data and collocation methods. In particular, denoted by ξ and η, the two mutually-perpendicular components of the deflection of the vertical vector (in the north and east directions, respectively, their values were computed by collocation in the framework of the Remove-Compute-Restore technique, applied to the gravity database used for estimating the ITALGEO05 geoid. Following this approach, these values have been computed at different altitudes that are relevant in aerial surveys. The (ξ, η values were then also estimated using the high degree EGM2008 global geopotential model and compared with those obtained in the previous computation. The analysis of the differences between the two estimates has shown that the (ξ, η global geopotential model estimate can be reliably used in aerial navigation applications that require the use of sensors connected to a GNSS/INS system only above a given height (e.g., 3000 m in this paper that must be defined by simulations.

  7. Estimation of radiation exposure from lung cancer screening program with low-dose computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Su Yeon; Jun, Jae Kwan [Graduate School of Cancer Science and Policy, National Cancer Center, Seoul (Korea, Republic of)

    2016-12-15

    The National Lung Screening Trial (NLST) demonstrated that screening with Low-dose Computed Tomography (LDCT) screening reduced lung cancer mortality in a high-risk population. Recently, the United States Preventive Services Task Force (USPSTF) gave a B recommendation for annual LDCT screening for individuals at high-risk. With the promising results, Korea developed lung cancer screening guideline and is planning a pilot study for implementation of national lung cancer screening. With widespread adoption of lung cancer screening with LDCT, there are concerns about harms of screening, including high false-positive rates and radiation exposure. Over the 3 rounds of screening in the NLST, 96.4% of positive results were false-positives. Although the initial screening is performed at low dose, subsequent diagnostic examinations following positive results additively contribute to patient's lifetime exposure. As with implementing a large-scale screening program, there is a lack of established risk assessment about the effect of radiation exposure from long-term screening program. Thus, the purpose of this study was to estimate cumulative radiation exposure of annual LDCT lung cancer screening program over 20-year period.

  8. Measurement and Estimation of Renal Size by Computed Tomography in Korean Children.

    Science.gov (United States)

    Park, Chan Won; Yu, Nali; Yun, Sin Weon; Chae, Soo Ahn; Lee, Na Mi; Yi, Dae Yong; Choi, Young Bae; Lim, In Seok

    2017-03-01

    Adequate organ growth is an important aspect of growth evaluation in children. Renal size is an important indicator of adequate renal growth; computed tomography (CT) can closely estimate actual kidney size. However, insufficient data are available on normal renal size as measured by CT. This study aimed to evaluate the relationships of anthropometric indices with renal length and volume measured by CT in Korean pediatric patients. Renal length and volume were measured using CT images in 272 pediatric patients (age renal disease. Data for anthropometric indices-including height, weight, and body surface area (BSA)-were obtained using medical records. Using the equation for an ellipsoid, renal volume was calculated in cubic centimeters. Height showed greatest correlation with renal length on stepwise multiple linear regression analysis; BSA showed the strongest significant correlation with renal volume. The mean renal size for each age group and height group was determined; it showed a tendency to increase with age and height. This is the first Korean study to report the relationship between body indices and renal size measured by CT. These results can serve as normative standards for assessing adequate renal growth.

  9. Estimating hydraulic parameters of the Açu-Brazil aquifer using the computer analysis of micrographs

    Science.gov (United States)

    de Lucena, Leandson R. F.; da Silva, Luis R. D.; Vieira, Marcela M.; Carvalho, Bruno M.; Xavier Júnior, Milton M.

    2016-04-01

    The conventional way of obtaining hydraulic parameters of aquifers is through the interpretation of aquifer tests that requires a fairly complex logistics in terms of equipment and personnel. On the other way, the processing and analysis of digital images of two-dimensional rock sample micrographs presents itself as a promising (simpler and cheaper) alternative procedure for obtaining estimates for hydraulics parameters. This methodology involves the sampling of rocks, followed by the making and imaging of thin rock samples, image segmentation, three-dimensional reconstruction and flow simulation. This methodology was applied to the outcropping portion of the Açu aquifer in the northeast of Brazil, and the computational analyses of the thin rock sections of the acquired samples produced effective porosities between 11.2% and 18.5%, and permeabilities between 52.4 mD and 1140.7 mD. Considering that the aquifer is unconfined, these effective porosity values can be used effectively as storage coefficients. The hydraulic conductivities produced by adopting different water dynamic viscosities at the temperature of 28 °C in the conversion of the permeabilities result in values in the range of [ 6.03 ×10-7, 1.43 ×10-5 ] m/s, compatible with the local hydrogeology.

  10. Estimation of pulmonary water distribution and pulmonary congestion by computed tomography

    International Nuclear Information System (INIS)

    Morooka, Nobuhiro; Watanabe, Shigeru; Masuda, Yoshiaki; Inagaki, Yoshiaki

    1982-01-01

    Computed tomography (CT) of the lung in normal subjects and patients with congestive heart failure was performed in the supine position with deep inspiration to obtain pulmonary CT values and images. The mean CT value in normal subjects was higher in the posterior than anterior lung field, presumably because blood vessels were more dilated in the former than the latter due to the effects of gravity. The mean pulmonary CT value in patients with congestive heart failure was significantly increased possibly due to an increase in blood flow per unit lung volume arising from either pulmonary congestion or pulmonary interstitial and alveolar edema. The mean pulmonary CT value increased parallel to the severity of pulmonary congestion, interstitial or alveolar edema and was well correlated with the pulmonary arterial wedge pressure, indicating that such a correlation was a valuable tool in assessing therapeutic effects. The results of the present study indicatethat pulmonary CT is useful for the noninvasive estimation of intrapulmonary water content and its distribution, thereby providing an effective diagnostic clue to various conditions in congestive heart failure. (author)

  11. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    Science.gov (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  13. Statistical characterization of roughness uncertainty and impact on wind resource estimation

    Directory of Open Access Journals (Sweden)

    M. Kelly

    2017-04-01

    Full Text Available In this work we relate uncertainty in background roughness length (z0 to uncertainty in wind speeds, where the latter are predicted at a wind farm location based on wind statistics observed at a different site. Sensitivity of predicted winds to roughness is derived analytically for the industry-standard European Wind Atlas method, which is based on the geostrophic drag law. We statistically consider roughness and its corresponding uncertainty, in terms of both z0 derived from measured wind speeds as well as that chosen in practice by wind engineers. We show the combined effect of roughness uncertainty arising from differing wind-observation and turbine-prediction sites; this is done for the case of roughness bias as well as for the general case. For estimation of uncertainty in annual energy production (AEP, we also develop a generalized analytical turbine power curve, from which we derive a relation between mean wind speed and AEP. Following our developments, we provide guidance on approximate roughness uncertainty magnitudes to be expected in industry practice, and we also find that sites with larger background roughness incur relatively larger uncertainties.

  14. Resource demand estimates for sustainable forest management: Mngazana Mangrove Forest, South Africa

    Directory of Open Access Journals (Sweden)

    C. H. Traynor

    2008-08-01

    Full Text Available Since democratization in 1994, South African forest policies have promoted sustainable forest management. However, implementation has been problematic due to limited information concerning forest product utilization. This paper investigates and quantifies timber use from the Mngazana Mangrove Forest. Eastern Cape Province. South Africa. Three local communities utilize stems of the mangrove species Rhizophora mucronata Lam. and Bruguiera gymnorrhiza (L. Lam. for building construction. There were two distinct building shapes, circular and rectangular. On average. 155 stems were used for circular buildings and 378 stems for rectangular buildings. Most buildings were constructed using mangroves as well as indigenous timber from coastal scarp forests. The proportion of mangrove stems in buildings varied from 0 to 95%. The annual demand for mangroves was estimated to be 18 400 stems. Due to the high annual demand, projected human population growth rates have a minor influence upon future demand values. For effective sustainable forest management, the standing stock at Mngazana should be restricted to the two mangrove species utilized for building construction, and a forest inventory performed so that demand for building can be compared to supply.

  15. A Tri-National program for estimating the link between snow resources and hydrological droughts

    Directory of Open Access Journals (Sweden)

    M. Zappa

    2015-06-01

    Full Text Available To evaluate how summer low flows and droughts are affected by the winter snowpack, a Tri-National effort will analyse data from three catchments: Alpbach (Prealps, central Switzerland, Gudjaretis-Tskali (Little Caucasus, central Georgia, and Kamenice (Jizera Mountains, northern Czech Republic. Two GIS-based rainfall-runoff models will simulate over 10 years of runoff in streams based on rain and snowfall measurements, and further meteorological variables. The models use information on the geographical settings of the catchments together with knowledge of the hydrological processes of runoff generation from rainfall, looking particularly at the relationship between spring snowmelt and summer droughts. These processes include snow accumulation and melt, evapotranspiration, groundwater recharge in spring that contributes to (the summer runoff, and will be studied by means of the environmental isotopes 18O and 2H. Knowledge about the isotopic composition of the different water sources will allow to identify the flow paths and estimate the residence time of snow meltwater in the subsurface and its contribution to the stream. The application of the models in different nested or neighbouring catchments will explore their potential for further development and allow a better early prediction of low-flow periods in various mountainous zones across Europe. The paper presents the planned activities including a first analysis of already available dataset of environmental isotopes, discharge, snow water equivalent and modelling experiments of the (already available datasets.

  16. A guide to developing resource selection functions from telemetry data using generalized estimating equations and generalized linear mixed models

    Directory of Open Access Journals (Sweden)

    Nicola Koper

    2012-03-01

    Full Text Available Resource selection functions (RSF are often developed using satellite (ARGOS or Global Positioning System (GPS telemetry datasets, which provide a large amount of highly correlated data. We discuss and compare the use of generalized linear mixed-effects models (GLMM and generalized estimating equations (GEE for using this type of data to develop RSFs. GLMMs directly model differences among caribou, while GEEs depend on an adjustment of the standard error to compensate for correlation of data points within individuals. Empirical standard errors, rather than model-based standard errors, must be used with either GLMMs or GEEs when developing RSFs. There are several important differences between these approaches; in particular, GLMMs are best for producing parameter estimates that predict how management might influence individuals, while GEEs are best for predicting how management might influence populations. As the interpretation, value, and statistical significance of both types of parameter estimates differ, it is important that users select the appropriate analytical method. We also outline the use of k-fold cross validation to assess fit of these models. Both GLMMs and GEEs hold promise for developing RSFs as long as they are used appropriately.

  17. The Kimberley Hospital Rule (KHR) for urgent computed tomography of the brain in a resource-limited environment.

    Science.gov (United States)

    Bezuidenhout, A Fourie; Hurter, Delme; Maydell, Arthur T; van Niekerk, Francois; de Figueiredo, Sonia A B; Harvey, Justin; Vlok, Adriaan J; Pitcher, Richard D

    2013-07-29

    The indications for urgent computed tomography of the brain (CTB) in the acute setting are controversial. While guidelines have been proposed for CTB in well-resourced countries, these are not always appropriate for resource-limited environments. Furthermore, no unifying guideline exists for trauma-related and non-trauma-related acute intracranial pathology. Adoption by resource-limited countries of more conservative scanning protocols, with outcomes comparable to well-resourced countries, would have significant benefit. A multidisciplinary team from Kimberley Hospital in the Northern Cape Province of South Africa adopted the principles defined in the National Institute for Health and Care Excellence (NICE) guideline for the early management of head injury and drafted the Kimberley Hospital Rule (KHR), a proposed unifying guideline for the imaging of acute intracranial pathology in a resource-limited environment. To evaluate the sensitivity and specificity of the KHR. A prospective cohort study was conducted in the Northern Cape Province between 1 May 2010 and 30 April 2011. All patients older than 16 years presenting to emergency departments with acute intracranial symptoms were triaged according to the KHR into three groups, as follows: group 1 - immediate scan (within 1 hour); group 2 - urgent scan (within 8 hours); and group 3 - no scan required. Patients in groups 1 and 2 were studied. The primary outcome was CTB findings of clinically significant intracranial pathology requiring acute change in management. Seven hundred and three patients were included. The KHR achieved 90.3% sensitivity and 45.5% specificity, while reducing the number of immediate CTBs by 36.0%. The KHR is an accurate, unifying clinical guideline that appears to optimise the utilisation of CTB in a resource-limited environment.

  18. Recommendations for protecting National Library of Medicine Computing and Networking Resources

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, R.

    1994-11-01

    Protecting Information Technology (IT) involves a number of interrelated factors. These include mission, available resources, technologies, existing policies and procedures, internal culture, contemporary threats, and strategic enterprise direction. In the face of this formidable list, a structured approach provides cost effective actions that allow the organization to manage its risks. We face fundamental challenges that will persist for at least the next several years. It is difficult if not impossible to precisely quantify risk. IT threats and vulnerabilities change rapidly and continually. Limited organizational resources combined with mission restraints-such as availability and connectivity requirements-will insure that most systems will not be absolutely secure (if such security were even possible). In short, there is no technical (or administrative) {open_quotes}silver bullet.{close_quotes} Protection is employing a stratified series of recommendations, matching protection levels against information sensitivities. Adaptive and flexible risk management is the key to effective protection of IT resources. The cost of the protection must be kept less than the expected loss, and one must take into account that an adversary will not expend more to attack a resource than the value of its compromise to that adversary. Notwithstanding the difficulty if not impossibility to precisely quantify risk, the aforementioned allows us to avoid the trap of choosing a course of action simply because {open_quotes}it`s safer{close_quotes} or ignoring an area because no one had explored its potential risk. Recommendations for protecting IT resources begins with discussing contemporary threats and vulnerabilities, and then procedures from general to specific preventive measures. From a risk management perspective, it is imperative to understand that today, the vast majority of threats are against UNIX hosts connected to the Internet.

  19. The Small Wind Energy Estimation Tool (SWEET –a practical application for a complicated resource

    Directory of Open Access Journals (Sweden)

    Keith Sunderland

    2013-10-01

    Full Text Available Of the forms of renewable energy available, wind energy is at the forefront of the European (and Irish green initiative with wind farms supplying a significant proportion of electrical energy demand. Increasingly, this type of distributed generation (DG represents a “paradigm shift” towards increased decentralisation of energy supply. However, because of the distances of most DG from urban areas where demand is greatest, there is a loss of efficiency. One possible solution, placing smaller wind energy systems in urban areas, faces significant challenges. However, if a renewable solution to increasing energy demand is to be achieved, energy conversion systems in cities, where populations are concentrated, must be considered. That said, assessing the feasibility of small/micro wind energy systems within the built environment is still a major challenge. These systems are aerodynamically rough and heterogeneous surfaces create complex flows that disrupt the steady-state conditions ideal for the operation of small wind turbines. In particular, a considerable amount of uncertainty is attributable to the lack of understanding concerning how turbulence within urban environments affects turbine productivity. This paper addresses some of these issues by providing an improved understanding of the complexities associated with wind energy prediction. This research used detailed wind observations to model its turbulence characteristics. The data was obtained using a sonic anemometer that measures wind speed along three orthogonal axes to resolve the wind vector at a temporal resolution of 10Hz. That modelling emphasises the need for practical solutions by optimising standard meteorological observations of mean speeds, and associated standard deviations, to facilitate an improved appreciation of turbulence. The results of the modelling research are incorporated into a practical tool developed in EXCEL, namely the Small Wind Energy Estimation Tool (SWEET

  20. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    Science.gov (United States)

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  1. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    Science.gov (United States)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  2. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Directory of Open Access Journals (Sweden)

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  3. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Science.gov (United States)

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  4. A computer model of the biosphere, to estimate stochastic and non-stochastic effects of radionuclides on humans

    International Nuclear Information System (INIS)

    Laurens, J.M.

    1985-01-01

    A computer code was written to model food chains in order to estimate the internal and external doses, for stochastic and non-stochastic effects, on humans (adults and infants). Results are given for 67 radionuclides, for unit concentration in water (1 Bq/L) and in atmosphere (1 Bq/m 3 )

  5. Characteristics of Computational Thinking about the Estimation of the Students in Mathematics Classroom Applying Lesson Study and Open Approach

    Science.gov (United States)

    Promraksa, Siwarak; Sangaroon, Kiat; Inprasitha, Maitree

    2014-01-01

    The objectives of this research were to study and analyze the characteristics of computational thinking about the estimation of the students in mathematics classroom applying lesson study and open approach. Members of target group included 4th grade students of 2011 academic year of Choomchon Banchonnabot School. The Lesson plan used for data…

  6. COMPUTER-AIDED DESIGN ELEMENTS OF PRECISION FARMING SYSTEMS BASED ON THE PRINCIPLES BIOLOGIZATION, RESOURCE AND ENVIRONMENTAL SAFETY

    Directory of Open Access Journals (Sweden)

    V. Lobkov

    2012-01-01

    Full Text Available Development of practical methods of computer-aided design elements of precision farming systems on the basis of biological function, resource and environmental security for the producers of different specialization, ownership and financial security is the actual direction of development of modern agricultural science. Proposed development, which may serve as a basic programming model, allowing for expanded reproduction of soil fertility through the use of new ways to maximize the amount of phytomass in the agricultural lands, increase soil biological activity and reduce the costs of manufacturing nitrogen on yield formation of crops.

  7. Computer and Video Games in Family Life: The Digital Divide as a Resource in Intergenerational Interactions

    Science.gov (United States)

    Aarsand, Pal Andre

    2007-01-01

    In this ethnographic study of family life, intergenerational video and computer game activities were videotaped and analysed. Both children and adults invoked the notion of a digital divide, i.e. a generation gap between those who master and do not master digital technology. It is argued that the digital divide was exploited by the children to…

  8. Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.

    Science.gov (United States)

    Beltrametti, Monica; English, Will

    1994-01-01

    Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…

  9. Attentional Resource Allocation and Cultural Modulation in a Computational Model of Ritualized Behavior

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Sørensen, Jesper

    2016-01-01

    How do cultural and religious rituals influence human perception and cognition, and what separates the highly patterned behaviors of communal ceremonies from perceptually similar precautionary and compulsive behaviors? These are some of the questions that recent theoretical models and empirical...... patterns and the simulation data were subjected to linear and non-linear analysis. The results are used to exemplify how action perception of ritualized behavior a) might influence allocation of attentional resources; and b) can be modulated by cultural priors. Further explorations of the model show why...... studies have tried to answer by focusing on ritualized behavior instead of ritual. Ritualized behavior (i.e., a set of behavioral features embedded in rituals) increases attention to detail and induces cognitive resource depletion, which together support distinct modes of action categorization. While...

  10. MalHaploFreq: A computer programme for estimating malaria haplotype frequencies from blood samples

    Directory of Open Access Journals (Sweden)

    Smith Thomas A

    2008-07-01

    Full Text Available Abstract Background Molecular markers, particularly those associated with drug resistance, are important surveillance tools that can inform policy choice. People infected with falciparum malaria often contain several genetically-distinct clones of the parasite; genotyping the patients' blood reveals whether or not the marker is present (i.e. its prevalence, but does not reveal its frequency. For example a person with four malaria clones may contain both mutant and wildtype forms of a marker but it is not possible to distinguish the relative frequencies of the mutant and wildtypes i.e. 1:3, 2:2 or 3:1. Methods An appropriate method for obtaining frequencies from prevalence data is by Maximum Likelihood analysis. A computer programme has been developed that allows the frequency of markers, and haplotypes defined by up to three codons, to be estimated from blood phenotype data. Results The programme has been fully documented [see Additional File 1] and provided with a user-friendly interface suitable for large scale analyses. It returns accurate frequencies and 95% confidence intervals from simulated dataset sets and has been extensively tested on field data sets. Additional File 1 User manual for MalHaploFreq. Click here for file Conclusion The programme is included [see Additional File 2] and/or may be freely downloaded from 1. It can then be used to extract molecular marker and haplotype frequencies from their prevalence in human blood samples. This should enhance the use of frequency data to inform antimalarial drug policy choice. Additional File 2 executable programme compiled for use on DOS or windows Click here for file

  11. Homeless Mentally Ill: Problems and Options in Estimating Numbers and Trends. Report to the Chairman, Committee on Labor and Human Resources, U.S. Senate.

    Science.gov (United States)

    General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.

    In response to a request by the United States Senate Committee on Labor and Human Resources, the General Accounting Office (GAO) examined the methodological soundness of current population estimates of the number of homeless chronically mentally ill persons, and proposed several options for estimating the size of this population. The GAO reviewed…

  12. Estimating radiation effective doses from whole body computed tomography scans based on U.S. soldier patient height and weight

    International Nuclear Information System (INIS)

    Prins, Robert D; Thornton, Raymond H; Schmidtlein, C Ross; Quinn, Brian; Ching, Hung; Dauer, Lawrence T

    2011-01-01

    The purpose of this study is to explore how a patient's height and weight can be used to predict the effective dose to a reference phantom with similar height and weight from a chest abdomen pelvis computed tomography scan when machine-based parameters are unknown. Since machine-based scanning parameters can be misplaced or lost, a predictive model will enable the medical professional to quantify a patient's cumulative radiation dose. One hundred mathematical phantoms of varying heights and weights were defined within an x-ray Monte Carlo based software code in order to calculate organ absorbed doses and effective doses from a chest abdomen pelvis scan. Regression analysis was used to develop an effective dose predictive model. The regression model was experimentally verified using anthropomorphic phantoms and validated against a real patient population. Estimates of the effective doses as calculated by the predictive model were within 10% of the estimates of the effective doses using experimentally measured absorbed doses within the anthropomorphic phantoms. Comparisons of the patient population effective doses show that the predictive model is within 33% of current methods of estimating effective dose using machine-based parameters. A patient's height and weight can be used to estimate the effective dose from a chest abdomen pelvis computed tomography scan. The presented predictive model can be used interchangeably with current effective dose estimating techniques that rely on computed tomography machine-based techniques

  13. Estimating radiation effective doses from whole body computed tomography scans based on U.S. soldier patient height and weight

    Directory of Open Access Journals (Sweden)

    Quinn Brian

    2011-10-01

    Full Text Available Abstract Background The purpose of this study is to explore how a patient's height and weight can be used to predict the effective dose to a reference phantom with similar height and weight from a chest abdomen pelvis computed tomography scan when machine-based parameters are unknown. Since machine-based scanning parameters can be misplaced or lost, a predictive model will enable the medical professional to quantify a patient's cumulative radiation dose. Methods One hundred mathematical phantoms of varying heights and weights were defined within an x-ray Monte Carlo based software code in order to calculate organ absorbed doses and effective doses from a chest abdomen pelvis scan. Regression analysis was used to develop an effective dose predictive model. The regression model was experimentally verified using anthropomorphic phantoms and validated against a real patient population. Results Estimates of the effective doses as calculated by the predictive model were within 10% of the estimates of the effective doses using experimentally measured absorbed doses within the anthropomorphic phantoms. Comparisons of the patient population effective doses show that the predictive model is within 33% of current methods of estimating effective dose using machine-based parameters. Conclusions A patient's height and weight can be used to estimate the effective dose from a chest abdomen pelvis computed tomography scan. The presented predictive model can be used interchangeably with current effective dose estimating techniques that rely on computed tomography machine-based techniques.

  14. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  15. IMPROVING EMISSIONS ESTIMATES WITH COMPUTATIONAL INTELLIGENCE, DATABASE EXPANSION, AND COMPREHENSIVE VALIDATION

    Science.gov (United States)

    The report discusses an EPA investigation of techniques to improve methods for estimating volatile organic compound (VOC) emissions from area sources. Using the automobile refinishing industry for a detailed area source case study, an emission estimation method is being developed...

  16. Computer-Mediated Communications for Distance Education and Training: Literature Review and International Resources

    Science.gov (United States)

    1991-01-01

    Pentz and Neil, 1981) ; the changing socioeconomic status of women (Northcott, 1986); and the number of adults living in rural or non-urban areas...communication not as surrogates for face-to-face, but as valid forms in their own right . For example, the Institute for the Future evaluated 28 computer...located in the states of Rajasthan and Bihar * Open Universities have been proposed in Gujarat, Kerala, Karnataka, Uttar Pradesh, Maharashtra, Orissa

  17. A Computational Study of Approximation Algorithms for a Minmax Resource Allocation Problem

    Directory of Open Access Journals (Sweden)

    Bogusz Przybysławski

    2012-01-01

    Full Text Available A basic resource allocation problem with uncertain costs has been discussed. The problem is to minimize the total cost of choosing exactly p items out of n available. The uncertain item costs are specified as a discrete scenario set and the minmax criterion is used to choose a solution. This problem is known to be NP-hard, but several approximation algorithms exist. The aim of this paper is to investigate the quality of the solutions returned by these approximation algorithms. According to the results obtained, the randomized algorithms described are fast and output solutions of good quality, even if the problem size is large. (original abstract

  18. INTERFACE ELEMENTS OF SCIENTIFIC WEB-RESOURCE PHYSIONET AND IMPORT DATA TO COMPUTER MATHEMATICS SYSTEM MAPLE 17

    Directory of Open Access Journals (Sweden)

    G. P. Chuiko

    2015-10-01

    Full Text Available Since 1999, PhysioNet (http://physionet.org/ has offered free access via the web to large collections of recorded physiologic signals and medical databases as well as associated open-source software. The intention of this scientific resource is to stimulate current research and new investigations in the study of cardiovascular and other complex biomedical signals. PhysioBank archives include today the records obtained from healthy individuals and from patients with different diagnoses obtained under various conditions. It includes sudden cardiac death, congestive heart failure, neurological disorders, epilepsy and many others. Software packages PhysioToolkit is valuable for physiological signal processing and analysis, for creation of new databases, the interactive display and characterization of signals, the simulation of physiological and other signals. Nonetheless, a researcher should have skills to work with the operating system Unix and be knowledgeable in special commands to successful use software PhysioToolkit. Therefore, it makes sense to convert the necessary signals to a user-friendly computer algebra system. This paper describes interface elements of scientific web-resource PhysioNet, the simple methods of converting from binary medical data files to the text format and import of received digital signals into computer mathematics system Maple 17.

  19. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  20. INTDOS: a computer code for estimating internal radiation dose using recommendations of the International Commission on Radiological Protection

    International Nuclear Information System (INIS)

    Ryan, M.T.

    1981-09-01

    INTDOS is a user-oriented computer code designed to calculate estimates of internal radiation dose commitment resulting from the acute inhalation intake of various radionuclides. It is designed so that users unfamiliar with the details of such can obtain results by answering a few questions regarding the exposure case. The user must identify the radionuclide name, solubility class, particle size, time since exposure, and the measured lung burden. INTDOS calculates the fractions of the lung burden remaining at time, t, postexposure considering the solubility class and particle size information. From the fraction remaining in the lung at time, t, the quantity inhaled is estimated. Radioactive decay is accounted for in the estimate. Finally, effective committed dose equivalents to various organs and tissues of the body are calculated using inhalation committed dose factors presented by the International Commission on Radiological Protection (ICRP). This computer code was written for execution on a Digital Equipment Corporation PDP-10 computer and is written in Fortran IV. A flow chart and example calculations are discussed in detail to aid the user who is unfamiliar with computer operations