WorldWideScience

Sample records for exploiting computational locality

  1. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    Science.gov (United States)

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  2. SMARTS: Exploiting Temporal Locality and Parallelism through Vertical Execution

    International Nuclear Information System (INIS)

    Beckman, P.; Crotinger, J.; Karmesin, S.; Malony, A.; Oldehoeft, R.; Shende, S.; Smith, S.; Vajracharya, S.

    1999-01-01

    In the solution of large-scale numerical prob- lems, parallel computing is becoming simultaneously more important and more difficult. The complex organization of today's multiprocessors with several memory hierarchies has forced the scientific programmer to make a choice between simple but unscalable code and scalable but extremely com- plex code that does not port to other architectures. This paper describes how the SMARTS runtime system and the POOMA C++ class library for high-performance scientific computing work together to exploit data parallelism in scientific applications while hiding the details of manag- ing parallelism and data locality from the user. We present innovative algorithms, based on the macro -dataflow model, for detecting data parallelism and efficiently executing data- parallel statements on shared-memory multiprocessors. We also desclibe how these algorithms can be implemented on clusters of SMPS

  3. SMARTS: Exploiting Temporal Locality and Parallelism through Vertical Execution

    Energy Technology Data Exchange (ETDEWEB)

    Beckman, P.; Crotinger, J.; Karmesin, S.; Malony, A.; Oldehoeft, R.; Shende, S.; Smith, S.; Vajracharya, S.

    1999-01-04

    In the solution of large-scale numerical prob- lems, parallel computing is becoming simultaneously more important and more difficult. The complex organization of today's multiprocessors with several memory hierarchies has forced the scientific programmer to make a choice between simple but unscalable code and scalable but extremely com- plex code that does not port to other architectures. This paper describes how the SMARTS runtime system and the POOMA C++ class library for high-performance scientific computing work together to exploit data parallelism in scientific applications while hiding the details of manag- ing parallelism and data locality from the user. We present innovative algorithms, based on the macro -dataflow model, for detecting data parallelism and efficiently executing data- parallel statements on shared-memory multiprocessors. We also desclibe how these algorithms can be implemented on clusters of SMPS.

  4. Exploitation of cloud computing in management of construction projects in Slovakia

    Directory of Open Access Journals (Sweden)

    Mandičák Tomáš

    2016-12-01

    Full Text Available The issue of cloud computing is a highly topical issue. Cloud computing represents a new model for information technology (IT services based on the exploitation of Web (it represents a cloud and other application platforms, as well as software as a service. In general, the exploitation of cloud computing in construction project management has several advantages, as demonstrated by several research reports. Currently, research quantifying the exploitation of cloud computing in the Slovak construction industry has not yet been carried out. The article discusses the issue of exploitation of cloud computing in construction project management in Slovakia. The main objective of the research is to confirm whether factors such as size of construction enterprise, owner of construction enterprise and participant of construction project have any impact on the exploitation level of cloud computing in construction project management. It includes confirmation of differences in use between different participants of the construction project or between construction enterprises broken down by size and shareholders.

  5. Local Perspectives on Environmental Insecurity and Its Influence on Illegal Biodiversity Exploitation.

    Directory of Open Access Journals (Sweden)

    Meredith L Gore

    Full Text Available Environmental insecurity is a source and outcome of biodiversity declines and social conflict. One challenge to scaling insecurity reduction policies is that empirical evidence about local attitudes is overwhelmingly missing. We set three objectives: determine how local people rank risk associated with different sources of environmental insecurity; assess perceptions of environmental insecurity, biodiversity exploitation, myths of nature and risk management preferences; and explore relationships between perceptions and biodiversity exploitation. We conducted interviews (N = 88 with residents of Madagascar's Torotorofotsy Protected Area, 2014. Risk perceptions had a moderate effect on perceptions of environmental insecurity. We found no effects of environmental insecurity on biodiversity exploitation. Results offer one if not the first exploration of local perceptions of illegal biodiversity exploitation and environmental security. Local people's perception of risk seriousness associated with illegal biodiversity exploitation such as lemur hunting (low overall may not reflect perceptions of policy-makers (considered to be high. Discord is a key entry point for attention.

  6. Target Localization with a Single Antenna via Directional Multipath Exploitation

    Directory of Open Access Journals (Sweden)

    Ali H. Muqaibel

    2015-01-01

    Full Text Available Target localization in urban sensing can benefit from angle dependency of the pulse shape at a radar receiver antenna. We propose a localization approach that utilizes the embedded directivity in ultra-wideband (UWB antennas to estimate target positions. A single radar unit sensing operation of indoor targets surrounded by interior walls is considered, where interior wall multipaths are exploited to provide target cross-range. This exploitation assumes resolvability of the multipath components, which is made possible by the virtue of using UWB radar signals. The proposed approach is most attractive when only few multipaths are detectable due to propagation obstructions or owing to low signal-to-noise ratios. Both simulated and experimental data are used to demonstrate the effectiveness of the proposed approach.

  7. Exploitation of heterogeneous resources for ATLAS Computing

    CERN Document Server

    Chudoba, Jiri; The ATLAS collaboration

    2018-01-01

    LHC experiments require significant computational resources for Monte Carlo simulations and real data processing and the ATLAS experiment is not an exception. In 2017, ATLAS exploited steadily almost 3M HS06 units, which corresponds to about 300 000 standard CPU cores. The total disk and tape capacity managed by the Rucio data management system exceeded 350 PB. Resources are provided mostly by Grid computing centers distributed in geographically separated locations and connected by the Grid middleware. The ATLAS collaboration developed several systems to manage computational jobs, data files and network transfers. ATLAS solutions for job and data management (PanDA and Rucio) were generalized and now are used also by other collaborations. More components are needed to include new resources such as private and public clouds, volunteers' desktop computers and primarily supercomputers in major HPC centers. Workflows and data flows significantly differ for these less traditional resources and extensive software re...

  8. Exploiting Virtualization and Cloud Computing in ATLAS

    International Nuclear Information System (INIS)

    Harald Barreiro Megino, Fernando; Van der Ster, Daniel; Benjamin, Doug; De, Kaushik; Gable, Ian; Paterson, Michael; Taylor, Ryan; Hendrix, Val; Vitillo, Roberto A; Panitkin, Sergey; De Silva, Asoka; Walker, Rod

    2012-01-01

    The ATLAS Computing Model was designed around the concept of grid computing; since the start of data-taking, this model has proven very successful in the federated operation of more than one hundred Worldwide LHC Computing Grid (WLCG) sites for offline data distribution, storage, processing and analysis. However, new paradigms in computing, namely virtualization and cloud computing, present improved strategies for managing and provisioning IT resources that could allow ATLAS to more flexibly adapt and scale its storage and processing workloads on varied underlying resources. In particular, ATLAS is developing a “grid-of-clouds” infrastructure in order to utilize WLCG sites that make resources available via a cloud API. This work will present the current status of the Virtualization and Cloud Computing R and D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a “cloud factory” for managing cloud VM instances. Next, performance results when running on virtualized/cloud resources at CERN LxCloud, StratusLab, and elsewhere will be presented. Finally, we will present the ATLAS strategies for exploiting cloud-based storage, including remote XROOTD access to input data, management of EC2-based files, and the deployment of cloud-resident LCG storage elements.

  9. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  10. Exploitability Assessment with TEASER

    Science.gov (United States)

    2017-05-01

    for architectural neutral taint analysis on top of LLVM and QEMU. POC Proof of Concept : Demonstration of an exploit on a program . vii RCE Remote Code...bug with a Proof of Concept (POC), or input to a program demonstrating the ability to use a bug to exploit the application, to demonstrate the...often leads to either computationally difficult constraint solving problems or taint explosion. Given the computational difficulty of exploit

  11. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  12. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  13. The impact of exploiting spectro-temporal context in computational speech segregation

    DEFF Research Database (Denmark)

    Bentsen, Thomas; Kressner, Abigail Anne; Dau, Torsten

    2018-01-01

    Computational speech segregation aims to automatically segregate speech from interfering noise, often by employing ideal binary mask estimation. Several studies have tried to exploit contextual information in speech to improve mask estimation accuracy by using two frequently-used strategies that (1...... for measured intelligibility. The findings may have implications for the design of speech segregation systems, and for the selection of a cost function that correlates with intelligibility....

  14. Exploiting volatile opportunistic computing resources with Lobster

    Science.gov (United States)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  15. Regional research exploitation of the LHC a case-study of the required computing resources

    CERN Document Server

    Almehed, S; Eerola, Paule Anna Mari; Mjörnmark, U; Smirnova, O G; Zacharatou-Jarlskog, C; Åkesson, T

    2002-01-01

    A simulation study to evaluate the required computing resources for a research exploitation of the Large Hadron Collider (LHC) has been performed. The evaluation was done as a case study, assuming existence of a Nordic regional centre and using the requirements for performing a specific physics analysis as a yard-stick. Other imput parameters were: assumption for the distribution of researchers at the institutions involved, an analysis model, and two different functional structures of the computing resources.

  16. Hacking the art of exploitation

    CERN Document Server

    Erickson, Jon

    2003-01-01

    A comprehensive introduction to the techniques of exploitation and creative problem-solving methods commonly referred to as "hacking," Hacking: The Art of Exploitation is for both technical and non-technical people who are interested in computer security. It shows how hackers exploit programs and write exploits, instead of just how to run other people's exploits. Unlike many so-called hacking books, this book explains the technical aspects of hacking, including stack based overflows, heap based overflows, string exploits, return-into-libc, shellcode, and cryptographic attacks on 802.11b.

  17. The organization of mineral exploitation and the relationship to urban structures and local business development

    DEFF Research Database (Denmark)

    Hendriksen, Kåre; Hoffmann, Birgitte; Jørgensen, Ulrik

    2013-01-01

    The paper explores relations between mining and urban structures as these are decisive for involving the local workforce and developing local businesses. A major challenge for Greenland is the on-going decoupling between existing settlements and the main export industry based on marine living...... also for the surrounding community. The paper explores if a different and long-term organisation of exploitation of mineral resources with establishment of flexible settlements creates an attractive and sustainable alternative with a reasonable population and economic diversity....

  18. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  19. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  20. Friend or foe: exploiting sensor failures for transparent object localization and classification

    Science.gov (United States)

    Seib, Viktor; Barthen, Andreas; Marohn, Philipp; Paulus, Dietrich

    2017-02-01

    In this work we address the problem of detecting and recognizing transparent objects using depth images from an RGB-D camera. Using this type of sensor usually prohibits the localization of transparent objects since the structured light pattern of these cameras is not reflected by transparent surfaces. Instead, transparent surfaces often appear as undefined values in the resulting images. However, these erroneous sensor readings form characteristic patterns that we exploit in the presented approach. The sensor data is fed into a deep convolutional neural network that is trained to classify and localize drinking glasses. We evaluate our approach with four different types of transparent objects. To our best knowledge, no datasets offering depth images of transparent objects exist so far. With this work we aim at closing this gap by providing our data to the public.

  1. Quantum computation via local control theory: Direct sum vs. direct product Hilbert spaces

    International Nuclear Information System (INIS)

    Sklarz, Shlomo E.; Tannor, David J.

    2006-01-01

    The central objective in any quantum computation is the creation of a desired unitary transformation; the mapping that this unitary transformation produces between the input and output states is identified with the computation. In [S.E. Sklarz, D.J. Tannor, arXiv:quant-ph/0404081 (submitted to PRA) (2004)] it was shown that local control theory can be used to calculate fields that will produce such a desired unitary transformation. In contrast with previous strategies for quantum computing based on optimal control theory, the local control scheme maintains the system within the computational subspace at intermediate times, thereby avoiding unwanted decay processes. In [S.E. Sklarz et al.], the structure of the Hilbert space had a direct sum structure with respect to the computational register and the mediating states. In this paper, we extend the formalism to the important case of a direct product Hilbert space. The final equations for the control algorithm for the two cases are remarkably similar in structure, despite the fact that the derivations are completely different and that in one case the dynamics is in a Hilbert space and in the other case the dynamics is in a Liouville space. As shown in [S.E. Sklarz et al.], the direct sum implementation leads to a computational mechanism based on virtual transitions, and can be viewed as an extension of the principles of Stimulated Raman Adiabatic Passage from state manipulation to evolution operator manipulation. The direct product implementation developed here leads to the intriguing concept of virtual entanglement - computation that exploits second-order transitions that pass through entangled states but that leaves the subsystems nearly separable at all intermediate times. Finally, we speculate on a connection between the algorithm developed here and the concept of decoherence free subspaces

  2. CIME Summer Course on Exploiting Hidden Structure in Matrix Computations : Algorithms and Applications

    CERN Document Server

    Simoncini, Valeria

    2016-01-01

    Focusing on special matrices and matrices which are in some sense "near" to structured matrices, this volume covers a broad range of topics of current interest in numerical linear algebra. Exploitation of these less obvious structural properties can be of great importance in the design of efficient numerical methods, for example algorithms for matrices with low-rank block structure, matrices with decay, and structured tensor computations. Applications range from quantum chemistry to queuing theory. Structured matrices arise frequently in applications. Examples include banded and sparse matrices, Toeplitz-type matrices, and matrices with semi-separable or quasi-separable structure, as well as Hamiltonian and symplectic matrices. The associated literature is enormous, and many efficient algorithms have been developed for solving problems involving such matrices. The text arose from a C.I.M.E. course held in Cetraro (Italy) in June 2015 which aimed to present this fast growing field to young researchers, exploit...

  3. Exploiting Deep Neural Networks and Head Movements for Robust Binaural Localization of Multiple Sources in Reverberant Environments

    DEFF Research Database (Denmark)

    Ma, Ning; May, Tobias; Brown, Guy J.

    2017-01-01

    This paper presents a novel machine-hearing system that exploits deep neural networks (DNNs) and head movements for robust binaural localization of multiple sources in reverberant environments. DNNs are used to learn the relationship between the source azimuth and binaural cues, consisting...... of the complete cross-correlation function (CCF) and interaural level differences (ILDs). In contrast to many previous binaural hearing systems, the proposed approach is not restricted to localization of sound sources in the frontal hemifield. Due to the similarity of binaural cues in the frontal and rear...

  4. Local rollback for fault-tolerance in parallel computing systems

    Science.gov (United States)

    Blumrich, Matthias A [Yorktown Heights, NY; Chen, Dong [Yorktown Heights, NY; Gara, Alan [Yorktown Heights, NY; Giampapa, Mark E [Yorktown Heights, NY; Heidelberger, Philip [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Steinmacher-Burow, Burkhard [Boeblingen, DE; Sugavanam, Krishnan [Yorktown Heights, NY

    2012-01-24

    A control logic device performs a local rollback in a parallel super computing system. The super computing system includes at least one cache memory device. The control logic device determines a local rollback interval. The control logic device runs at least one instruction in the local rollback interval. The control logic device evaluates whether an unrecoverable condition occurs while running the at least one instruction during the local rollback interval. The control logic device checks whether an error occurs during the local rollback. The control logic device restarts the local rollback interval if the error occurs and the unrecoverable condition does not occur during the local rollback interval.

  5. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    Science.gov (United States)

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  6. Exploiting Data Sparsity In Covariance Matrix Computations on Heterogeneous Systems

    KAUST Repository

    Charara, Ali M.

    2018-05-24

    Covariance matrices are ubiquitous in computational sciences, typically describing the correlation of elements of large multivariate spatial data sets. For example, covari- ance matrices are employed in climate/weather modeling for the maximum likelihood estimation to improve prediction, as well as in computational ground-based astronomy to enhance the observed image quality by filtering out noise produced by the adap- tive optics instruments and atmospheric turbulence. The structure of these covariance matrices is dense, symmetric, positive-definite, and often data-sparse, therefore, hier- archically of low-rank. This thesis investigates the performance limit of dense matrix computations (e.g., Cholesky factorization) on covariance matrix problems as the number of unknowns grows, and in the context of the aforementioned applications. We employ recursive formulations of some of the basic linear algebra subroutines (BLAS) to accelerate the covariance matrix computation further, while reducing data traffic across the memory subsystems layers. However, dealing with large data sets (i.e., covariance matrices of billions in size) can rapidly become prohibitive in memory footprint and algorithmic complexity. Most importantly, this thesis investigates the tile low-rank data format (TLR), a new compressed data structure and layout, which is valuable in exploiting data sparsity by approximating the operator. The TLR com- pressed data structure allows approximating the original problem up to user-defined numerical accuracy. This comes at the expense of dealing with tasks with much lower arithmetic intensities than traditional dense computations. In fact, this thesis con- solidates the two trends of dense and data-sparse linear algebra for HPC. Not only does the thesis leverage recursive formulations for dense Cholesky-based matrix al- gorithms, but it also implements a novel TLR-Cholesky factorization using batched linear algebra operations to increase hardware occupancy and

  7. Noise-exploitation and adaptation in neuromorphic sensors

    Science.gov (United States)

    Hindo, Thamira; Chakrabartty, Shantanu

    2012-04-01

    Even though current micro-nano fabrication technology has reached integration levels where ultra-sensitive sensors can be fabricated, the sensing performance (resolution per joule) of synthetic systems are still orders of magnitude inferior to those observed in neurobiology. For example, the filiform hairs in crickets operate at fundamental limits of noise; auditory sensors in a parasitoid fly can overcome fundamental limitations to precisely localize ultra-faint acoustic signatures. Even though many of these biological marvels have served as inspiration for different types of neuromorphic sensors, the main focus these designs have been to faithfully replicate the biological functionalities, without considering the constructive role of "noise". In man-made sensors device and sensor noise are typically considered as a nuisance, where as in neurobiology "noise" has been shown to be a computational aid that enables biology to sense and operate at fundamental limits of energy efficiency and performance. In this paper, we describe some of the important noise-exploitation and adaptation principles observed in neurobiology and how they can be systematically used for designing neuromorphic sensors. Our focus will be on two types of noise-exploitation principles, namely, (a) stochastic resonance; and (b) noise-shaping, which are unified within our previously reported framework called Σ▵ learning. As a case-study, we describe the application of Σ▵ learning for the design of a miniature acoustic source localizer whose performance matches that of its biological counterpart(Ormia Ochracea).

  8. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case

    International Nuclear Information System (INIS)

    Ciaschini, Vincenzo; Dal Pra, Stefano; Dell'Agnello, Luca

    2015-01-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF. (paper)

  9. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.

    Science.gov (United States)

    Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca

    2015-12-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.

  10. Green Computing in Local Governments and Information Technology Companies

    Directory of Open Access Journals (Sweden)

    Badar Agung Nugroho

    2013-06-01

    Full Text Available Green computing is a study and practice of designing, manufacturing, using, and disposing of information and communication devices efficiently and effectively with minimum impact on the environment. If the green computing concept was implemented, it will help the agencies or companies to reduce energy and capital cost from their IT infrastructure. The goal from this research is to explore the current condition about the efforts from local governments and IT companies at West Java to implement the green computing concept at their working environment. The primary data were collected by using focus group discussion by inviting the local governments and IT companies representatives who responsible to manage their IT infrastructure. And then, the secondary data were collected by doing brief observation in order to see the real effort of green computing implementation at each institution. The result shows that there are many different perspectives and efforts of green computing implementation between local governments and IT companies.

  11. Instantaneous Non-Local Computation of Low T-Depth Quantum Circuits

    DEFF Research Database (Denmark)

    Speelman, Florian

    2016-01-01

    -depth of a quantum circuit, able to perform non-local computation of quantum circuits with a (poly-)logarithmic number of layers of T gates with quasi-polynomial entanglement. Our proofs combine ideas from blind and delegated quantum computation with the garden-hose model, a combinatorial model of communication......Instantaneous non-local quantum computation requires multiple parties to jointly perform a quantum operation, using pre-shared entanglement and a single round of simultaneous communication. We study this task for its close connection to position-based quantum cryptography, but it also has natural...... applications in the context of foundations of quantum physics and in distributed computing. The best known general construction for instantaneous non-local quantum computation requires a pre-shared state which is exponentially large in the number of qubits involved in the operation, while efficient...

  12. A non-local computational boundary condition for duct acoustics

    Science.gov (United States)

    Zorumski, William E.; Watson, Willie R.; Hodge, Steve L.

    1994-01-01

    A non-local boundary condition is formulated for acoustic waves in ducts without flow. The ducts are two dimensional with constant area, but with variable impedance wall lining. Extension of the formulation to three dimensional and variable area ducts is straightforward in principle, but requires significantly more computation. The boundary condition simulates a nonreflecting wave field in an infinite duct. It is implemented by a constant matrix operator which is applied at the boundary of the computational domain. An efficient computational solution scheme is developed which allows calculations for high frequencies and long duct lengths. This computational solution utilizes the boundary condition to limit the computational space while preserving the radiation boundary condition. The boundary condition is tested for several sources. It is demonstrated that the boundary condition can be applied close to the sound sources, rendering the computational domain small. Computational solutions with the new non-local boundary condition are shown to be consistent with the known solutions for nonreflecting wavefields in an infinite uniform duct.

  13. 5 CFR 531.245 - Computing locality rates and special rates for GM employees.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing locality rates and special... Gm Employees § 531.245 Computing locality rates and special rates for GM employees. Locality rates and special rates are computed for GM employees in the same manner as locality rates and special rates...

  14. Exploiting Quantum Resonance to Solve Combinatorial Problems

    Science.gov (United States)

    Zak, Michail; Fijany, Amir

    2006-01-01

    Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.

  15. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  16. 5 CFR 531.607 - Computing hourly, daily, weekly, and biweekly locality rates.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Computing hourly, daily, weekly, and... Computing hourly, daily, weekly, and biweekly locality rates. (a) Apply the following methods to convert an... firefighter whose pay is computed under 5 U.S.C. 5545b, a firefighter hourly locality rate is computed using a...

  17. Exploiting GPUs in Virtual Machine for BioCloud

    OpenAIRE

    Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon

    2013-01-01

    Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that ena...

  18. Regional reanalysis without local data: Exploiting the downscaling paradigm

    Science.gov (United States)

    von Storch, Hans; Feser, Frauke; Geyer, Beate; Klehmet, Katharina; Li, Delei; Rockel, Burkhardt; Schubert-Frisius, Martina; Tim, Nele; Zorita, Eduardo

    2017-08-01

    This paper demonstrates two important aspects of regional dynamical downscaling of multidecadal atmospheric reanalysis. First, that in this way skillful regional descriptions of multidecadal climate variability may be constructed in regions with little or no local data. Second, that the concept of large-scale constraining allows global downscaling, so that global reanalyses may be completed by additions of consistent detail in all regions of the world. Global reanalyses suffer from inhomogeneities. However, their large-scale componenst are mostly homogeneous; Therefore, the concept of downscaling may be applied to homogeneously complement the large-scale state of the reanalyses with regional detail—wherever the condition of homogeneity of the description of large scales is fulfilled. Technically, this can be done by dynamical downscaling using a regional or global climate model, which's large scales are constrained by spectral nudging. This approach has been developed and tested for the region of Europe, and a skillful representation of regional weather risks—in particular marine risks—was identified. We have run this system in regions with reduced or absent local data coverage, such as Central Siberia, the Bohai and Yellow Sea, Southwestern Africa, and the South Atlantic. Also, a global simulation was computed, which adds regional features to prescribed global dynamics. Our cases demonstrate that spatially detailed reconstructions of the climate state and its change in the recent three to six decades add useful supplementary information to existing observational data for midlatitude and subtropical regions of the world.

  19. Weighted Local Active Pixel Pattern (WLAPP for Face Recognition in Parallel Computation Environment

    Directory of Open Access Journals (Sweden)

    Gundavarapu Mallikarjuna Rao

    2013-10-01

    Full Text Available Abstract  - The availability of multi-core technology resulted totally new computational era. Researchers are keen to explore available potential in state of art-machines for breaking the bearer imposed by serial computation. Face Recognition is one of the challenging applications on so ever computational environment. The main difficulty of traditional Face Recognition algorithms is lack of the scalability. In this paper Weighted Local Active Pixel Pattern (WLAPP, a new scalable Face Recognition Algorithm suitable for parallel environment is proposed.  Local Active Pixel Pattern (LAPP is found to be simple and computational inexpensive compare to Local Binary Patterns (LBP. WLAPP is developed based on concept of LAPP. The experimentation is performed on FG-Net Aging Database with deliberately introduced 20% distortion and the results are encouraging. Keywords — Active pixels, Face Recognition, Local Binary Pattern (LBP, Local Active Pixel Pattern (LAPP, Pattern computing, parallel workers, template, weight computation.  

  20. FSH: fast spaced seed hashing exploiting adjacent hashes.

    Science.gov (United States)

    Girotto, Samuele; Comin, Matteo; Pizzi, Cinzia

    2018-01-01

    Patterns with wildcards in specified positions, namely spaced seeds , are increasingly used instead of k -mers in many bioinformatics applications that require indexing, querying and rapid similarity search, as they can provide better sensitivity. Many of these applications require to compute the hashing of each position in the input sequences with respect to the given spaced seed, or to multiple spaced seeds. While the hashing of k -mers can be rapidly computed by exploiting the large overlap between consecutive k -mers, spaced seeds hashing is usually computed from scratch for each position in the input sequence, thus resulting in slower processing. The method proposed in this paper, fast spaced-seed hashing (FSH), exploits the similarity of the hash values of spaced seeds computed at adjacent positions in the input sequence. In our experiments we compute the hash for each positions of metagenomics reads from several datasets, with respect to different spaced seeds. We also propose a generalized version of the algorithm for the simultaneous computation of multiple spaced seeds hashing. In the experiments, our algorithm can compute the hashing values of spaced seeds with a speedup, with respect to the traditional approach, between 1.6[Formula: see text] to 5.3[Formula: see text], depending on the structure of the spaced seed. Spaced seed hashing is a routine task for several bioinformatics application. FSH allows to perform this task efficiently and raise the question of whether other hashing can be exploited to further improve the speed up. This has the potential of major impact in the field, making spaced seed applications not only accurate, but also faster and more efficient. The software FSH is freely available for academic use at: https://bitbucket.org/samu661/fsh/overview.

  1. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  2. Local computations in Dempster-Shafer theory of evidence

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2012-01-01

    Roč. 53, č. 8 (2012), s. 1155-1167 ISSN 0888-613X Grant - others:GA ČR(CZ) GAP403/12/2175 Program:GA Institutional support: RVO:67985556 Keywords : Discrete belief functions * Dempster-Shafer theory * conditional independence * decomposable model Subject RIV: IN - Informatics, Computer Science Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/jirousek-local computations in dempster–shafer theory of evidence. pdf

  3. SEXUAL EXPLOITATION AND ABUSE BY UN PEACEKEEPERS ...

    African Journals Online (AJOL)

    Allaiac

    from sexual exploitation and sexual abuse (ST/SGB/2003/13) (UN, Secretary .... In addition, in most situations, UN personnel have enjoyed immunity from local .... 9 Official UN statistics show a higher incidence of allegations reported against.

  4. Cortical basis of communication: local computation, coordination, attention.

    Science.gov (United States)

    Alexandre, Frederic

    2009-03-01

    Human communication emerges from cortical processing, known to be implemented on a regular repetitive neuronal substratum. The supposed genericity of cortical processing has elicited a series of modeling works in computational neuroscience that underline the information flows driven by the cortical circuitry. In the minimalist framework underlying the current theories for the embodiment of cognition, such a generic cortical processing is exploited for the coordination of poles of representation, as is reported in this paper for the case of visual attention. Interestingly, this case emphasizes how abstract internal referents are built to conform to memory requirements. This paper proposes that these referents are the basis for communication in humans, which is firstly a coordination and an attentional procedure with regard to their congeners.

  5. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    Science.gov (United States)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    available and it will also provide statistical pedigree data. This pedigree data provides both uncertainties associated with the information and an audit trail cataloging the raw data sources and the processing/exploitation applied to derive the final product. Collaboration provides for a close union between the information producer(s)/exploiter(s) and the information user(s) as well as between local and remote producer(s)/exploiter(s). From a military operational perspective, IMAGES is a step toward further uniting NIMA with its customers and further blurring the dividing line between operational command and control (C2) and its supporting intelligence activities. IMAGES also provides a foundation for reachback to remote data sources, data stores, application software, and computational resources for achieving 'just-in- time' information delivery -- all of which is transparent to the analyst or operator employing the system.

  6. A high-resolution computational localization method for transcranial magnetic stimulation mapping.

    Science.gov (United States)

    Aonuma, Shinta; Gomez-Tames, Jose; Laakso, Ilkka; Hirata, Akimasa; Takakura, Tomokazu; Tamura, Manabu; Muragaki, Yoshihiro

    2018-05-15

    Transcranial magnetic stimulation (TMS) is used for the mapping of brain motor functions. The complexity of the brain deters determining the exact localization of the stimulation site using simplified methods (e.g., the region below the center of the TMS coil) or conventional computational approaches. This study aimed to present a high-precision localization method for a specific motor area by synthesizing computed non-uniform current distributions in the brain for multiple sessions of TMS. Peritumoral mapping by TMS was conducted on patients who had intra-axial brain neoplasms located within or close to the motor speech area. The electric field induced by TMS was computed using realistic head models constructed from magnetic resonance images of patients. A post-processing method was implemented to determine a TMS hotspot by combining the computed electric fields for the coil orientations and positions that delivered high motor-evoked potentials during peritumoral mapping. The method was compared to the stimulation site localized via intraoperative direct brain stimulation and navigated TMS. Four main results were obtained: 1) the dependence of the computed hotspot area on the number of peritumoral measurements was evaluated; 2) the estimated localization of the hand motor area in eight non-affected hemispheres was in good agreement with the position of a so-called "hand-knob"; 3) the estimated hotspot areas were not sensitive to variations in tissue conductivity; and 4) the hand motor areas estimated by this proposal and direct electric stimulation (DES) were in good agreement in the ipsilateral hemisphere of four glioma patients. The TMS localization method was validated by well-known positions of the "hand-knob" in brains for the non-affected hemisphere, and by a hotspot localized via DES during awake craniotomy for the tumor-containing hemisphere. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Local computer network of the JINR Neutron Physics Laboratory

    International Nuclear Information System (INIS)

    Alfimenkov, A.V.; Vagov, V.A.; Vajdkhadze, F.

    1988-01-01

    New high-speed local computer network, where intelligent network adapter (NA) is used as hardware base, is developed in the JINR Neutron Physics Laboratory to increase operation efficiency and data transfer rate. NA consists of computer bus interface, cable former, microcomputer segment designed for both program realization of channel-level protocol and organization of bidirectional transfer of information through direct access channel between monochannel and computer memory with or witout buffering in NA operation memory device

  8. Exploiting parallel R in the cloud with SPRINT.

    Science.gov (United States)

    Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J

    2013-01-01

    Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.

  9. Personal computer local networks report

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. Since the first microcomputer local networks of the late 1970's and early 80's, personal computer LANs have expanded in popularity, especially since the introduction of IBMs first PC in 1981. The late 1980s has seen a maturing in the industry with only a few vendors maintaining a large share of the market. This report is intended to give the reader a thorough understanding of the technology used to build these systems ... from cable to chips ... to ... protocols to servers. The report also fully defines PC LANs and the marketplace, with in-

  10. Fishery Development and Exploitation in South East Australia

    Directory of Open Access Journals (Sweden)

    Camilla Novaglio

    2018-04-01

    Full Text Available Understanding the full extent of past ecological changes in human-influenced marine systems is needed to inform present management policies, but is often hampered by the scarcity of information about exploitation practices and population status over the entire history of fishing. The history of commercial fishing in South East Australia is relatively recent and thus easier to document. Our aim is to reconstruct such history and to use this information to understand general patterns and consequences of fishing exploitation. Intense exploitation of marine resources arrived in South East Australia with European colonization in the early 1800s, and unregulated sealing, whaling and oyster dredging resulted in the first documented significant impact on local marine populations. Exploitation extended to demersal resources in 1915 when the trawl fishery developed. Between the early 1800s and the 1980s, some of the exploited stocks collapsed, but fishing moved further offshore and in deeper waters as technology improved and new resources became available or were discovered. This phase of fisheries expansion masked the unsustainable nature of some fishing industries, such as trawling and whaling, and postponed the need for management regulations. From the 1990s onward, an increasing awareness of the depleted nature of some fisheries led to the establishment of management strategies aiming at a more sustainable exploitation of target stocks and, from the mid-2000s onwards, management strategies were revised and improved to better address the effect of fishing on multiple components of marine ecosystems. This led to the recovery of some depleted populations and to increased habitat protection. The relatively short history of fishing exploitation and the small scale of the fishing industry in South East Australia played a significant role in limiting the magnitude of fishing impacts on local populations and helped to achieve recoveries when fisheries

  11. [Ecotourism exploitation model in Bita Lake Natural Reserve of Yunnan].

    Science.gov (United States)

    Yang, G; Wang, Y; Zhong, L

    2000-12-01

    Bita lake provincial natural reserve is located in Shangri-La region of North-western Yunnan, and was set as a demonstrating area for ecotourism exploitation in 1998. After a year's exploitation construction and half a year's operation as a branch of the 99' Kunming International Horticulture Exposition to accept tourists, it was proved that the ecotourism demonstrating area attained four integrated functions of ecotourism, i.e., tourism, protection, poverty clearing and environment education. Five exploitation and management models including function zoned exploitation model, featured tourism communication model signs system designing model, local Tibetan family reception model and environmental monitoring model, were also successful, which were demonstrated and spreaded to the whole province. Bita lake provincial natural reserve could be a good sample for the ecotourism exploitation natural reserves of the whole country.

  12. Exploiting GPUs in Virtual Machine for BioCloud

    Science.gov (United States)

    Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon

    2013-01-01

    Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment. PMID:23710465

  13. Exploiting GPUs in Virtual Machine for BioCloud

    Directory of Open Access Journals (Sweden)

    Heeseung Jo

    2013-01-01

    Full Text Available Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment.

  14. Exploiting GPUs in virtual machine for BioCloud.

    Science.gov (United States)

    Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon

    2013-01-01

    Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment.

  15. Assessing Africa-wide Pangolin exploitation by scaling local data

    DEFF Research Database (Denmark)

    Ingram, Daniel J.; Coad, Lauren; Abernethy, Katharine A.

    2018-01-01

    on regional trends in exploitation of threatened species to inform conservation actions and policy. We estimate that 0.4-2.7 million pangolins are hunted annually in Central African forests. The number of pangolins hunted has increased by ∼150% and the proportion of pangolins of all vertebrates hunted...... increased from 0.04% to 1.83% over the past four decades. However, there were no trends in pangolins observed at markets, suggesting use of alternative supply chains. We found evidence that the price of giant (Smutsia gigantea) and arboreal (Phataginus sp.) pangolins in urban markets has increased...

  16. Exploiting Virtualization and Cloud Computing in ATLAS

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    This work will present the current status of the Virtualization and Cloud Computing R&D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a "cloud factory" for managing cloud VM instances. Ne...

  17. Field Level Computer Exploitation Package

    Science.gov (United States)

    2007-03-01

    to take advantage of the data retrieved from the computer. Major Barge explained that if a tool could be designed that nearly anyone could use...the study of network forensics. This has become a necessity because of the constantly growing eCommerce industry and the stiff competition between...Security. One big advantage that Insert has is the fact that it is quite small compared to most bootable CDs. At only 60 megabytes it can be burned

  18. Cosimulation of electromagnetics-circuit systems exploiting DGTD and MNA

    KAUST Repository

    Li, Ping

    2014-06-01

    A hybrid electromagnetics (EM)-circuit simulator exploiting the discontinuous Galerkin time domain (DGTD) method and the modified nodal analysis (MNA) algorithm is developed for analyzing hybrid distributive and nonlinear multiport lumped circuit systems. The computational domain is split into two subsystems. One is the EM subsystem that is analyzed by DGTD, while the other is the circuit subsystem that is solved by the MNA method. The coupling between the EM and circuit subsystems is enforced at the lumped port where related field and circuit unknowns are coupled via the use of numerical flux, port voltages, and current sources. Since the spatial operations of DGTD are localized, thanks to the use of numerical flux, coupling matrices between EM and circuit subsystems are small and are directly inverted. To handle nonlinear devices within the circuit subsystem, the standard Newton-Raphson method is applied to the nonlinear coupling matrix system. In addition, a local time-stepping scheme is applied to improve the efficiency of the hybrid solver. Numerical examples including single and multiport linear/nonlinear circuit networks are presented to validate the proposed solver. © 2014 IEEE.

  19. A semi-local quasi-harmonic model to compute the thermodynamic and mechanical properties of silicon nanostructures

    International Nuclear Information System (INIS)

    Zhao, H; Aluru, N R

    2007-01-01

    This paper presents a semi-local quasi-harmonic model with local phonon density of states (LPDOS) to compute the thermodynamic and mechanical properties of silicon nanostructures at finite temperature. In contrast to an earlier approach (Tang and Aluru 2006 Phys. Rev. B 74 235441), where a quasi-harmonic model with LPDOS computed by a Green's function technique (QHMG) was developed considering many layers of atoms, the semi-local approach considers only two layers of atoms to compute the LPDOS. We show that the semi-local approach combines the accuracy of the QHMG approach and the computational efficiency of the local quasi-harmonic model. We present results for several silicon nanostructures to address the accuracy and efficiency of the semi-local approach

  20. Computer Vision Using Local Binary Patterns

    CERN Document Server

    Pietikainen, Matti; Zhao, Guoying; Ahonen, Timo

    2011-01-01

    The recent emergence of Local Binary Patterns (LBP) has led to significant progress in applying texture methods to various computer vision problems and applications. The focus of this research has broadened from 2D textures to 3D textures and spatiotemporal (dynamic) textures. Also, where texture was once utilized for applications such as remote sensing, industrial inspection and biomedical image analysis, the introduction of LBP-based approaches have provided outstanding results in problems relating to face and activity analysis, with future scope for face and facial expression recognition, b

  1. Consensual exploitation : the moral wrong in exploitation and legal restrictions on consensual exploitative transactions

    OpenAIRE

    van der Neut, Wendy

    2014-01-01

    This thesis is about so-­‐called consensual exploitative transactions: transactions to which all parties agree voluntarily, and which are beneficial for all parties, but which are still widely considered exploitative, and for that reason legally restricted in many countries. The thesis asks two main questions: 1. What is wrong with consensual exploitation? 2.What implications does the answer to this question have for the legal restriction of consensual transactions ...

  2. Exploiting the Dynamics of Soft Materials for Machine Learning.

    Science.gov (United States)

    Nakajima, Kohei; Hauser, Helmut; Li, Tao; Pfeifer, Rolf

    2018-06-01

    Soft materials are increasingly utilized for various purposes in many engineering applications. These materials have been shown to perform a number of functions that were previously difficult to implement using rigid materials. Here, we argue that the diverse dynamics generated by actuating soft materials can be effectively used for machine learning purposes. This is demonstrated using a soft silicone arm through a technique of multiplexing, which enables the rich transient dynamics of the soft materials to be fully exploited as a computational resource. The computational performance of the soft silicone arm is examined through two standard benchmark tasks. Results show that the soft arm compares well to or even outperforms conventional machine learning techniques under multiple conditions. We then demonstrate that this system can be used for the sensory time series prediction problem for the soft arm itself, which suggests its immediate applicability to a real-world machine learning problem. Our approach, on the one hand, represents a radical departure from traditional computational methods, whereas on the other hand, it fits nicely into a more general perspective of computation by way of exploiting the properties of physical materials in the real world.

  3. Ultrasonic imaging of material flaws exploiting multipath information

    Science.gov (United States)

    Shen, Xizhong; Zhang, Yimin D.; Demirli, Ramazan; Amin, Moeness G.

    2011-05-01

    In this paper, we consider ultrasonic imaging for the visualization of flaws in a material. Ultrasonic imaging is a powerful nondestructive testing (NDT) tool which assesses material conditions via the detection, localization, and classification of flaws inside a structure. Multipath exploitations provide extended virtual array apertures and, in turn, enhance imaging capability beyond the limitation of traditional multisensor approaches. We utilize reflections of ultrasonic signals which occur when encountering different media and interior discontinuities. The waveforms observed at the physical as well as virtual sensors yield additional measurements corresponding to different aspect angles. Exploitation of multipath information addresses unique issues observed in ultrasonic imaging. (1) Utilization of physical and virtual sensors significantly extends the array aperture for image enhancement. (2) Multipath signals extend the angle of view of the narrow beamwidth of the ultrasound transducers, allowing improved visibility and array design flexibility. (3) Ultrasonic signals experience difficulty in penetrating a flaw, thus the aspect angle of the observation is limited unless access to other sides is available. The significant extension of the aperture makes it possible to yield flaw observation from multiple aspect angles. We show that data fusion of physical and virtual sensor data significantly improves the detection and localization performance. The effectiveness of the proposed multipath exploitation approach is demonstrated through experimental studies.

  4. Assessing the utility of phase-space-localized basis functions: Exploiting direct product structure and a new basis function selection procedure.

    Science.gov (United States)

    Brown, James; Carrington, Tucker

    2016-06-28

    In this paper we show that it is possible to use an iterative eigensolver in conjunction with Halverson and Poirier's symmetrized Gaussian (SG) basis [T. Halverson and B. Poirier, J. Chem. Phys. 137, 224101 (2012)] to compute accurate vibrational energy levels of molecules with as many as five atoms. This is done, without storing and manipulating large matrices, by solving a regular eigenvalue problem that makes it possible to exploit direct-product structure. These ideas are combined with a new procedure for selecting which basis functions to use. The SG basis we work with is orders of magnitude smaller than the basis made by using a classical energy criterion. We find significant convergence errors in previous calculations with SG bases. For sum-of-product Hamiltonians, SG bases large enough to compute accurate levels are orders of magnitude larger than even simple pruned bases composed of products of harmonic oscillator functions.

  5. Assessing the utility of phase-space-localized basis functions: Exploiting direct product structure and a new basis function selection procedure

    International Nuclear Information System (INIS)

    Brown, James; Carrington, Tucker

    2016-01-01

    In this paper we show that it is possible to use an iterative eigensolver in conjunction with Halverson and Poirier’s symmetrized Gaussian (SG) basis [T. Halverson and B. Poirier, J. Chem. Phys. 137, 224101 (2012)] to compute accurate vibrational energy levels of molecules with as many as five atoms. This is done, without storing and manipulating large matrices, by solving a regular eigenvalue problem that makes it possible to exploit direct-product structure. These ideas are combined with a new procedure for selecting which basis functions to use. The SG basis we work with is orders of magnitude smaller than the basis made by using a classical energy criterion. We find significant convergence errors in previous calculations with SG bases. For sum-of-product Hamiltonians, SG bases large enough to compute accurate levels are orders of magnitude larger than even simple pruned bases composed of products of harmonic oscillator functions.

  6. Exploiting opportunistic resources for ATLAS with ARC CE and the Event Service

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00226583; The ATLAS collaboration; Filipčič, Andrej; Guan, Wen; Tsulaia, Vakhtang; Walker, Rodney; Wenaus, Torre

    2017-01-01

    With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from the resources that comprise the Grid computing of most experiments, therefore exploiting these resources requires a change in strategy for the experiment. The resources may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The ARC CE with its non-intrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the Event Service primarily to address the issue of jobs that can be terminated at any point when opportunistic resources are needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in...

  7. Exploiting Opportunistic Resources for ATLAS with ARC CE and the Event Service

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2016-01-01

    With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from the resources that comprise the Grid computing of most experiments, therefore exploiting these resources requires a change in strategy for the experiment. The resources may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The ARC CE with its non-intrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the Event Service primarily to address the issue of jobs that can be terminated at any point when opportunistic resources are needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in...

  8. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    Science.gov (United States)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP

  9. Efforts to Overcome Child Commercial Sexual Exploitation Victims in City Tourism Area, Manado

    Directory of Open Access Journals (Sweden)

    Rahmat Hidayat

    2017-09-01

    Full Text Available The tourism sector has a significant contribution to the economy of Manado City, North Sulawesi Province. However, on the other hand, it has a negative effect on the increase in the number of child commercial sexual exploitation victims and makes children into commercial sex workers. Despite not effective, the Local Government of Manado City, North Sulawesi Province, has made efforts to cope with the child commercial sexual exploitation victims. In connection with the case, this study is designed to analyze the causes of ineffectiveness of Local Government efforts in tackling child commercial sexual exploitation victims. The study was conducted in tourism area of Manado City, North Sulawesi Province. The informants involved in this study were divided into two types: experts and non-experts. The informants were determined by using Opportunistic Sampling, and the sampling is using Snowball Sampling. The results of the study showed that the development of tourism sector has negative effect on children in the communities. Efforts made to cope with child commercial sexual exploitation victims by the local government and relevant parties have not been effective due to limited allocation of budgets and skilled, quality human resources, the lack of harmonious understanding between police with judges and public prosecutors as law apparatus, supervision, and protection of victims in solving the cases of child commercial sexual exploitation victims, the implementation of action committee’s duties and responsibility have been not effect, the number of obstacles facing them.

  10. A kind of balance between exploitation and exploration on kriging for global optimization of expensive functions

    International Nuclear Information System (INIS)

    Dong, Huachao; Song, Baowei; Wang, Peng; Huang, Shuai

    2015-01-01

    In this paper, a novel kriging-based algorithm for global optimization of computationally expensive black-box functions is presented. This algorithm utilizes a multi-start approach to find all of the local optimal values of the surrogate model and performs searches within the neighboring area around these local optimal positions. Compared with traditional surrogate-based global optimization method, this algorithm provides another kind of balance between exploitation and exploration on kriging-based model. In addition, a new search strategy is proposed and coupled into this optimization process. The local search strategy employs a kind of improved 'Minimizing the predictor' method, which dynamically adjusts search direction and radius until finds the optimal value. Furthermore, the global search strategy utilizes the advantage of kriging-based model in predicting unexplored regions to guarantee the reliability of the algorithm. Finally, experiments on 13 test functions with six algorithms are set up and the results show that the proposed algorithm is very promising.

  11. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  12. Exploiting opportunistic resources for ATLAS with ARC CE and the Event Service

    Science.gov (United States)

    Cameron, D.; Filipčič, A.; Guan, W.; Tsulaia, V.; Walker, R.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from those that comprise the Grid computing of most experiments, therefore exploiting them requires a change in strategy for the experiment. They may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The Advanced Resource Connector Computing Element (ARC CE) with its nonintrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the ATLAS Event Service (AES) primarily to address the issue of jobs that can be terminated at any point when opportunistic computing capacity is needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in a restrictive environment. In addition to the technical details, results from deployment of this solution in the SuperMUC HPC centre in Munich are shown.

  13. Exploiting Analytics Techniques in CMS Computing Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Bonacorsi, D. [Bologna U.; Kuznetsov, V. [Cornell U.; Magini, N. [Fermilab; Repečka, A. [Vilnius U.; Vaandering, E. [Fermilab

    2017-11-22

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.

  14. Exploiting NASA's Cumulus Earth Science Cloud Archive with Services and Computation

    Science.gov (United States)

    Pilone, D.; Quinn, P.; Jazayeri, A.; Schuler, I.; Plofchan, P.; Baynes, K.; Ramachandran, R.

    2017-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) houses nearly 30PBs of critical Earth Science data and with upcoming missions is expected to balloon to between 200PBs-300PBs over the next seven years. In addition to the massive increase in data collected, researchers and application developers want more and faster access - enabling complex visualizations, long time-series analysis, and cross dataset research without needing to copy and manage massive amounts of data locally. NASA has started prototyping with commercial cloud providers to make this data available in elastic cloud compute environments, allowing application developers direct access to the massive EOSDIS holdings. In this talk we'll explain the principles behind the archive architecture and share our experience of dealing with large amounts of data with serverless architectures including AWS Lambda, the Elastic Container Service (ECS) for long running jobs, and why we dropped thousands of lines of code for AWS Step Functions. We'll discuss best practices and patterns for accessing and using data available in a shared object store (S3) and leveraging events and message passing for sophisticated and highly scalable processing and analysis workflows. Finally we'll share capabilities NASA and cloud services are making available on the archives to enable massively scalable analysis and computation in a variety of formats and tools.

  15. Challenges of Replacing NAD 83, NAVD 88, and IGLD 85: Exploiting the Characteristics of 3-D Digital Spatial Data

    Science.gov (United States)

    Burkholder, E. F.

    2016-12-01

    One way to address challenges of replacing NAD 83, NGVD 88 and IGLD 85 is to exploit the characteristics of 3-D digital spatial data. This presentation describes the 3-D global spatial data model (GSDM) which accommodates rigorous scientific endeavors while simultaneously supporting a local flat-earth view of the world. The GSDM is based upon the assumption of a single origin for 3-D spatial data and uses rules of solid geometry for manipulating spatial data components. This approach exploits the characteristics of 3-D digital spatial data and preserves the quality of geodetic measurements while providing spatial data users the option of working with rectangular flat-earth components and computational procedures for local applications. This flexibility is provided by using a bidirectional rotation matrix that allows any 3-D vector to be used in a geodetic reference frame for high-end applications and/or the local frame for flat-earth users. The GSDM is viewed as compatible with the datum products being developed by NGS and provides for unambiguous exchange of 3-D spatial data between disciplines and users worldwide. Three geometrical models will be summarized - geodetic, map projection, and 3-D. Geodetic computations are performed on an ellipsoid and are without equal in providing rigorous coordinate values for latitude, longitude, and ellipsoid height. Members of the user community have, for generations, sought ways to "flatten the world" to accommodate a flat-earth view and to avoid the complexity of working on an ellipsoid. Map projections have been defined for a wide variety of applications and remain very useful for visualizing spatial data. But, the GSDM supports computations based on 3-D components that have not been distorted in a 2-D map projection. The GSDM does not invalidate either geodesy or cartographic computational processes but provides a geometrically correct view of any point cloud from any point selected by the user. As a bonus, the GSDM also

  16. The Ethics of Exploitation

    Directory of Open Access Journals (Sweden)

    Paul McLaughlin

    2008-11-01

    Full Text Available Philosophical inquiry into exploitation has two major deficiencies to date: it assumes that exploitation is wrong by definition; and it pays too much attention to the Marxian account of exploitation. Two senses of exploitation should be distinguished: the ‘moral’ or pejorative sense and the ‘non-moral’ or ‘non-prejudicial’ sense. By demonstrating the conceptual inadequacy of exploitation as defined in the first sense, and by defining exploitation adequately in the latter sense, we seek to demonstrate the moral complexity of exploitation. We contend, moreover, that moral evaluation of exploitation is only possible once we abandon a strictly Marxian framework and attempt, in the long run, to develop an integral ethic along Godwinian lines.

  17. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  18. Cloud Based Earth Observation Data Exploitation Platforms

    Science.gov (United States)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland

  19. Transnational (Dis)connection in localizing personal computing in the Netherlands, 1975-1990

    NARCIS (Netherlands)

    Veraart, F.C.A.; Alberts, G.; Oldenziel, R.

    2014-01-01

    Examining the diffusion and domestication of computer technologies in Dutch households and schools during the 1980s and 1990s, this chapter shows that the process was not a simple story of adoption of American models. Instead, many Dutch actors adapted computer technologies to their own local needs,

  20. Automatic programming via iterated local search for dynamic job shop scheduling.

    Science.gov (United States)

    Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen

    2015-01-01

    Dispatching rules have been commonly used in practice for making sequencing and scheduling decisions. Due to specific characteristics of each manufacturing system, there is no universal dispatching rule that can dominate in all situations. Therefore, it is important to design specialized dispatching rules to enhance the scheduling performance for each manufacturing environment. Evolutionary computation approaches such as tree-based genetic programming (TGP) and gene expression programming (GEP) have been proposed to facilitate the design task through automatic design of dispatching rules. However, these methods are still limited by their high computational cost and low exploitation ability. To overcome this problem, we develop a new approach to automatic programming via iterated local search (APRILS) for dynamic job shop scheduling. The key idea of APRILS is to perform multiple local searches started with programs modified from the best obtained programs so far. The experiments show that APRILS outperforms TGP and GEP in most simulation scenarios in terms of effectiveness and efficiency. The analysis also shows that programs generated by APRILS are more compact than those obtained by genetic programming. An investigation of the behavior of APRILS suggests that the good performance of APRILS comes from the balance between exploration and exploitation in its search mechanism.

  1. Computational modeling of local hemodynamics phenomena: methods, tools and clinical applications

    International Nuclear Information System (INIS)

    Ponzini, R.; Rizzo, G.; Vergara, C.; Veneziani, A.; Morbiducci, U.; Montevecchi, F.M.; Redaelli, A.

    2009-01-01

    Local hemodynamics plays a key role in the onset of vessel wall pathophysiology, with peculiar blood flow structures (i.e. spatial velocity profiles, vortices, re-circulating zones, helical patterns and so on) characterizing the behavior of specific vascular districts. Thanks to the evolving technologies on computer sciences, mathematical modeling and hardware performances, the study of local hemodynamics can today afford also the use of a virtual environment to perform hypothesis testing, product development, protocol design and methods validation that just a couple of decades ago would have not been thinkable. Computational fluid dynamics (Cfd) appears to be more than a complementary partner to in vitro modeling and a possible substitute to animal models, furnishing a privileged environment for cheap fast and reproducible data generation.

  2. April 1977 The Cape gurnard is a commercially exploited species of ...

    African Journals Online (AJOL)

    The Cape gurnard is a commercially exploited species of which the annual landings between ... fishing operations took place along the eastern Cape coast of South Africa ..... Handbook of computation for biological statistics offish populations.

  3. Real-time image dehazing using local adaptive neighborhoods and dark-channel-prior

    Science.gov (United States)

    Valderrama, Jesus A.; Díaz-Ramírez, Víctor H.; Kober, Vitaly; Hernandez, Enrique

    2015-09-01

    A real-time algorithm for single image dehazing is presented. The algorithm is based on calculation of local neighborhoods of a hazed image inside a moving window. The local neighborhoods are constructed by computing rank-order statistics. Next the dark-channel-prior approach is applied to the local neighborhoods to estimate the transmission function of the scene. By using the suggested approach there is no need for applying a refining algorithm to the estimated transmission such as the soft matting algorithm. To achieve high-rate signal processing the proposed algorithm is implemented exploiting massive parallelism on a graphics processing unit (GPU). Computer simulation results are carried out to test the performance of the proposed algorithm in terms of dehazing efficiency and speed of processing. These tests are performed using several synthetic and real images. The obtained results are analyzed and compared with those obtained with existing dehazing algorithms.

  4. Local knowledge and exploitation of the avian fauna by a rural community in the semi-arid zone of northeastern Brazil.

    Science.gov (United States)

    Teixeira, Pedro Hudson Rodrigues; Thel, Thiago do Nascimento; Ferreira, Jullio Marques Rocha; de Azevedo, Severino Mendes; Junior, Wallace Rodrigues Telino; Lyra-Neves, Rachel Maria

    2014-12-24

    The present study examined the exploitation of bird species by the residents of a rural community in the Brazilian semi-arid zone, and their preferences for species with different characteristics. The 24 informants were identified using the "snowball" approach, and were interviewed using semi-structured questionnaires and check-sheets for the collection of data on their relationship with the bird species that occur in the region. The characteristics that most attract the attention of the interviewees were the song and the coloration of the plumage of a bird, as well as its body size, which determines its potential as a game species, given that hunting is an important activity in the region. A total of 98 species representing 32 families (50.7% of the species known to occur in the region) were reported during interviews, being used for meat, pets, and medicinal purposes. Three species were used as zootherapeutics - White-naped Jay was eaten whole as a cure for speech problems, the feathers of Yellow-legged Tinamou were used for snakebite, Smooth-billed Ani was eaten for "chronic cough" and Small-billed Tinamou and Tataupa Tinamou used for locomotion problems. The preference of the informants for characteristics such as birdsong and colorful plumage was a significant determinant of their preference for the species exploited. Birds with cynegetic potential and high use values were also among the most preferred species. Despite the highly significant preferences for certain species, some birds, such as those of the families Trochilidae, Thamnophilidae, and Tyrannidae are hunted randomly, independently of their attributes. The evidence collected on the criteria applied by local specialists for the exploitation of the bird fauna permitted the identification of the species that suffer hunting pressure, providing guidelines for the development of conservation and management strategies that will guarantee the long-term survival of the populations of these bird species in

  5. Atomic orbital-based SOS-MP2 with tensor hypercontraction. II. Local tensor hypercontraction

    Science.gov (United States)

    Song, Chenchen; Martínez, Todd J.

    2017-01-01

    In the first paper of the series [Paper I, C. Song and T. J. Martinez, J. Chem. Phys. 144, 174111 (2016)], we showed how tensor-hypercontracted (THC) SOS-MP2 could be accelerated by exploiting sparsity in the atomic orbitals and using graphical processing units (GPUs). This reduced the formal scaling of the SOS-MP2 energy calculation to cubic with respect to system size. The computational bottleneck then becomes the THC metric matrix inversion, which scales cubically with a large prefactor. In this work, the local THC approximation is proposed to reduce the computational cost of inverting the THC metric matrix to linear scaling with respect to molecular size. By doing so, we have removed the primary bottleneck to THC-SOS-MP2 calculations on large molecules with O(1000) atoms. The errors introduced by the local THC approximation are less than 0.6 kcal/mol for molecules with up to 200 atoms and 3300 basis functions. Together with the graphical processing unit techniques and locality-exploiting approaches introduced in previous work, the scaled opposite spin MP2 (SOS-MP2) calculations exhibit O(N2.5) scaling in practice up to 10 000 basis functions. The new algorithms make it feasible to carry out SOS-MP2 calculations on small proteins like ubiquitin (1231 atoms/10 294 atomic basis functions) on a single node in less than a day.

  6. The exploitation of swamp plants for dewatering liquid sewage sludge

    Directory of Open Access Journals (Sweden)

    Jiří Šálek

    2006-01-01

    Full Text Available The operators of little rural wastewater treatment plants have been interested in economic exploitation of sewage sludge in local conditions. The chance is searching simply and natural ways of processing and exploitation stabilized sewage sludge in agriculture. Manure substrate have been obtained by composting waterless sewage sludge including rest plant biomass after closing 6–8 years period of filling liquid sewage sludge to the basin. Main attention was focused on exploitation of swamp plants for dewatering liquid sewage sludge and determination of influence sewage sludge on plants, intensity and course of evapotranspiration and design and setting of drying beds. On the base of determined ability of swamp plants evapotranspiration were edited suggestion solutions of design and operation sludge bed facilities in the conditions of small rural wastewater treatment plant.

  7. Augmented Lagrange Programming Neural Network for Localization Using Time-Difference-of-Arrival Measurements.

    Science.gov (United States)

    Han, Zifa; Leung, Chi Sing; So, Hing Cheung; Constantinides, Anthony George

    2017-08-15

    A commonly used measurement model for locating a mobile source is time-difference-of-arrival (TDOA). As each TDOA measurement defines a hyperbola, it is not straightforward to compute the mobile source position due to the nonlinear relationship in the measurements. This brief exploits the Lagrange programming neural network (LPNN), which provides a general framework to solve nonlinear constrained optimization problems, for the TDOA-based localization. The local stability of the proposed LPNN solution is also analyzed. Simulation results are included to evaluate the localization accuracy of the LPNN scheme by comparing with the state-of-the-art methods and the optimality benchmark of Cramér-Rao lower bound.

  8. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  9. Impact of tidal phenomenon “Met Ef” on the exploitation of benthos at inshore shoals in Kei islands, Indonesia

    Science.gov (United States)

    Renjaan, Eugenius Alfred; Makailipessy, Marvin Mario

    2017-10-01

    A study on benthos exploitation at five shoals located in Rosenberg and Nerong Straits, the kei islands had been conducted during period of the lowest ebb tide phenomenon which locally termed Met Ef on 2016. The purpose of the study is to know the impact of the Met Ef on benthos exploitation at the shoals by local communities during the Met Ef. Data of tidal amplitudes were obtained from the Tide Charts Mobile Applications which confirmed the observations on tide pole during October, November, and December 2013 until 2016. Data of benthos exploited during periods of Met Ef at the shoals was obtained through direct observation on benthos exploited by the local communities, also by interviewing them using questionnaires. The results showed that the lowest ebb tide of the Met Ef occurred in November, i.e., at 2 to 5 days after the full moon and/or new moon, with an average tidal range of 2.66 m and even have ever one reached 2.80 m. The most exploited benthos at the shoals is Giant clam (Tridacna sp.), Spider conch (Lambis sp.), Hammer oyster (Malleus sp.), Octopus (Octopus spp.). The intensity of benthos exploited at the shoals increased during the period of Met Ef especially in October because at that time the sea was very calm and clear due to relatively lower wind speed and the rain fall was relatively lower. This has promoted an easier accessibility of the communities to exploit benthos at shoals and, therefore October is considered by the local communities as the peak of Met Ef, instead of November. During November and, December the availability of benthos in shoals has been reduced due to it has been exploited intensely in October.

  10. Fault-tolerant quantum computation for local non-Markovian noise

    International Nuclear Information System (INIS)

    Terhal, Barbara M.; Burkard, Guido

    2005-01-01

    We derive a threshold result for fault-tolerant quantum computation for local non-Markovian noise models. The role of error amplitude in our analysis is played by the product of the elementary gate time t 0 and the spectral width of the interaction Hamiltonian between system and bath. We discuss extensions of our model and the applicability of our analysis

  11. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  12. Exploiting communication concurrency on high performance computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Chaimov, Nicholas [Univ. of Oregon, Eugene, OR (United States); Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Iancu, Costin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-01-01

    Although logically available, applications may not exploit enough instantaneous communication concurrency to maximize hardware utilization on HPC systems. This is exacerbated in hybrid programming models such as SPMD+OpenMP. We present the design of a "multi-threaded" runtime able to transparently increase the instantaneous network concurrency and to provide near saturation bandwidth, independent of the application configuration and dynamic behavior. The runtime forwards communication requests from application level tasks to multiple communication servers. Our techniques alleviate the need for spatial and temporal application level message concurrency optimizations. Experimental results show improved message throughput and bandwidth by as much as 150% for 4KB bytes messages on InfiniBand and by as much as 120% for 4KB byte messages on Cray Aries. For more complex operations such as all-to-all collectives, we observe as much as 30% speedup. This translates into 23% speedup on 12,288 cores for a NAS FT implemented using FFTW. We also observe as much as 76% speedup on 1,500 cores for an already optimized UPC+OpenMP geometric multigrid application using hybrid parallelism.

  13. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    CERN Document Server

    Bonacorsi, D; Giordano, D; Girone, M; Neri, M; Magini, N; Kuznetsov, V; Wildish, T

    2015-01-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This...

  14. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  15. Application of local computer networks in nuclear-physical experiments and technology

    International Nuclear Information System (INIS)

    Foteev, V.A.

    1986-01-01

    The bases of construction, comparative performance and potentialities of local computer networks with respect to their application in physical experiments are considered. The principle of operation of local networks is shown on the basis of the Ethernet network and the results of analysis of their operating performance are given. The examples of operating local networks in the area of nuclear-physics research and nuclear technology are presented as follows: networks of Japan Atomic Energy Research Institute, California University and Los Alamos National Laboratory, network realization according to the DECnet and Fast-bus programs, home network configurations of the USSR Academy of Sciences and JINR Neutron Physical Laboratory etc. It is shown that local networks allows significantly raise productivity in the sphere of data processing

  16. Bio-inspired computational heuristics to study Lane-Emden systems arising in astrophysics model.

    Science.gov (United States)

    Ahmad, Iftikhar; Raja, Muhammad Asif Zahoor; Bilal, Muhammad; Ashraf, Farooq

    2016-01-01

    This study reports novel hybrid computational methods for the solutions of nonlinear singular Lane-Emden type differential equation arising in astrophysics models by exploiting the strength of unsupervised neural network models and stochastic optimization techniques. In the scheme the neural network, sub-part of large field called soft computing, is exploited for modelling of the equation in an unsupervised manner. The proposed approximated solutions of higher order ordinary differential equation are calculated with the weights of neural networks trained with genetic algorithm, and pattern search hybrid with sequential quadratic programming for rapid local convergence. The results of proposed solvers for solving the nonlinear singular systems are in good agreements with the standard solutions. Accuracy and convergence the design schemes are demonstrated by the results of statistical performance measures based on the sufficient large number of independent runs.

  17. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    Science.gov (United States)

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  18. The role of computers in developing countries with reference to East Africa

    International Nuclear Information System (INIS)

    Shayo, L.K.

    1984-01-01

    The role of computers in economic and technological development is examined with particular reference to developing countries. It is stressed that these countries must exploit the potential of computers in their strive to catch-up in the development race. The shortage of qualified EDP personnel is singled out as one of the most critical factors in any unsatisfactory state of computer applications. A computerization policy based on the demands for information by the sophistication of the development process, and supported by a sufficient core of qualified local manpower, is recommended. The situation in East Africa is discussed and recommendations for training and production of telematics equipment are made. (author)

  19. Power Consumption Evaluation of Distributed Computing Network Considering Traffic Locality

    Science.gov (United States)

    Ogawa, Yukio; Hasegawa, Go; Murata, Masayuki

    When computing resources are consolidated in a few huge data centers, a massive amount of data is transferred to each data center over a wide area network (WAN). This results in increased power consumption in the WAN. A distributed computing network (DCN), such as a content delivery network, can reduce the traffic from/to the data center, thereby decreasing the power consumed in the WAN. In this paper, we focus on the energy-saving aspect of the DCN and evaluate its effectiveness, especially considering traffic locality, i.e., the amount of traffic related to the geographical vicinity. We first formulate the problem of optimizing the DCN power consumption and describe the DCN in detail. Then, numerical evaluations show that, when there is strong traffic locality and the router has ideal energy proportionality, the system's power consumption is reduced to about 50% of the power consumed in the case where a DCN is not used; moreover, this advantage becomes even larger (up to about 30%) when the data center is located farthest from the center of the network topology.

  20. A local computer network for the experimental data acquisition at BESSY

    International Nuclear Information System (INIS)

    Buchholz, W.

    1984-01-01

    For the users of the Berlin dedicated electron storage ring for synchrotron radiation (BESSY) a local computer network has been installed: The system is designed primarily for data acquisition and offers the users a generous hardware provision combined with maximum sortware flexibility

  1. Initialization and Restart in Stochastic Local Search: Computing a Most Probable Explanation in Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole J.; Wilkins, David C.; Roth, Dan

    2010-01-01

    For hard computational problems, stochastic local search has proven to be a competitive approach to finding optimal or approximately optimal problem solutions. Two key research questions for stochastic local search algorithms are: Which algorithms are effective for initialization? When should the search process be restarted? In the present work we investigate these research questions in the context of approximate computation of most probable explanations (MPEs) in Bayesian networks (BNs). We introduce a novel approach, based on the Viterbi algorithm, to explanation initialization in BNs. While the Viterbi algorithm works on sequences and trees, our approach works on BNs with arbitrary topologies. We also give a novel formalization of stochastic local search, with focus on initialization and restart, using probability theory and mixture models. Experimentally, we apply our methods to the problem of MPE computation, using a stochastic local search algorithm known as Stochastic Greedy Search. By carefully optimizing both initialization and restart, we reduce the MPE search time for application BNs by several orders of magnitude compared to using uniform at random initialization without restart. On several BNs from applications, the performance of Stochastic Greedy Search is competitive with clique tree clustering, a state-of-the-art exact algorithm used for MPE computation in BNs.

  2. Computed tomography findings after radiofrequency ablation in locally advanced pancreatic cancer

    NARCIS (Netherlands)

    Rombouts, Steffi J. E.; Derksen, Tyche C.; Nio, Chung Y.; van Hillegersberg, Richard; van Santvoort, Hjalmar C.; Walma, Marieke S.; Molenaar, Izaak Q.; van Leeuwen, Maarten S.

    2018-01-01

    The purpose of the study was to provide a systematic evaluation of the computed tomography(CT) findings after radiofrequency ablation (RFA) in locally advanced pancreatic cancer(LAPC). Eighteen patients with intra-operative RFA-treated LAPC were included in a prospective case series. All CT-scans

  3. Exploitation of Labour and Exploitation of Commodities: a “New Interpretation”

    OpenAIRE

    Veneziani, Roberto; Yoshihara, Naoki

    2011-01-01

    In the standard Okishio-Morishima approach, the existence of profits is proved to be equivalent to the exploitation of labour. Yet, it can also be proved that the existence of profits is equivalent to the ‘exploitation’ of any good. Labour and commodity exploitation are just different numerical representations of the productiveness of the economy. This paper presents an alternative approach to exploitation theory which is related to the New Interpretation (Duménil 1980; Foley 1982). In this a...

  4. Computed tomography of localized dilatation of the intrahepatic bile ducts

    International Nuclear Information System (INIS)

    Araki, T.; Itai, Y.; Tasaka, A.

    1981-01-01

    Twenty-nine patients showed localized dilatation of the intrahepatic bile ducts on computed tomography, usually unaccompanied by jaundice. Congenital dilatation was diagnosed when associated with a choledochal cyst, while cholangiographic contrast material was helpful in differentiating such dilatation from a simple cyst by showing its communication with the biliary tract when no choledochal cyst was present. Obstructive dilatation was associated with intrahepatic calculi in 4 cases, hepatoma in 9, cholangioma in 5, metastatic tumor in 5, and polycystic disease in 2. Cholangioma and intrahepatic calculi had a greater tendency to accompany such localized dilatation; in 2 cases, the dilatation was the only clue to the underlying disorder

  5. Experience of BESIII data production with local cluster and distributed computing model

    International Nuclear Information System (INIS)

    Deng, Z Y; Li, W D; Liu, H M; Sun, Y Z; Zhang, X M; Lin, L; Nicholson, C; Zhemchugov, A

    2012-01-01

    The BES III detector is a new spectrometer which works on the upgraded high-luminosity collider, BEPCII. The BES III experiment studies physics in the tau-charm energy region from 2 GeV to 4.6 GeV . From 2009 to 2011, BEPCII has produced 106M ψ(2S) events, 225M J/ψ events, 2.8 fb −1 ψ(3770) data, and 500 pb −1 data at 4.01 GeV. All the data samples were processed successfully and many important physics results have been achieved based on these samples. Doing data production correctly and efficiently with limited CPU and storage resources is a big challenge. This paper will describe the implementation of the experiment-specific data production for BESIII in detail, including data calibration with event-level parallel computing model, data reconstruction, inclusive Monte Carlo generation, random trigger background mixing and multi-stream data skimming. Now, with the data sample increasing rapidly, there is a growing demand to move from solely using a local cluster to a more distributed computing model. A distributed computing environment is being set up and expected to go into production use in 2012. The experience of BESIII data production, both with a local cluster and with a distributed computing model, is presented here.

  6. Imaging local brain function with emission computed tomography

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1984-01-01

    Positron emission tomography (PET) using 18 F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed

  7. Automated UAV-based video exploitation using service oriented architecture framework

    Science.gov (United States)

    Se, Stephen; Nadeau, Christian; Wood, Scott

    2011-05-01

    Airborne surveillance and reconnaissance are essential for successful military missions. Such capabilities are critical for troop protection, situational awareness, mission planning, damage assessment, and others. Unmanned Aerial Vehicles (UAVs) gather huge amounts of video data but it is extremely labour-intensive for operators to analyze hours and hours of received data. At MDA, we have developed a suite of tools that can process the UAV video data automatically, including mosaicking, change detection and 3D reconstruction, which have been integrated within a standard GIS framework. In addition, the mosaicking and 3D reconstruction tools have also been integrated in a Service Oriented Architecture (SOA) framework. The Visualization and Exploitation Workstation (VIEW) integrates 2D and 3D visualization, processing, and analysis capabilities developed for UAV video exploitation. Visualization capabilities are supported through a thick-client Graphical User Interface (GUI), which allows visualization of 2D imagery, video, and 3D models. The GUI interacts with the VIEW server, which provides video mosaicking and 3D reconstruction exploitation services through the SOA framework. The SOA framework allows multiple users to perform video exploitation by running a GUI client on the operator's computer and invoking the video exploitation functionalities residing on the server. This allows the exploitation services to be upgraded easily and allows the intensive video processing to run on powerful workstations. MDA provides UAV services to the Canadian and Australian forces in Afghanistan with the Heron, a Medium Altitude Long Endurance (MALE) UAV system. On-going flight operations service provides important intelligence, surveillance, and reconnaissance information to commanders and front-line soldiers.

  8. Exploitation and disadvantage

    NARCIS (Netherlands)

    Ferguson, B.

    2016-01-01

    According to some accounts of exploitation, most notably Ruth Sample's (2003) degradation-based account and Robert Goodin's (1987) vulnerability-based account, exploitation occurs when an advantaged party fails to constrain their advantage in light of another's disadvantage, regardless of the cause

  9. Multiobjective memetic estimation of distribution algorithm based on an incremental tournament local searcher.

    Science.gov (United States)

    Yang, Kaifeng; Mu, Li; Yang, Dongdong; Zou, Feng; Wang, Lei; Jiang, Qiaoyong

    2014-01-01

    A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  10. Multiobjective Memetic Estimation of Distribution Algorithm Based on an Incremental Tournament Local Searcher

    Directory of Open Access Journals (Sweden)

    Kaifeng Yang

    2014-01-01

    Full Text Available A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  11. Bronchobiliary Fistula Localized by Cholescintigraphy with Single-Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Artunduaga, Maddy; Patel, Niraj R.; Wendt, Julie A.; Guy, Elizabeth S.; Nachiappan, Arun C.

    2015-01-01

    Biliptysis is an important clinical feature to recognize as it is associated with bronchobiliary fistula, a rare entity. Bronchobiliary fistulas have been diagnosed with planar cholescintigraphy. However, cholescintigraphy with single-photon emission computed tomography (SPECT) can better spatially localize a bronchobiliary fistula as compared to planar cholescintigraphy alone, and is useful for preoperative planning if surgical treatment is required. Here, we present the case of a 23-year-old male who developed a bronchobiliary fistula in the setting of posttraumatic and postsurgical infection, which was diagnosed and localized by cholescintigraphy with SPECT

  12. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  13. Rational Exploitation and Utilizing of Groundwater in Jiangsu Coastal Area

    Science.gov (United States)

    Kang, B.; Lin, X.

    2017-12-01

    Jiangsu coastal area is located in the southeast coast of China, where is a new industrial base and an important coastal and Land Resources Development Zone of China. In the areas with strong human exploitation activities, regional groundwater evolution is obviously affected by human activities. In order to solve the environmental geological problems caused by groundwater exploitation fundamentally, we must find out the forming conditions of regional groundwater hydrodynamic field, and the impact of human activities on groundwater hydrodynamic field evolution and hydrogeochemical evolition. Based on these results, scientific management and reasonable exploitation of the regional groundwater resources can be provided for the utilization. Taking the coastal area of Jiangsu as the research area, we investigate and analyze of the regional hydrogeological conditions. The numerical simulation model of groundwater flow was established according to the water power, chemical and isotopic methods, the conditions of water flow and the influence of hydrodynamic field on the water chemical field. We predict the evolution of regional groundwater dynamics under the influence of human activities and climate change and evaluate the influence of groundwater dynamic field evolution on the environmental geological problems caused by groundwater exploitation under various conditions. We get the following conclusions: Three groundwater exploitation optimal schemes were established. The groundwater salinization was taken as the primary control condition. The substitution model was proposed to model groundwater exploitation and water level changes by BP network method.Then genetic algorithm was used to solve the optimization solution. Three groundwater exploitation optimal schemes were submit to local water resource management. The first sheme was used to solve the groundwater salinization problem. The second sheme focused on dual water supply. The third sheme concerned on emergency water

  14. Paradise lost: Sovereign State Interest, Global Resource Exploitation and the Politics of Human Rights

    OpenAIRE

    Augenstein, Daniel

    2016-01-01

    Taking its cue from the US Supreme Court judgment in Kiobel that restricted the extraterritorial reach of the Alien Tort Claims Act, this article explores how sovereignty structures the relationship between global resource exploitation and the localization of human rights in the international order of states. The argument situates international human rights law in an area of tension between national political self-determination and the global economic exploitation of natural resources. Global...

  15. Learning Metasploit exploitation and development

    CERN Document Server

    Balapure, Aditya

    2013-01-01

    A practical, hands-on tutorial with step-by-step instructions. The book will follow a smooth and easy-to-follow tutorial approach, covering the essentials and then showing the readers how to write more sophisticated exploits.This book targets exploit developers, vulnerability analysts and researchers, network administrators, and ethical hackers looking to gain advanced knowledge in exploitation development and identifying vulnerabilities. The primary goal is to take readers wishing to get into more advanced exploitation discovery and reaching the next level.Prior experience exploiting basic st

  16. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    International Nuclear Information System (INIS)

    Bonacorsi, D; Neri, M; Boccali, T; Giordano, D; Girone, M; Magini, N; Kuznetsov, V; Wildish, T

    2015-01-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for

  17. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    Science.gov (United States)

    Bonacorsi, D.; Boccali, T.; Giordano, D.; Girone, M.; Neri, M.; Magini, N.; Kuznetsov, V.; Wildish, T.

    2015-12-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for

  18. Analytic and Unambiguous Phase-Based Algorithm for 3-D Localization of a Single Source with Uniform Circular Array

    Directory of Open Access Journals (Sweden)

    Le Zuo

    2018-02-01

    Full Text Available This paper presents an analytic algorithm for estimating three-dimensional (3-D localization of a single source with uniform circular array (UCA interferometers. Fourier transforms are exploited to expand the phase distribution of a single source and the localization problem is reformulated as an equivalent spectrum manipulation problem. The 3-D parameters are decoupled to different spectrums in the Fourier domain. Algebraic relations are established between the 3-D localization parameters and the Fourier spectrums. Fourier sampling theorem ensures that the minimum element number for 3-D localization of a single source with a UCA is five. Accuracy analysis provides mathematical insights into the 3-D localization algorithm that larger number of elements gives higher estimation accuracy. In addition, the phase-based high-order difference invariance (HODI property of a UCA is found and exploited to realize phase range compression. Following phase range compression, ambiguity resolution is addressed by the HODI of a UCA. A major advantage of the algorithm is that the ambiguity resolution and 3-D localization estimation are both analytic and are processed simultaneously, hence computationally efficient. Numerical simulations and experimental results are provided to verify the effectiveness of the proposed 3-D localization algorithm.

  19. Local hybrid functionals with orbital-free mixing functions and balanced elimination of self-interaction error

    International Nuclear Information System (INIS)

    Silva, Piotr de; Corminboeuf, Clémence

    2015-01-01

    The recently introduced density overlap regions indicator (DORI) [P. de Silva and C. Corminboeuf, J. Chem. Theory Comput. 10(9), 3745–3756 (2014)] is a density-dependent scalar field revealing regions of high density overlap between shells, atoms, and molecules. In this work, we exploit its properties to construct local hybrid exchange-correlation functionals aiming at balanced reduction of the self-interaction error. We show that DORI can successfully replace the ratio of the von Weizsäcker and exact positive-definite kinetic energy densities, which is commonly used in mixing functions of local hybrids. Additionally, we introduce several semi-empirical parameters to control the local and global admixture of exact exchange. The most promising of our local hybrids clearly outperforms the underlying semi-local functionals as well as their global hybrids

  20. Anthropology of sexual exploitation

    Directory of Open Access Journals (Sweden)

    Lalić Velibor

    2009-01-01

    Full Text Available In this paper, the authors observe sexual exploitation from an anthropological perspective. They analyze the rational, ethical, emotional and mythological dimensions of human sexuality. Consequently, after setting the phenomenon in a social and historical context, sexual exploitation is closely observed in the contemporary age. Based on thoughts of relevant thinkers, they make the conclusion that the elimination of sexual exploitation is not an utterly legal issue, but political and economical issues as well. Namely, legal norms are not sufficient to overcome sexual exploitation, but, political and economical relationships in contemporary societies, which will be based on sincere equal opportunities must be established.

  1. Finite element electromagnetic field computation on the Sequent Symmetry 81 parallel computer

    International Nuclear Information System (INIS)

    Ratnajeevan, S.; Hoole, H.

    1990-01-01

    Finite element field analysis algorithms lend themselves to parallelization and this fact is exploited in this paper to implement a finite element analysis program for electromagnetic field computation on the Sequent Symmetry 81 parallel computer with three processors. In terms of waiting time, the maximum gains are to be made in matrix solution and therefore this paper concentrates on the gains in parallelizing the solution part of finite element analysis. An outline of how parallelization could be exploited in most finite element operations is given in this paper although the actual implemention of parallelism on the Sequent Symmetry 81 parallel computer was in sparsity computation, matrix assembly and the matrix solution areas. In all cases, the algorithms were modified suit the parallel programming application rather than allowing the compiler to parallelize on existing algorithms

  2. Exploiting large-scale correlations to detect continuous gravitational waves.

    Science.gov (United States)

    Pletsch, Holger J; Allen, Bruce

    2009-10-30

    Fully coherent searches (over realistic ranges of parameter space and year-long observation times) for unknown sources of continuous gravitational waves are computationally prohibitive. Less expensive hierarchical searches divide the data into shorter segments which are analyzed coherently, then detection statistics from different segments are combined incoherently. The novel method presented here solves the long-standing problem of how best to do the incoherent combination. The optimal solution exploits large-scale parameter-space correlations in the coherent detection statistic. Application to simulated data shows dramatic sensitivity improvements compared with previously available (ad hoc) methods, increasing the spatial volume probed by more than 2 orders of magnitude at lower computational cost.

  3. The exploitation argument against commercial surrogacy.

    Science.gov (United States)

    Wilkinson, Stephen

    2003-04-01

    This paper discusses the exploitation argument against commercial surrogacy: the claim that commercial surrogacy is morally objectionable because it is exploitative. The following questions are addressed. First, what exactly does the exploitation argument amount to? Second, is commercial surrogacy in fact exploitative? Third, if it were exploitative, would this provide a sufficient reason to prohibit (or otherwise legislatively discourage) it? The focus throughout is on the exploitation of paid surrogates, although it is noted that other parties (e.g. 'commissioning parents') may also be the victims of exploitation. It is argued that there are good reasons for believing that commercial surrogacy is often exploitative. However, even if we accept this, the exploitation argument for prohibiting (or otherwise legislatively discouraging) commercial surrogacy remains quite weak. One reason for this is that prohibition may well 'backfire' and lead to potential surrogates having to do other things that are more exploitative and/or more harmful than paid surrogacy. It is concluded therefore that those who oppose exploitation should (rather than attempting to stop particular practices like commercial surrogacy) concentrate on: (a) improving the conditions under which paid surrogates 'work'; and (b) changing the background conditions (in particular, the unequal distribution of power and wealth) which generate exploitative relationships.

  4. Two questions about surrogacy and exploitation.

    Science.gov (United States)

    Wertheimer, Alan

    1992-01-01

    In this article I will consider two related questions about surrogacy and exploitation: (1) Is surrogacy exploitative? (2) If surrogacy is exploitative, what is the moral force of this exploitation? Briefly stated, I shall argue that whether surrogacy is exploitative depends on whether exploitation must be harmful to the exploited party or whether (as I think) there can be mutually advantageous exploitation. It also depends on some facts about surrogacy about which we have little reliable evidence and on our philosophical view on what counts as a harm to the surrogate. Our answer to the second question will turn in part on the account of exploitation we invoke in answering the first question and in part on the way in which we resolve some other questions about the justification of state interference. I shall suggest, however, that if surrogacy is a form of voluntary and mutually advantageous exploitation, then there is a strong presumption that surrogacy contracts should be permitted and even enforceable, although that presumption may be overridden on other grounds.

  5. The ESA Geohazard Exploitation Platform

    Science.gov (United States)

    Bally, Philippe; Laur, Henri; Mathieu, Pierre-Philippe; Pinto, Salvatore

    2015-04-01

    Earthquakes represent one of the world's most significant hazards in terms both of loss of life and damages. In the first decade of the 21st century, earthquakes accounted for 60 percent of fatalities from natural disasters, according to the United Nations International Strategy for Disaster Reduction (UNISDR). To support mitigation activities designed to assess and reduce risks and improve response in emergency situations, satellite EO can be used to provide a broad range of geo-information services. This includes for instance crustal block boundary mapping to better characterize active faults, strain rate mapping to assess how rapidly faults are deforming, soil vulnerability mapping to help estimate how the soil is behaving in reaction to seismic phenomena, geo-information to assess the extent and intensity of the earthquake impact on man-made structures and formulate assumptions on the evolution of the seismic sequence, i.e. where local aftershocks or future main shocks (on nearby faults) are most likely to occur. In May 2012, the European Space Agency and the GEO Secretariat convened the International Forum on Satellite EO for Geohazards now known as the Santorini Conference. The event was the continuation of a series of international workshops such as those organized by the Geohazards Theme of the Integrated Global Observing Strategy Partnership. In Santorini the seismic community has set out a vision of the EO contribution to an operational global seismic risk program, which lead to the Geohazard Supersites and Natural Laboratories (GSNL) initiative. The initial contribution of ESA to suuport the GSNL was the first Supersites Exploitation Platform (SSEP) system in the framework of Grid Processing On Demand (GPOD), now followed by the Geohazard Exploitation Platform (GEP). In this presentation, we will describe the contribution of the GEP for exploiting satellite EO for geohazard risk assessment. It is supporting the GEO Supersites and has been further

  6. Exploiting Sparsity in SDP Relaxation for Sensor Network Localization

    NARCIS (Netherlands)

    S. Kim (Sunyoung); M. Kojima; H. Waki (Hayato)

    2008-01-01

    htmlabstract A sensor network localization problem can be formulated as a quadratic optimization problem (QOP). For quadratic optimization problems, semidefinite programming (SDP) relaxation by Lasserre with relaxation order 1 for general polynomial optimization problems (POPs) is known to be

  7. Exploiting Sparsity in SDP Relaxation for Sensor Network Localization

    NARCIS (Netherlands)

    S. Kim (Sunyoung); M. Kojima; H. Waki (Hayato)

    2009-01-01

    htmlabstract A sensor network localization problem can be formulated as a quadratic optimization problem (QOP). For quadratic optimization problems, semidefinite programming (SDP) relaxation by Lasserre with relaxation order 1 for general polynomial optimization problems (POPs) is known to be

  8. Exploit Kit traffic analysis

    OpenAIRE

    Καπίρης, Σταμάτης; Kapiris, Stamatis

    2017-01-01

    Exploit kits have become one of the most widespread and destructive threat that Internet users face on a daily basis. Since the first actor, which has been categorized as exploit kit, namely MPack, appeared in 2006, we have seen a new era on exploit kit variants compromising popular websites, infecting hosts and delivering destructive malware, following an exponentially evolvement to date. With the growing threat landscape, large enterprises to domestic networks, have starte...

  9. Performing a local reduction operation on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Faraj, Daniel A.

    2012-12-11

    A parallel computer including compute nodes, each including two reduction processing cores, a network write processing core, and a network read processing core, each processing core assigned an input buffer. Copying, in interleaved chunks by the reduction processing cores, contents of the reduction processing cores' input buffers to an interleaved buffer in shared memory; copying, by one of the reduction processing cores, contents of the network write processing core's input buffer to shared memory; copying, by another of the reduction processing cores, contents of the network read processing core's input buffer to shared memory; and locally reducing in parallel by the reduction processing cores: the contents of the reduction processing core's input buffer; every other interleaved chunk of the interleaved buffer; the copied contents of the network write processing core's input buffer; and the copied contents of the network read processing core's input buffer.

  10. Exploiting Group Symmetry in Semidefinite Programming Relaxations of the Quadratic Assignment Problem

    NARCIS (Netherlands)

    de Klerk, E.; Sotirov, R.

    2007-01-01

    We consider semidefinite programming relaxations of the quadratic assignment problem, and show how to exploit group symmetry in the problem data. Thus we are able to compute the best known lower bounds for several instances of quadratic assignment problems from the problem library: [R.E. Burkard,

  11. Enjeux fonciers, exploitation des ressources naturelles et Forêts des Communautés Locales en périphérie de Kinshasa, RDC

    Directory of Open Access Journals (Sweden)

    Vermeulen, C.

    2011-01-01

    Full Text Available Land issues, exploitation of natural resources, and Forests of Rural Communities in the periphery of Kinshasa, DRC. Peri-urban forests are under strong anthropic pressure. Any activity needs a previous identification of stakeholders, landscape perception, socio-economic trends in local communities and their relationships with land and natural resources. Kinshasa (capital of Democratic Republic of Congo, DRC is a 10 millions inhabitants city with rapid growth and increasing impacts on surrounding villages linked with forest natural resources. This paper describes the relationship amongst local communities stakeholders and their relations with land areas and wood resources. Two areas surrounding Kinshasa (Bas-Congo and Bateke Plateaux are considered as major fuel-wood and charcoal supply zones for the city. Those two areas are different in terms of land pressure (very high in Bas-Congo and focused on riparian forests on Bateke Plateaux, but show the same pattern of overuse of the forest and woody natural resources. In both areas, local management of forest resources by the traditional authorities (heads of village or lineage has failed. Local population willingness for reforestation and forest restoration activities is much more important in Bas-Congo than on Bateke Plateaux. In both areas, shifting cultivation due to slash and burn practices for agricultural and charcoal practices are more and more quick. This has strong negative impact on the potential of regeneration process with local forest species. Sustainability of forest natural resources management by communities is discussed in regard to the on going negotiations on community based forest management regulations.

  12. Comparison of Spatiotemporal Mapping Techniques for Enormous Etl and Exploitation Patterns

    Science.gov (United States)

    Deiotte, R.; La Valley, R.

    2017-10-01

    The need to extract, transform, and exploit enormous volumes of spatiotemporal data has exploded with the rise of social media, advanced military sensors, wearables, automotive tracking, etc. However, current methods of spatiotemporal encoding and exploitation simultaneously limit the use of that information and increase computing complexity. Current spatiotemporal encoding methods from Niemeyer and Usher rely on a Z-order space filling curve, a relative of Peano's 1890 space filling curve, for spatial hashing and interleaving temporal hashes to generate a spatiotemporal encoding. However, there exist other space-filling curves, and that provide different manifold coverings that could promote better hashing techniques for spatial data and have the potential to map spatiotemporal data without interleaving. The concatenation of Niemeyer's and Usher's techniques provide a highly efficient space-time index. However, other methods have advantages and disadvantages regarding computational cost, efficiency, and utility. This paper explores the several methods using a range of sizes of data sets from 1K to 10M observations and provides a comparison of the methods.

  13. COMPARISON OF SPATIOTEMPORAL MAPPING TECHNIQUES FOR ENORMOUS ETL AND EXPLOITATION PATTERNS

    Directory of Open Access Journals (Sweden)

    R. Deiotte

    2017-10-01

    Full Text Available The need to extract, transform, and exploit enormous volumes of spatiotemporal data has exploded with the rise of social media, advanced military sensors, wearables, automotive tracking, etc. However, current methods of spatiotemporal encoding and exploitation simultaneously limit the use of that information and increase computing complexity. Current spatiotemporal encoding methods from Niemeyer and Usher rely on a Z-order space filling curve, a relative of Peano’s 1890 space filling curve, for spatial hashing and interleaving temporal hashes to generate a spatiotemporal encoding. However, there exist other space-filling curves, and that provide different manifold coverings that could promote better hashing techniques for spatial data and have the potential to map spatiotemporal data without interleaving. The concatenation of Niemeyer’s and Usher’s techniques provide a highly efficient space-time index. However, other methods have advantages and disadvantages regarding computational cost, efficiency, and utility. This paper explores the several methods using a range of sizes of data sets from 1K to 10M observations and provides a comparison of the methods.

  14. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  15. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  16. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  17. A new measure of interpersonal exploitativeness

    Directory of Open Access Journals (Sweden)

    Amy B. Brunell

    2013-05-01

    Full Text Available Measures of exploitativeness evidence problems with validity and reliability. The present set of studies assessed a new measure (the Interpersonal Exploitativeness Scale that defines exploitativeness in terms of reciprocity. In Studies 1 and 2, 33 items were administered to participants. Exploratory and Confirmatory Factor Analysis demonstrated that a single factor consisting of six items adequately assess interpersonal exploitativeness. Study 3 results revealed that the Interpersonal Exploitativeness Scale was positively associated with normal narcissism, pathological narcissism, psychological entitlement, and negative reciprocity and negatively correlated with positive reciprocity. In Study 4, participants competed in a commons dilemma. Those who scored higher on the Interpersonal Exploitativeness Scale were more likely to harvest a greater share of resources over time, even while controlling for other relevant variables, such as entitlement. Together, these studies show the Interpersonal Exploitativeness Scale to be a valid and reliable measure of interpersonal exploitativeness. The authors discuss the implications of these studies.

  18. Accelerating SPARQL queries by exploiting hash-based locality and adaptive partitioning

    KAUST Repository

    Al-Harbi, Razen

    2016-02-08

    State-of-the-art distributed RDF systems partition data across multiple computer nodes (workers). Some systems perform cheap hash partitioning, which may result in expensive query evaluation. Others try to minimize inter-node communication, which requires an expensive data preprocessing phase, leading to a high startup cost. Apriori knowledge of the query workload has also been used to create partitions, which, however, are static and do not adapt to workload changes. In this paper, we propose AdPart, a distributed RDF system, which addresses the shortcomings of previous work. First, AdPart applies lightweight partitioning on the initial data, which distributes triples by hashing on their subjects; this renders its startup overhead low. At the same time, the locality-aware query optimizer of AdPart takes full advantage of the partitioning to (1) support the fully parallel processing of join patterns on subjects and (2) minimize data communication for general queries by applying hash distribution of intermediate results instead of broadcasting, wherever possible. Second, AdPart monitors the data access patterns and dynamically redistributes and replicates the instances of the most frequent ones among workers. As a result, the communication cost for future queries is drastically reduced or even eliminated. To control replication, AdPart implements an eviction policy for the redistributed patterns. Our experiments with synthetic and real data verify that AdPart: (1) starts faster than all existing systems; (2) processes thousands of queries before other systems become online; and (3) gracefully adapts to the query load, being able to evaluate queries on billion-scale RDF data in subseconds.

  19. Asymmetrical floating point array processors, their application to exploration and exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Geriepy, B L

    1983-01-01

    An asymmetrical floating point array processor is a special-purpose scientific computer which operates under asymmetrical control of a host computer. Although an array processor can receive fixed point input and produce fixed point output, its primary mode of operation is floating point. The first generation of array processors was oriented towards time series information. The next generation of array processors has proved much more versatile and their applicability ranges from petroleum reservoir simulation to speech syntheses. Array processors are becoming commonplace in mining, the primary usage being construction of grids-by usual methods or by kriging. The Australian mining community is among the world's leaders in regard to computer-assisted exploration and exploitation systems. Part of this leadership role must be providing guidance to computer vendors in regard to current and future requirements.

  20. Exploitative and hierarchical antagonism in a cooperative bacterium.

    Directory of Open Access Journals (Sweden)

    Francesca Fiegna

    2005-11-01

    Full Text Available Social organisms that cooperate with some members of their own species, such as close relatives, may fail to cooperate with other genotypes of the same species. Such noncooperation may take the form of outright antagonism or social exploitation. Myxococcus xanthus is a highly social prokaryote that cooperatively develops into spore-bearing, multicellular fruiting bodies in response to starvation. Here we have characterized the nature of social interactions among nine developmentally proficient strains of M. xanthus isolated from spatially distant locations. Strains were competed against one another in all possible pairwise combinations during starvation-induced development. In most pairings, at least one competitor exhibited strong antagonism toward its partner and a majority of mixes showed bidirectional antagonism that decreased total spore production, even to the point of driving whole populations to extinction. Differential response to mixing was the primary determinant of competitive superiority rather than the sporulation efficiencies of unmixed populations. In some competitive pairings, the dominant partner sporulated more efficiently in mixed populations than in clonal isolation. This finding represents a novel form of exploitation in bacteria carried out by socially competent genotypes and is the first documentation of social exploitation among natural bacterial isolates. Patterns of antagonistic superiority among these strains form a highly linear dominance hierarchy. At least some competition pairs construct chimeric, rather than segregated, fruiting bodies. The cooperative prokaryote M. xanthus has diverged into a large number of distinct social types that cooperate with clone-mates but exhibit intense antagonism toward distinct social types of the same species. Most lengthy migration events in nature may thus result in strong antagonism between migratory and resident populations, and this antagonism may have large effects on local

  1. Geometric saliency to characterize radar exploitation performance

    Science.gov (United States)

    Nolan, Adam; Keserich, Brad; Lingg, Andrew; Goley, Steve

    2014-06-01

    Based on the fundamental scattering mechanisms of facetized computer-aided design (CAD) models, we are able to define expected contributions (EC) to the radar signature. The net result of this analysis is the prediction of the salient aspects and contributing vehicle morphology based on the aspect. Although this approach does not provide the fidelity of an asymptotic electromagnetic (EM) simulation, it does provide very fast estimates of the unique scattering that can be consumed by a signature exploitation algorithm. The speed of this approach is particularly relevant when considering the high dimensionality of target configuration variability due to articulating parts which are computationally burdensome to predict. The key scattering phenomena considered in this work are the specular response from a single bounce interaction with surfaces and dihedral response formed between the ground plane and vehicle. Results of this analysis are demonstrated for a set of civilian target models.

  2. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    Science.gov (United States)

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  3. Preoperative localization of endocrine pancreatic tumours by intra-arterial dynamic computed tomography

    International Nuclear Information System (INIS)

    Ahlstroem, H.; Magnusson, A.; Grama, D.; Eriksson, B.; Oeberg, K.; Loerelius, L.E.; Akademiska Sjukhuset, Uppsala; Akademiska Sjukhuset, Uppsala

    1990-01-01

    Eleven patients with biochemically confirmed endocrine pancreatic tumours were examined with intra-arterial (i.a.) dynamic computed tomography (CT) and angiography preoperatively. Seven of the patients suffered from the multiple endocrine neoplasia type 1 (MEN-1) syndrome. All patients were operated upon and surgical palpation and ultrasound were the peroperative localization methods. Of the 33 tumours which were found at histopathologic analysis of the resected specimens in the 11 patients, 7 tumours in 7 patients were correctly localized by both i.a. dynamic CT and angiography. Six patients with MEN-1 syndrome had multiple tumours and this group of patients together had 28 tumours, of which 5 (18%) were localized preoperatively by both CT and angiography. I.a. dynamic CT, with the technique used by us, does not seem to improve the localization of endocrine pancreatic tumours, especially in the rare group of MEN-1 patients, as compared with angiography. (orig.)

  4. Decision support system for exploiting local renewable energy sources: A case study of the Chigu area of southwestern Taiwan

    International Nuclear Information System (INIS)

    Yue, C.-D.; Yang, G.G.-L.

    2007-01-01

    The topic of climate and energy policy has drawn new attention since the Kyoto Protocol has now come into force. It is hoped that strengthened use of renewable energy sources can meet new international environmental requirements and provide self-sufficient domestic energy supplies. The decision support system established in this study integrates potential evaluations, cost analyses, legal incentives, and analysis of returns on investments with the aid of a geographic information system (GIS). This system can provide insights for policymakers into where and the extent of the potentials, for lawmakers into whether the current legal incentives are sufficient to encourage private investment, and for investors into whether investments in exploiting local renewable energy sources are economically feasible. Under the current incentive framework in Taiwan, the amortization periods of investment on renewable energy are generally longer than the period over which the investment is to be recovered. This presents an unfavorable condition for attracting investments to and for developing renewable energy. An increase in remuneration through legal revisions is needed before domestic investment in renewable energy will actively expand

  5. Exploiting peer group concept for adaptive and highly available services

    CERN Document Server

    Jan, M A; Fraz, M M; Ali, A; Ali, Arshad; Fraz, Mohammad Moazam; Jan, Muhammad Asif; Zahid, Fahd Ali

    2003-01-01

    This paper presents a prototype for redundant, highly available and fault tolerant peer to peer framework for data management. Peer to peer computing is gaining importance due to its flexible organization, lack of central authority, distribution of functionality to participating nodes and ability to utilize unused computational resources. Emergence of GRID computing has provided much needed infrastructure and administrative domain for peer to peer computing. The components of this framework exploit peer group concept to scope service and information search, arrange services and information in a coherent manner, provide selective redundancy and ensure availability in face of failure and high load conditions. A prototype system has been implemented using JXTA peer to peer technology and XML is used for service description and interfaces, allowing peers to communicate with services implemented in various platforms including web services and JINI services. It utilizes code mobility to achieve role interchange amo...

  6. EXPLOITATION OF GRANITE BOULDER

    Directory of Open Access Journals (Sweden)

    Ivan Cotman

    1994-12-01

    Full Text Available The processes of forming, petrography, features, properties and exploitation of granite boulders are described. The directional drilling and black powder blasting is the succesful method in exploitation of granite boulders (boulder technology (the paper is published in Croatian.

  7. Computation of quantum electron transport with local current conservation using quantum trajectories

    International Nuclear Information System (INIS)

    Alarcón, A; Oriols, X

    2009-01-01

    A recent proposal for modeling time-dependent quantum electron transport with Coulomb and exchange correlations using quantum (Bohm) trajectories (Oriols 2007 Phys. Rev. Lett. 98 066803) is extended towards the computation of the total (particle plus displacement) current in mesoscopic devices. In particular, two different methods for the practical computation of the total current are compared. The first method computes the particle and the displacement currents from the rate of Bohm particles crossing a particular surface and the time-dependent variations of the electric field there. The second method uses the Ramo–Shockley theorem to compute the total current on that surface from the knowledge of the Bohm particle dynamics in a 3D volume and the time-dependent variations of the electric field on the boundaries of that volume. From a computational point of view, it is shown that both methods achieve local current conservation, but the second is preferred because it is free from 'spurious' peaks. A numerical example, a Bohm trajectory crossing a double-barrier tunneling structure, is presented, supporting the conclusions

  8. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  9. Precise RFID localization in impaired environment through sparse signal recovery

    Science.gov (United States)

    Subedi, Saurav; Zhang, Yimin D.; Amin, Moeness G.

    2013-05-01

    Radio frequency identification (RFID) is a rapidly developing wireless communication technology for electronically identifying, locating, and tracking products, assets, and personnel. RFID has become one of the most important means to construct real-time locating systems (RTLS) that track and identify the location of objects in real time using simple, inexpensive tags and readers. The applicability and usefulness of RTLS techniques depend on their achievable accuracy. In particular, when multilateration-based localization techniques are exploited, the achievable accuracy primarily relies on the precision of the range estimates between a reader and the tags. Such range information can be obtained by using the received signal strength indicator (RSSI) and/or the phase difference of arrival (PDOA). In both cases, however, the accuracy is significantly compromised when the operation environment is impaired. In particular, multipath propagation significantly affects the measurement accuracy of both RSSI and phase information. In addition, because RFID systems are typically operated in short distances, RSSI and phase measurements are also coupled with the reader and tag antenna patterns, making accurate RFID localization very complicated and challenging. In this paper, we develop new methods to localize RFID tags or readers by exploiting sparse signal recovery techniques. The proposed method allows the channel environment and antenna patterns to be taken into account and be properly compensated at a low computational cost. As such, the proposed technique yields superior performance in challenging operation environments with the above-mentioned impairments.

  10. The status of computing and means of local and external networking at JINR

    Energy Technology Data Exchange (ETDEWEB)

    Dorokhin, A T; Shirikov, V P

    1996-12-31

    The goal of this report is to represent a view of the current state of computer support at JINR different physical researches. JINR network and its applications are considered. Trends of local networks and the connectivity with global networks are discussed. 3 refs.

  11. Inequality measures perform differently in global and local assessments: An exploratory computational experiment

    Science.gov (United States)

    Chiang, Yen-Sheng

    2015-11-01

    Inequality measures are widely used in both the academia and public media to help us understand how incomes and wealth are distributed. They can be used to assess the distribution of a whole society-global inequality-as well as inequality of actors' referent networks-local inequality. How different is local inequality from global inequality? Formalizing the structure of reference groups as a network, the paper conducted a computational experiment to see how the structure of complex networks influences the difference between global and local inequality assessed by a selection of inequality measures. It was found that local inequality tends to be higher than global inequality when population size is large; network is dense and heterophilously assorted, and income distribution is less dispersed. The implications of the simulation findings are discussed.

  12. Position Localization with Impulse Ultra Wide Band

    National Research Council Canada - National Science Library

    Zhang, Guoping; Rao, S. V

    2005-01-01

    ...) bias and clock jittering error of TDOA measurement. In our prototype design, we exploit impulse UWB techniques to implement a very low cost localization system that can achieve centimeters localization for indoor applications...

  13. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  14. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  15. Exploiting VM/XA

    International Nuclear Information System (INIS)

    Boeheim, C.

    1990-03-01

    The Stanford Linear Accelerator Center has recently completed a conversion to IBM's VM/XA SP Release 2 operating system. The primary physics application had been constrained by the previous 16 megabyte memory limit. Work is underway to enable this application to exploit the new features of VM/XA. This paper presents a brief tutorial on how to convert an application to exploit VM/XA and discusses some of the SLAC experiences in doing so. 13 figs

  16. Exploiting Redundancy in an OFDM SDR Receiver

    Directory of Open Access Journals (Sweden)

    Tomas Palenik

    2009-01-01

    Full Text Available Common OFDM system contains redundancy necessary to mitigate interblock interference and allows computationally effective single-tap frequency domain equalization in receiver. Assuming the system implements an outer error correcting code and channel state information is available in the receiver, we show that it is possible to understand the cyclic prefix insertion as a weak inner ECC encoding and exploit the introduced redundancy to slightly improve error performance of such a system. In this paper, an easy way to implement modification to an existing SDR OFDM receiver is presented. This modification enables the utilization of prefix redundancy, while preserving full compatibility with existing OFDM-based communication standards.

  17. The economics of exploiting gas hydrates

    International Nuclear Information System (INIS)

    Döpke, Lena-Katharina; Requate, Till

    2014-01-01

    We investigate the optimal exploitation of methane hydrates, a recent discovery of methane resources under the sea floor, mainly located along the continental margins. Combustion of methane (releasing CO2) and leakage through blow-outs (releasing CH4) contribute to the accumulation of greenhouse gases. A second externality arises since removing solid gas hydrates from the sea bottom destabilizes continental margins and thus increases the risk of marine earthquakes. We show that in such a model three regimes can occur: i) resource exploitation will be stopped in finite time, and some of the resource will stay in situ, ii) the resource will be used up completely in finite time, and iii) the resource will be exhausted in infinite time. We also show how to internalize the externalities by policy instruments. - Highlights: • We set up a model of optimal has hydrate exploitation • We incorporate to types of damages: contribution to global warming and geo-hazards • We characterize optimal exploitation paths and study decentralization with an exploitation tax. • Three regimes can occur: • i) exploitation in finite time and some of the stock remaining in situ, • ii) exploitation in finite time and the resource will be exhausted, • iii) exploitation and exhaustion in infinite time

  18. Taiwanese Consumers’ Perceptions of Local and Global Brands: An Investigation in Taiwan Computer Industry

    OpenAIRE

    Hsieh, Ya-Yun

    2010-01-01

    This study aims to investigate how consumers in a newly developed country, Taiwan, perceive local brands and global brands in the computer industry. To access an in-depth understanding and evaluate factors that influence consumers’ assessment of local and global brands, the country-of-origin effect and the association of brand origin are investigated; the effect of consumer ethnocentrism is addressed; and the cultural aspects on collectivism and face concept are examined. The study adopts...

  19. Local and Long Distance Computer Networking for Science Classrooms. Technical Report No. 43.

    Science.gov (United States)

    Newman, Denis

    This report describes Earth Lab, a project which is demonstrating new ways of using computers for upper-elementary and middle-school science instruction, and finding ways to integrate local-area and telecommunications networks. The discussion covers software, classroom activities, formative research on communications networks, and integration of…

  20. Conservational Exploitation as a Sustainable Development Strategy for a Small Township: The Example of Waipu District in Taichung, Taiwan

    Directory of Open Access Journals (Sweden)

    Li-Wei Liu

    Full Text Available ABSTRACT: The contemporary form of urban-rural interfaces is moving toward a more hybridized identity, since nowadays ambiguous places prevail. In particular, contemporary rural settlements are increasingly exhibiting urban characteristics in terms of their built environments. Many traditional agricultural landscapes in the world have thus been altered due to rapid urbanization, such that in order to maintain environmental diversity, the conservation of agricultural landscapes and their functions has become an important issue. Waipu District is located in the northwest of Taichung, a peri-urban area of Taichung City. This means that Waipu is rural in appearance at present but may undergo rapid urbanization in the near future. In order to protect quality local characteristics from land exploitation and move towards sustainable development, this paper depicts a strategy of conservational exploitation which is adopted to direct the future development of Waipu so as to maintain its characteristics and environmental sustainability. This paper also briefly documents the process of an experimental practice through which graduate students with landscape architecture, recreation, and urban planning backgrounds have taken part in an experimental project with professional social services in order to develop master plans for Waipu. By means of field surveys, interviews with local people, and discussions with local officials, a landscape master plan, a tourism plan, and a spatial plan developed by the students have been provided to the Waipu District Office. These plans are intended to help Waipu to develop its unique characteristics, improve local landscapes, and promote local economic development. In order to conserve rural landscapes, cultural landscaping and cultural tourism have been identified as development concepts for future development. Conservational exploitation which focuses on maintaining cultural landscapes and promoting cultural tourism can

  1. Computing wave functions in multichannel collisions with non-local potentials using the R-matrix method

    Science.gov (United States)

    Bonitati, Joey; Slimmer, Ben; Li, Weichuan; Potel, Gregory; Nunes, Filomena

    2017-09-01

    The calculable form of the R-matrix method has been previously shown to be a useful tool in approximately solving the Schrodinger equation in nuclear scattering problems. We use this technique combined with the Gauss quadrature for the Lagrange-mesh method to efficiently solve for the wave functions of projectile nuclei in low energy collisions (1-100 MeV) involving an arbitrary number of channels. We include the local Woods-Saxon potential, the non-local potential of Perey and Buck, a Coulomb potential, and a coupling potential to computationally solve for the wave function of two nuclei at short distances. Object oriented programming is used to increase modularity, and parallel programming techniques are introduced to reduce computation time. We conclude that the R-matrix method is an effective method to predict the wave functions of nuclei in scattering problems involving both multiple channels and non-local potentials. Michigan State University iCER ACRES REU.

  2. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  3. Exploitation in International Paid Surrogacy Arrangements

    OpenAIRE

    Wilkinson, Stephen

    2015-01-01

    Abstract Many critics have suggested that international paid surrogacy is exploitative. Taking such concerns as its starting point, this article asks: (1) how defensible is the claim that international paid surrogacy is exploitative and what could be done to make it less exploitative? (2) In the light of the answer to (1), how strong is the case for prohibiting it? Exploitation could in principle be dealt with by improving surrogates' pay and conditions. However, doing so may exacerbate probl...

  4. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  5. Paradise lost : Sovereign State Interest, Global Resource Exploitation and the Politics of Human Rights

    NARCIS (Netherlands)

    Augenstein, Daniel

    Taking its cue from the US Supreme Court judgment in Kiobel that restricted the extraterritorial reach of the Alien Tort Claims Act, this article explores how sovereignty structures the relationship between global resource exploitation and the localization of human rights in the international order

  6. Profits and Exploitation: A Reappraisal

    OpenAIRE

    Yoshihara, Naoki; Veneziani, Roberto

    2011-01-01

    This paper provides a mathematical analysis of the Marxian theory of the exploitation of labour in general equilibrium models. The two main definitions of Marxian exploitation in the literature, proposed by Morishima (1974) and Roemer (1982), respectively, are analysed in the context of general convex economies. It is shown that, contrary to the received view, in general these definitions do not preserve the so-called Fundamental Marxian Theorem (FMT), which states that the exploitation of la...

  7. The microeconomics of sexual exploitation of girls and young women in the Peruvian Amazon.

    Science.gov (United States)

    Mujica, Jaris

    2013-01-01

    This paper examines the sexual exploitation of girls and young women as an increasing phenomenon within the extractive industries of wood, oil, minerals and gas in Peruvian Amazonia. The analysis focuses on the city of Pucallpa and the northern part of the Ucayali River and aims to identify the social and economic dynamics underpinning the commercial sexual exploitation of female children and teenagers around the main river port. The study describes the local operating mechanisms of bars and restaurants in the port, the demand for and perceptions of the sexual exploitation of children and teenagers, and the economic logic that it entails. Using a discourse analytic approach, it is argued that this is a business whose profitability is tied to the trade in alcoholic beverages and foods and which responds to a set of family connections and networks.

  8. STAR: a local network system for real-time management of imagery data

    Energy Technology Data Exchange (ETDEWEB)

    Chuan-lin Wu; Tse-yun Feng; Min-chang Lin

    1982-10-01

    Overall architecture of a local computer network, STAR, is described. The objective is to accomplish a cost-effective system which provides multiple users a real-time service of manipulating very large volume imagery information and data. STAR consists of a reconfigurable communication subnet (starnet), heterogeneous resource units, and distributed-control software entities. Architectural aspects of a fault-tolerant communication subnet, distributed database management, and a distributed scheduling strategy for configuring desirable computation topology are exploited. A model for comparing cost-effectiveness among starnet, crossbar, and multiple buses is included. It is concluded that starnet outperforms the other two when the number of units to be connected is larger than 64. This project serves as a research tool for using current and projected technology to innovate better schemes for parallel image processing. 30 references.

  9. Estimating the Counterparty Risk Exposure by Using the Brownian Motion Local Time

    Directory of Open Access Journals (Sweden)

    Bonollo Michele

    2017-06-01

    Full Text Available In recent years, the counterparty credit risk measure, namely the default risk in over-the-counter (OTC derivatives contracts, has received great attention by banking regulators, specifically within the frameworks of Basel II and Basel III. More explicitly, to obtain the related risk figures, one is first obliged to compute intermediate output functionals related to the mark-to-market position at a given time no exceeding a positive and finite time horizon. The latter implies an enormous amount of computational effort is needed, with related highly time consuming procedures to be carried out, turning out into significant costs. To overcome the latter issue, we propose a smart exploitation of the properties of the (local time spent by the Brownian motion close to a given value.

  10. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  11. Efficient Backprojection-Based Synthetic Aperture Radar Computation with Many-Core Processors

    Directory of Open Access Journals (Sweden)

    Jongsoo Park

    2013-01-01

    Full Text Available Tackling computationally challenging problems with high efficiency often requires the combination of algorithmic innovation, advanced architecture, and thorough exploitation of parallelism. We demonstrate this synergy through synthetic aperture radar (SAR via backprojection, an image reconstruction method that can require hundreds of TFLOPS. Computation cost is significantly reduced by our new algorithm of approximate strength reduction; data movement cost is economized by software locality optimizations facilitated by advanced architecture support; parallelism is fully harnessed in various patterns and granularities. We deliver over 35 billion backprojections per second throughput per compute node on an Intel® Xeon® processor E5-2670-based cluster, equipped with Intel® Xeon Phi™ coprocessors. This corresponds to processing a 3K×3K image within a second using a single node. Our study can be extended to other settings: backprojection is applicable elsewhere including medical imaging, approximate strength reduction is a general code transformation technique, and many-core processors are emerging as a solution to energy-efficient computing.

  12. Computationally efficient near-field source localization using third-order moments

    Science.gov (United States)

    Chen, Jian; Liu, Guohong; Sun, Xiaoying

    2014-12-01

    In this paper, a third-order moment-based estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithm is proposed for passive localization of near-field sources. By properly choosing sensor outputs of the symmetric uniform linear array, two special third-order moment matrices are constructed, in which the steering matrix is the function of electric angle γ, while the rotational factor is the function of electric angles γ and ϕ. With the singular value decomposition (SVD) operation, all direction-of-arrivals (DOAs) are estimated from a polynomial rooting version. After substituting the DOA information into the steering matrix, the rotational factor is determined via the total least squares (TLS) version, and the related range estimations are performed. Compared with the high-order ESPRIT method, the proposed algorithm requires a lower computational burden, and it avoids the parameter-match procedure. Computer simulations are carried out to demonstrate the performance of the proposed algorithm.

  13. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  14. Redefining Exploitation

    DEFF Research Database (Denmark)

    Agarwala, Rina

    2016-01-01

    This article examines how self-employed workers are organizing in the garments and waste collection industries in India. Although the question of who is profiting from self-employed workers’ labor is complex, the cases outlined in this paper highlight telling instances of how some self......-employed workers are organizing as workers. They are fighting labor exploitation by redefining the concept to include additional exploitation axes (from the state and middle class) and forms (including sexual). In doing so, they are redefining potential solutions, including identities and material benefits, to fit...... their unique needs. By expanding the category of “workers” beyond those defined by a narrow focus on a standard employer-employee relationship, these movements are also fighting exclusion from earlier labor protections by increasing the number of entitled beneficiaries. These struggles provide an important...

  15. Experimental demonstration of deterministic one-way quantum computing on a NMR quantum computer

    OpenAIRE

    Ju, Chenyong; Zhu, Jing; Peng, Xinhua; Chong, Bo; Zhou, Xianyi; Du, Jiangfeng

    2008-01-01

    One-way quantum computing is an important and novel approach to quantum computation. By exploiting the existing particle-particle interactions, we report the first experimental realization of the complete process of deterministic one-way quantum Deutsch-Josza algorithm in NMR, including graph state preparation, single-qubit measurements and feed-forward corrections. The findings in our experiment may shed light on the future scalable one-way quantum computation.

  16. Exploitation of geographic information system at mapping and modelling of selected soil parameters

    International Nuclear Information System (INIS)

    Palka, B.; Makovnikova, J.; Siran, M.

    2005-01-01

    In this presentation authors describe using of computers and geographic information systems (GIS) at effective use of soil fund, rational exploitation and organization of agricultural soil fund on the territory of the Slovak Republic, its monitoring and modelling. Using and creating of some geographically oriented information systems and databases about soils as well as present trends are discussed

  17. Exploration, Exploitation, and Organizational Coordination Mechanisms

    Directory of Open Access Journals (Sweden)

    Silvio Popadiuk

    2016-03-01

    Full Text Available This paper presents an empirical relationship among exploration, exploitation, and organizational coordination mechanisms, classified as the centralization of decision-making, formalization, and connectedness. In order to analyze the findings of this survey, we used two techniques: Principal Component Analysis (PCA and Partial Least Squares Path Modeling (PLS-PM. Our analysis was supported by 249 answers from managers of companies located in Brazil (convenience sampling. Contrary to expectations, centralization and exploitation were negatively associated. Our data supports the research hypothesis that formalization is positively associated with exploitation. Although the relationship between formalization and exploration were significant, the result is contrary to the research hypothesis that we made. The relationships among connectedness and exploitation, and connectedness and exploration were both positive and significant. This relationship means that the more connectedness increases, the higher the likelihood of exploitation and exploration.

  18. Role of Multislice Computed Tomography and Local Contrast in the Diagnosis and Characterization of Choanal Atresia

    Directory of Open Access Journals (Sweden)

    Khaled Al-Noury

    2011-01-01

    Full Text Available Objective. To illustrate the role of multislice computed tomography and local contrast instillation in the diagnosis and characterization of choanal atresia. To review the common associated radiological findings. Methods. We analyzed 9 pediatric patients (5 males and 4 females with suspected choanal atresia by multislice computed tomography. We recorded the type of atresia plate and other congenital malformations of the skull. Results. Multislice computed tomography with local contrast installed delineated the posterior choanae. Three patients had unilateral mixed membranous and bony atresia. Three patients had unilateral pure bony atresia. Only 1 of 7 patients have bilateral bony atresia. It also showed other congenital anomalies in the head region. One patient is with an ear abnormality. One patient had congenital nasal pyriform aperture stenosis. One of these patients had several congenital abnormalities, including cardiac and renal deformities and a hypoplastic lateral semicircular canal. Of the 6 patients diagnosed to have choanal atresia, 1 patient had esophageal atresia and a tracheoesophageal fistula. The remaining patients had no other CHARGE syndrome lesions. Conclusions. Local Contrast medium with the application of the low-dose technique helps to delineate the cause of the nasal obstruction avoiding a high radiation dose to the child.

  19. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    DEFF Research Database (Denmark)

    Mazzoni, Alberto; Linden, Henrik; Cuntz, Hermann

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local f...... in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo....

  20. Énergie et environnement: l’exploitation des sables bitumineux en Alberta (Canada

    Directory of Open Access Journals (Sweden)

    Stéphane Héritier

    2007-09-01

    Full Text Available Découverts dans les années 1930, les secteurs de sables bitumineux (ou pétrolifères de l’Ouest canadien sont caractérisés par une intense exploitation, accélérée et stimulée depuis la décennie 1990, liée à l’explosion de la demande mondiale et aux prix élevés du baril de pétrole. Grâce à cette activité, l’Alberta est devenue l’une des provinces les plus dynamiques du Canada. L’exploitation, concédée à des entreprises pétrolières nationales et internationales, contribue à stimuler à la fois l’économie et la démographie de la province, où les revenus et les conditions économiques générales sont devenus particulièrement attractifs. Dans le même temps l’Alberta et le Canada se trouvent en situation délicate par rapport aux engagements internationaux, l’exploitation et la production du pétrole ayant des effets environnementaux importants tels que l’augmentation de la production de gaz à effets de serre, alors que les économies locales et régionales fondent leurs projets de croissance sur les revenus dégagés par cette exploitation.

  1. Implementation of Computer Assisted Test Selection System in Local Governments

    Directory of Open Access Journals (Sweden)

    Abdul Azis Basri

    2016-05-01

    Full Text Available As an evaluative way of selection of civil servant system in all government areas, Computer Assisted Test selection system was started to apply in 2013. In phase of implementation for first time in all areas in 2014, this system selection had trouble in several areas, such as registration procedure and passing grade. The main objective of this essay was to describe implementation of new selection system for civil servants in the local governments and to seek level of effectiveness of this selection system. This essay used combination of study literature and field survey which data collection was made by interviews, observations, and documentations from various sources, and to analyze the collected data, this essay used reduction, display data and verification for made the conclusion. The result of this essay showed, despite there a few parts that be problem of this system such as in the registration phase but almost all phases of implementation of CAT selection system in local government areas can be said was working clearly likes in preparation, implementation and result processing phase. And also this system was fulfilled two of three criterias of effectiveness for selection system, they were accuracy and trusty. Therefore, this selection system can be said as an effective way to select new civil servant. As suggestion, local governments have to make prime preparation in all phases of test and make a good feedback as evaluation mechanism and together with central government to seek, fix and improve infrastructures as supporting tool and competency of local residents.

  2. Daily Megavoltage Computed Tomography in Lung Cancer Radiotherapy: Correlation Between Volumetric Changes and Local Outcome

    International Nuclear Information System (INIS)

    Bral, Samuel; De Ridder, Mark; Duchateau, Michael; Gevaert, Thierry; Engels, Benedikt; Schallier, Denis; Storme, Guy

    2011-01-01

    Purpose: To assess the predictive or comparative value of volumetric changes, measured on daily megavoltage computed tomography during radiotherapy for lung cancer. Patients and Methods: We included 80 patients with locally advanced non-small-cell lung cancer treated with image-guided intensity-modulated radiotherapy. The radiotherapy was combined with concurrent chemotherapy, combined with induction chemotherapy, or given as primary treatment. Patients entered two parallel studies with moderately hypofractionated radiotherapy. Tumor volume contouring was done on the daily acquired images. A regression coefficient was derived from the volumetric changes on megavoltage computed tomography, and its predictive value was validated. Logarithmic or polynomial fits were applied to the intratreatment changes to compare the different treatment schedules radiobiologically. Results: Regardless of the treatment type, a high regression coefficient during radiotherapy predicted for a significantly prolonged cause-specific local progression free-survival (p = 0.05). Significant differences were found in the response during radiotherapy. The significant difference in volumetric treatment response between radiotherapy with concurrent chemotherapy and radiotherapy plus induction chemotherapy translated to a superior long-term local progression-free survival for concurrent chemotherapy (p = 0.03). An enhancement ratio of 1.3 was measured for the used platinum/taxane doublet in comparison with radiotherapy alone. Conclusion: Contouring on daily megavoltage computed tomography images during radiotherapy enabled us to predict the efficacy of a given treatment. The significant differences in volumetric response between treatment strategies makes it a possible tool for future schedule comparison.

  3. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    Science.gov (United States)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  4. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  5. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  6. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  7. Exponential rise of dynamical complexity in quantum computing through projections.

    Science.gov (United States)

    Burgarth, Daniel Klaus; Facchi, Paolo; Giovannetti, Vittorio; Nakazato, Hiromichi; Pascazio, Saverio; Yuasa, Kazuya

    2014-10-10

    The ability of quantum systems to host exponentially complex dynamics has the potential to revolutionize science and technology. Therefore, much effort has been devoted to developing of protocols for computation, communication and metrology, which exploit this scaling, despite formidable technical difficulties. Here we show that the mere frequent observation of a small part of a quantum system can turn its dynamics from a very simple one into an exponentially complex one, capable of universal quantum computation. After discussing examples, we go on to show that this effect is generally to be expected: almost any quantum dynamics becomes universal once 'observed' as outlined above. Conversely, we show that any complex quantum dynamics can be 'purified' into a simpler one in larger dimensions. We conclude by demonstrating that even local noise can lead to an exponentially complex dynamics.

  8. Accelerating SPARQL queries by exploiting hash-based locality and adaptive partitioning

    KAUST Repository

    Al-Harbi, Razen; Abdelaziz, Ibrahim; Kalnis, Panos; Mamoulis, Nikos; Ebrahim, Yasser; Sahli, Majed

    2016-01-01

    State-of-the-art distributed RDF systems partition data across multiple computer nodes (workers). Some systems perform cheap hash partitioning, which may result in expensive query evaluation. Others try to minimize inter-node communication, which

  9. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    CERN Document Server

    Barreiro Megino, Fernando Harald; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2014-01-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain ...

  10. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    CERN Multimedia

    Barreiro Megino, Fernando Harald; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2013-01-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D; investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain...

  11. Organ sales: exploitative at any price?

    Science.gov (United States)

    Lawlor, Rob

    2014-05-01

    In many cases, claims that a transaction is exploitative will focus on the details of the transaction, such as the price paid or conditions. For example, in a claim that a worker is exploited, the grounds for the claim are usually that the pay is not sufficient or the working conditions too dangerous. In some cases, however, the claim that a transaction is exploitative is not seen to rely on these finer details. Many, for example, claim that organ sales would be exploitative, in a way that doesn't seem to depend on the details. This article considers, but ultimately rejects, a number of arguments which could be used to defend this sort of claim. © 2012 John Wiley & Sons Ltd.

  12. Extra Facial Landmark Localization via Global Shape Reconstruction

    Directory of Open Access Journals (Sweden)

    Shuqiu Tan

    2017-01-01

    Full Text Available Localizing facial landmarks is a popular topic in the field of face analysis. However, problems arose in practical applications such as handling pose variations and partial occlusions while maintaining moderate training model size and computational efficiency still challenges current solutions. In this paper, we present a global shape reconstruction method for locating extra facial landmarks comparing to facial landmarks used in the training phase. In the proposed method, the reduced configuration of facial landmarks is first decomposed into corresponding sparse coefficients. Then explicit face shape correlations are exploited to regress between sparse coefficients of different facial landmark configurations. Finally extra facial landmarks are reconstructed by combining the pretrained shape dictionary and the approximation of sparse coefficients. By applying the proposed method, both the training time and the model size of a class of methods which stack local evidences as an appearance descriptor can be scaled down with only a minor compromise in detection accuracy. Extensive experiments prove that the proposed method is feasible and is able to reconstruct extra facial landmarks even under very asymmetrical face poses.

  13. Indoor localization using magnetic fields

    Science.gov (United States)

    Pathapati Subbu, Kalyan Sasidhar

    Indoor localization consists of locating oneself inside new buildings. GPS does not work indoors due to multipath reflection and signal blockage. WiFi based systems assume ubiquitous availability and infrastructure based systems require expensive installations, hence making indoor localization an open problem. This dissertation consists of solving the problem of indoor localization by thoroughly exploiting the indoor ambient magnetic fields comprising mainly of disturbances termed as anomalies in the Earth's magnetic field caused by pillars, doors and elevators in hallways which are ferromagnetic in nature. By observing uniqueness in magnetic signatures collected from different campus buildings, the work presents the identification of landmarks and guideposts from these signatures and further develops magnetic maps of buildings - all of which can be used to locate and navigate people indoors. To understand the reason behind these anomalies, first a comparison between the measured and model generated Earth's magnetic field is made, verifying the presence of a constant field without any disturbances. Then by modeling the magnetic field behavior of different pillars such as steel reinforced concrete, solid steel, and other structures like doors and elevators, the interaction of the Earth's field with the ferromagnetic fields is described thereby explaining the causes of the uniqueness in the signatures that comprise these disturbances. Next, by employing the dynamic time warping algorithm to account for time differences in signatures obtained from users walking at different speeds, an indoor localization application capable of classifying locations using the magnetic signatures is developed solely on the smart phone. The application required users to walk short distances of 3-6 m anywhere in hallway to be located with accuracies of 80-99%. The classification framework was further validated with over 90% accuracies using model generated magnetic signatures representing

  14. Mobile clouds exploiting distributed resources in wireless, mobile and social networks

    CERN Document Server

    Fitzek, Frank H P

    2013-01-01

    Includes a preface written by Professor Leonard Kleinrock, Distinguished Professor of Computer Science, UCLA, USA This book discusses and explores the concept of mobile cloud, creating an inspiring research space for exploiting opportunistic resource sharing, and covering from theoretical research approaches to the development of commercially profitable ideas. A mobile cloud is a cooperative arrangement of dynamically connected communication nodes sharing opportunistic resources. In this book, authors provide a comprehensive and motivating overview of this rapidly emerging technology. The b

  15. Efficient Topological Localization Using Global and Local Feature Matching

    Directory of Open Access Journals (Sweden)

    Junqiu Wang

    2013-03-01

    Full Text Available We present an efficient vision-based global topological localization approach in which different image features are used in a coarse-to-fine matching framework. Orientation Adjacency Coherence Histogram (OACH, a novel image feature, is proposed to improve the coarse localization. The coarse localization results are taken as inputs for the fine localization which is carried out by matching Harris-Laplace interest points characterized by the SIFT descriptor. The computation of OACHs and interest points is efficient due to the fact that these features are computed in an integrated process. The matching of local features is improved by using approximate nearest neighbor searching technique. We have implemented and tested the localization system in real environments. The experimental results demonstrate that our approach is efficient and reliable in both indoor and outdoor environments. This work has also been compared with previous works. The comparison results show that our approach has better performance with higher correct ratio and lower computational complexity.

  16. Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales

    Data.gov (United States)

    National Aeronautics and Space Administration — Development of computational infrastructure to support hyper-resolution large-ensemble hydrology simulations from local-to-continental scales A move is currently...

  17. Empowering enterprises through next-generation enterprise computing

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Andrade Almeida, João

    Enterprise computing is concerned with exploiting interconnected computers to improve the efficiency and effectiveness of larger companies. Such companies form business organizations that manage various sorts of information, used by disparate groups of people, who are situated at different

  18. Herbivory eliminates fitness costs of mutualism exploiters.

    Science.gov (United States)

    Simonsen, Anna K; Stinchcombe, John R

    2014-04-01

    A common empirical observation in mutualistic interactions is the persistence of variation in partner quality and, in particular, the persistence of exploitative phenotypes. For mutualisms between hosts and symbionts, most mutualism theory assumes that exploiters always impose fitness costs on their host. We exposed legume hosts to mutualistic (nitrogen-fixing) and exploitative (non-nitrogen-fixing) symbiotic rhizobia in field conditions, and manipulated the presence or absence of insect herbivory to determine if the costly fitness effects of exploitative rhizobia are context-dependent. Exploitative rhizobia predictably reduced host fitness when herbivores were excluded. However, insects caused greater damage on hosts associating with mutualistic rhizobia, as a consequence of feeding preferences related to leaf nitrogen content, resulting in the elimination of fitness costs imposed on hosts by exploitative rhizobia. Our experiment shows that herbivory is potentially an important factor in influencing the evolutionary dynamic between legumes and rhizobia. Partner choice and host sanctioning are theoretically predicted to stabilize mutualisms by reducing the frequency of exploitative symbionts. We argue that herbivore pressure may actually weaken selection on choice and sanction mechanisms, thus providing one explanation of why host-based discrimination mechanisms may not be completely effective in eliminating nonbeneficial partners. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  19. Dissemination and Exploitation Strategy

    DEFF Research Database (Denmark)

    Badger, Merete; Monaco, Lucio; Fransson, Torsten

    of Technology in Sweden, Politecnico di Torino in Italy, and Eindhoven University of Technology in the Netherlands. The project is partially funded by the European Commission under the 7th Framework Programme (project no. RI-283746). This report describes the final dissemination and exploitation strategy...... for project Virtual Campus Hub. A preliminary dissemination and exploitation plan was setup early in the project as described in the deliverable D6.1 Dissemination strategy paper - preliminary version. The plan has been revised on a monthly basis during the project’s lifecycle in connection with the virtual...

  20. A novel DTI-QA tool: Automated metric extraction exploiting the sphericity of an agar filled phantom.

    Science.gov (United States)

    Chavez, Sofia; Viviano, Joseph; Zamyadi, Mojdeh; Kingsley, Peter B; Kochunov, Peter; Strother, Stephen; Voineskos, Aristotle

    2018-02-01

    To develop a quality assurance (QA) tool (acquisition guidelines and automated processing) for diffusion tensor imaging (DTI) data using a common agar-based phantom used for fMRI QA. The goal is to produce a comprehensive set of automated, sensitive and robust QA metrics. A readily available agar phantom was scanned with and without parallel imaging reconstruction. Other scanning parameters were matched to the human scans. A central slab made up of either a thick slice or an average of a few slices, was extracted and all processing was performed on that image. The proposed QA relies on the creation of two ROIs for processing: (i) a preset central circular region of interest (ccROI) and (ii) a signal mask for all images in the dataset. The ccROI enables computation of average signal for SNR calculations as well as average FA values. The production of the signal masks enables automated measurements of eddy current and B0 inhomogeneity induced distortions by exploiting the sphericity of the phantom. Also, the signal masks allow automated background localization to assess levels of Nyquist ghosting. The proposed DTI-QA was shown to produce eleven metrics which are robust yet sensitive to image quality changes within site and differences across sites. It can be performed in a reasonable amount of scan time (~15min) and the code for automated processing has been made publicly available. A novel DTI-QA tool has been proposed. It has been applied successfully on data from several scanners/platforms. The novelty lies in the exploitation of the sphericity of the phantom for distortion measurements. Other novel contributions are: the computation of an SNR value per gradient direction for the diffusion weighted images (DWIs) and an SNR value per non-DWI, an automated background detection for the Nyquist ghosting measurement and an error metric reflecting the contribution of EPI instability to the eddy current induced shape changes observed for DWIs. Copyright © 2017 Elsevier

  1. Computed tomography-guided cryoablation of local recurrence after primary resection of pancreatic adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Claudio Pusceddu

    2015-06-01

    Full Text Available The optimal management of local recurrences after primary resection of pancreatic cancer still remains to be clarified. A 58-yearold woman developed an isolated recurrence of pancreatic cancer six year after distal pancreatectomy. Re-resection was attempted but the lesion was deemed unresectable at surgery. Then chemotherapy was administrated without obtaining a reduction of the tumor size nor an improvement of the patient’s symptoms. Thus the patient underwent percutaneous cryoablation under computed tomography (CT-guidance obtaining tumor necrosis and a significant improvement in the quality of life. A CT scan one month later showed a stable lesion with no contrast enhancement. While the use of percutaneous cryoblation has widened its applications in patients with unresectable pancreatic cancer, it has never been described for the treatment of local pancreatic cancer recurrence after primary resection. Percutaneous cryoablation deserves further studies in the multimodality treatment of local recurrence after primary pancreatic surgery.

  2. Exploitative and Deceptive Resource Acquisition Strategies

    Directory of Open Access Journals (Sweden)

    Joshua J. Reynolds

    2015-07-01

    Full Text Available Life history strategy (LHS and life history contingencies (LHCs should theoretically influence the use of exploitative and deceptive resource acquisition strategies. However, little research has been done in this area. The purpose of the present work was to create measures of exploitative strategies and test the predictions of life history theory. Pilot studies developed and validated a behavioral measure of cheating called the Dot Game. The role of individual LHS and LHCs (manipulated via validated story primes on cheating was investigated in Study 1. Studies 2a through 2c were conducted to develop and validate a self-report measure called the Exploitative and Deceptive Resource Acquisition Strategy Scale (EDRASS. Finally, Study 3 investigated life history and EDRASS. Results indicated that while LHS influences exploitative strategies, life history contingences had little effect. Implications of these findings are discussed.

  3. Trajectories and cycles of sexual exploitation and trafficking for sexual exploitation of women in the Peruvian Amazon

    OpenAIRE

    Mujica, Jaris

    2015-01-01

    The commercial sexual exploitation is a constant activity in the Peruvian Amazon. Around the river port of Pucallpa in ucayali region, the practice appears systematically: teenage attend taverns around the port, and those dedicated to the work of cooking camps logging, are victims of constant exploitation and many also of trafficking. this article aims to reconstruct the path of life and reproductive cycle of the forms of exploitation in a sample of 20 women, and focuses on: (i) evidence of s...

  4. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  5. L'industrialisation de l'exploitation de l'or à Kalsaka, Burkina Faso : une chance pour une population rurale pauvre ?

    Directory of Open Access Journals (Sweden)

    Matthieu Thune

    2011-09-01

    Full Text Available On assiste au Burkina Faso au passage d'une exploitation essentiellement artisanale de l'or à une exploitation industrielle. Sur des sites miniers exploités jusqu'alors de façon artisanale, l'arrivée d'une entreprise industrielle perturbe les rapports des habitants aux espaces qu’ils exploitent et aux bénéfices qu’ils prétendent en tirer. A travers l'étude du site de Kalsaka, au nord du pays, il s'agit d'analyser les changements engendrés par l'arrivée de la mine sur les activités rurales. L'implantation de l'entreprise minière bouleverse l’économie locale dans le sens d'une paupérisation et ne constitue pas une nouvelle opportunité économique pour les habitants du lieu. Toutefois, les acteurs locaux sont inégalement touchés et de nombreux habitants perturbés dans leurs activités ont été en mesure de s'adapter au changement.Burkina Faso is faced with the transition from a mainly traditional gold mining activity to an industrial one. On sites that had hitherto been mined with traditional techniques, the setting-up of an industrial company interferes with the way the population interact with the area, as well as the profit they wish to draw from it. The point of this study on the Kalsaka site, situated in the north of the country, is to analyze the impact the mine has caused on rural activities. The emergence of the mining company has impoverished the local economy and has failed to provide the local population with any new economic opportunity. However, the locals turned out to be unevenly affected and many people -whose activity had initially been disrupted- have adjusted to the change.

  6. Benefits Management of Cloud Computing Investments

    OpenAIRE

    Richard Greenwell; Xiaodong Liu; Kevin Chalmers

    2014-01-01

    This paper examines investments in cloud computing using the Benefits Management approach. The major contribution of the paper is to provide a unique insight into how organizations derive value from cloud computing investments. The motivation for writing this paper is to consider the business benefits generated from utilizing cloud computing in a range of organizations. Case studies are used to describe a number of organizations approaches to benefits exploitation using cloud computing. It wa...

  7. 44 Rural Fuelwood Exploitation in Mbo Local Government Area – A ...

    African Journals Online (AJOL)

    `123456789jkl''''#

    Ethiopian Journal of Environmental Studies and Management Vol.2 No.3 2009. 1Department Of ... Through the use of Geographic Information Systems (GIS) and empirical surveys, investigations ... Major areas of local consumption include domestic energy, fish smoking and .... the fresh and salt water mangrove swamp.

  8. OPTIMAL PLANNING OF THE EXPLOITATION OF THE AGRICULTURAL EQUIPMENT, THROUGH A SYSTEM COMPUTER IN SUGAR COMPANY SELECTED SANTIAGO DE CUBA PROVINCE

    Directory of Open Access Journals (Sweden)

    Raimundo J. Lora-Freyre

    2016-01-01

    Full Text Available This work offers a new method in order to planning the optimal use of the agricultural machinery in the sugar companies, using the Economic-Mathematical Multiobjective Models and the computation. The study contemplates the following stages: 1 Definition of the goals, being promoted the search of the good values of the time of work, of the consumption of fuel and of the operational cost. 2 Making and application of Models of Goal Programming in order to planning the agricultural machinery.3 Elaboration of an automated system to make the application of the  models of optimization referred. This system allows its operation by the personnel responsible for the exploitation of the agricultural machinery. The investigation was applied in the sugar enterprise «Paquito Rosales» and «Julio Antonio Mella», both in the Santiago de Cuba County. The results show the advantages. The savings of fuels are of 16 %, the decreases of the costs and of the time are expressed  respectively in 12 % and 4 %. 

  9. An Algorithm Computing the Local $b$ Function by an Approximate Division Algorithm in $\\hat{\\mathcal{D}}$

    OpenAIRE

    Nakayama, Hiromasa

    2006-01-01

    We give an algorithm to compute the local $b$ function. In this algorithm, we use the Mora division algorithm in the ring of differential operators and an approximate division algorithm in the ring of differential operators with power series coefficient.

  10. ROUNDTABLE - SESSION 2 EXPLOITATION, CONSERVATION AND LEGISLATION

    Directory of Open Access Journals (Sweden)

    EDSMAN L.

    2004-01-01

    Full Text Available The link between socioeconomics and conservation and the role of legislation in conservation work was discussed in the group with participants from nine European countries. Interest and knowledge among the general public, stakeholders and managers is the key to successful conservation of native crayfish species. Exploitation and conservation do not necessarily exclude each other. A controlled fishery, where it can be sustained, may be an essential tool for conservation by increasing the general awareness and involving more people in the task of protecting the native crayfish species. This strategy is mainly possible for the noble crayfish in the northern part of its distribution, where strong traditions connected to crayfish also exist. A balance between utilisation and overexploitation has to be found and local guidelines for sustainable exploitation produced. Media, the Internet and educational material aimed at schools and stakeholders are excellent ways of reaching a wide audience with information. Universal objectives, rules and regulations at the European level are desirable and the noble crayfish and the stone crayfish should be included in Annex II of the Habitat Directive. Based on this framework detailed regulations are best worked out at the national level, considering the specific crayfish situation in the country. Information about the legislation, the purpose of the legislation and the consequences when not obeying it should be distributed. Stricter regulation of the trade with live alien crayfish is vital because of the associated risk of introducing new diseases and species.

  11. Three Tier Indoor Localization System for Digital Forensics

    OpenAIRE

    Dennis L. Owuor; Okuthe P. Kogeda; Johnson I. Agbinya

    2017-01-01

    Mobile localization has attracted a great deal of attention recently due to the introduction of wireless networks. Although several localization algorithms and systems have been implemented and discussed in the literature, very few researchers have exploited the gap that exists between indoor localization, tracking, external storage of location information and outdoor localization for the purpose of digital forensics during and after a disaster. The contribution of this paper lies in the impl...

  12. Key issues for determining the exploitable water resources in a Mediterranean river basin.

    Science.gov (United States)

    Pedro-Monzonís, María; Ferrer, Javier; Solera, Abel; Estrela, Teodoro; Paredes-Arquiola, Javier

    2015-01-15

    One of the major difficulties in water planning is to determine the water availability in a water resource system in order to distribute water sustainably. In this paper, we analyze the key issues for determining the exploitable water resources as an indicator of water availability in a Mediterranean river basin. Historically, these territories are characterized by heavily regulated water resources and the extensive use of unconventional resources (desalination and wastewater reuse); hence, emulating the hydrological cycle is not enough. This analysis considers the Jucar River Basin as a case study. We have analyzed the different possible combinations between the streamflow time series, the length of the simulation period and the reliability criteria. As expected, the results show a wide dispersion, proving the great influence of the reliability criteria used for the quantification and localization of the exploitable water resources in the system. Therefore, it is considered risky to provide a single value to represent the water availability in the Jucar water resource system. In this sense, it is necessary that policymakers and stakeholders make a decision about the methodology used to determine the exploitable water resources in a river basin. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Exploitation in International Paid Surrogacy Arrangements.

    Science.gov (United States)

    Wilkinson, Stephen

    2016-05-01

    Many critics have suggested that international paid surrogacy is exploitative. Taking such concerns as its starting point, this article asks: (1) how defensible is the claim that international paid surrogacy is exploitative and what could be done to make it less exploitative? (2) In the light of the answer to (1), how strong is the case for prohibiting it? Exploitation could in principle be dealt with by improving surrogates' pay and conditions. However, doing so may exacerbate problems with consent. Foremost amongst these is the argument that surrogates from economically disadvantaged countries cannot validly consent because their background circumstances are coercive. Several versions of this argument are examined and I conclude that at least one has some merit. The article's overall conclusion is that while ethically there is something to be concerned about, paid surrogacy is in no worse a position than many other exploitative commercial transactions which take place against a backdrop of global inequality and constrained options, such as poorly-paid and dangerous construction work. Hence, there is little reason to single surrogacy out for special condemnation. On a policy level, the case for prohibiting international commercial surrogacy is weak, despite legitimate concerns about consent and background poverty.

  14. Exploitation in International Paid Surrogacy Arrangements

    Science.gov (United States)

    Wilkinson, Stephen

    2015-01-01

    Abstract Many critics have suggested that international paid surrogacy is exploitative. Taking such concerns as its starting point, this article asks: (1) how defensible is the claim that international paid surrogacy is exploitative and what could be done to make it less exploitative? (2) In the light of the answer to (1), how strong is the case for prohibiting it? Exploitation could in principle be dealt with by improving surrogates' pay and conditions. However, doing so may exacerbate problems with consent. Foremost amongst these is the argument that surrogates from economically disadvantaged countries cannot validly consent because their background circumstances are coercive. Several versions of this argument are examined and I conclude that at least one has some merit. The article's overall conclusion is that while ethically there is something to be concerned about, paid surrogacy is in no worse a position than many other exploitative commercial transactions which take place against a backdrop of global inequality and constrained options, such as poorly‐paid and dangerous construction work. Hence, there is little reason to single surrogacy out for special condemnation. On a policy level, the case for prohibiting international commercial surrogacy is weak, despite legitimate concerns about consent and background poverty. PMID:27471338

  15. Models for solid oxide fuel cell systems exploitation of models hierarchy for industrial design of control and diagnosis strategies

    CERN Document Server

    Marra, Dario; Polverino, Pierpaolo; Sorrentino, Marco

    2016-01-01

    This book presents methodologies for optimal design of control and diagnosis strategies for Solid Oxide Fuel Cell systems. A key feature of the methodologies presented is the exploitation of modelling tools that balance accuracy and computational burden.

  16. Exploitation and exploration dynamics in recessionary times

    OpenAIRE

    Walrave, B.

    2012-01-01

    Firm performance largely depends on the ability to adapt to, and exploit, changes in the business environment. That is, firms should maintain ecological fitness by reconfiguring their resource base to cope with emerging threats and explore new opportunities, while at the same time exploiting existing resources. As such, firms possessing the ability to simultaneously perform exploitative and explorative initiatives are more resilient. In this respect, the performance implications of balancing ...

  17. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  18. Computer local construction of a general solution for the Chew-Low equations

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1980-01-01

    General solution of the dynamic form of the Chew-Low equations in the vicinity of the restpoint is considered. A method for calculating coefficients of series being members of such solution is suggested. The results of calculations, coefficients of power series and expansions carried out by means of the SCHOONSCHIP and SYMBAL systems are given. It is noted that the suggested procedure of the Chew-Low equation solutions basing on using an electronic computer as an instrument for analytical calculations permits to obtain detail information on the local structure of general solution

  19. General rigid motion correction for computed tomography imaging based on locally linear embedding

    Science.gov (United States)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  20. Poverty-Exploitation-Alienation.

    Science.gov (United States)

    Bronfenbrenner, Martin

    1980-01-01

    Illustrates how knowledge derived from the discipline of economics can be used to help shed light on social problems such as poverty, exploitation, and alienation, and can help decision makers form policy to minimize these and similar problems. (DB)

  1. Exploiting Stabilizers and Parallelism in State Space Generation with the Symmetry Method

    DEFF Research Database (Denmark)

    Lorentsen, Louise; Kristensen, Lars Michael

    2001-01-01

    The symmetry method is a main reduction paradigm for alleviating the state explosion problem. For large symmetry groups deciding whether two states are symmetric becomes time expensive due to the apparent high time complexity of the orbit problem. The contribution of this paper is to alleviate th...... the negative impact of the orbit problem by the specification of canonical representatives for equivalence classes of states in Coloured Petri Nets, and by giving algorithms exploiting stabilizers and parallelism for computing the condensed state space....

  2. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  3. Exploration and exploitation of Victorian science in Darwin's reading notebooks.

    Science.gov (United States)

    Murdock, Jaimie; Allen, Colin; DeDeo, Simon

    2017-02-01

    Search in an environment with an uncertain distribution of resources involves a trade-off between exploitation of past discoveries and further exploration. This extends to information foraging, where a knowledge-seeker shifts between reading in depth and studying new domains. To study this decision-making process, we examine the reading choices made by one of the most celebrated scientists of the modern era: Charles Darwin. From the full-text of books listed in his chronologically-organized reading journals, we generate topic models to quantify his local (text-to-text) and global (text-to-past) reading decisions using Kullback-Liebler Divergence, a cognitively-validated, information-theoretic measure of relative surprise. Rather than a pattern of surprise-minimization, corresponding to a pure exploitation strategy, Darwin's behavior shifts from early exploitation to later exploration, seeking unusually high levels of cognitive surprise relative to previous eras. These shifts, detected by an unsupervised Bayesian model, correlate with major intellectual epochs of his career as identified both by qualitative scholarship and Darwin's own self-commentary. Our methods allow us to compare his consumption of texts with their publication order. We find Darwin's consumption more exploratory than the culture's production, suggesting that underneath gradual societal changes are the explorations of individual synthesis and discovery. Our quantitative methods advance the study of cognitive search through a framework for testing interactions between individual and collective behavior and between short- and long-term consumption choices. This novel application of topic modeling to characterize individual reading complements widespread studies of collective scientific behavior. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Multislice Computed Tomography Coronary Angiography at a Local Hospital: Pitfalls and Potential

    Energy Technology Data Exchange (ETDEWEB)

    Kolnes, K.; Velle, Ose H.; Hareide, S.; Hegbom, K.; Wiseth, R. [Volda Hospital (Norway). Depts. of Radiology and Internal Medicine

    2006-09-15

    Purpose: To evaluate whether the favorable results achieved with multislice computed tomography (MSCT) of coronary arteries at larger centers could be paralleled at a local hospital. Material and Methods: Fifty consecutive patients with suspected coronary artery disease scheduled for invasive investigation with quantitative coronary angiography (QCA) at a university hospital underwent MSCT with a 16-slice scanner at a local hospital. Diagnostic accuracy of MSCT for coronary artery disease was assessed using a 16-segment coronary artery model with QCA as the gold standard. Results: Segments with diameter 50% stenosis for the 416 assessable segments were 92%, 82%, 53%, and 98%, respectively. Conclusion: Our beginners' experience demonstrated favorable results regarding sensitivity and negative predictive value. The positive predictive value, however, was unsatisfactory. Calcifications were identified as the most important factor for false-positive results with MSCT. With widespread use of MSCT coronary angiography, there is a risk of recruiting patients without significant coronary artery disease to unnecessary and potentially harmful invasive procedures.

  5. Sensor Network-Based Localization for Continuous Tracking Applications: Implementation and Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Franco Davoli

    2008-10-01

    Full Text Available The increasing interest in systems able to provide users with immersive services (e.g., domotics, context-aware applications, and immersive distance learning tools has encouraged the development of cheap and effective platforms aimed at tracking objects and people within a certain space. In this context, wireless sensor networks (WSNs can play a very important role, since specialized sensors can be fruitfully exploited in order to generate/receive signals by means of which the WSN can derive the position of nodes joined to the objects to be tracked. The paper presents an original localization platform that exploits a single-hop WSN, based on a Microchip MCU and a Cypress RF device, to track its moving nodes. Specifically, the nodes of the network are divided into three sets: the first set consists of anchor nodes that, according to the commands from the sink (the central node of the WSN, generate ultrasonic pulses. These pulses are received by the second set of (moving nodes, which estimate the pulse time trip and communicate it to the sink. Finally, the last set is constituted by general purpose nodes that collect any kind of data from the surrounding field. The sink gathers all the data, computes the position of moving nodes, and transfers information to external users on the Internet. The algorithms adopted to manage the network and to localize moving nodes are discussed. A working prototype based upon the hardware platform, software, and protocol described in this paper has been deployed and tested, and some results are shown. Simulation results of the localization system are presented to show system scalability.

  6. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  7. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Sakai, F.; Hata, T.; Oravez, W.T.; Timpe, G.M.; Deville, T.; Solomon, E.

    1988-08-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Llambda) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Llambda values to be normal, introducing the risk of systematic errors, because Llambda values differ throughout normal brain and may be altered by disease. Color-coded maps of Llambda and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 +- 7.7, for subcortical gray matter it was 50.3 +- 13.2 and for white matter it was 18.8 +- 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Llambda and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner.

  8. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  9. Exploiting Microwave Imaging Methods for Real-Time Monitoring of Thermal Ablation

    Directory of Open Access Journals (Sweden)

    Rosa Scapaticci

    2017-01-01

    Full Text Available Microwave thermal ablation is a cancer treatment that exploits local heating caused by a microwave electromagnetic field to induce coagulative necrosis of tumor cells. Recently, such a technique has significantly progressed in the clinical practice. However, its effectiveness would dramatically improve if paired with a noninvasive system for the real-time monitoring of the evolving dimension and shape of the thermally ablated area. In this respect, microwave imaging can be a potential candidate to monitor the overall treatment evolution in a noninvasive way, as it takes direct advantage from the dependence of the electromagnetic properties of biological tissues from temperature. This paper explores such a possibility by presenting a proof of concept validation based on accurate simulated imaging experiments, run with respect to a scenario that mimics an ex vivo experimental setup. In particular, two model-based inversion algorithms are exploited to tackle the imaging task. These methods provide independent results in real-time and their integration improves the quality of the overall tracking of the variations occurring in the target and surrounding regions.

  10. Radiation environmental impact assessment of copper exploitation

    International Nuclear Information System (INIS)

    Fan Guang; Wen Zhijian

    2010-01-01

    The radiation environmental impact of mineral exploitation on the surrounding environment has become a public concern. This paper presents the radiation environmental impact assessment of copper exploitation. Based on the project description and detailed investigations of surrounding environment, systematic radiation environmental impacts have been identified. The environmental impacts are assessed during both construction and operation phase. The environmental protection measures have also been proposed. The related conclusion and measures can play an active role in copper exploitation and environmental protection. (authors)

  11. Exploiting the Potential of Data Centers in the Smart Grid

    Science.gov (United States)

    Wang, Xiaoying; Zhang, Yu-An; Liu, Xiaojing; Cao, Tengfei

    As the number of cloud computing data centers grows rapidly in recent years, from the perspective of smart grid, they are really large and noticeable electric load. In this paper, we focus on the important role and the potential of data centers as controllable loads in the smart grid. We reviewed relevant research in the area of letting data centers participate in the ancillary services market and demand response programs of the grid, and further investigate the possibility of exploiting the impact of data center placement on the grid. Various opportunities and challenges are summarized, which could provide more chances for researches to explore this field.

  12. General approach to the computation of local transport coefficients with finite Larmor effects in the collision contribution

    International Nuclear Information System (INIS)

    Ghendrih, P.

    1986-10-01

    We expand the distribution functions on a basis of Hermite functions and obtain a general scheme to compute the local transport coefficients. The magnetic field dependence due to finite Larmor radius effects during the collision process is taken into account

  13. Competing Discourses about Youth Sexual Exploitation in Canadian News Media.

    Science.gov (United States)

    Saewyc, Elizabeth M; Miller, Bonnie B; Rivers, Robert; Matthews, Jennifer; Hilario, Carla; Hirakata, Pam

    2013-10-01

    Media holds the power to create, maintain, or break down stigmatizing attitudes, which affect policies, funding, and services. To understand how Canadian news media depicts the commercial sexual exploitation of children and youth, we examined 835 Canadian newspaper articles from 1989-2008 using a mixed methods critical discourse analysis approach, comparing representations to existing research about sexually exploited youth. Despite research evidence that equal rates of boys and girls experience exploitation, Canadian news media depicted exploited youth predominantly as heterosexual girls, and described them alternately as victims or workers in a trade, often both in the same story. News media mentioned exploiters far less often than victims, and portrayed them almost exclusively as male, most often called 'customers' or 'consumers,' and occasionally 'predators'; in contrast, research has documented the majority of sexually exploited boys report female exploiters. Few news stories over the past two decades portrayed the diversity of victims, perpetrators, and venues of exploitation reported in research. The focus on victims but not exploiters helps perpetuate stereotypes of sexual exploitation as business or a 'victimless crime,' maintains the status quo, and blurs responsibility for protecting youth under the UN Convention on the Rights of the Child. Health care providers and researchers can be advocates for accuracy in media coverage about sexual exploitation; news reporters and editors should focus on exploiters more than victims, draw on existing research evidence to avoid perpetuating stereotypes, and use accurate terms, such as commercial sexual exploitation, rather than terms related to business or trade.

  14. Simulated population responses of common carp to commercial exploitation

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Michael J.; Hennen, Matthew J.; Brown, Michael L.

    2011-12-01

    Common carp Cyprinus carpio is a widespread invasive species that can become highly abundant and impose deleterious ecosystem effects. Thus, aquatic resource managers are interested in controlling common carp populations. Control of invasive common carp populations is difficult, due in part to the inherent uncertainty of how populations respond to exploitation. To understand how common carp populations respond to exploitation, we evaluated common carp population dynamics (recruitment, growth, and mortality) in three natural lakes in eastern South Dakota. Common carp exhibited similar population dynamics across these three systems that were characterized by consistent recruitment (ages 3 to 15 years present), fast growth (K = 0.37 to 0.59), and low mortality (A = 1 to 7%). We then modeled the effects of commercial exploitation on size structure, abundance, and egg production to determine its utility as a management tool to control populations. All three populations responded similarly to exploitation simulations with a 575-mm length restriction, representing commercial gear selectivity. Simulated common carp size structure modestly declined (9 to 37%) in all simulations. Abundance of common carp declined dramatically (28 to 56%) at low levels of exploitation (0 to 20%) but exploitation >40% had little additive effect and populations were only reduced by 49 to 79% despite high exploitation (>90%). Maximum lifetime egg production was reduced from 77 to 89% at a moderate level of exploitation (40%), indicating the potential for recruitment overfishing. Exploitation further reduced common carp size structure, abundance, and egg production when simulations were not size selective. Our results provide insights to how common carp populations may respond to exploitation. Although commercial exploitation may be able to partially control populations, an integrated removal approach that removes all sizes of common carp has a greater chance of controlling population abundance

  15. CDF GlideinWMS usage in Grid computing of high energy physics

    International Nuclear Information System (INIS)

    Zvada, Marian; Sfiligoi, Igor; Benjamin, Doug

    2010-01-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  16. Managing the Innovators for Exploration and Exploitation

    Directory of Open Access Journals (Sweden)

    C. Annique UN

    2007-09-01

    Full Text Available I analyze how to manage employees to achieve a balance between exploration and exploitation in large established firms. Previous studies suggest that, although firms need to undertake both exploration and exploitation simultaneously, this is difficult either because of the scarcity of resources or because of the incompatibility of these two processes. Proposed solutions have been ambidexterity, punctuated equilibrium or specialization. I suggest another method: managing employees. Specifically, I argue that using the so-called “innovative” system of human resource management practices, consisting of team-based incentive system, team-based job design, and job rotation, enables the firm to undertake exploration and exploitation simultaneously because it provides the psychological safety for people to explore new knowledge to make novel products and develops employees to have the perspective-taking capability that enables the integration of knowledge cross-functionally for efficiency. Using the so-called “traditional” system of human resource management practices, consisting of individual-based incentive system, individual-based job design, and no job rotation, has limited impact on either exploration or exploitation because it does not create the psychological safety for people to explore new knowledge and does not develop the perspective-taking capability needed for exploitation. Moreover, mixing practices from both systems is better than only using the traditional system in achieving exploration or exploitation, but less effective than only using the innovative system as the mix of practices can create inconsistent expectations on employees.

  17. Transnational gestational surrogacy: does it have to be exploitative?

    Science.gov (United States)

    Kirby, Jeffrey

    2014-01-01

    This article explores the controversial practice of transnational gestational surrogacy and poses a provocative question: Does it have to be exploitative? Various existing models of exploitation are considered and a novel exploitation-evaluation heuristic is introduced to assist in the analysis of the potentially exploitative dimensions/elements of complex health-related practices. On the basis of application of the heuristic, I conclude that transnational gestational surrogacy, as currently practiced in low-income country settings (such as rural, western India), is exploitative of surrogate women. Arising out of consideration of the heuristic's exploitation conditions, a set of public education and enabled choice, enhanced protections, and empowerment reforms to transnational gestational surrogacy practice is proposed that, if incorporated into a national regulatory framework and actualized within a low income country, could possibly render such practice nonexploitative.

  18. Exploitation as the Unequal Exchange of Labour : An Axiomatic Approach

    OpenAIRE

    Yoshihara, Naoki; Veneziani, Roberto

    2009-01-01

    In subsistence economies with general convex technology and rational optimising agents, a new, axiomatic approach is developed, which allows an explicit analysis of the core positive and normative intuitions behind the concept of exploitation. Three main new axioms, called Labour Exploitation in Subsistence Economies , Relational Exploitation , and Feasibility of Non-Exploitation , are presented and it is proved that they uniquely characterise a definition of exploitation conceptually related...

  19. CoBaltDB: Complete bacterial and archaeal orfeomes subcellular localization database and associated resources

    Directory of Open Access Journals (Sweden)

    Lucchetti-Miganeh Céline

    2010-03-01

    Full Text Available Abstract Background The functions of proteins are strongly related to their localization in cell compartments (for example the cytoplasm or membranes but the experimental determination of the sub-cellular localization of proteomes is laborious and expensive. A fast and low-cost alternative approach is in silico prediction, based on features of the protein primary sequences. However, biologists are confronted with a very large number of computational tools that use different methods that address various localization features with diverse specificities and sensitivities. As a result, exploiting these computer resources to predict protein localization accurately involves querying all tools and comparing every prediction output; this is a painstaking task. Therefore, we developed a comprehensive database, called CoBaltDB, that gathers all prediction outputs concerning complete prokaryotic proteomes. Description The current version of CoBaltDB integrates the results of 43 localization predictors for 784 complete bacterial and archaeal proteomes (2.548.292 proteins in total. CoBaltDB supplies a simple user-friendly interface for retrieving and exploring relevant information about predicted features (such as signal peptide cleavage sites and transmembrane segments. Data are organized into three work-sets ("specialized tools", "meta-tools" and "additional tools". The database can be queried using the organism name, a locus tag or a list of locus tags and may be browsed using numerous graphical and text displays. Conclusions With its new functionalities, CoBaltDB is a novel powerful platform that provides easy access to the results of multiple localization tools and support for predicting prokaryotic protein localizations with higher confidence than previously possible. CoBaltDB is available at http://www.umr6026.univ-rennes1.fr/english/home/research/basic/software/cobalten.

  20. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    International Nuclear Information System (INIS)

    Yeh, M; Wang, Y; Weng, H

    2015-01-01

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose

  1. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, M; Wang, Y; Weng, H [Chiayi Chang Gung Memorial Hospital of The C.G.M.F, Puzi City, Chiayi County, Taiwan (China)

    2015-06-15

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.

  2. Probabilistic Localization and Tracking of Malicious Insiders Using Hyperbolic Position Bounding in Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Laurendeau Christine

    2009-01-01

    Full Text Available A malicious insider in a wireless network may carry out a number of devastating attacks without fear of retribution, since the messages it broadcasts are authenticated with valid credentials such as a digital signature. In attributing an attack message to its perpetrator by localizing the signal source, we can make no presumptions regarding the type of radio equipment used by a malicious transmitter, including the transmitting power utilized to carry out an exploit. Hyperbolic position bounding (HPB provides a mechanism to probabilistically estimate the candidate location of an attack message's originator using received signal strength (RSS reports, without assuming knowledge of the transmitting power. We specialize the applicability of HPB into the realm of vehicular networks and provide alternate HPB algorithms to improve localization precision and computational efficiency. We extend HPB for tracking the consecutive locations of a mobile attacker. We evaluate the localization and tracking performance of HPB in a vehicular scenario featuring a variable number of receivers and a known navigational layout. We find that HPB can position a transmitting device within stipulated guidelines for emergency services localization accuracy.

  3. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  4. Shredder: GPU-Accelerated Incremental Storage and Computation

    OpenAIRE

    Bhatotia, Pramod; Rodrigues, Rodrigo; Verma, Akshat

    2012-01-01

    Redundancy elimination using data deduplication and incremental data processing has emerged as an important technique to minimize storage and computation requirements in data center computing. In this paper, we present the design, implementation and evaluation of Shredder, a high performance content-based chunking framework for supporting incremental storage and computation systems. Shredder exploits the massively parallel processing power of GPUs to overcome the CPU bottlenecks of content-ba...

  5. ATLAS and LHC computing on CRAY

    CERN Document Server

    Haug, Sigve; The ATLAS collaboration

    2016-01-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one import measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb from a dedicated cluster to the large CRAY systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  6. ATLAS and LHC computing on CRAY

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00297774; The ATLAS collaboration; Haug, Sigve

    2017-01-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  7. Subspace exclusion zones for damage localization

    DEFF Research Database (Denmark)

    Bernal, Dionisio; Ulriksen, Martin Dalgaard

    2018-01-01

    , this is exploited in the context of structural damage localization to cast the Subspace Exclusion Zone (SEZ) scheme, which locates damage by reconstructing the captured field quantity shifts from analytical subspaces indexed by postulated boundaries, the so-called exclusion zones (EZs), in a model of the structure...

  8. Abnormalities by pulmonary regions studied with computer tomography following local or local-regional radiotherapy for breast cancer

    International Nuclear Information System (INIS)

    Lind, Pehr; Svane, Gunilla; Gagliardi, Giovanna; Svensson, Christer

    1999-01-01

    Purpose: To study pulmonary radiological abnormalities with computer tomography (CT) following different radiotherapy (RT) techniques for breast cancer with respect to regions and density, and their correlation to pulmonary complications and reduction in vital capacity (VC). Methods and Materials: CT scans of the lungs were performed prior to and 4 months following RT in 105 breast cancer patients treated with local or local-regional RT. The radiological abnormalities were analyzed with a CT-adapted modification of a classification system originally proposed by Arriagada, and scored according to increasing density (0-3) and affected lung regions (apical-lateral, central-parahilar, basal-lateral). The highest density grade in each region were added together to form scores ranging from 0-9. The patients were monitored for RT-induced pulmonary complications. VC was measured prior to and 5 months following RT. Results: Increasing CT scores were correlated with both local-regional RT and pulmonary complications (p < 0.001). The mean reduction of VC for patients scoring 4-9 (-202 ml) was larger than for patients scoring 0-3 (-2 ml) (p = 0.035). The effect of confounding factors on the radiological scoring was tested in the local-regional RT group. Scores of 4-9 were less frequently seen in the patients who had received adjuvant chemotherapy prior to RT. The importance of the respective lung regions on the outcome of pulmonary complications was tested. Only radiological abnormalities in the central-parahilar and apical-lateral regions were significantly correlated to pulmonary complications. Discussion: Radiological abnormalities detected on CT images and scored with a modification of Arriagada's classification system can be used as an objective endpoint for pulmonary side effects in breast cancer. The described model should, however, be expanded with information about the volume of lung affected in each region before definite conclusions can be drawn concerning each

  9. SEXUAL EXPLOITATION AND ABUSE BY UN PEACEKEEPERS ...

    African Journals Online (AJOL)

    Allaiac

    sexual exploitation of children by peacekeepers is particularly insidious. ... sexual exploitation and abuse should involve an understanding of the social .... The charges of sexual misconduct, and the consequent media exposure, have ..... awareness programmes such as video tapes, lectures and training manuals, designed.

  10. Imaging local cerebral blood flow by xenon-enhanced computed tomography - technical optimization procedures

    International Nuclear Information System (INIS)

    Meyer, J.S.; Shinohara, T.; Imai, A.; Kobari, M.; Solomon, E.

    1988-01-01

    Methods are described for non-invasive, computer-assisted serial scanning throughout the human brain during eight minutes of inhalation of 27%-30% xenon gas in order to measure local cerebral blood flow (LCBF). Optimized xenon-enhanced computed tomography (XeCT) was achieved by 5-second scanning at one-minute intervals utilizing a state-of-the-art CT scanner and rapid delivery of xenon gas via a face mask. Values for local brain-blood partition coefficients (Lλ) measured in vivo were utilized to calculate LCBF values. Previous methods assumed Lλ values to be normal, introducing the risk of systematic errors, because Lλ values differ throughout normal brain and may be altered by disease. Color-coded maps of Lλ and LCBF values were formatted directly onto CT images for exact correlation of function with anatomic and pathologic observations (spatial resolution: 26.5 cubic mm). Results were compared among eight normal volunteers, aged between 50 and 88 years. Mean cortical gray matter blood flow was 46.3 ± 7.7, for subcortical gray matter it was 50.3 ± 13.2 and for white matter it was 18.8 ± 3.2. Modern CT scanners provide stability, improved signal to noise ratio and minimal radiation scatter. Combining these advantages with rapid xenon saturation of the blood provides correlations of Lλ and LCBF with images of normal and abnormal brain in a safe, useful and non-invasive manner. (orig.)

  11. Graphical User Interface Programming in Introductory Computer Science.

    Science.gov (United States)

    Skolnick, Michael M.; Spooner, David L.

    Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…

  12. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  13. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Oliver S., E-mail: osmart@globalphasing.com; Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard [Global Phasing Ltd, Sheraton House, Castle Park, Cambridge CB3 0AX (United Kingdom)

    2012-04-01

    Local structural similarity restraints (LSSR) provide a novel method for exploiting NCS or structural similarity to an external target structure. Two examples are given where BUSTER re-refinement of PDB entries with LSSR produces marked improvements, enabling further structural features to be modelled. Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and @@target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -target enables the correct ligand-binding structure to be found, and http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand.

  14. Exploiting first-class arrays in Fortran for accelerator programming

    International Nuclear Information System (INIS)

    Rasmussen, Craig E.; Weseloh, Wayne N.; Robey, Robert W.; Sottile, Matthew J.; Quinlan, Daniel; Overbey, Jeffrey

    2010-01-01

    Emerging architectures for high performance computing often are well suited to a data parallel programming model. This paper presents a simple programming methodology based on existing languages and compiler tools that allows programmers to take advantage of these systems. We will work with the array features of Fortran 90 to show how this infrequently exploited, standardized language feature is easily transformed to lower level accelerator code. Our transformations are based on a mapping from Fortran 90 to C++ code with OpenCL extensions. The sheer complexity of programming for clusters of many or multi-core processors with tens of millions threads of execution make the simplicity of the data parallel model attractive. Furthermore, the increasing complexity of todays applications (especially when convolved with the increasing complexity of the hardware) and the need for portability across hardware architectures make a higher-level and simpler programming model like data parallel attractive. The goal of this work has been to exploit source-to-source transformations that allow programmers to develop and maintain programs at a high-level of abstraction, without coding to a specific hardware architecture. Furthermore these transformations allow multiple hardware architectures to be targeted without changing the high-level source. It also removes the necessity for application programmers to understand details of the accelerator architecture or to know OpenCL.

  15. Exploitation and exploration dynamics in recessionary times

    NARCIS (Netherlands)

    Walrave, B.

    2012-01-01

    Firm performance largely depends on the ability to adapt to, and exploit, changes in the business environment. That is, firms should maintain ecological fitness by reconfiguring their resource base to cope with emerging threats and explore new opportunities, while at the same time exploiting

  16. Main principles of developing exploitation models of semiconductor devices

    Science.gov (United States)

    Gradoboev, A. V.; Simonova, A. V.

    2018-05-01

    The paper represents primary tasks, solutions of which allow to develop the exploitation modes of semiconductor devices taking into account complex and combined influence of ionizing irradiation and operation factors. The structure of the exploitation model of the semiconductor device is presented, which is based on radiation and reliability models. Furthermore, it was shown that the exploitation model should take into account complex and combine influence of various ionizing irradiation types and operation factors. The algorithm of developing the exploitation model of the semiconductor devices is proposed. The possibility of creating the radiation model of Schottky barrier diode, Schottky field-effect transistor and Gunn diode is shown based on the available experimental data. The basic exploitation model of IR-LEDs based upon double AlGaAs heterostructures is represented. The practical application of the exploitation models will allow to output the electronic products with guaranteed operational properties.

  17. EXPLOT - decision support system for optimization of oil exploitation; EXPLOT - sistema de apoio a decisao para a otimizacao da explotacao de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Tupac Valdivia, Yvan Jesus; Almeida, Luciana Faletti; Pacheco, Marco Aurelio Cavalcanti; Vellasco, Marley Maria Bernardes Rebuzzi [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Eletrica. Lab. de Inteligencia Computacional], e-mail: yvantv@ele.puc-rio.br, e-mail: faletti@ele.puc-rio.br, e-mail: marco@ele.puc-rio.br, e-mail: marley@ele.puc-rio.br

    2007-06-15

    The present work offers a decision supporting system, integrated in different techniques (genetic algorithms, cultural algorithms, co-evolution, neural networks, neuro fuzzy model and distributed processing) for optimization of exploitation of oil reservoirs. The EXPLOT system identifies exploitation alternatives and determines the quantity, position, type (producers or injectors) and structure (horizontal or vertical) of wells, that maximize the present net value of the alternative (VPL). The EXPLOT system is composed of three main modules: the optimizer (genetic algorithms, cultural algorithms and co-evolution), the Production Curves Obtention (approximator neuro fuzzy-NFHB of the production curve) and the present net value calculation. To estimate the VPL of each developmental alternative, the system utilizes a reservoir simulator, specifically the IMEX, although other simulators may be utilized. In addition to these technologies, the system also utilizes distributed processing, based on the CORBA architecture for distributed execution of the reservoir simulator in a computer network, which significantly reduces the total optimization time. The EXPLOT system was already tested in different examples of oil fields. Results obtained so far are considered consistent according to the opinion of specialists, who consider the system as a new decision support tool concept in the area. The differences of EXPLOT are not only to be found in its efficient optimization model, but also in its interface, through which the specialists interact with the system, introducing project recommendations (e.g., five-spot wells), commanding a localized search for best solutions, sizing the simulation network and monitoring simulation distribution by means of available networks. The EXPLOT system is the result of joint research between CENPES and the Applied Computational Intelligence Lab, PUC-Rio, accomplished during the past three years. The continuation of this research project expands

  18. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    Energy Technology Data Exchange (ETDEWEB)

    Czuchlewski, Kristina Rodriguez [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hart, William E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of human perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into

  19. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  20. A 250-Mbit/s ring local computer network using 1.3-microns single-mode optical fibers

    Science.gov (United States)

    Eng, S. T.; Tell, R.; Andersson, T.; Eng, B.

    1985-01-01

    A 250-Mbit/s three-station fiber-optic ring local computer network was built and successfully demonstrated. A conventional token protocol was employed for bus arbitration to maximize the bus efficiency under high loading conditions, and a non-return-to-zero (NRS) data encoding format was selected for simplicity and maximum utilization of the ECL-circuit bandwidth.

  1. Time complexity analysis for distributed memory computers: implementation of parallel conjugate gradient method

    NARCIS (Netherlands)

    Hoekstra, A.G.; Sloot, P.M.A.; Haan, M.J.; Hertzberger, L.O.; van Leeuwen, J.

    1991-01-01

    New developments in Computer Science, both hardware and software, offer researchers, such as physicists, unprecedented possibilities to solve their computational intensive problems.However, full exploitation of e.g. new massively parallel computers, parallel languages or runtime environments

  2. Packaging of Sin Goods - Commitment or Exploitation?

    DEFF Research Database (Denmark)

    Nafziger, Julia

    to such self-control problems, and possibly exploit them, by offering different package sizes. In a competitive market, either one or three (small, medium and large) packages are offered. In contrast to common intuition, the large, and not the small package is a commitment device. The latter serves to exploit...

  3. Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition.

    Directory of Open Access Journals (Sweden)

    Johannes Bill

    Full Text Available During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.

  4. Distributed Bayesian Computation and Self-Organized Learning in Sheets of Spiking Neurons with Local Lateral Inhibition

    Science.gov (United States)

    Bill, Johannes; Buesing, Lars; Habenschuss, Stefan; Nessler, Bernhard; Maass, Wolfgang; Legenstein, Robert

    2015-01-01

    During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input. PMID:26284370

  5. Exploiting MIC architectures for the simulation of channeling of charged particles in crystals

    Science.gov (United States)

    Bagli, Enrico; Karpusenko, Vadim

    2016-08-01

    Coherent effects of ultra-relativistic particles in crystals is an area of science under development. DYNECHARM + + is a toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures. The particle trajectory in a crystal is computed through numerical integration of the equation of motion. The code was revised and improved in order to exploit parallelization on multi-cores and vectorization of single instructions on multiple data. An Intel Xeon Phi card was adopted for the performance measurements. The computation time was proved to scale linearly as a function of the number of physical and virtual cores. By enabling the auto-vectorization flag of the compiler a three time speedup was obtained. The performances of the card were compared to the Dual Xeon ones.

  6. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    Directory of Open Access Journals (Sweden)

    Graham Cormode

    Full Text Available Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines, computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH methods and evaluate four variants in a distributed computing environment (specifically, Hadoop. We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  7. A memory-array architecture for computer vision

    Energy Technology Data Exchange (ETDEWEB)

    Balsara, P.T.

    1989-01-01

    With the fast advances in the area of computer vision and robotics there is a growing need for machines that can understand images at a very high speed. A conventional von Neumann computer is not suited for this purpose because it takes a tremendous amount of time to solve most typical image processing problems. Exploiting the inherent parallelism present in various vision tasks can significantly reduce the processing time. Fortunately, parallelism is increasingly affordable as hardware gets cheaper. Thus it is now imperative to study computer vision in a parallel processing framework. The author should first design a computational structure which is well suited for a wide range of vision tasks and then develop parallel algorithms which can run efficiently on this structure. Recent advances in VLSI technology have led to several proposals for parallel architectures for computer vision. In this thesis he demonstrates that a memory array architecture with efficient local and global communication capabilities can be used for high speed execution of a wide range of computer vision tasks. This architecture, called the Access Constrained Memory Array Architecture (ACMAA), is efficient for VLSI implementation because of its modular structure, simple interconnect and limited global control. Several parallel vision algorithms have been designed for this architecture. The choice of vision problems demonstrates the versatility of ACMAA for a wide range of vision tasks. These algorithms were simulated on a high level ACMAA simulator running on the Intel iPSC/2 hypercube, a parallel architecture. The results of this simulation are compared with those of sequential algorithms running on a single hypercube node. Details of the ACMAA processor architecture are also presented.

  8. New computer simulation technology of WSPEEDI for local and regional environmental assessment during nuclear emergency

    International Nuclear Information System (INIS)

    Chino, Masamichi; Furuno, Akiko; Terada, Hiroaki; Kitabata, Hideyuki

    2002-01-01

    The increase of nuclear power plants in the Asian region necessitates the capability to predict long-range atmospheric dispersions of radionuclides and radiological impacts due to a nuclear accident. For this purpose, we have developed a computer-based emergency response system WSPEEDI. This paper aims to expanding the capability of WSPEEDI so that it can be applied to simultaneous multi-scale predictions of local and regional scales in the Asian region

  9. Computer simulation of local atomic displacements in alloys. Application to Guinier-Preston zones in Al-Cu

    International Nuclear Information System (INIS)

    Kyobu, J.; Murata, Y.; Morinaga, M.

    1994-01-01

    A new computer program has been developed for the simulation of local atomic displacements in alloys with face-centered-cubic and body-centered-cubic lattices. The combined use of this program with the Gehlen-Cohen program for the simulation of chemical short-range order completely describes atomic fluctuations in alloys. The method has been applied to the structural simulation of Guinier-Preston (GP) zones in an Al-Cu alloy, using the experimental data of Matsubara and Cohen. Characteristic displacements of atoms have been observed around the GP zones and new structural models including local displacements have been proposed for a single-layer zone and several multilayer zones. (orig.)

  10. Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Canon, Shane

    2011-10-12

    DOE JGI's Zhong Wang, chair of the High-performance Computing session, gives a brief introduction before Berkeley Lab's Shane Canon talks about "Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  11. A Theory of Exploitative Child Labor

    OpenAIRE

    Carol Ann Rogers; Kenneth A. Swinnerton

    2003-01-01

    Child labor laws should aim to protect children who work, instead of trying to remove children from work. In this paper, we identify an instance when the risk of exploitation lowers the expected bene…t of child labor to the child,and therefore suppresses child labor force participation. Targeted legal intervention that lowers or removes the risk of exploitation raises child participation in the labor market, child welfare, and overall societal welfare. Targeting on child labor more broadly ma...

  12. ATLAS off-Grid sites (Tier 3) monitoring. From local fabric monitoring to global overview of the VO computing activities

    CERN Document Server

    PETROSYAN, A; The ATLAS collaboration; BELOV, S; ANDREEVA, J; KADOCHNIKOV, I

    2012-01-01

    The ATLAS Distributed Computing activities have so far concentrated in the "central" part of the experiment computing system, namely the first 3 tiers (the CERN Tier0, 10 Tier1 centers and over 60 Tier2 sites). Many ATLAS Institutes and National Communities have deployed (or intend to) deploy Tier-3 facilities. Tier-3 centers consist of non-pledged resources, which are usually dedicated to data analysis tasks by the geographically close or local scientific groups, and which usually comprise a range of architectures without Grid middleware. Therefore a substantial part of the ATLAS monitoring tools which make use of Grid middleware, cannot be used for a large fraction of Tier3 sites. The presentation will describe the T3mon project, which aims to develop a software suite for monitoring the Tier3 sites, both from the perspective of the local site administrator and that of the ATLAS VO, thereby enabling the global view of the contribution from Tier3 sites to the ATLAS computing activities. Special attention in p...

  13. Rationalising predictors of child sexual exploitation and sex-trading.

    Science.gov (United States)

    Klatt, Thimna; Cavner, Della; Egan, Vincent

    2014-02-01

    Although there is evidence for specific risk factors leading to child sexual exploitation and prostitution, these influences overlap and have rarely been examined concurrently. The present study examined case files for 175 young persons who attended a voluntary organization in Leicester, United Kingdom, which supports people who are sexually exploited or at risk of sexual exploitation. Based on the case files, the presence or absence of known risk factors for becoming a sex worker was coded. Data were analyzed using t-test, logistic regression, and smallest space analysis. Users of the voluntary organization's services who had been sexually exploited exhibited a significantly greater number of risk factors than service users who had not been victims of sexual exploitation. The logistic regression produced a significant model fit. However, of the 14 potential predictors--many of which were associated with each other--only four variables significantly predicted actual sexual exploitation: running away, poverty, drug and/or alcohol use, and having friends or family members in prostitution. Surprisingly, running away was found to significantly decrease the odds of becoming involved in sexual exploitation. Smallest space analysis of the data revealed 5 clusters of risk factors. Two of the clusters, which reflected a desperation and need construct and immature or out-of-control lifestyles, were significantly associated with sexual exploitation. Our research suggests that some risk factors (e.g. physical and emotional abuse, early delinquency, and homelessness) for becoming involved in sexual exploitation are common but are part of the problematic milieu of the individuals affected and not directly associated with sex trading itself. Our results also indicate that it is important to engage with the families and associates of young persons at risk of becoming (or remaining) a sex worker if one wants to reduce the numbers of persons who engage in this activity. Copyright

  14. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  15. Shale gas exploitation: Status, problems and prospect

    Directory of Open Access Journals (Sweden)

    Shiqian Wang

    2018-02-01

    Full Text Available Over the past five years, great progress has been made in shale gas exploitation, which has become the most driving force for global gas output growth. Hydrocarbon extraction from shale helps drive the USA on the road to energy independence. Besides, shale oil & gas production has been kept in a sustained growth by continuous improvement in drilling efficiency and well productivity in the case of tumbling oil prices and rig counts. Shale gas reserves and production have been in a rapid growth in China owing to the Lower Paleozoic Wufeng and Longmaxi shale gas exploitation in the Sichuan Basin, which has become an important sector for the future increment of gas reserves and output in China. However, substantial progress has been made neither in non-marine shale gas exploitation as previously expected nor in the broad complicated tectonic areas in South China for which a considerable investment was made. Analysis of the basic situation and issues in domestic shale gas development shows that shale gas exploitation prospects are constrained by many problems in terms of resources endowment, horizontal well fracturing technology, etc. especially in non-marine shale deposits and complicated tectonic areas in South China where hot shales are widely distributed but geological structures are found severely deformed and over matured. Discussion on the prospects shows that the sustained and steady growth in shale gas reserves and production capacity in the coming years lies in the discovery and supersession of new shale plays in addition to Wufeng and Longmaxi shale plays, and that a technological breakthrough in ultra-high-pressure and ultra-deep (over 3500 m buried in the Sichuan Basin marine shale gas exploitation is the key and hope. Keywords: Shale gas, Exploitation, Marine facies, Hot shale, Resource endowment, Sichuan Basin, South China, Complicated tectonic area, Gas play

  16. Hybrid Indoor-Based WLAN-WSN Localization Scheme for Improving Accuracy Based on Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Zahid Farid

    2016-01-01

    Full Text Available In indoor environments, WiFi (RSS based localization is sensitive to various indoor fading effects and noise during transmission, which are the main causes of localization errors that affect its accuracy. Keeping in view those fading effects, positioning systems based on a single technology are ineffective in performing accurate localization. For this reason, the trend is toward the use of hybrid positioning systems (combination of two or more wireless technologies in indoor/outdoor localization scenarios for getting better position accuracy. This paper presents a hybrid technique to implement indoor localization that adopts fingerprinting approaches in both WiFi and Wireless Sensor Networks (WSNs. This model exploits machine learning, in particular Artificial Natural Network (ANN techniques, for position calculation. The experimental results show that the proposed hybrid system improved the accuracy, reducing the average distance error to 1.05 m by using ANN. Applying Genetic Algorithm (GA based optimization technique did not incur any further improvement to the accuracy. Compared to the performance of GA optimization, the nonoptimized ANN performed better in terms of accuracy, precision, stability, and computational time. The above results show that the proposed hybrid technique is promising for achieving better accuracy in real-world positioning applications.

  17. East-West paths to unconventional computing.

    Science.gov (United States)

    Adamatzky, Andrew; Akl, Selim; Burgin, Mark; Calude, Cristian S; Costa, José Félix; Dehshibi, Mohammad Mahdi; Gunji, Yukio-Pegio; Konkoli, Zoran; MacLennan, Bruce; Marchal, Bruno; Margenstern, Maurice; Martínez, Genaro J; Mayne, Richard; Morita, Kenichi; Schumann, Andrew; Sergeyev, Yaroslav D; Sirakoulis, Georgios Ch; Stepney, Susan; Svozil, Karl; Zenil, Hector

    2017-12-01

    Unconventional computing is about breaking boundaries in thinking, acting and computing. Typical topics of this non-typical field include, but are not limited to physics of computation, non-classical logics, new complexity measures, novel hardware, mechanical, chemical and quantum computing. Unconventional computing encourages a new style of thinking while practical applications are obtained from uncovering and exploiting principles and mechanisms of information processing in and functional properties of, physical, chemical and living systems; in particular, efficient algorithms are developed, (almost) optimal architectures are designed and working prototypes of future computing devices are manufactured. This article includes idiosyncratic accounts of 'unconventional computing' scientists reflecting on their personal experiences, what attracted them to the field, their inspirations and discoveries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Strategic aspects of exploiting geothermal energy for industrial purposes

    International Nuclear Information System (INIS)

    Ludviksson, V.

    1992-01-01

    Geothermal energy is widely used in Iceland for space heating swimming pools and snow melting systems as well as for greenhouses and soil heating and aquaculture. Its contribution to the standard of living in Iceland is very substantial. The industrial applications are, however, fewer today than anticipated twenty years ago. This paper considers some of the socio-economic reasons for that. Although geothermal energy is generally a cost competitive source of energy, it is site limited and does not by itself provide sufficient economic incentive to attract manufacturing or process industries. This generally requires another, locally available production factor offering further competitive advantage to justify greenfield investments. World economic slow-downs, and structural problems in many process industries after the energy crisis of the seventies have reduced interest for investments in energy intensify industries world wide. While public sector initiative motivated by technological possibilities was instrumental for developing geothermal resources in the past, time has now come for private sector initiative, led by market interest, to identify and exploit opportunities for using geothermal energy for industrial purposes. National and local governments must, however, provide the appropriate incentives to stimulate such developments

  19. The possibilities of exploitation of Serbian thermomineral waters

    International Nuclear Information System (INIS)

    Jovanovic, L.

    2002-01-01

    Global ecological problem of petrol resources deficit caused an intensive search of alternative energy sources. Deficit of conventional energy fluids in Yugoslavia requires serious efforts to create a program of alternative energy sources exploitation. Geothermal energy represents an important energetic source for the countries with poor energy resources. Geothermal energy can become the basis for economic development. At present these geothermal resources are not being exploited in Yugoslavia. The possibilities of effective exploitation of thermal and thermomineral water resources in Yugoslavia are presented in this paper

  20. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  1. Transient Response Improvement of Microgrids Exploiting the Inertia of a Doubly-Fed Induction Generator (DFIG

    Directory of Open Access Journals (Sweden)

    Christina N. Papadimitriou

    2010-06-01

    Full Text Available Storage devices are introduced in microgrids in order to secure their power quality, power regularity and offer ancillary services in a transient period. In the transition period of a low voltage microgrid, from the connected mode of operation to the islanded mode of operation, the power unbalance can be partly covered by the inertia energy of the existing power sources. This paper proposes fuzzy local controllers exploiting the inertia of a Wind Turbine (WT with a Doubly Fed Induction Generator (DFIG, if such a machine exists in the microgrid, in order to decrease the necessary storage devices and the drawbacks that arise. The proposed controllers are based in fuzzy logic due to the non linear and stochastic behavior of the system. Two cases are studied and compared during the transient period where the microgrid architecture and the DFIG controller differ. In the first case, the understudy microgrid includes a hybrid fuel cell system (FCS-battery system and a WT with a DFIGURE. The DFIG local controller in this case is also based in fuzzy logic and follows the classical optimum power absorption scenario for the WT. The transition of the microgrid from the connected mode of operation to the islanded mode is evaluated and, especially, the battery contribution is estimated. In the second case, the battery is eliminated. The fuzzy controller of the DFIG during the transition provides primary frequency control and local bus voltage support exploiting the WT inertia. The response of the system is estimated in both cases using MATLAB/Simulink software package.

  2. Thermal Property Engineering: Exploiting the Properties of Ceramic Nanocomposites

    Science.gov (United States)

    2018-03-01

    ARL-TR-8308 ● MAR 2018 US Army Research Laboratory Thermal Property Engineering : Exploiting the Properties of Ceramic...return it to the originator. ARL-TR-8308 ● MAR 2018 US Army Research Laboratory Thermal Property Engineering : Exploiting the...2015 – Dec 31 2017 4. TITLE AND SUBTITLE Thermal Property Engineering : Exploiting the Properties of Ceramic Nanocomposites 5a. CONTRACT NUMBER 5b

  3. Local pulmonary structure classification for computer-aided nodule detection

    Science.gov (United States)

    Bahlmann, Claus; Li, Xianlin; Okada, Kazunori

    2006-03-01

    We propose a new method of classifying the local structure types, such as nodules, vessels, and junctions, in thoracic CT scans. This classification is important in the context of computer aided detection (CAD) of lung nodules. The proposed method can be used as a post-process component of any lung CAD system. In such a scenario, the classification results provide an effective means of removing false positives caused by vessels and junctions thus improving overall performance. As main advantage, the proposed solution transforms the complex problem of classifying various 3D topological structures into much simpler 2D data clustering problem, to which more generic and flexible solutions are available in literature, and which is better suited for visualization. Given a nodule candidate, first, our solution robustly fits an anisotropic Gaussian to the data. The resulting Gaussian center and spread parameters are used to affine-normalize the data domain so as to warp the fitted anisotropic ellipsoid into a fixed-size isotropic sphere. We propose an automatic method to extract a 3D spherical manifold, containing the appropriate bounding surface of the target structure. Scale selection is performed by a data driven entropy minimization approach. The manifold is analyzed for high intensity clusters, corresponding to protruding structures. Techniques involve EMclustering with automatic mode number estimation, directional statistics, and hierarchical clustering with a modified Bhattacharyya distance. The estimated number of high intensity clusters explicitly determines the type of pulmonary structures: nodule (0), attached nodule (1), vessel (2), junction (>3). We show accurate classification results for selected examples in thoracic CT scans. This local procedure is more flexible and efficient than current state of the art and will help to improve the accuracy of general lung CAD systems.

  4. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    Directory of Open Access Journals (Sweden)

    Yongkai An

    2015-07-01

    Full Text Available This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately.

  5. Resonance localization in tokamaks excited with ICRF waves

    International Nuclear Information System (INIS)

    Kerbel, G.D.; McCoy, M.G.

    1985-01-01

    Advanced wave models used to evaluate ICRH in tokamaks typically use warm plasma theory and allow inhomogeneity in one dimension. The majority of these calculations neglect the fact that gyrocenters experience the inhomogeneity via their motion parallel to the magnetic field. The non-local effects of rotational transform and toroidicity can play a significant role in both the propagation and the absorption physics. In strongly driven systems, wave damping can distort the particle distribution function supporting the wave and this produces changes in the absorption. The most common approach is to use Maxwellian absorption rates. We have developed a bounce-averaged Fokker-Planck quasilinear computational model which evolves the population of particles on more realistic orbits. Each wave-particle resonance has its own specific interaction amplitude within any given volume element; these data need only be generated once, and appropriately stored for efficient retrieval. The wave-particle resonant interaction then serves as a mechanism by which the diffusion of particle populations can proceed among neighboring orbits. The local specific spectral energy absorption rate is directly calculable once the orbit geometry and populations are determined. The code is constructed in such fashion as to accommodate wave propagation models which provide the wave spectral energy density on a poloidal cross-section. Information provided by the calculation includes the local absorption properties of the medium which can then be exploited to evolve the wave field

  6. Rethinking exploitation: a process-centered account.

    Science.gov (United States)

    Jansen, Lynn A; Wall, Steven

    2013-12-01

    Exploitation has become an important topic in recent discussions of biomedical and research ethics. This is due in no small measure to the influence of Alan Wertheimer's path-breaking work on the subject. This paper presents some objections to Wertheimer's account of the concept. The objections attempt to show that his account places too much emphasis on outcome-based considerations and too little on process-based considerations. Building on these objections, the paper develops an alternative process-centered account of the concept. This alternative account of exploitation takes as its point of departure the broadly Kantian notion that it is wrong to use another as an instrument for the advancement of one's own ends. It sharpens this slippery notion and adds a number of refinements to it. The paper concludes by arguing that process-centered accounts of exploitation better illuminate the ethical challenges posed by research on human subjects than outcome-centered accounts.

  7. Quality of Governance and Local Development: The Case of Top Nine Performing Local Government Units in the Philippines

    OpenAIRE

    MA. NIÑA I. ADRIANO

    2014-01-01

    There is a large body of literature that studies the link between good governance and development in a country level. However, only a few have exploited the same study in the local government unit (LGU) setting. This study attempts to establish the relationship between the quality of governance and the state of local development of the Top 9 Performing LGUs in the Philippines (La Union, Albay, Cavite, Ilocos Norte, Makati City Valenzuela City, Taguig City, Davao City and Angeles C...

  8. Synthetic analog computation in living cells.

    Science.gov (United States)

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  9. Exploiting intrinsic fluctuations to identify model parameters.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven; Pahle, Jürgen

    2015-04-01

    Parameterisation of kinetic models plays a central role in computational systems biology. Besides the lack of experimental data of high enough quality, some of the biggest challenges here are identification issues. Model parameters can be structurally non-identifiable because of functional relationships. Noise in measured data is usually considered to be a nuisance for parameter estimation. However, it turns out that intrinsic fluctuations in particle numbers can make parameters identifiable that were previously non-identifiable. The authors present a method to identify model parameters that are structurally non-identifiable in a deterministic framework. The method takes time course recordings of biochemical systems in steady state or transient state as input. Often a functional relationship between parameters presents itself by a one-dimensional manifold in parameter space containing parameter sets of optimal goodness. Although the system's behaviour cannot be distinguished on this manifold in a deterministic framework it might be distinguishable in a stochastic modelling framework. Their method exploits this by using an objective function that includes a measure for fluctuations in particle numbers. They show on three example models, immigration-death, gene expression and Epo-EpoReceptor interaction, that this resolves the non-identifiability even in the case of measurement noise with known amplitude. The method is applied to partially observed recordings of biochemical systems with measurement noise. It is simple to implement and it is usually very fast to compute. This optimisation can be realised in a classical or Bayesian fashion.

  10. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  11. Computing activities for the P-bar ANDA experiment at FAIR

    International Nuclear Information System (INIS)

    Messchendorp, Johan

    2010-01-01

    The P-bar ANDA experiment at the future facility FAIR will provide valuable data for our present understanding of the strong interaction. In preparation for the experiments, large-scale simulations for design and feasibility studies are performed exploiting a new software framework, P-bar ANDAROOT, which is based on FairROOT and the Virtual Monte Carlo interface, and which runs on a large-scale computing GRID environment exploiting the AliEn 2 middleware. In this paper, an overview is given of the P-bar ANDA experiment with the emphasis on the various developments which are pursuit to provide a user and developer friendly computing environment for the P-bar ANDA collaboration.

  12. CernVM Co-Pilot: an Extensible Framework for Building Scalable Cloud Computing Infrastructures

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    CernVM Co-Pilot is a framework for instantiating an ad-hoc computing infrastructure on top of distributed computing resources. Such resources include commercial computing clouds (e.g. Amazon EC2), scientific computing clouds (e.g. CERN lxcloud), as well as the machines of users participating in volunteer computing projects (e.g. BOINC). The framework consists of components that communicate using the Extensible Messaging and Presence protocol (XMPP), allowing for new components to be developed in virtually any programming language and interfaced to existing Grid and batch computing infrastructures exploited by the High Energy Physics community. Co-Pilot has been used to execute jobs for both the ALICE and ATLAS experiments at CERN. CernVM Co-Pilot is also one of the enabling technologies behind the LHC@home 2.0 volunteer computing project, which is the first such project that exploits virtual machine technology. The use of virtual machines eliminates the necessity of modifying existing applications and adapt...

  13. Demonstration of measurement-only blind quantum computing

    Science.gov (United States)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Morimae, Tomoyuki; Walther, Philip

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks.

  14. A bi-population based scheme for an explicit exploration/exploitation trade-off in dynamic environments

    Science.gov (United States)

    Ben-Romdhane, Hajer; Krichen, Saoussen; Alba, Enrique

    2017-05-01

    Optimisation in changing environments is a challenging research topic since many real-world problems are inherently dynamic. Inspired by the natural evolution process, evolutionary algorithms (EAs) are among the most successful and promising approaches that have addressed dynamic optimisation problems. However, managing the exploration/exploitation trade-off in EAs is still a prevalent issue, and this is due to the difficulties associated with the control and measurement of such a behaviour. The proposal of this paper is to achieve a balance between exploration and exploitation in an explicit manner. The idea is to use two equally sized populations: the first one performs exploration while the second one is responsible for exploitation. These tasks are alternated from one generation to the next one in a regular pattern, so as to obtain a balanced search engine. Besides, we reinforce the ability of our algorithm to quickly adapt after cnhanges by means of a memory of past solutions. Such a combination aims to restrain the premature convergence, to broaden the search area, and to speed up the optimisation. We show through computational experiments, and based on a series of dynamic problems and many performance measures, that our approach improves the performance of EAs and outperforms competing algorithms.

  15. The Theory of Exploitation as the Unequal Exchange of Labour

    OpenAIRE

    Veneziani, Roberto; Yoshihara, Naoki

    2016-01-01

    This paper analyses the normative and positive foundations of the theory of exploitation as the unequal exchange of labour (UEL). The key intuitions behind all of the main approaches to UEL exploitation are explicitly analysed as a series of formal claims in a general economic environment. It is then argued that these intuitions can be captured by one fundamental axiom - called Labour Exploitation - which defines the basic domain of all UEL exploitation forms and identifies the formal and the...

  16. The theory of exploitation as the unequal exchange of labour

    OpenAIRE

    Veneziani, Roberto; Yoshihara, Naoki

    2017-01-01

    This paper analyses the normative and positive foundations of the theory of exploitation as the unequal exchange of labour (UEL). The key intuitions behind all of the main approaches to UEL exploitation are explicitly analysed as a series of formal claims in a general economic environment. It is then argued that these intuitions can be captured by one fundamental axiom - called Labour Exploitation - which defines the basic domain of all UEL exploitation forms and identifies the formal and the...

  17. Lend Global, Fund Local? Price and Funding Cost Margins in Multinational Banking

    NARCIS (Netherlands)

    Galema, R.; Koetter, M.; Liesegang, C.

    2016-01-01

    In a proposed model of a multinational bank, interest margins determine local lending by foreign affiliates and the internal funding by parent banks. We exploit detailed parent-affiliate-level data of all German banks to empirically test our theoretical predictions in pre-crisis times. Local lending

  18. THE HERSCHEL EXPLOITATION OF LOCAL GALAXY ANDROMEDA (HELGA). VI. THE DISTRIBUTION AND PROPERTIES OF MOLECULAR CLOUD ASSOCIATIONS IN M31

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, J. M. [Jeremiah Horrocks Institute, University of Central Lancashire, Preston PR1 2HE (United Kingdom); Gear, W. K.; Smith, M. W. L.; Ford, G.; Eales, S. A.; Gomez, H. L. [School of Physics and Astronomy, Cardiff University, Queens Buildings, The Parade, Cardiff, Wales CF24 3AA (United Kingdom); Fritz, J.; Baes, M.; De Looze, I.; Gentile, G.; Gordon, K.; Verstappen, J.; Viaene, S. [Sterrenkundig Observatorium, Universiteit Gent, Krijgslaan 281 S9, B-9000 Gent (Belgium); Bendo, G. J. [UK ALMA Regional Centre Node, Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); O' Halloran, B. [Astrophysics Group, Imperial College, Blackett Laboratory, Prince Consort Road, London SW7 2AZ (United Kingdom); Madden, S. C.; Lebouteiller, V. [Laboratoire AIM, CEA/DSM-CNRS-Université Paris Diderot, Irfu/Service, Paris, F-91190 Gif-sur-Yvette (France); Roman-Duval, J. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Boselli, A. [Laboratoire d' Astrophysique de Marseille, UMR 7326 CNRS, 38 rue F. Joliot-Curie, F-13388 Marseille (France); Cooray, A. [Department of Physics and Astronomy, University of California, Irvine, CA 92697 (United States); and others

    2015-01-01

    In this paper we present a catalog of giant molecular clouds (GMCs) in the Andromeda (M31) galaxy extracted from the Herschel Exploitation of Local Galaxy Andromeda (HELGA) data set. GMCs are identified from the Herschel maps using a hierarchical source extraction algorithm. We present the results of this new catalog and characterize the spatial distribution and spectral energy properties of its clouds based on the radial dust/gas properties found by Smith et al. A total of 326 GMCs in the mass range 10{sup 4}-10{sup 7} M {sub ☉} are identified; their cumulative mass distribution is found to be proportional to M {sup –2.34}, in agreement with earlier studies. The GMCs appear to follow the same correlation of cloud mass to L {sub CO} observed in the Milky Way. However, comparison between this catalog and interferometry studies also shows that the GMCs are substructured below the Herschel resolution limit, suggesting that we are observing associations of GMCs. Following Gordon et al., we study the spatial structure of M31 by splitting the observed structure into a set of spiral arms and offset rings. We fit radii of 10.3 and 15.5 kpc to the two most prominent rings. We then fit a logarithmic spiral with a pitch angle of 8.°9 to the GMCs not associated with either ring. Last, we comment on the effects of deprojection on our results and investigate the effect different models for M31's inclination will have on the projection of an unperturbed spiral arm system.

  19. From Exploitation to Industry: Definitions, Risks, and Consequences of Domestic Sexual Exploitation and Sex Work Among Women and Girls.

    Science.gov (United States)

    Gerassi, Lara

    In the last 15 years, terms such as prostitution, sex trafficking, sexual exploitation, modern-day slavery, and sex work have elicited much confusion and debate as to their definitions. Consequently several challenges have emerged for both law enforcement in the prosecution of criminals and practitioners in service provision. This article reviews the state of the literature with regard to domestic, sexual exploitation among women and girls in the United States and seeks to (1) provide definitions and describe the complexity of all terms relating to domestic sexual exploitation of women and girls in the United States, (2) explore available national prevalence data according to the definitions provided, and (3) review the evidence of mental health, social, and structural risk factors at the micro-, mezzo-, and macrolevels.

  20. Fisheye-Based Method for GPS Localization Improvement in Unknown Semi-Obstructed Areas

    Directory of Open Access Journals (Sweden)

    Julien Moreau

    2017-01-01

    Full Text Available A precise GNSS (Global Navigation Satellite System localization is vital for autonomous road vehicles, especially in cluttered or urban environments where satellites are occluded, preventing accurate positioning. We propose to fuse GPS (Global Positioning System data with fisheye stereovision to face this problem independently to additional data, possibly outdated, unavailable, and needing correlation with reality. Our stereoscope is sky-facing with 360° × 180° fisheye cameras to observe surrounding obstacles. We propose a 3D modelling and plane extraction through following steps: stereoscope self-calibration for decalibration robustness, stereo matching considering neighbours epipolar curves to compute 3D, and robust plane fitting based on generated cartography and Hough transform. We use these 3D data with GPS raw data to estimate NLOS (Non Line Of Sight reflected signals pseudorange delay. We exploit extracted planes to build a visibility mask for NLOS detection. A simplified 3D canyon model allows to compute reflections pseudorange delays. In the end, GPS positioning is computed considering corrected pseudoranges. With experimentations on real fixed scenes, we show generated 3D models reaching metric accuracy and improvement of horizontal GPS positioning accuracy by more than 50%. The proposed procedure is effective, and the proposed NLOS detection outperforms CN0-based methods (Carrier-to-receiver Noise density.

  1. Local behavior and lymph node metastases of Wilms' tumor: accuracy of computed tomography; Comportamento local e metastases linfonodais do tumor de Wilms: acuracia da tomografia computadorizada

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Eduardo Just da Costa e, E-mail: eduardojust@oi.com.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Instituto Materno Infantil de Pernambuco (IMIP), Recife, PE (Brazil); Silva, Giselia Alves Pontes da [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. Maternal Infantil

    2014-01-15

    Objective: to evaluate the accuracy of computed tomography for local and lymph node staging of Wilms' tumor. Materials and methods: each case of Wilms' tumor was evaluated for the presence of abdominal lymph nodes by a radiologist. Signs of capsule and adjacent organ invasion were analyzed. Surgical and histopathological results were taken as the gold standard. Results: sensitivity was 100% for both mesenteric and retroperitoneal lymph nodes detection, and specificity was, respectively, 12% and 33%, with positive predictive value of 8% and 11% and negative predictive value of 100%. Signs of capsular invasion presented sensitivity of 87%, specificity of 77%, positive predictive value of 63% and negative predictive value of 93%. Signs of adjacent organ invasion presented sensitivity of 100%, specificity of 78%, positive predictive value of 37% and negative predictive value of 100%. Conclusion: computed tomography tumor showed low specificity and low positive predictive value in the detection of lymph node dissemination. The absence of detectable lymph nodes makes their presence unlikely, and likewise regarding the evaluation of local behavior of tumors. (author)

  2. Exploitation of Aquatic Resources in Ahanve, Badagry, south-western Nigeria

    Directory of Open Access Journals (Sweden)

    Orijemie, Emuobosa Akpo

    2014-11-01

    Full Text Available The Badagry Cultural Area (BCA is one of the significant socio-cultural places in coastal south-western Nigeria. Palynological and archaeological studies at Ahanve, a settlement in the BCA were undertaken recently to improve the understanding of past human exploitation of aquatic resources. Collected data revealed contrasts in the availability and utilisation of aquatic resources between a first occupation phase (9th-17th centuries AD and a second occupation phase (17th century AD to present. The environment during the first phase was characterised by secondary forest and freshwater swamp. During this period, the inhabitants consumed cat-fish (Clariidae and bivalves (Anodonta sp., and engaged in salt production. The salt was produced from brine obtained from the Atlantic Ocean. Aquatic food resources were supplemented with terrestrial animal and plant foods. During the second occupation phase, aquatic resources (cat-fish and bivalves declined and subsequently disappeared; salt production was discontinued while terrestrial foods, particularly plant-based types, increased significantly. These events coincided with the arrival of European travellers. Oral sources suggest that the decline in the exploitation of aquatic resources was in part due to the fear of being taken captive while on fishing expeditions, restrictions by Europeans who controlled the water-ways, and the massive importation of salt which replaced local production.

  3. An Opportunity on Exploiting of Geology and Mineral Resource Data for Regional Development

    International Nuclear Information System (INIS)

    Agus-Hendratno

    2004-01-01

    Indonesia archipelago have the very complex geo diversity. The complexity of geo diversity gives a lot of opportunity on exploiting of earth resources for society prosperity. In other side, the complexity of geology also gives a lot of resistance and various limitation at one particular region to expand. Hence, various data of geology as well as data of result of mapping of minerals resources (mapping at macro scale and also have detail scale) require to be managed and exploited maximally. Effort the exploiting also require various infrastructure which is concerning regulatory, technological, human resources being, market-drive of an economic geo material, social environment and culture which grow around geology data, and also availability and readiness of geology and mineral resources data. This study is expected can give a few description of how the geology and minerals resources data can be as reference in regional development planning. This paper was writed by assessment of description qualitative and comparative inter-region case study in various regency area, where writer have been involved to conduct the activity of geological mapping and mineral resources data and also involved by a discussion with a few officers of local government in so many opportunity. Some of the case study region for example : in Kampar Regency (Riau), Tanjung Jabung Timur Regency (Jambi), Biak Numfor Regency (Papua), Gunung Kidul Regency (Yogyakarta), Pacitan Regency (East Java), and also Klaten Regency (Central Java). (author)

  4. Oil exploitation and the environmental Kuznets curve

    International Nuclear Information System (INIS)

    Esmaeili, Abdoulkarim; Abdollahzadeh, Negar

    2009-01-01

    This study refers to a panel estimation of an environmental Kuznets curve (EKC) for oil to determine the factors most affecting oil exploitation in 38 oil-producing countries during 1990-2000. Control variables such as oil reserves, oil price, population, political rights, and the Gini index were used to determine its contribution to the main EKC model. The empirical results fully support the existence of an EKC for oil exploitation. Furthermore, the result indicates that the proved oil reserves has a significant and positive role in oil production, but oil price and population do not significantly affect crude oil production. Also, increased freedoms and a better income distribution will reduce the rate of oil exploitation. Thus, policies aiming at enhancing democratic society and better income distribution would be more compatible with sustainability. (author)

  5. Oil exploitation and the environmental Kuznets curve

    Energy Technology Data Exchange (ETDEWEB)

    Esmaeili, Abdoulkarim; Abdollahzadeh, Negar [Department of Agricultural Economics, College of Agriculture, Shiraz University, Shiraz, Fars (Iran)

    2009-01-15

    This study refers to a panel estimation of an environmental Kuznets curve (EKC) for oil to determine the factors most affecting oil exploitation in 38 oil-producing countries during 1990-2000. Control variables such as oil reserves, oil price, population, political rights, and the Gini index were used to determine its contribution to the main EKC model. The empirical results fully support the existence of an EKC for oil exploitation. Furthermore, the result indicates that the proved oil reserves has a significant and positive role in oil production, but oil price and population do not significantly affect crude oil production. Also, increased freedoms and a better income distribution will reduce the rate of oil exploitation. Thus, policies aiming at enhancing democratic society and better income distribution would be more compatible with sustainability. (author)

  6. A Novel Parallel Algorithm for Edit Distance Computation

    Directory of Open Access Journals (Sweden)

    Muhammad Murtaza Yousaf

    2018-01-01

    Full Text Available The edit distance between two sequences is the minimum number of weighted transformation-operations that are required to transform one string into the other. The weighted transformation-operations are insert, remove, and substitute. Dynamic programming solution to find edit distance exists but it becomes computationally intensive when the lengths of strings become very large. This work presents a novel parallel algorithm to solve edit distance problem of string matching. The algorithm is based on resolving dependencies in the dynamic programming solution of the problem and it is able to compute each row of edit distance table in parallel. In this way, it becomes possible to compute the complete table in min(m,n iterations for strings of size m and n whereas state-of-the-art parallel algorithm solves the problem in max(m,n iterations. The proposed algorithm also increases the amount of parallelism in each of its iteration. The algorithm is also capable of exploiting spatial locality while its implementation. Additionally, the algorithm works in a load balanced way that further improves its performance. The algorithm is implemented for multicore systems having shared memory. Implementation of the algorithm in OpenMP shows linear speedup and better execution time as compared to state-of-the-art parallel approach. Efficiency of the algorithm is also proven better in comparison to its competitor.

  7. The Potential Socio-economic Impacts of Gas Hydrate Exploitation

    Science.gov (United States)

    Riley, David; Schaafsma, Marije; Marin-Moreno, Héctor; Minshull, Tim A.

    2017-04-01

    the labour supply may not fit with the labour demand. In regions with an existing strong fossil fuel energy sector, hydrate development would prolong the timeframe for which this sector could significantly contribute to the local and wider economy. In unexploited areas the industry can provide considerable income to an otherwise undeveloped region. Industrialisation tends to increase regional population, pressuring existing public services, such as healthcare and transport infrastructure. Immigrant fossil fuel sector workers are predominantly young, male and single. Their presence may be linked to elevated levels of certain social issues seen as undesirable problems by the community at large, such as drug usage or alcoholism. Hydrate development provides limited benefit to indigenous communities who are still following a traditional cultural lifestyle in the proposed development area, as many opportunities are not compatible with their way of life. Additionally, industry associated infrastructure can reduce the ability of the indigenous population to utilise the land directly, or as an access route elsewhere. The range of possible impacts show that any hydrate development must be carefully managed to maximise its potential, whether this takes the form of using the revenue from hydrate exploitation to try and counter the associated issues, or whether there needs to be specific limits placed on locations where extraction can occur.

  8. Demonstration of measurement-only blind quantum computing

    International Nuclear Information System (INIS)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Walther, Philip; Morimae, Tomoyuki

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks. (paper)

  9. Security option file - Exploitation (DOS-Expl)

    International Nuclear Information System (INIS)

    2016-01-01

    This document aims at presenting functions performed by Cigeo during its exploitation phase, its main technical and security options which are envisaged with respect to different types of internal or external risks, and a first assessment of its impact on mankind and on the environment during its exploitation in normal operation as well as in incidental or accidental situations. A first volume addresses security principles, approach and management in relationship with the legal and regulatory framework. The second volume presents input data related to waste parcels and used for the installation sizing and operation, the main site characteristics, the main technical options regarding structures and equipment, and the main options regarding exploitation (parcel management, organisational and human aspects, and effluent management). The third volume describes how parcel are processed from their arrival to their setting in storage compartment, an inventory of internal and external risks, and a first assessment of consequences of scenarios on mankind and on the environment. The fourth volume presents options and operations which are envisaged regarding Cigeo closure, and inventory of associated risks

  10. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    Science.gov (United States)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  11. Plancton: an opportunistic distributed computing project based on Docker containers

    Science.gov (United States)

    Concas, Matteo; Berzano, Dario; Bagnasco, Stefano; Lusso, Stefano; Masera, Massimo; Puccio, Maximiliano; Vallero, Sara

    2017-10-01

    The computing power of most modern commodity computers is far from being fully exploited by standard usage patterns. In this work we describe the development and setup of a virtual computing cluster based on Docker containers used as worker nodes. The facility is based on Plancton: a lightweight fire-and-forget background service. Plancton spawns and controls a local pool of Docker containers on a host with free resources, by constantly monitoring its CPU utilisation. It is designed to release the resources allocated opportunistically, whenever another demanding task is run by the host user, according to configurable policies. This is attained by killing a number of running containers. One of the advantages of a thin virtualization layer such as Linux containers is that they can be started almost instantly upon request. We will show how fast the start-up and disposal of containers eventually enables us to implement an opportunistic cluster based on Plancton daemons without a central control node, where the spawned Docker containers behave as job pilots. Finally, we will show how Plancton was configured to run up to 10 000 concurrent opportunistic jobs on the ALICE High-Level Trigger facility, by giving a considerable advantage in terms of management compared to virtual machines.

  12. From Exploitation to Industry: Definitions, Risks, and Consequences of Domestic Sexual Exploitation and Sex Work Among Women and Girls

    Science.gov (United States)

    Gerassi, Lara

    2015-01-01

    In the last 15 years, terms such as prostitution, sex trafficking, sexual exploitation, modern-day slavery, and sex work have elicited much confusion and debate as to their definitions. Consequently several challenges have emerged for both law enforcement in the prosecution of criminals and practitioners in service provision. This article reviews the state of the literature with regard to domestic, sexual exploitation among women and girls in the United States and seeks to (1) provide definitions and describe the complexity of all terms relating to domestic sexual exploitation of women and girls in the United States, (2) explore available national prevalence data according to the definitions provided, and (3) review the evidence of mental health, social, and structural risk factors at the micro-, mezzo-, and macrolevels. PMID:26726289

  13. Optimization Algorithm for Kalman Filter Exploiting the Numerical Characteristics of SINS/GPS Integrated Navigation Systems.

    Science.gov (United States)

    Hu, Shaoxing; Xu, Shike; Wang, Duhu; Zhang, Aiwu

    2015-11-11

    Aiming at addressing the problem of high computational cost of the traditional Kalman filter in SINS/GPS, a practical optimization algorithm with offline-derivation and parallel processing methods based on the numerical characteristics of the system is presented in this paper. The algorithm exploits the sparseness and/or symmetry of matrices to simplify the computational procedure. Thus plenty of invalid operations can be avoided by offline derivation using a block matrix technique. For enhanced efficiency, a new parallel computational mechanism is established by subdividing and restructuring calculation processes after analyzing the extracted "useful" data. As a result, the algorithm saves about 90% of the CPU processing time and 66% of the memory usage needed in a classical Kalman filter. Meanwhile, the method as a numerical approach needs no precise-loss transformation/approximation of system modules and the accuracy suffers little in comparison with the filter before computational optimization. Furthermore, since no complicated matrix theories are needed, the algorithm can be easily transplanted into other modified filters as a secondary optimization method to achieve further efficiency.

  14. Early Sexual Exploitation as an Influence in Prostitution.

    Science.gov (United States)

    Silbert, Mimi H.; Pines, Ayala M.

    1983-01-01

    Surveyed 200 female street prostitutes to determine whether they were sexually exploited during childhood. Results showed 60 percent of the subjects were sexually exploited. The few girls who discussed their abuse with others were met with shame and most often inaction. Only 10 percent were abused by strangers. (JAC)

  15. Solving the Stokes problem on a massively parallel computer

    DEFF Research Database (Denmark)

    Axelsson, Owe; Barker, Vincent A.; Neytcheva, Maya

    2001-01-01

    boundary value problem for each velocity component, are solved by the conjugate gradient method with a preconditioning based on the algebraic multi‐level iteration (AMLI) technique. The velocity is found from the computed pressure. The method is optimal in the sense that the computational work...... is proportional to the number of unknowns. Further, it is designed to exploit a massively parallel computer with distributed memory architecture. Numerical experiments on a Cray T3E computer illustrate the parallel performance of the method....

  16. Exploiting Symmetry on Parallel Architectures.

    Science.gov (United States)

    Stiller, Lewis Benjamin

    1995-01-01

    This thesis describes techniques for the design of parallel programs that solve well-structured problems with inherent symmetry. Part I demonstrates the reduction of such problems to generalized matrix multiplication by a group-equivariant matrix. Fast techniques for this multiplication are described, including factorization, orbit decomposition, and Fourier transforms over finite groups. Our algorithms entail interaction between two symmetry groups: one arising at the software level from the problem's symmetry and the other arising at the hardware level from the processors' communication network. Part II illustrates the applicability of our symmetry -exploitation techniques by presenting a series of case studies of the design and implementation of parallel programs. First, a parallel program that solves chess endgames by factorization of an associated dihedral group-equivariant matrix is described. This code runs faster than previous serial programs, and discovered it a number of results. Second, parallel algorithms for Fourier transforms for finite groups are developed, and preliminary parallel implementations for group transforms of dihedral and of symmetric groups are described. Applications in learning, vision, pattern recognition, and statistics are proposed. Third, parallel implementations solving several computational science problems are described, including the direct n-body problem, convolutions arising from molecular biology, and some communication primitives such as broadcast and reduce. Some of our implementations ran orders of magnitude faster than previous techniques, and were used in the investigation of various physical phenomena.

  17. Dissemination and Exploitation: Project Goals beyond Science

    Science.gov (United States)

    Hamann, Kristin; Reitz, Anja

    2017-04-01

    Dissemination and Exploitation are essential parts of public funded projects. In Horizon 2020 a plan for the exploitation and dissemination of results (PEDR) is a requirement. The plan should contain a clear vision on the objectives of the project in relation to actions for dissemination and potential exploitation of the project results. The actions follow the basic idea to spread the knowledge and results gathered within the project and face the challenge of how to bring the results into potentially relevant policy circle and how they impact the market. The plan follows the purpose to assess the impact of the project and to address various target groups who are interested in the project results. Simply put, dissemination concentrates on the transfer of knowledge and exploitation on the commercialization of the project. Beyond the question of the measurability of project`s impact, strategies within science marketing can serve purposes beyond internal and external communication. Accordingly, project managers are facing the challenge to implement a dissemination and exploitation strategy that ideally supports the identification of all partners with the project and matches the current discourse of the project`s content within the society, politics and economy. A consolidated plan might unite all projects partners under a central idea and supports the identification with the project beyond the individual research questions. Which applications, strategies and methods can be used to bring forward a PEDR that accompanies a project successfully and allows a comprehensive assessment of the project afterwards? Which hurdles might project managers experience in the dissemination process and which tasks should be fulfilled by the project manager?

  18. Spatial scale of similarity as an indicator of metacommunity stability in exploited marine systems.

    Science.gov (United States)

    Shackell, Nancy L; Fisher, Jonathan A D; Frank, Kenneth T; Lawton, Peter

    2012-01-01

    The spatial scale of similarity among fish communities is characteristically large in temperate marine systems: connectivity is enhanced by high rates of dispersal during the larval/juvenile stages and the increased mobility of large-bodied fish. A larger spatial scale of similarity (low beta diversity) is advantageous in heavily exploited systems because locally depleted populations are more likely to be "rescued" by neighboring areas. We explored whether the spatial scale of similarity changed from 1970 to 2006 due to overfishing of dominant, large-bodied groundfish across a 300 000-km2 region of the Northwest Atlantic. Annually, similarities among communities decayed slowly with increasing geographic distance in this open system, but through time the decorrelation distance declined by 33%, concomitant with widespread reductions in biomass, body size, and community evenness. The decline in connectivity stemmed from an erosion of community similarity among local subregions separated by distances as small as 100 km. Larger fish, of the same species, contribute proportionally more viable offspring, so observed body size reductions will have affected maternal output. The cumulative effect of nonlinear maternal influences on egg/larval quality may have compromised the spatial scale of effective larval dispersal, which may account for the delayed recovery of certain member species. Our study adds strong support for using the spatial scale of similarity as an indicator of metacommunity stability both to understand the spatial impacts of exploitation and to refine how spatial structure is used in management plans.

  19. A computational framework for the optimal design of morphing processes in locally activated smart material structures

    International Nuclear Information System (INIS)

    Wang, Shuang; Brigham, John C

    2012-01-01

    A proof-of-concept study is presented for a strategy to obtain maximally efficient and accurate morphing structures composed of active materials such as shape memory polymers (SMP) through synchronization of adaptable and localized activation and actuation. The work focuses on structures or structural components entirely composed of thermo-responsive SMP, and particularly utilizes the ability of such materials to display controllable variable stiffness. The study presents and employs a computational inverse mechanics approach that combines a computational representation of the SMP thermo-mechanical behavior with a nonlinear optimization algorithm to determine location, magnitude and sequencing of the activation and actuation to obtain a desired shape change subject to design objectives such as prevention of damage. Two numerical examples are presented in which the synchronization of the activation and actuation and the location of activation excitation were optimized with respect to the combined thermal and mechanical energy for design concepts in morphing skeletal structural components. In all cases the concept of localized activation along with the optimal design strategy were able to produce far more energy efficient morphing structures and more accurately reach the desired shape change in comparison to traditional methods that require complete structural activation prior to actuation. (paper)

  20. Dealing with BIG Data - Exploiting the Potential of Multicore Parallelism and Auto-Tuning

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Physics experiments nowadays produce tremendous amounts of data that require sophisticated analyses in order to gain new insights. At such large scale, scientists are facing non-trivial software engineering problems in addition to the physics problems. Ubiquitous multicore processors and GPGPUs have turned almost any computer into a parallel machine and have pushed compute clusters and clouds to become multicore-based and more heterogenous. These developments complicate the exploitation of various types of parallelism within different layers of hardware and software. As a consequence, manual performance tuning is non-intuitive and tedious due to the large search space spanned by numerous inter-related tuning parameters. This talk addresses these challenges at CERN and discusses how to leverage multicore parallelization techniques in this context. It presents recent advances in automatic performance tuning to algorithmically find sweet spots with good performance. The talk also presents results from empiri...

  1. Multi-Locality Based Local and Symbiotic Computing for Interactively fast On-Demand Weather Forecasting for Small Regions, Short Durations, and Very High-Resolutions

    OpenAIRE

    Fjukstad, Bård

    2014-01-01

    Papers 1, 3 and 4 are not available in Munin: 1: Bård Fjukstad, Tor-Magne Stien Hagen, Daniel Stødle, Phuong Hoai Ha, John Markus Bjørndalen, and Otto Anshus: ‘Interactive Weather Simulation and Visualization on a Display Wall with Many-Core Compute Nodes’, in K. Jónasson (ed.): PARA 2010, Part I, LNCS 7133, pp. 142–151, 2012, © Springer-Verlag Berlin Heidelberg 3: Bård Fjukstad, John Markus Bjørndalen and Otto Anshus: ‘Accurate Weather Forecasting Through Locality Based Collaborative Computi...

  2. The Geohazards Exploitation Platform: an advanced cloud-based environment for the Earth Science community

    Science.gov (United States)

    Manunta, Michele; Casu, Francesco; Zinno, Ivana; De Luca, Claudio; Pacini, Fabrizio; Caumont, Hervé; Brito, Fabrice; Blanco, Pablo; Iglesias, Ruben; López, Álex; Briole, Pierre; Musacchio, Massimo; Buongiorno, Fabrizia; Stumpf, Andre; Malet, Jean-Philippe; Brcic, Ramon; Rodriguez Gonzalez, Fernando; Elias, Panagiotis

    2017-04-01

    The idea to create advanced platforms for the Earth Observation community, where the users can find data but also state-of-art algorithms, processing tools, computing facilities, and instruments for dissemination and sharing, has been launched several years ago. The initiatives developed in this context have been supported firstly by the Framework Programmes of European Commission and the European Space Agency (ESA) and, progressively, by the Copernicus programme. In particular, ESA created and supported the Grid Processing on Demand (G-POD) environment, where the users can access to advanced processing tools implemented in a GRID environment, satellite data and computing facilities. All these components are located in the same datacentre to significantly reduce and make negligible the time to move the satellite data from the archive. From the experience of G-POD was born the idea of ESA to have an ecosystem of Thematic Exploitation Platforms (TEP) focused on the integration of Ground Segment capabilities and ICT technologies to maximize the exploitation of EO data from past and future missions. A TEP refers to a computing platform that deals with a set of user scenarios involving scientists, data providers and ICT developers, aggregated around an Earth Science thematic area. Among the others, the Geohazards Exploitation Platform (GEP) aims at providing on-demand and systematic processing services to address the need of the geohazards community for common information layers and to integrate newly developed processors for scientists and other expert users. Within GEP, the community benefits from a cloud-based environment, specifically designed for the advanced exploitation of EO data. A partner can bring its own tools and processing chains, but also has access in the same workspace to large satellite datasets and shared data processing tools. GEP is currently in the pre-operations phase under a consortium led by Terradue Srl and six pilot projects concerning

  3. DOE pushes for useful quantum computing

    Science.gov (United States)

    Cho, Adrian

    2018-01-01

    The U.S. Department of Energy (DOE) is joining the quest to develop quantum computers, devices that would exploit quantum mechanics to crack problems that overwhelm conventional computers. The initiative comes as Google and other companies race to build a quantum computer that can demonstrate "quantum supremacy" by beating classical computers on a test problem. But reaching that milestone will not mean practical uses are at hand, and the new $40 million DOE effort is intended to spur the development of useful quantum computing algorithms for its work in chemistry, materials science, nuclear physics, and particle physics. With the resources at its 17 national laboratories, DOE could play a key role in developing the machines, researchers say, although finding problems with which quantum computers can help isn't so easy.

  4. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  5. Exploitation program of ''Adamow'' quarry up to 2015

    International Nuclear Information System (INIS)

    Hadlaw, A.

    1994-01-01

    The brown coal deposits exploited by the quarry ''Adamow'' located in Central Poland are shortly described and the prognosis of their exploitation up to 2015 is given. The basis data on the perspective deposits in the quarry's area are also presented. All deposits are shown on the map. 3 ills, 2 tabs

  6. Locality-Aware CTA Clustering For Modern GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ang; Song, Shuaiwen; Liu, Weifeng; Liu, Xu; Kumar, Akash; Corporaal, Henk

    2017-04-08

    In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. The experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.

  7. Collaborative real-time motion video analysis by human observer and image exploitation algorithms

    Science.gov (United States)

    Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2015-05-01

    Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.

  8. Cooperative localization in 5G networks: A survey

    Directory of Open Access Journals (Sweden)

    Ping Zhang

    2017-03-01

    Full Text Available In upcoming 5G networks, key prospects such as increased bandwidth, smaller cells, higher mobile terminal (MT densities, multiple radio access technologies, and the capability of device-to-device communication are beneficial for localization. Meanwhile, technologies suggested in 5G, such as massive multiple-in multiple-out, would also benefit from the accurate locations of MTs. Therefore, an opportunity to develop and integrate mobile localization technology in 5G networks has presented itself at this early stage. This paper reviews recent literature relating to localization in 5G networks, and emphasizes the prospect for implementing cooperative localization, which exploits the location information from additional measurements between MTs. To evaluate the accuracy of cooperative localization, a performance evaluation approach is also suggested.

  9. zero day exploits and national readiness for cyber-warfare

    African Journals Online (AJOL)

    HOD

    A zero day vulnerability is an unknown exploit that divulges security flaws in software before such a flaw is publicly ... Keywords: exploits, zero day, vulnerability, cyberspace, cyber-warfare. 1. ..... industries and companies across the globe. The.

  10. Delay-active damage versus non-local enhancement for anisotropic damage dynamics computations with alternated loading

    International Nuclear Information System (INIS)

    Desmorat, R.; Chambart, M.; Gatuingt, F.; Guilbaud, D.

    2010-01-01

    Anisotropic damage thermodynamics framework allows to model the concrete-like materials behavior and in particular their dissymmetric tension/compression response. To deal with dynamics applications such as impact, it is furthermore necessary to take into account the strain rate effect observed experimentally. This is done in the present work by means of anisotropic visco-damage, by introducing a material strain rate effect in the cases of positive hydrostatic stresses only. The proposed delay-damage law assumes no viscous effect in compression as the consideration of inertia effects proves sufficient to model the apparent material strength increase. High-rate dynamics applications imply to deal with wave propagation and reflection which can generate alternated loading in the impacted structure. In order to do so, the key concept of active damage is defined and introduced within both the damage criterion and the delay-damage evolution law. At the structural level, strain localization often leads to spurious mesh dependency. Three-dimensional Finite Element computations of dynamic tensile tests by spalling are presented, with visco-damage and either without or with non-local enhancement. Delay-damage, as introduced, regularizes the solution in fast dynamics. The location of the macro-crack initiated is found influenced by non-local regularization. The strain rate range in which each enhancement, delay-damage or non-local enhancement, has a regularizing effect is studied. (authors)

  11. The cost of being valuable: predictors of extinction risk in marine invertebrates exploited as luxury seafood.

    Science.gov (United States)

    Purcell, Steven W; Polidoro, Beth A; Hamel, Jean-François; Gamboa, Ruth U; Mercier, Annie

    2014-04-22

    Extinction risk has been linked to biological and anthropogenic variables. Prediction of extinction risk in valuable fauna may not follow mainstream drivers when species are exploited for international markets. We use results from an International Union for Conservation of Nature Red List assessment of extinction risk in all 377 known species of sea cucumber within the order Aspidochirotida, many of which are exploited worldwide as luxury seafood for Asian markets. Extinction risk was primarily driven by high market value, compounded by accessibility and familiarity (well known) in the marketplace. Extinction risk in marine animals often relates closely to body size and small geographical range but our study shows a clear exception. Conservation must not lose sight of common species, especially those of high value. Greater human population density and poorer economies in the geographical ranges of endangered species illustrate that anthropogenic variables can also predict extinction risks in marine animals. Local-level regulatory measures must prevent opportunistic exploitation of high-value species. Trade agreements, for example CITES, may aid conservation but will depend on international technical support to low-income tropical countries. The high proportion of data deficient species also stresses a need for research on the ecology and population demographics of unglamorous invertebrates.

  12. Digital optical interconnects for photonic computing

    Science.gov (United States)

    Guilfoyle, Peter S.; Stone, Richard V.; Zeise, Frederick F.

    1994-05-01

    A 32-bit digital optical computer (DOC II) has been implemented in hardware utilizing 8,192 free-space optical interconnects. The architecture exploits parallel interconnect technology by implementing microcode at the primitive level. A burst mode of 0.8192 X 1012 binary operations per sec has been reliably demonstrated. The prototype has been successful in demonstrating general purpose computation. In addition to emulating the RISC instruction set within the UNIX operating environment, relational database text search operations have been implemented on DOC II.

  13. ATLAS computing on Swiss Cloud SWITCHengines

    Science.gov (United States)

    Haug, S.; Sciacca, F. G.; ATLAS Collaboration

    2017-10-01

    Consolidation towards more computing at flat budgets beyond what pure chip technology can offer, is a requirement for the full scientific exploitation of the future data from the Large Hadron Collider at CERN in Geneva. One consolidation measure is to exploit cloud infrastructures whenever they are financially competitive. We report on the technical solutions and the performances used and achieved running simulation tasks for the ATLAS experiment on SWITCHengines. SWITCHengines is a new infrastructure as a service offered to Swiss academia by the National Research and Education Network SWITCH. While solutions and performances are general, financial considerations and policies, on which we also report, are country specific.

  14. ATLAS computing on Swiss Cloud SWITCHengines

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00215485; The ATLAS collaboration; Sciacca, Gianfranco

    2017-01-01

    Consolidation towards more computing at flat budgets beyond what pure chip technology can offer, is a requirement for the full scientific exploitation of the future data from the Large Hadron Collider at CERN in Geneva. One consolidation measure is to exploit cloud infrastructures whenever they are financially competitive. We report on the technical solutions and the performances used and achieved running simulation tasks for the ATLAS experiment on SWITCHengines. SWITCHengines is a new infrastructure as a service offered to Swiss academia by the National Research and Education Network SWITCH. While solutions and performances are general, financial considerations and policies, on which we also report, are country specific.

  15. From Exploitation to Industry: Definitions, Risks, and Consequences of Domestic Sexual Exploitation and Sex Work Among Women and Girls

    OpenAIRE

    Gerassi, Lara

    2015-01-01

    In the last 15 years, terms such as prostitution, sex trafficking, sexual exploitation, modern-day slavery, and sex work have elicited much confusion and debate as to their definitions. Consequently several challenges have emerged for both law enforcement in the prosecution of criminals and practitioners in service provision. This article reviews the state of the literature with regard to domestic, sexual exploitation among women and girls in the United States and seeks to (1) provide definit...

  16. Computed tomographic localization of pelvic hydatid disease

    International Nuclear Information System (INIS)

    Kotoulas, G.; Gouliamos, A.; Kalovidouris, A.; Vlahos, L.; Papavasiliou, C.

    1990-01-01

    Nine patients with history of hydatid disease have been examined by CT. Localization of the hydatid cysts in the pelvis was established by anatomical criteria. Occasionally, the transverse plane can be confusing the precise localization of a lesion. A central location, close to the boundaries of the bladder and rectum, can define peritoneal location. Further posterolateral retrovesical location can be considered retroperitoneal. Using these criteria, 8 cysts were situated within the peritoneum an 1 within the retroperitoneum. (authoer). 16 refs.; 5 figs.; 1 tab

  17. Computed tomographic localization of pelvic hydatid disease

    Energy Technology Data Exchange (ETDEWEB)

    Kotoulas, G.; Gouliamos, A.; Kalovidouris, A.; Vlahos, L.; Papavasiliou, C. (Athens University (Greece). Areteion Hospital, Department of Radiology)

    Nine patients with history of hydatid disease have been examined by CT. Localization of the hydatid cysts in the pelvis was established by anatomical criteria. Occasionally, the transverse plane can be confusing the precise localization of a lesion. A central location, close to the boundaries of the bladder and rectum, can define peritoneal location. Further posterolateral retrovesical location can be considered retroperitoneal. Using these criteria, 8 cysts were situated within the peritoneum an 1 within the retroperitoneum. (authoer). 16 refs.; 5 figs.; 1 tab.

  18. Automated motion imagery exploitation for surveillance and reconnaissance

    Science.gov (United States)

    Se, Stephen; Laliberte, France; Kotamraju, Vinay; Dutkiewicz, Melanie

    2012-06-01

    Airborne surveillance and reconnaissance are essential for many military missions. Such capabilities are critical for troop protection, situational awareness, mission planning and others, such as post-operation analysis / damage assessment. Motion imagery gathered from both manned and unmanned platforms provides surveillance and reconnaissance information that can be used for pre- and post-operation analysis, but these sensors can gather large amounts of video data. It is extremely labour-intensive for operators to analyse hours of collected data without the aid of automated tools. At MDA Systems Ltd. (MDA), we have previously developed a suite of automated video exploitation tools that can process airborne video, including mosaicking, change detection and 3D reconstruction, within a GIS framework. The mosaicking tool produces a geo-referenced 2D map from the sequence of video frames. The change detection tool identifies differences between two repeat-pass videos taken of the same terrain. The 3D reconstruction tool creates calibrated geo-referenced photo-realistic 3D models. The key objectives of the on-going project are to improve the robustness, accuracy and speed of these tools, and make them more user-friendly to operational users. Robustness and accuracy are essential to provide actionable intelligence, surveillance and reconnaissance information. Speed is important to reduce operator time on data analysis. We are porting some processor-intensive algorithms to run on a Graphics Processing Unit (GPU) in order to improve throughput. Many aspects of video processing are highly parallel and well-suited for optimization on GPUs, which are now commonly available on computers. Moreover, we are extending the tools to handle video data from various airborne platforms and developing the interface to the Coalition Shared Database (CSD). The CSD server enables the dissemination and storage of data from different sensors among NATO countries. The CSD interface allows

  19. Exploiting the Complementarity between Dereplication and Computer-Assisted Structure Elucidation for the Chemical Profiling of Natural Cosmetic Ingredients: Tephrosia purpurea as a Case Study.

    Science.gov (United States)

    Hubert, Jane; Chollet, Sébastien; Purson, Sylvain; Reynaud, Romain; Harakat, Dominique; Martinez, Agathe; Nuzillard, Jean-Marc; Renault, Jean-Hugues

    2015-07-24

    The aqueous-ethanolic extract of Tephrosia purpurea seeds is currently exploited in the cosmetic industry as a natural ingredient of skin lotions. The aim of this study was to chemically characterize this ingredient by combining centrifugal partition extraction (CPE) as a fractionation tool with two complementary identification approaches involving dereplication and computer-assisted structure elucidation. Following two rapid fractionations of the crude extract (2 g), seven major compounds namely, caffeic acid, quercetin-3-O-rutinoside, ethyl galactoside, ciceritol, stachyose, saccharose, and citric acid, were unambiguously identified within the CPE-generated simplified mixtures by a recently developed (13)C NMR-based dereplication method. The structures of four additional compounds, patuletin-3-O-rutinoside, kaempferol-3-O-rutinoside, guaiacylglycerol 8-vanillic acid ether, and 2-methyl-2-glucopyranosyloxypropanoic acid, were automatically elucidated by using the Logic for Structure Determination program based on the interpretation of 2D NMR (HSQC, HMBC, and COSY) connectivity data. As more than 80% of the crude extract mass was characterized without need for tedious and labor-intensive multistep purification procedures, the identification tools involved in this work constitute a promising strategy for an efficient and time-saving chemical profiling of natural extracts.

  20. ATLAS Computing on the Swiss Cloud SWITCHengines

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00215485; The ATLAS collaboration; Sciacca, Gianfranco

    2016-01-01

    Consolidation towards more computing at flat budgets beyond what pure chip technology can offer, is a requirement for the full scientific exploitation of the future data from the Large Hadron Collider. One consolidation measure is to exploit cloud infrastructures whenever they are financially competitive. We report on the technical solutions and the performance used and achieved running ATLAS production on SWITCHengines. SWITCHengines is the new cloud infrastructure offered to Swiss academia by the National Research and Education Network SWITCH. While solutions and performances are general, financial considerations and policies, which we also report on, are country specific.

  1. Teotihuacan, tepeapulco, and obsidian exploitation.

    Science.gov (United States)

    Charlton, T H

    1978-06-16

    Current cultural ecological models of the development of civilization in central Mexico emphasize the role of subsistence production techniques and organization. The recent use of established and productive archeological surface survey techniques along natural corridors of communication between favorable niches for cultural development within the Central Mexican symbiotic region resulted in the location of sites that indicate an early development of a decentralized resource exploitation, manufacturing, and exchange network. The association of the development of this system with Teotihuacán indicates the importance such nonsubsistence production and exchange had in the evolution of this first central Mexican civilization. The later expansion of Teotihuacán into more distant areas of Mesoamerica was based on this resource exploitation model. Later civilizations centered at Tula and Tenochtitlán also used such a model in their expansion.

  2. On the energy benefit of compute-and-forward on the hexagonal lattice

    NARCIS (Netherlands)

    Ren, Zhijie; Goseling, Jasper; Weber, Jos; Gastpar, Michael; Skoric, B.; Ignatenko, T.

    2014-01-01

    We study the energy benefit of applying compute-and-forward on a wireless hexagonal lattice network with multiple unicast sessions with a specific session placement. Two compute-and-forward based transmission schemes are proposed, which allow the relays to exploit both the broadcast and

  3. Source localization using a non-cocentered orthogonal loop and dipole (NCOLD) array

    Institute of Scientific and Technical Information of China (English)

    Liu Zhaoting; Xu Tongyang

    2013-01-01

    A uniform array of scalar-sensors with intersensor spacings over a large aperture size generally offers enhanced resolution and source localization accuracy, but it may also lead to cyclic ambiguity. By exploiting the polarization information of impinging waves, an electromagnetic vec-tor-sensor array outperforms the unpolarized scalar-sensor array in resolving this cyclic ambiguity. However, the electromagnetic vector-sensor array usually consists of cocentered orthogonal loops and dipoles (COLD), which is easily subjected to mutual coupling across these cocentered dipoles/loops. As a result, the source localization performance of the COLD array may substantially degrade rather than being improved. This paper proposes a new source localization method with a non-cocentered orthogonal loop and dipole (NCOLD) array. The NCOLD array contains only one dipole or loop on each array grid, and the intersensor spacings are larger than a half-wave-length. Therefore, unlike the COLD array, these well separated dipoles/loops minimize the mutual coupling effects and extend the spatial aperture as well. With the NCOLD array, the proposed method can efficiently exploit the polarization information to offer high localization precision.

  4. Finger Vein Recognition Based on Local Directional Code

    Science.gov (United States)

    Meng, Xianjing; Yang, Gongping; Yin, Yilong; Xiao, Rongyang

    2012-01-01

    Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP), Local Derivative Pattern (LDP) and Local Line Binary Pattern (LLBP). However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD), this paper represents a new direction based local descriptor called Local Directional Code (LDC) and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP. PMID:23202194

  5. Finger Vein Recognition Based on Local Directional Code

    Directory of Open Access Journals (Sweden)

    Rongyang Xiao

    2012-11-01

    Full Text Available Finger vein patterns are considered as one of the most promising biometric authentication methods for its security and convenience. Most of the current available finger vein recognition methods utilize features from a segmented blood vessel network. As an improperly segmented network may degrade the recognition accuracy, binary pattern based methods are proposed, such as Local Binary Pattern (LBP, Local Derivative Pattern (LDP and Local Line Binary Pattern (LLBP. However, the rich directional information hidden in the finger vein pattern has not been fully exploited by the existing local patterns. Inspired by the Webber Local Descriptor (WLD, this paper represents a new direction based local descriptor called Local Directional Code (LDC and applies it to finger vein recognition. In LDC, the local gradient orientation information is coded as an octonary decimal number. Experimental results show that the proposed method using LDC achieves better performance than methods using LLBP.

  6. A language for data-parallel and task parallel programming dedicated to multi-SIMD computers. Contributions to hydrodynamic simulation with lattice gases

    International Nuclear Information System (INIS)

    Pic, Marc Michel

    1995-01-01

    Parallel programming covers task-parallelism and data-parallelism. Many problems need both parallelisms. Multi-SIMD computers allow hierarchical approach of these parallelisms. The T++ language, based on C++, is dedicated to exploit Multi-SIMD computers using a programming paradigm which is an extension of array-programming to tasks managing. Our language introduced array of independent tasks to achieve separately (MIMD), on subsets of processors of identical behaviour (SIMD), in order to translate the hierarchical inclusion of data-parallelism in task-parallelism. To manipulate in a symmetrical way tasks and data we propose meta-operations which have the same behaviour on tasks arrays and on data arrays. We explain how to implement this language on our parallel computer SYMPHONIE in order to profit by the locally-shared memory, by the hardware virtualization, and by the multiplicity of communications networks. We analyse simultaneously a typical application of such architecture. Finite elements scheme for Fluid mechanic needs powerful parallel computers and requires large floating points abilities. Lattice gases is an alternative to such simulations. Boolean lattice bases are simple, stable, modular, need to floating point computation, but include numerical noise. Boltzmann lattice gases present large precision of computation, but needs floating points and are only locally stable. We propose a new scheme, called multi-bit, who keeps the advantages of each boolean model to which it is applied, with large numerical precision and reduced noise. Experiments on viscosity, physical behaviour, noise reduction and spurious invariants are shown and implementation techniques for parallel Multi-SIMD computers detailed. (author) [fr

  7. Aspects économiques de l'exploitation des resources halieutiques ...

    African Journals Online (AJOL)

    Aspects économiques de l'exploitation des resources halieutiques des petits barrages du Nord de la Côte d' lvoire : Economic aspects of the exploitation of halieutic resources of petits barrages inland waters in the north of Côte d'lvoire.

  8. Local binary pattern variants-based adaptive texture features analysis for posed and nonposed facial expression recognition

    Science.gov (United States)

    Sultana, Maryam; Bhatti, Naeem; Javed, Sajid; Jung, Soon Ki

    2017-09-01

    Facial expression recognition (FER) is an important task for various computer vision applications. The task becomes challenging when it requires the detection and encoding of macro- and micropatterns of facial expressions. We present a two-stage texture feature extraction framework based on the local binary pattern (LBP) variants and evaluate its significance in recognizing posed and nonposed facial expressions. We focus on the parametric limitations of the LBP variants and investigate their effects for optimal FER. The size of the local neighborhood is an important parameter of the LBP technique for its extraction in images. To make the LBP adaptive, we exploit the granulometric information of the facial images to find the local neighborhood size for the extraction of center-symmetric LBP (CS-LBP) features. Our two-stage texture representations consist of an LBP variant and the adaptive CS-LBP features. Among the presented two-stage texture feature extractions, the binarized statistical image features and adaptive CS-LBP features were found showing high FER rates. Evaluation of the adaptive texture features shows competitive and higher performance than the nonadaptive features and other state-of-the-art approaches, respectively.

  9. Collision broadened resonance localization in tokamaks excited with ICRF waves

    International Nuclear Information System (INIS)

    Kerbel, G.D.; McCoy, M.G.

    1985-08-01

    Advanced wave models used to evaluate ICRH in tokamaks typically use warm plasma theory and allow inhomogeneity in one dimension. The authors have developed a bounce-averaged Fokker-Planck quasilinear computational model which evolves the population of particles on more realistic orbits. Each wave-particle resonance has its own specific interaction amplitude within any given volume element. These data need only be generated once, and appropriately stored for efficient retrieval. The wave-particle resonant interaction then serves as a mechanism by which the diffusion of particle populations can proceed among neighboring orbits. Collisions affect the absorption of rf energy by two quite distinct processes: In addition to the usual relaxation towards the Maxwellian distribution creating velocity gradients which drive quasilinear diffusion, collisions also affect the wave-particle resonance through the mechanism of gyro-phase diffusion. The local specific spectral energy absorption rate is directly calculable once the orbit geometry and populations are determined. The code is constructed in such fashion as to accommodate wave propagation models which provide the wave spectral energy density on a poloidal cross-section. Information provided by the calculation includes the local absorption properties of the medium which can then be exploited to evolve the wave field

  10. BIO-EXPLOITATION STATUS OF BOMBAY DUCK (Harpadon nehereus HAMILTON, 1822 ON TRAWL FISHERY IN TARAKAN WATERS

    Directory of Open Access Journals (Sweden)

    Duto Nugroho

    2015-06-01

    Full Text Available North Kalimantan Province, notably Tarakan City marine waters, is one of the important fishing ground in boundary area among Sulu Sulawesi Marine Ecoregion. It produces approximately 100 mt/annum of Bombay duck (Harpadon nehereus with valued of US$ 750,000. The sustainability of this fishery is a crucially concern given the following: substantial economic contribution, significant dependence of small-scale fishers on this species for their livelihoods. The fishing intensities considerable and growing threats to their habitats. To evaluate the vulnerability of individual species to over exploitation, the spawning potential ratio (SPR approach applied to describe the status of its existing fisheries. This approach provides the ability to determine fishing mortality as reference points to enhance its sustainability. The objective of this study is to understand this fish biomass resilience to harvesting. The calculated SPR based on the value of estimated length of first capture or Lc at 208 mm is equivalent to the SPR of 28%. With a base line of stocks are generally thought to risk recruitment declining when SPR <20%, recent finding indicated that the existing fishery can be generally described as nearly fully exploited. In recognition of this sector’s has an ecological importance and socio-economic significance, the sustainable development of Bombay duck fisheries should be initiated through developing local fishery committee to provide a their local fishery management plan.

  11. Information source exploitation/exploration and NPD decision-making

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    different Scandinavian companies. Data was analyzed using hierarchical regression models across decision criteria dimensions and NPD stages as well as analyzing the combination of selected information sources. Rather than forwarding one optimal search behavior for the entire NPD process, we find optimal...... information search behavior at either end of the exploitation/exploration continuum. Additionally, we find that overexploitation and overexploration is caused by managerial bias. This creates managerial misbehavior at gate decision-points of the NPD process.......The purpose of this study is to examine how the exploration/exploitation continuum is applied by decision-makers in new product gate decision-making. Specifically, we analyze at gate decision-points how the evaluation of a new product project is affected by the information source exploitation...

  12. Recurrently connected and localized neuronal communities initiate coordinated spontaneous activity in neuronal networks

    Science.gov (United States)

    Amin, Hayder; Maccione, Alessandro; Nieus, Thierry

    2017-01-01

    Developing neuronal systems intrinsically generate coordinated spontaneous activity that propagates by involving a large number of synchronously firing neurons. In vivo, waves of spikes transiently characterize the activity of developing brain circuits and are fundamental for activity-dependent circuit formation. In vitro, coordinated spontaneous spiking activity, or network bursts (NBs), interleaved within periods of asynchronous spikes emerge during the development of 2D and 3D neuronal cultures. Several studies have investigated this type of activity and its dynamics, but how a neuronal system generates these coordinated events remains unclear. Here, we investigate at a cellular level the generation of network bursts in spontaneously active neuronal cultures by exploiting high-resolution multielectrode array recordings and computational network modelling. Our analysis reveals that NBs are generated in specialized regions of the network (functional neuronal communities) that feature neuronal links with high cross-correlation peak values, sub-millisecond lags and that share very similar structural connectivity motifs providing recurrent interactions. We show that the particular properties of these local structures enable locally amplifying spontaneous asynchronous spikes and that this mechanism can lead to the initiation of NBs. Through the analysis of simulated and experimental data, we also show that AMPA currents drive the coordinated activity, while NMDA and GABA currents are only involved in shaping the dynamics of NBs. Overall, our results suggest that the presence of functional neuronal communities with recurrent local connections allows a neuronal system to generate spontaneous coordinated spiking activity events. As suggested by the rules used for implementing our computational model, such functional communities might naturally emerge during network development by following simple constraints on distance-based connectivity. PMID:28749937

  13. Recurrently connected and localized neuronal communities initiate coordinated spontaneous activity in neuronal networks.

    Directory of Open Access Journals (Sweden)

    Davide Lonardoni

    2017-07-01

    Full Text Available Developing neuronal systems intrinsically generate coordinated spontaneous activity that propagates by involving a large number of synchronously firing neurons. In vivo, waves of spikes transiently characterize the activity of developing brain circuits and are fundamental for activity-dependent circuit formation. In vitro, coordinated spontaneous spiking activity, or network bursts (NBs, interleaved within periods of asynchronous spikes emerge during the development of 2D and 3D neuronal cultures. Several studies have investigated this type of activity and its dynamics, but how a neuronal system generates these coordinated events remains unclear. Here, we investigate at a cellular level the generation of network bursts in spontaneously active neuronal cultures by exploiting high-resolution multielectrode array recordings and computational network modelling. Our analysis reveals that NBs are generated in specialized regions of the network (functional neuronal communities that feature neuronal links with high cross-correlation peak values, sub-millisecond lags and that share very similar structural connectivity motifs providing recurrent interactions. We show that the particular properties of these local structures enable locally amplifying spontaneous asynchronous spikes and that this mechanism can lead to the initiation of NBs. Through the analysis of simulated and experimental data, we also show that AMPA currents drive the coordinated activity, while NMDA and GABA currents are only involved in shaping the dynamics of NBs. Overall, our results suggest that the presence of functional neuronal communities with recurrent local connections allows a neuronal system to generate spontaneous coordinated spiking activity events. As suggested by the rules used for implementing our computational model, such functional communities might naturally emerge during network development by following simple constraints on distance-based connectivity.

  14. Adaptive local backlight dimming algorithm based on local histogram and image characteristics

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari

    2013-01-01

    -off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.......Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image......, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted...

  15. Co-occurrence of Local Anisotropic Gradient Orientations (CoLlAGe): A new radiomics descriptor.

    Science.gov (United States)

    Prasanna, Prateek; Tiwari, Pallavi; Madabhushi, Anant

    2016-11-22

    In this paper, we introduce a new radiomic descriptor, Co-occurrence of Local Anisotropic Gradient Orientations (CoLlAGe) for capturing subtle differences between benign and pathologic phenotypes which may be visually indistinguishable on routine anatomic imaging. CoLlAGe seeks to capture and exploit local anisotropic differences in voxel-level gradient orientations to distinguish similar appearing phenotypes. CoLlAGe involves assigning every image voxel an entropy value associated with the co-occurrence matrix of gradient orientations computed around every voxel. The hypothesis behind CoLlAGe is that benign and pathologic phenotypes even though they may appear similar on anatomic imaging, will differ in their local entropy patterns, in turn reflecting subtle local differences in tissue microarchitecture. We demonstrate CoLlAGe's utility in three clinically challenging classification problems: distinguishing (1) radiation necrosis, a benign yet confounding effect of radiation treatment, from recurrent tumors on T1-w MRI in 42 brain tumor patients, (2) different molecular sub-types of breast cancer on DCE-MRI in 65 studies and (3) non-small cell lung cancer (adenocarcinomas) from benign fungal infection (granulomas) on 120 non-contrast CT studies. For each of these classification problems, CoLlAGE in conjunction with a random forest classifier outperformed state of the art radiomic descriptors (Haralick, Gabor, Histogram of Gradient Orientations).

  16. Exploiting opportunities at all cost? Entrepreneurial intent and externalities

    NARCIS (Netherlands)

    Urbig, D.; Weitzel, G.U.; Rosenkranz, S.; van Witteloostuijn, A.

    2011-01-01

    they exploit welfare-enhancing opportunities as it is assumed in several normative models? Do we need to prevent potential entrepreneurs from being destructive or are there intrinsic limits to harm others? We experimentally investigate how people with different entrepreneurial intent exploit risky

  17. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    Science.gov (United States)

    Barreiro Megino, Fernando H.; Jones, Robert; Kucharczyk, Katarzyna; Medrano Llamas, Ramón; van der Ster, Daniel

    2014-06-01

    The recent paradigm shift toward cloud computing in IT, and general interest in "Big Data" in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R&D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula - the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain the cloud marketplace for years to come. This contribution will summarize the participation of CERN in Helix Nebula. We will explain CERN's flagship use-case and the model used to integrate several cloud providers with an LHC experiment's workload management system. During the first proof of concept, this project contributed over 40.000 CPU-days of Monte Carlo production throughput to the ATLAS experiment with marginal manpower required. CERN's experience, together with that of ESA and EMBL, is providing a great insight into the cloud computing industry and highlighted several challenges that are being tackled in order to ease the export of the scientific workloads to the cloud environments.

  18. Knowledge-Based Topic Model for Unsupervised Object Discovery and Localization.

    Science.gov (United States)

    Niu, Zhenxing; Hua, Gang; Wang, Le; Gao, Xinbo

    Unsupervised object discovery and localization is to discover some dominant object classes and localize all of object instances from a given image collection without any supervision. Previous work has attempted to tackle this problem with vanilla topic models, such as latent Dirichlet allocation (LDA). However, in those methods no prior knowledge for the given image collection is exploited to facilitate object discovery. On the other hand, the topic models used in those methods suffer from the topic coherence issue-some inferred topics do not have clear meaning, which limits the final performance of object discovery. In this paper, prior knowledge in terms of the so-called must-links are exploited from Web images on the Internet. Furthermore, a novel knowledge-based topic model, called LDA with mixture of Dirichlet trees, is proposed to incorporate the must-links into topic modeling for object discovery. In particular, to better deal with the polysemy phenomenon of visual words, the must-link is re-defined as that one must-link only constrains one or some topic(s) instead of all topics, which leads to significantly improved topic coherence. Moreover, the must-links are built and grouped with respect to specific object classes, thus the must-links in our approach are semantic-specific , which allows to more efficiently exploit discriminative prior knowledge from Web images. Extensive experiments validated the efficiency of our proposed approach on several data sets. It is shown that our method significantly improves topic coherence and outperforms the unsupervised methods for object discovery and localization. In addition, compared with discriminative methods, the naturally existing object classes in the given image collection can be subtly discovered, which makes our approach well suited for realistic applications of unsupervised object discovery.Unsupervised object discovery and localization is to discover some dominant object classes and localize all of object

  19. High-performance computing on the Intel Xeon Phi how to fully exploit MIC architectures

    CERN Document Server

    Wang, Endong; Shen, Bo; Zhang, Guangyong; Lu, Xiaowei; Wu, Qing; Wang, Yajuan

    2014-01-01

    The aim of this book is to explain to high-performance computing (HPC) developers how to utilize the Intel® Xeon Phi™ series products efficiently. To that end, it introduces some computing grammar, programming technology and optimization methods for using many-integrated-core (MIC) platforms and also offers tips and tricks for actual use, based on the authors' first-hand optimization experience.The material is organized in three sections. The first section, "Basics of MIC", introduces the fundamentals of MIC architecture and programming, including the specific Intel MIC programming environment

  20. Agglomeration Economies and the High-Tech Computer

    OpenAIRE

    Wallace, Nancy E.; Walls, Donald

    2004-01-01

    This paper considers the effects of agglomeration on the production decisions of firms in the high-tech computer cluster. We build upon an alternative definition of the high-tech computer cluster developed by Bardhan et al. (2003) and we exploit a new data source, the National Establishment Time-Series (NETS) Database, to analyze the spatial distribution of firms in this industry. An essential contribution of this research is the recognition that high-tech firms are heterogeneous collections ...

  1. Computer-aided instruction system

    International Nuclear Information System (INIS)

    Teneze, Jean Claude

    1968-01-01

    This research thesis addresses the use of teleprocessing and time sharing by the RAX IBM system and the possibility to introduce a dialog with the machine to develop an application in which the computer plays the role of a teacher for different pupils at the same time. Two operating modes are thus exploited: a teacher-mode and a pupil-mode. The developed CAI (computer-aided instruction) system comprises a checker to check the course syntax in teacher-mode, a translator to trans-code the course written in teacher-mode into a form which can be processes by the execution programme, and the execution programme which presents the course in pupil-mode

  2. Application of large computers for predicting the oil field production

    Energy Technology Data Exchange (ETDEWEB)

    Philipp, W; Gunkel, W; Marsal, D

    1971-10-01

    The flank injection drive plays a dominant role in the exploitation of the BEB-oil fields. Therefore, 2-phase flow computer models were built up, adapted to a predominance of a single flow direction and combining a high accuracy of prediction with a low job time. Any case study starts with the partitioning of the reservoir into blocks. Then the statistics of the time-independent reservoir properties are analyzed by means of an IBM 360/25 unit. Using these results and the past production of oil, water and gas, a Fortran-program running on a CDC-3300 computer yields oil recoveries and the ratios of the relative permeabilities as a function of the local oil saturation for all blocks penetrated by mobile water. In order to assign kDwU/KDoU-functions to blocks not yet reached by the advancing water-front, correlation analysis is used to relate reservoir properties to kDwU/KDoU-functions. All these results are used as input into a CDC-660 Fortran program, allowing short-, medium-, and long-term forecasts as well as the handling of special problems.

  3. Exploiting the X-Window environment to expand the number, reach, and usefulness of Fermilab accelerator control consoles

    International Nuclear Information System (INIS)

    Cahill, K.; Smedinghoff, J.

    1992-01-01

    The Fermilab accelerator operator workstation of choice is now the Digital VAX station running VMS and X-Window software. This new platform provides an easy to learn programming environment while support routines are expanding in number and power. The X-Window environment is exploited to provide remote consoles to users across long haul networks and to support multiple consoles on a single workstation. The integration of imaging systems, local datalogging, commercial and Physics community's software, and development facilities on the operator workstation adds functionality to the system. The locally engineered knob/pointer/keyboard interface solves the multiple keyboard and mouse problems of a multi-screen console. This paper will address these issues of Fermilab's accelerator operator workstations. (author)

  4. Local approach of cleavage fracture applied to a vessel with subclad flaw. A benchmark on computational simulation

    International Nuclear Information System (INIS)

    Moinereau, D.; Brochard, J.; Guichard, D.; Bhandari, S.; Sherry, A.; France, C.

    1996-10-01

    A benchmark on the computational simulation of a cladded vessel with a 6.2 mm sub-clad flaw submitted to a thermal transient has been conducted. Two-dimensional elastic and elastic-plastic finite element computations of the vessel have been performed by the different partners with respective finite element codes ASTER (EDF), CASTEM 2000 (CEA), SYSTUS (Framatome) and ABAQUS (AEA Technology). Main results have been compared: temperature field in the vessel, crack opening, opening stress at crack tips, stress intensity factor in cladding and base metal, Weibull stress σ w and probability of failure in base metal, void growth rate R/R 0 in cladding. This comparison shows an excellent agreement on main results, in particular on results obtained with local approach. (K.A.)

  5. Exploitation and Utilization of Oilfield Geothermal Resources in China

    Directory of Open Access Journals (Sweden)

    Shejiao Wang

    2016-09-01

    Full Text Available Geothermal energy is a clean, green renewable resource, which can be utilized for power generation, heating, cooling, and could effectively replace oil, gas, and coal. In recent years, oil companies have put more efforts into exploiting and utilizing geothermal energy with advanced technologies for heat-tracing oil gathering and transportation, central heating, etc., which has not only reduced resource waste, but also improved large-scale and industrial resource utilization levels, and has achieved remarkable economic and social benefits. Based on the analysis of oilfield geothermal energy development status, resource potential, and exploitation and utilization modes, the advantages and disadvantages of harnessing oilfield geothermal resource have been discussed. Oilfield geothermal energy exploitation and utilization have advantages in resources, technical personnel, technology, and a large number of abandoned wells that could be reconstructed and utilized. Due to the high heat demand in oilfields, geothermal energy exploitation and utilization can effectively replace oil, gas, coal, and other fossil fuels, and has bright prospects. The key factors limiting oilfield geothermal energy exploitation and utilization are also pointed out in this paper, including immature technologies, lack of overall planning, lack of standards in resource assessment, and economic assessment, lack of incentive policies, etc.

  6. Network exploitation using WAMI tracks

    Science.gov (United States)

    Rimey, Ray; Record, Jim; Keefe, Dan; Kennedy, Levi; Cramer, Chris

    2011-06-01

    Creating and exploiting network models from wide area motion imagery (WAMI) is an important task for intelligence analysis. Tracks of entities observed moving in the WAMI sensor data are extracted, then large numbers of tracks are studied over long time intervals to determine specific locations that are visited (e.g., buildings in an urban environment), what locations are related to other locations, and the function of each location. This paper describes several parts of the network detection/exploitation problem, and summarizes a solution technique for each: (a) Detecting nodes; (b) Detecting links between known nodes; (c) Node attributes to characterize a node; (d) Link attributes to characterize each link; (e) Link structure inferred from node attributes and vice versa; and (f) Decomposing a detected network into smaller networks. Experimental results are presented for each solution technique, and those are used to discuss issues for each problem part and its solution technique.

  7. Solving computationally expensive engineering problems

    CERN Document Server

    Leifsson, Leifur; Yang, Xin-She

    2014-01-01

    Computational complexity is a serious bottleneck for the design process in virtually any engineering area. While migration from prototyping and experimental-based design validation to verification using computer simulation models is inevitable and has a number of advantages, high computational costs of accurate, high-fidelity simulations can be a major issue that slows down the development of computer-aided design methodologies, particularly those exploiting automated design improvement procedures, e.g., numerical optimization. The continuous increase of available computational resources does not always translate into shortening of the design cycle because of the growing demand for higher accuracy and necessity to simulate larger and more complex systems. Accurate simulation of a single design of a given system may be as long as several hours, days or even weeks, which often makes design automation using conventional methods impractical or even prohibitive. Additional problems include numerical noise often pr...

  8. Computational neuropharmacology: dynamical approaches in drug discovery.

    Science.gov (United States)

    Aradi, Ildiko; Erdi, Péter

    2006-05-01

    Computational approaches that adopt dynamical models are widely accepted in basic and clinical neuroscience research as indispensable tools with which to understand normal and pathological neuronal mechanisms. Although computer-aided techniques have been used in pharmaceutical research (e.g. in structure- and ligand-based drug design), the power of dynamical models has not yet been exploited in drug discovery. We suggest that dynamical system theory and computational neuroscience--integrated with well-established, conventional molecular and electrophysiological methods--offer a broad perspective in drug discovery and in the search for novel targets and strategies for the treatment of neurological and psychiatric diseases.

  9. Nanostructured Basaltfiberconcrete Exploitational Characteristics

    Science.gov (United States)

    Saraykina, K. A.; Shamanov, V. A.

    2017-11-01

    The article demonstrates that the mass use of basalt fiber concrete (BFC) is constrained by insufficient study of their durability and serviceability in a variety of environments. This research is aimed at the study of the basalt fiber corrosion processes in the cement stone of BFC, the control of the new products structure formation in order to protect the reinforcing fiber from alkaline destruction and thereby improve the exploitational characteristics of the composite. The research result revealed that the modification of basaltfiber concrete by the dispersion of MWNTs contributes to the directional formation of new products in the cement matrix. The HAM additive in basaltfiberconcrete provides for the binding of portlandite to low-basic calcium hydroaluminosilicates, thus reducing the aggressive effect of the cement environment on the reinforcing fibers properties. The complex modification of BFC with nanostructured additives provides for an increase in its durability and exploitational properties (strength, frost resistance and water resistance) due to basalt fiber protection from alkali corrosion on account of the compacting of the contact zone “basalt fiber - cement stone” and designing of the new products structure and morphology of cement matrix over the fiber surface.

  10. Preoperative Computed Tomography-Guided Percutaneous Hookwire Localization of Metallic Marker Clips in the Breast with a Radial Approach: Initial Experience

    Energy Technology Data Exchange (ETDEWEB)

    Uematsu, T.; Kasami, M.; Uchida, Y.; Sanuki, J.; Kimura, K.; Tanaka, K.; Takahashi, K. [Dept. of Diagnostic Radiology, Dept. of Pathology, and Dept. of Breast Surgery, Shizuoka Cancer Center Hospital, Naga-izumi, Shizuoka (Japan)

    2007-07-15

    Background: Hookwire localization is the current standard technique for radiological marking of nonpalpable breast lesions. Stereotactic directional vacuum-assisted breast biopsy (SVAB) is of sufficient sensitivity and specificity to replace surgical biopsy. Wire localization for metallic marker clips placed after SVAB is needed. Purpose: To describe a method for performing computed tomography (CT)-guided hookwire localization using a radial approach for metallic marker clips placed percutaneously after SVAB. Material and Methods: Nineteen women scheduled for SVAB with marker-clip placement, CT-guided wire localization of marker clips, and, eventually, surgical excision were prospectively entered into the study. CT-guided wire localization was performed with a radial approach, followed by placement of a localizing marker-clip surgical excision. Feasibility and reliability of the procedure and the incidence of complications were examined. Results: CT-guided wire localization surgical excision was successfully performed in all 19 women without any complications. The mean total procedure time was 15 min. The median distance on CT image from marker clip to hookwire was 2 mm (range 0-3 mm). Conclusion: CT-guided preoperative hookwire localization with a radial approach for marker clips after SVAB is technically feasible.

  11. Preoperative computed tomography-guided percutaneous hookwire localization of metallic marker clips in the breast with a radial approach: initial experience.

    Science.gov (United States)

    Uematsu, T; Kasami, M; Uchida, Y; Sanuki, J; Kimura, K; Tanaka, K; Takahashi, K

    2007-06-01

    Hookwire localization is the current standard technique for radiological marking of nonpalpable breast lesions. Stereotactic directional vacuum-assisted breast biopsy (SVAB) is of sufficient sensitivity and specificity to replace surgical biopsy. Wire localization for metallic marker clips placed after SVAB is needed. To describe a method for performing computed tomography (CT)-guided hookwire localization using a radial approach for metallic marker clips placed percutaneously after SVAB. Nineteen women scheduled for SVAB with marker-clip placement, CT-guided wire localization of marker clips, and, eventually, surgical excision were prospectively entered into the study. CT-guided wire localization was performed with a radial approach, followed by placement of a localizing marker-clip surgical excision. Feasibility and reliability of the procedure and the incidence of complications were examined. CT-guided wire localization surgical excision was successfully performed in all 19 women without any complications. The mean total procedure time was 15 min. The median distance on CT image from marker clip to hookwire was 2 mm (range 0-3 mm). CT-guided preoperative hookwire localization with a radial approach for marker clips after SVAB is technically feasible.

  12. On species preservation and Non-Cooperative Exploiters

    DEFF Research Database (Denmark)

    Kronbak, Lone Grønbæk; Lindroos, Marko

    cases where several non-cooperative exploiters are involved in mixed fisheries. This paper is targeting biodiversity preservation by setting up a two species model with the aim of ensuring both species survive harvesting of exploiters adapting a non-cooperative behaviour. The model starts out as a multi......-species model without biological dependency and is then modified to include also biological dependency. We contribute to the literature by analytically finding the limits on the number of players preserving both species including the conditions to be satisfied. For visual purposes we simulate a two species...

  13. WE-A-17A-09: Exploiting Electromagnetic Technologies for Real-Time Seed Drop Position Validation in Permanent Implant Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Racine, E [Departement de Radio-Oncologie et Centre de Recherche du CHU de Quebec, Quebec, QC (Canada); Hautvast, G [Biomedical Systems, Philips Group Innovation, Eindhoven, North Brabant (Netherlands); Binnekamp, D [Integrated Clinical Solutions and Marketing, Philips Healthcare, Best, DA (Netherlands); Beaulieu, L [Centre Hospitalier University de Quebec, Quebec, QC (Canada)

    2014-06-15

    Purpose: To report on preliminary results validating the performance of a specially designed LDR brachytherapy needle prototype possessing both electromagnetic (EM) tracking and seed drop detection abilities. Methods: An EM hollow needle prototype has been designed and constructed in collaboration with research partner Philips Healthcare. The needle possesses conventional 3D tracking capabilities, along with a novel seed drop detection mechanism exploiting local changes of electromagnetic properties generated by the passage of seeds in the needle's embedded sensor coils. These two capabilities are exploited by proprietary engineering and signal processing techniques to generate seed drop position estimates in real-time treatment delivery. The electromagnetic tracking system (EMTS) used for the experiment is the NDI Aurora Planar Field Generator. The experiment consisted of dropping a total of 35 seeds in a prismatic agarose phantom, and comparing the 3D seed drop positions of the EMTS to those obtained by an image analysis of subsequent micro-CT scans. Drop position error computations and statistical analysis were performed after a 3D registration of the two seed distributions. Results: Of the 35 seeds dropped in the phantom, 32 were properly detected by the needle prototype. Absolute drop position errors among the detected seeds ranged from 0.5 to 4.8 mm with mean and standard deviation values of 1.6 and 0.9 mm, respectively. Error measurements also include undesirable and uncontrollable effects such as seed motion upon deposition. The true accuracy performance of the needle prototype is therefore underestimated. Conclusion: This preliminary study demonstrates the potential benefits of EM technologies in detecting the passage of seeds in a hollow needle as a means of generating drop position estimates in real-time treatment delivery. Such tools could therefore represent a potentially interesting addition to existing brachytherapy protocols for rapid dosimetry

  14. WE-A-17A-09: Exploiting Electromagnetic Technologies for Real-Time Seed Drop Position Validation in Permanent Implant Brachytherapy

    International Nuclear Information System (INIS)

    Racine, E; Hautvast, G; Binnekamp, D; Beaulieu, L

    2014-01-01

    Purpose: To report on preliminary results validating the performance of a specially designed LDR brachytherapy needle prototype possessing both electromagnetic (EM) tracking and seed drop detection abilities. Methods: An EM hollow needle prototype has been designed and constructed in collaboration with research partner Philips Healthcare. The needle possesses conventional 3D tracking capabilities, along with a novel seed drop detection mechanism exploiting local changes of electromagnetic properties generated by the passage of seeds in the needle's embedded sensor coils. These two capabilities are exploited by proprietary engineering and signal processing techniques to generate seed drop position estimates in real-time treatment delivery. The electromagnetic tracking system (EMTS) used for the experiment is the NDI Aurora Planar Field Generator. The experiment consisted of dropping a total of 35 seeds in a prismatic agarose phantom, and comparing the 3D seed drop positions of the EMTS to those obtained by an image analysis of subsequent micro-CT scans. Drop position error computations and statistical analysis were performed after a 3D registration of the two seed distributions. Results: Of the 35 seeds dropped in the phantom, 32 were properly detected by the needle prototype. Absolute drop position errors among the detected seeds ranged from 0.5 to 4.8 mm with mean and standard deviation values of 1.6 and 0.9 mm, respectively. Error measurements also include undesirable and uncontrollable effects such as seed motion upon deposition. The true accuracy performance of the needle prototype is therefore underestimated. Conclusion: This preliminary study demonstrates the potential benefits of EM technologies in detecting the passage of seeds in a hollow needle as a means of generating drop position estimates in real-time treatment delivery. Such tools could therefore represent a potentially interesting addition to existing brachytherapy protocols for rapid dosimetry

  15. Geographically distributed Batch System as a Service: the INDIGO-DataCloud approach exploiting HTCondor

    Science.gov (United States)

    Aiftimiei, D. C.; Antonacci, M.; Bagnasco, S.; Boccali, T.; Bucchi, R.; Caballer, M.; Costantini, A.; Donvito, G.; Gaido, L.; Italiano, A.; Michelotto, D.; Panella, M.; Salomoni, D.; Vallero, S.

    2017-10-01

    One of the challenges a scientific computing center has to face is to keep delivering well consolidated computational frameworks (i.e. the batch computing farm), while conforming to modern computing paradigms. The aim is to ease system administration at all levels (from hardware to applications) and to provide a smooth end-user experience. Within the INDIGO- DataCloud project, we adopt two different approaches to implement a PaaS-level, on-demand Batch Farm Service based on HTCondor and Mesos. In the first approach, described in this paper, the various HTCondor daemons are packaged inside pre-configured Docker images and deployed as Long Running Services through Marathon, profiting from its health checks and failover capabilities. In the second approach, we are going to implement an ad-hoc HTCondor framework for Mesos. Container-to-container communication and isolation have been addressed exploring a solution based on overlay networks (based on the Calico Project). Finally, we have studied the possibility to deploy an HTCondor cluster that spans over different sites, exploiting the Condor Connection Broker component, that allows communication across a private network boundary or firewall as in case of multi-site deployments. In this paper, we are going to describe and motivate our implementation choices and to show the results of the first tests performed.

  16. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure

  17. Power throttling of collections of computing elements

    Science.gov (United States)

    Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY

    2011-08-16

    An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.

  18. Increasing the power of accelerated molecular dynamics methods and plans to exploit the coming exascale

    Science.gov (United States)

    Voter, Arthur

    Many important materials processes take place on time scales that far exceed the roughly one microsecond accessible to molecular dynamics simulation. Typically, this long-time evolution is characterized by a succession of thermally activated infrequent events involving defects in the material. In the accelerated molecular dynamics (AMD) methodology, known characteristics of infrequent-event systems are exploited to make reactive events take place more frequently, in a dynamically correct way. For certain processes, this approach has been remarkably successful, offering a view of complex dynamical evolution on time scales of microseconds, milliseconds, and sometimes beyond. We have recently made advances in all three of the basic AMD methods (hyperdynamics, parallel replica dynamics, and temperature accelerated dynamics (TAD)), exploiting both algorithmic advances and novel parallelization approaches. I will describe these advances, present some examples of our latest results, and discuss what should be possible when exascale computing arrives in roughly five years. Funded by the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, and by the Los Alamos Laboratory Directed Research and Development program.

  19. Trolling may intensify exploitation in crappie fisheries

    Science.gov (United States)

    Meals, K. O.; Dunn, A. W.; Miranda, Leandro E.

    2012-01-01

    In some parts of the USA, anglers targeting crappies Pomoxis spp. are transitioning from mostly stationary angling with a single pole around submerged structures to using multiple poles while drifting with the wind or under power. This shift in fishing methods could result in a change in catch efficiency, possibly increasing exploitation rates to levels that would be of concern to managers. We studied the catch statistics of anglers fishing while trolling with multiple poles (trollers) and those fishing with single poles (polers) in Mississippi reservoirs. Specifically, we tested whether (1) various catch statistics differed between trollers and polers, (2) catch rates of trollers were related to the number of poles fished, and (3) trollers could raise exploitation rates to potentially unsustainable levels. Results showed that participation in the crappie fisheries was about equally split between polers and trollers. In spring, 90% of crappie anglers were polers; in summer, 85% of crappie anglers were trollers. The size of harvested crappies was similar for the two angler groups, but the catch per hour was almost three times higher for trollers than for polers. Catch rates by trollers were directly correlated to the number of poles fished, although the relationship flattened as the number of poles increased. The average harvest rate for one troller fishing with three poles was similar to the harvest rate obtained by one poler. Simulations predicted that at the existing mix of about 50% polers and 50% trollers and with no restrictions on the number of poles used by trollers, exploitation of crappies is about 1.3 times higher than that in a polers-only fishery; under a scenario in which 100% of crappie anglers were trollers, exploitation was forecasted to increase to about 1.7 times the polers-only rate. The efficiency of trolling for crappies should be of concern to fishery managers because crappie fisheries are mostly consumptive and may increase exploitation

  20. 25th Annual International Symposium on Field-Programmable Custom Computing Machines

    CERN Document Server

    The IEEE Symposium on Field-Programmable Custom Computing Machines is the original and premier forum for presenting and discussing new research related to computing that exploits the unique features and capabilities of FPGAs and other reconfigurable hardware. Over the past two decades, FCCM has been the place to present papers on architectures, tools, and programming models for field-programmable custom computing machines as well as applications that use such systems.

  1. An Introduction to Parallel Cluster Computing Using PVM for Computer Modeling and Simulation of Engineering Problems

    International Nuclear Information System (INIS)

    Spencer, VN

    2001-01-01

    An investigation has been conducted regarding the ability of clustered personal computers to improve the performance of executing software simulations for solving engineering problems. The power and utility of personal computers continues to grow exponentially through advances in computing capabilities such as newer microprocessors, advances in microchip technologies, electronic packaging, and cost effective gigabyte-size hard drive capacity. Many engineering problems require significant computing power. Therefore, the computation has to be done by high-performance computer systems that cost millions of dollars and need gigabytes of memory to complete the task. Alternately, it is feasible to provide adequate computing in the form of clustered personal computers. This method cuts the cost and size by linking (clustering) personal computers together across a network. Clusters also have the advantage that they can be used as stand-alone computers when they are not operating as a parallel computer. Parallel computing software to exploit clusters is available for computer operating systems like Unix, Windows NT, or Linux. This project concentrates on the use of Windows NT, and the Parallel Virtual Machine (PVM) system to solve an engineering dynamics problem in Fortran

  2. Models of Social Exploitation with Special Emphasis on Slovenc Traffic Economics

    Directory of Open Access Journals (Sweden)

    Iztok Ostan

    2005-01-01

    Full Text Available In order to decipher the organisational behaviour operatingin the transport sector of the economy it is necessary to discoverthe prevalent patterns of social exploitation at work. PreliminaJyresults of a study of experienced irregular traffic studentsshow that, according to them there is no significant differencein Slovenia between exploitation in traffic and other sectors.Thus, general models of exploitation could be used to explainthe behaviour in the traffic sector. Empirical research amongSlovene students showed that according to their statements inthe 90s the managerial and capitalistic types of exploitation prevailedin Slovenia over non-exploitative types of economic behaviour.It also showed that statements of students do not differmuch from those of the general public regarding this question,nor from the statements of irregular students with extensivework experience. It was also found that there were no substantialdifferences between the statements of Italian and Slovenestudents regarding the type of exploitation operative in theircountries. Students of traffic are basically of the same opinionregarding this topic as students in general, though slightly morecritical, especially towards business managers and politicians.

  3. Fuzzy logic, neural networks, and soft computing

    Science.gov (United States)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  4. Augmented reality enabling intelligence exploitation at the edge

    Science.gov (United States)

    Kase, Sue E.; Roy, Heather; Bowman, Elizabeth K.; Patton, Debra

    2015-05-01

    Today's Warfighters need to make quick decisions while interacting in densely populated environments comprised of friendly, hostile, and neutral host nation locals. However, there is a gap in the real-time processing of big data streams for edge intelligence. We introduce a big data processing pipeline called ARTEA that ingests, monitors, and performs a variety of analytics including noise reduction, pattern identification, and trend and event detection in the context of an area of operations (AOR). Results of the analytics are presented to the Soldier via an augmented reality (AR) device Google Glass (Glass). Non-intrusive AR devices such as Glass can visually communicate contextually relevant alerts to the Soldier based on the current mission objectives, time, location, and observed or sensed activities. This real-time processing and AR presentation approach to knowledge discovery flattens the intelligence hierarchy enabling the edge Soldier to act as a vital and active participant in the analysis process. We report preliminary observations testing ARTEA and Glass in a document exploitation and person of interest scenario simulating edge Soldier participation in the intelligence process in disconnected deployment conditions.

  5. Biomass exploitation for revitalizing rural areas: experiences and lessons drawn from three South European countries

    Energy Technology Data Exchange (ETDEWEB)

    Santana, J.C.; Herrero, J. [Wood and Forest Service Center of Castilla y Leon (CESEFOR) Pol. Ind. Las Casas, Soria (Spain); Crema, L.; Bozzoli, A. [Fondazione Bruno Kessler (FKB), Trento (Italy); Karampinis, E.; Grammelis, P.; Margaritis, N. [Centre for Research and Technology Hellas/Inst. for Solid Fuels Technology and Applications (CERTH/ISFTA), Athens (Greece)

    2012-11-01

    Castilla y Leon (Spain), Trento (Italy) and Western Macedonia (Greece) are regions with a very high potential for forest and agricultural biomass production, but their biomass supply chains are not firmly established yet. In Castilla y Leon, a municipality from a forest area takes advantage of its large autochthonous stock of wood to arrange a complete chain of business, beginning with wood cutting and extraction, processing of raw biomass in local logistic centers to produce quality and traceable wood chips and pellets, distribution of the solid biofuels to consumers in a determined area and own use to generate energy and heat. In Trento, we analyse the exploitation of locally certified wood and residues pellets for public micro-cogeneration in a town, reaching a virtual closed cycle of use and recycling of resources. In a municipality from Western Macedonia, biomass residues from animal waste are being used to produce biogas to generate electric power to be sold and heat to dry wood biomass in a local pellet factory, revitalizing a land very conditioned by mining industry. These strategies maximize the number of jobs created and make optimum use of the local resources, providing them with high added value.

  6. Meaning, function and methods of the recultivation in mining exploitation

    OpenAIRE

    Dambov, Risto; Ljatifi, Ejup

    2015-01-01

    With the exploitation of mineral resources is performed degradation and deformation of the relief and the general part of surface of the Earth's crust. Depending on the type of open pit mine, this degradation can be expressed to a lesser or greater extent, and sometimes in several square kilometers. The exploitation of mineral resources is with unbreakable link with the environment. Very often it is said that mining is „enemy No. 1“ for environment. With exploitation comes to degradation of h...

  7. Earliest economic exploitation of chicken outside East Asia: Evidence from the Hellenistic Southern Levant

    Science.gov (United States)

    Perry-Gal, Lee; Erlich, Adi; Gilboa, Ayelet; Bar-Oz, Guy

    2015-01-01

    Chicken (Gallus gallus domesticus) is today one of the most widespread domesticated species and is a main source of protein in the human diet. However, for thousands of years exploitation of chickens was confined to symbolic and social domains such as cockfighting. The question of when and where chickens were first used for economic purposes remains unresolved. The results of our faunal analysis demonstrate that the Hellenistic (fourth–second centuries B.C.E.) site of Maresha, Israel, is the earliest site known today where economic exploitation of chickens was widely practiced. We base our claim on the exceptionally high frequency of chicken bones at that site, the majority of which belong to adult individuals, and on the observed 2:1 ratio of female to male bones. These results are supported further by an extensive survey of faunal remains from 234 sites in the Southern Levant, spanning more than three millennia, which shows a sharp increase in the frequency of chicken during the Hellenistic period. We further argue that the earliest secure evidence for economic exploitation of chickens in Europe dates to the first century B.C.E. and therefore is predated by the finds in the Southern Levant by at least a century. We suggest that the gradual acclimatization of chickens in the Southern Levant and its gradual integration into the local economy, the latter fully accomplished in the Hellenistic period, was a crucial step in the adoption of this species in European husbandry some 100 y later. PMID:26195775

  8. Computerized management of radiology department: Installation and use of local area network(LAN) by personal computers

    International Nuclear Information System (INIS)

    Lee, Young Joon; Han, Kook Sang; Geon, Do Ig; Sol, Chang Hyo; Kim, Byung Soo

    1993-01-01

    There is increasing need for network connecting personal computers(PC) together. Thus local area network(LAN) emerged, which was designed to allow multiple computers to access and share multiple files and programs and expensive peripheral devices and to communicate with each user. We build PC-LAN in our department that consisted of 1) hardware-9 sets of personal computers(IBM compatible 80386 DX, 1 set; 80286 AT, 8 sets) and cables and network interface cards (Ethernet compatible, 16 bits) that connected PC and peripheral devices 2) software - network operating system and database management system. We managed this network for 6 months. The benefits of PC-LAN were 1) multiuser (share multiple files and programs, peripheral devices) 2) real data processing 3) excellent expandability and flexibility, compatibility, easy connectivity 4) single cable for networking) rapid data transmission 5) simple and easy installation and management 6) using conventional PC's software running under DOS(Disk Operating System) without transformation 7) low networking cost. In conclusion, PC-lan provides an easier and more effective way to manage multiuser database system needed at hospital departments instead of more expensive and complex network of minicomputer or mainframe

  9. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  10. Membrane computing: brief introduction, recent results and applications.

    Science.gov (United States)

    Păun, Gheorghe; Pérez-Jiménez, Mario J

    2006-07-01

    The internal organization and functioning of living cells, as well as their cooperation in tissues and higher order structures, can be a rich source of inspiration for computer science, not fully exploited at the present date. Membrane computing is an answer to this challenge, well developed at the theoretical (mathematical and computability theory) level, already having several applications (via usual computers), but without having yet a bio-lab implementation. After briefly discussing some general issues related to natural computing, this paper provides an informal introduction to membrane computing, focused on the main ideas, the main classes of results and of applications. Then, three recent achievements, of three different types, are briefly presented, with emphasis on the usefulness of membrane computing as a framework for devising models of interest for biological and medical research.

  11. Structure and dynamics of amorphous polymers: computer simulations compared to experiment and theory

    International Nuclear Information System (INIS)

    Paul, Wolfgang; Smith, Grant D

    2004-01-01

    This contribution considers recent developments in the computer modelling of amorphous polymeric materials. Progress in our capabilities to build models for the computer simulation of polymers from the detailed atomistic scale up to coarse-grained mesoscopic models, together with the ever-improving performance of computers, have led to important insights from computer simulations into the structural and dynamic properties of amorphous polymers. Structurally, chain connectivity introduces a range of length scales from that of the chemical bond to the radius of gyration of the polymer chain covering 2-4 orders of magnitude. Dynamically, this range of length scales translates into an even larger range of time scales observable in relaxation processes in amorphous polymers ranging from about 10 -13 to 10 -3 s or even to 10 3 s when glass dynamics is concerned. There is currently no single simulation technique that is able to describe all these length and time scales efficiently. On large length and time scales basic topology and entropy become the governing properties and this fact can be exploited using computer simulations of coarse-grained polymer models to study universal aspects of the structure and dynamics of amorphous polymers. On the largest length and time scales chain connectivity is the dominating factor leading to the strong increase in longest relaxation times described within the reptation theory of polymer melt dynamics. Recently, many of the universal aspects of this behaviour have been further elucidated by computer simulations of coarse-grained polymer models. On short length scales the detailed chemistry and energetics of the polymer are important, and one has to be able to capture them correctly using chemically realistic modelling of specific polymers, even when the aim is to extract generic physical behaviour exhibited by the specific chemistry. Detailed studies of chemically realistic models highlight the central importance of torsional dynamics

  12. SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform

    Science.gov (United States)

    Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio

    2016-08-01

    SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.

  13. Assessing the Role of User Computer Self-Efficacy, Cybersecurity Countermeasures Awareness, and Cybersecurity Skills toward Computer Misuse Intention at Government Agencies

    Science.gov (United States)

    Choi, Min Suk

    2013-01-01

    Cybersecurity threats and vulnerabilities are causing substantial financial losses for governments and organizations all over the world. Cybersecurity criminals are stealing more than one billion dollars from banks every year by exploiting vulnerabilities caused by bank users' computer misuse. Cybersecurity breaches are threatening the common…

  14. Peat exploitation - Environment. Effects and measures

    International Nuclear Information System (INIS)

    Stenbeck, G.

    1996-01-01

    This report gives a detailed description of the influence of peat exploitation on the land-, water- and atmospheric environments. Proposals for mitigatory measures to minimize damage to the environment are also given

  15. SU-G-JeP1-01: A Combination of Real Time Electromagnetic Localization and Tracking with Cone Beam Computed Tomography in Stereotactic Radiosurgery for Brain Tumors

    International Nuclear Information System (INIS)

    Muralidhar, K Raja; Pangam, Suresh; Ponaganti, Srinivas; Krishna, Jayarama; Sujana, Kolla V; Komanduri, Priya K

    2016-01-01

    Purpose: 1. online verification of patient position during treatment using calypso electromagnetic localization and tracking system. 2. Verification and comparison of positional accuracy between cone beam computed tomography and calypso system. 3. Presenting the advantage of continuation localization in Stereotactic radiosurgery treatments. Methods: Ten brain tumor cases were taken for this study. Patients with head mask were under gone Computed Tomography (CT). Before scanning, mask was cut on the fore head area to keep surface beacons on the skin. Slice thickness of 0.65 mm were taken for this study. x, y, z coordinates of these beacons in TPS were entered into tracking station. Varian True Beam accelerator, equipped with On Board Imager was used to take Cone beam Computed Tomography (CBCT) to localize the patient. Simultaneously Surface beacons were used to localize and track the patient throughout the treatment. The localization values were compared in both systems. For localization CBCT considered as reference. Tracking was done throughout the treatment using Calypso tracking system using electromagnetic array. This array was in tracking position during imaging and treatment. Flattening Filter free beams of 6MV photons along with Volumetric Modulated Arc Therapy was used for the treatment. The patient movement was observed throughout the treatment ranging from 2 min to 4 min. Results: The average variation observed between calypso system and CBCT localization was less than 0.5 mm. These variations were due to manual errors while keeping beacon on the patient. Less than 0.05 cm intra-fraction motion was observed throughout the treatment with the help of continuous tracking. Conclusion: Calypso target localization system is one of the finest tools to perform radiosurgery in combination with CBCT. This non radiographic method of tracking is a real beneficial method to treat patients confidently while observing real-time motion information of the patient.

  16. SU-G-JeP1-01: A Combination of Real Time Electromagnetic Localization and Tracking with Cone Beam Computed Tomography in Stereotactic Radiosurgery for Brain Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Muralidhar, K Raja; Pangam, Suresh; Ponaganti, Srinivas; Krishna, Jayarama; Sujana, Kolla V; Komanduri, Priya K [American Oncology Institute, Hyderabad, Telangana (India)

    2016-06-15

    Purpose: 1. online verification of patient position during treatment using calypso electromagnetic localization and tracking system. 2. Verification and comparison of positional accuracy between cone beam computed tomography and calypso system. 3. Presenting the advantage of continuation localization in Stereotactic radiosurgery treatments. Methods: Ten brain tumor cases were taken for this study. Patients with head mask were under gone Computed Tomography (CT). Before scanning, mask was cut on the fore head area to keep surface beacons on the skin. Slice thickness of 0.65 mm were taken for this study. x, y, z coordinates of these beacons in TPS were entered into tracking station. Varian True Beam accelerator, equipped with On Board Imager was used to take Cone beam Computed Tomography (CBCT) to localize the patient. Simultaneously Surface beacons were used to localize and track the patient throughout the treatment. The localization values were compared in both systems. For localization CBCT considered as reference. Tracking was done throughout the treatment using Calypso tracking system using electromagnetic array. This array was in tracking position during imaging and treatment. Flattening Filter free beams of 6MV photons along with Volumetric Modulated Arc Therapy was used for the treatment. The patient movement was observed throughout the treatment ranging from 2 min to 4 min. Results: The average variation observed between calypso system and CBCT localization was less than 0.5 mm. These variations were due to manual errors while keeping beacon on the patient. Less than 0.05 cm intra-fraction motion was observed throughout the treatment with the help of continuous tracking. Conclusion: Calypso target localization system is one of the finest tools to perform radiosurgery in combination with CBCT. This non radiographic method of tracking is a real beneficial method to treat patients confidently while observing real-time motion information of the patient.

  17. Neural Computation Scheme of Compound Control: Tacit Learning for Bipedal Locomotion

    Science.gov (United States)

    Shimoda, Shingo; Kimura, Hidenori

    The growing need for controlling complex behaviors of versatile robots working in unpredictable environment has revealed the fundamental limitation of model-based control strategy that requires precise models of robots and environments before their operations. This difficulty is fundamental and has the same root with the well-known frame problem in artificial intelligence. It has been a central long standing issue in advanced robotics, as well as machine intelligence, to find a prospective clue to attack this fundamental difficulty. The general consensus shared by many leading researchers in the related field is that the body plays an important role in acquiring intelligence that can conquer unknowns. In particular, purposeful behaviors emerge during body-environment interactions with the help of an appropriately organized neural computational scheme that can exploit what the environment can afford. Along this line, we propose a new scheme of neural computation based on compound control which represents a typical feature of biological controls. This scheme is based on classical neuron models with local rules that can create macroscopic purposeful behaviors. This scheme is applied to a bipedal robot and generates the rhythm of walking without any model of robot dynamics and environments.

  18. Approximating local observables on projected entangled pair states

    Science.gov (United States)

    Schwarz, M.; Buerschaper, O.; Eisert, J.

    2017-06-01

    Tensor network states are for good reasons believed to capture ground states of gapped local Hamiltonians arising in the condensed matter context, states which are in turn expected to satisfy an entanglement area law. However, the computational hardness of contracting projected entangled pair states in two- and higher-dimensional systems is often seen as a significant obstacle when devising higher-dimensional variants of the density-matrix renormalization group method. In this work, we show that for those projected entangled pair states that are expected to provide good approximations of such ground states of local Hamiltonians, one can compute local expectation values in quasipolynomial time. We therefore provide a complexity-theoretic justification of why state-of-the-art numerical tools work so well in practice. We finally turn to the computation of local expectation values on quantum computers, providing a meaningful application for a small-scale quantum computer.

  19. Helix Nebula and CERN: A Symbiotic approach to exploiting commercial clouds

    International Nuclear Information System (INIS)

    Megino, Fernando H Barreiro; Jones, Robert; Llamas, Ramón Medrano; Ster, Daniel van der; Kucharczyk, Katarzyna

    2014-01-01

    The recent paradigm shift toward cloud computing in IT, and general interest in 'Big Data' in particular, have demonstrated that the computing requirements of HEP are no longer globally unique. Indeed, the CERN IT department and LHC experiments have already made significant R and D investments in delivering and exploiting cloud computing resources. While a number of technical evaluations of interesting commercial offerings from global IT enterprises have been performed by various physics labs, further technical, security, sociological, and legal issues need to be address before their large-scale adoption by the research community can be envisaged. Helix Nebula – the Science Cloud is an initiative that explores these questions by joining the forces of three European research institutes (CERN, ESA and EMBL) with leading European commercial IT enterprises. The goals of Helix Nebula are to establish a cloud platform federating multiple commercial cloud providers, along with new business models, which can sustain the cloud marketplace for years to come. This contribution will summarize the participation of CERN in Helix Nebula. We will explain CERN's flagship use-case and the model used to integrate several cloud providers with an LHC experiment's workload management system. During the first proof of concept, this project contributed over 40.000 CPU-days of Monte Carlo production throughput to the ATLAS experiment with marginal manpower required. CERN's experience, together with that of ESA and EMBL, is providing a great insight into the cloud computing industry and highlighted several challenges that are being tackled in order to ease the export of the scientific workloads to the cloud environments.

  20. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  1. The direct exploitation in the mangrove ecosystem in Central Java and the land use in its surrounding; degradation and its restoration effort

    Directory of Open Access Journals (Sweden)

    AHMAD DWI SETYAWAN

    2006-07-01

    Full Text Available The aims of the research were to find out (i the direct exploitation in the mangrove ecosystem, (ii the land use in its surrounding, and (iii the restoration activities in the mangrove ecosystem in northern coast and southern coast of Central Java Province. This was descriptive research that was done qualitatively, in July until December 2003, at 20 sites of mangrove habitat. The data was collected in field surveys, in-depth interview to local people and/or local government, and examination of topographic maps of Java (1963-1965 and digital satellite image of Landsat 7 TM (July-September 2001. The result indicated that the direct exploitation in the mangrove ecosystem included fishery, forestry, food stuff, cattle woof, medicinal stuff, industrial material, and also tourism and education. The land use around mangrove ecosystem included fishery/embankment, agriculture, and the area of developing and building. The anthropogenic activities had been degraded mangrove ecosystem, it was called for restoration. The mangrove restoration had been done success in Pasar Banggi, but it failed in Cakrayasan and Lukulo.

  2. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  3. Computed Flow Through An Artificial Heart And Valve

    Science.gov (United States)

    Rogers, Stuart E.; Kwak, Dochan; Kiris, Cetin; Chang, I-Dee

    1994-01-01

    NASA technical memorandum discusses computations of flow of blood through artificial heart and through tilting-disk artificial heart valve. Represents further progress in research described in "Numerical Simulation of Flow Through an Artificial Heart" (ARC-12478). One purpose of research to exploit advanced techniques of computational fluid dynamics and capabilities of supercomputers to gain understanding of complicated internal flows of viscous, essentially incompressible fluids like blood. Another to use understanding to design better artificial hearts and valves.

  4. Local house prices and mental health.

    Science.gov (United States)

    Joshi, Nayan Krishna

    2016-03-01

    This paper examines the impact of local (county-level) house prices on individual self-reported mental health using individual level data from the United States Behavioral Risk Factor Surveillance System between 2005 and 2011. Exploiting a fixed-effects model that relies on within-county variations, relative to the corresponding changes in other counties, I find that while individuals are likely to experience worse self-reported mental health when local house prices decline, this association is most pronounced for individuals who are least likely to be homeowners. This finding is not consistent with a prediction from a pure wealth mechanism but rather with the hypothesis that house prices act as an economic barometer. I also demonstrate that the association between self-reported mental health and local house prices is not driven by unemployment or foreclosure. The primary result-that lower local house prices have adverse impact on self-reported mental health of homeowners and renters-is consistent with studies using data from the United Kingdom.

  5. D-Wave's Approach to Quantum Computing: 1000-qubits and Counting!

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    In this talk I will describe D-Wave's approach to quantum computing, including the system architecture of our 1000-qubit D-Wave 2X, its programming model, and performance benchmarks. Furthermore, I will describe how the native optimization and sampling capabilities of the quantum processor can be exploited to tackle problems in a variety of fields including medicine, machine learning, physics, and computational finance.

  6. Elder Financial Exploitation: Implications for Future Policy and Research in Elder Mistreatment

    Directory of Open Access Journals (Sweden)

    Price, Thomas

    2011-07-01

    Full Text Available Recent advances in the understanding of elder mistreatment have demonstrated that financial exploitation tends to be one of the most common forms of mistreatment affecting older populations. Agencies such as the World Bank and World Health Organization show significant concern regarding financial exploitation and its connection to physical and emotional injury to victims. The World Bank uses the term “financial violence” as a means of generally describing the harm caused to an individual as a result of financial exploitation or abuse. The proportion of financial exploitation in relation to other forms of elder mistreatment is defined in our research. We discuss the potential impact of elder financial exploitation on victims as well as explore the implications for future research and policy development focused on financial aspects of elder mistreatment and call for further study in the concept of financial exploitation as a violent act. [West J Emerg Med. 2011;12(3:354-356.

  7. Exploitation of commercial remote sensing images: reality ignored?

    Science.gov (United States)

    Allen, Paul C.

    1999-12-01

    The remote sensing market is on the verge of being awash in commercial high-resolution images. Market estimates are based on the growing numbers of planned commercial remote sensing electro-optical, radar, and hyperspectral satellites and aircraft. EarthWatch, Space Imaging, SPOT, and RDL among others are all working towards launch and service of one to five meter panchromatic or radar-imaging satellites. Additionally, new advances in digital air surveillance and reconnaissance systems, both manned and unmanned, are also expected to expand the geospatial customer base. Regardless of platform, image type, or location, each system promises images with some combination of increased resolution, greater spectral coverage, reduced turn-around time (request-to- delivery), and/or reduced image cost. For the most part, however, market estimates for these new sources focus on the raw digital images (from collection to the ground station) while ignoring the requirements for a processing and exploitation infrastructure comprised of exploitation tools, exploitation training, library systems, and image management systems. From this it would appear the commercial imaging community has failed to learn the hard lessons of national government experience choosing instead to ignore reality and replicate the bias of collection over processing and exploitation. While this trend may be not impact the small quantity users that exist today it will certainly adversely affect the mid- to large-sized users of the future.

  8. Adiabatic quantum computation and quantum annealing theory and practice

    CERN Document Server

    McGeoch, Catherine C

    2014-01-01

    Adiabatic quantum computation (AQC) is an alternative to the better-known gate model of quantum computation. The two models are polynomially equivalent, but otherwise quite dissimilar: one property that distinguishes AQC from the gate model is its analog nature. Quantum annealing (QA) describes a type of heuristic search algorithm that can be implemented to run in the ``native instruction set'''' of an AQC platform. D-Wave Systems Inc. manufactures {quantum annealing processor chips} that exploit quantum properties to realize QA computations in hardware. The chips form the centerpiece of a nov

  9. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  10. ATLAS@Home: Harnessing Volunteer Computing for HEP

    International Nuclear Information System (INIS)

    Adam-Bourdarios, C; Cameron, D; Filipčič, A; Lancon, E; Wu, W

    2015-01-01

    A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte Carlo simulation. Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans. (paper)

  11. ATLAS@Home: Harnessing Volunteer Computing for HEP

    CERN Document Server

    Bourdarios, Claire; Filipcic, Andrej; Lancon, Eric; Wu, Wenjing

    2015-01-01

    A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte-Carlo simulation. Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans.

  12. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  13. Accelerating Large Data Analysis By Exploiting Regularities

    Science.gov (United States)

    Moran, Patrick J.; Ellsworth, David

    2003-01-01

    We present techniques for discovering and exploiting regularity in large curvilinear data sets. The data can be based on a single mesh or a mesh composed of multiple submeshes (also known as zones). Multi-zone data are typical to Computational Fluid Dynamics (CFD) simulations. Regularities include axis-aligned rectilinear and cylindrical meshes as well as cases where one zone is equivalent to a rigid-body transformation of another. Our algorithms can also discover rigid-body motion of meshes in time-series data. Next, we describe a data model where we can utilize the results from the discovery process in order to accelerate large data visualizations. Where possible, we replace general curvilinear zones with rectilinear or cylindrical zones. In rigid-body motion cases we replace a time-series of meshes with a transformed mesh object where a reference mesh is dynamically transformed based on a given time value in order to satisfy geometry requests, on demand. The data model enables us to make these substitutions and dynamic transformations transparently with respect to the visualization algorithms. We present results with large data sets where we combine our mesh replacement and transformation techniques with out-of-core paging in order to achieve significant speed-ups in analysis.

  14. Sexual exploitation and labor during adolescence: A case study

    Directory of Open Access Journals (Sweden)

    Luciana Dutra-Thomé

    2011-09-01

    Full Text Available The present article focused on the perception of sexual exploitation as a job, using a single case study design. The aim of the study was to investigate the case of a 14 year-old girl, involved in commercial sexual exploitation, who considered this situation as her labor activity. A content analysis showed protective and risk factors as categories, especially related to her labor activities. The girl perceived the sexual exploitation activity as a job that provided autonomy, subsistence, and survival. The study revealed that the negative effects of working during adolescence may bring consequences to health and development. Youth work may be defined as a risk factor, especially when the labour conditions are not adequate and protected.

  15. The LOCAL attack: Cryptanalysis of the authenticated encryption scheme ALE

    DEFF Research Database (Denmark)

    Khovratovich, Dmitry; Rechberger, Christian

    2014-01-01

    We show how to produce a forged (ciphertext, tag) pair for the scheme ALE with data and time complexity of 2102 ALE encryptions of short messages and the same number of authentication attempts. We use a differential attack based on a local collision, which exploits the availability of extracted...

  16. Rare earth elements exploitation, geopolitical implications and raw materials trading

    Science.gov (United States)

    Chemin, Marie-Charlotte

    2015-04-01

    Rare earth elements (REE) correspond to seventeen elements of the periodic table. They are used in high technology, cracking, electric cars' magnet, metal alloy for batteries, and also in phone construction or ceramics for electronic card. REEs are an important resource for high technology. This project targets 16 years old students in the subject "personalized aid" and will last six weeks. The purpose of this project is to develop autonomy and research in groups for a transdisciplinary work. This project gathers knowledge in geology, geography and economics. During the first session students analyze the geology applications of the REE. They begin the analysis with learning the composition in different rocks such as basalt and diorite to make the link with crystallization. Then they compare it with adakite to understand the formation of these rocks. In the second session, they study REE exploitation. We can find them as oxides in many deposits. The principal concentrations of rare earth elements are associated with uncommon varieties of igneous rocks, such as carbonatites. They can use Qgis, to localize this high concentration. In the third session, they study the environmental costs of REE exploitation. Indeed, the exploitation produces thorium and carcinogenic toxins: sulphates, ammonia and hydrochloric acid. Processing one ton of rare earths produces 2,000 tons of toxic waste. This session focuses, first, on Baotou's region, and then on an example they are free to choose. In the fourth session, they study the geopolitical issues of REE with a focus on China. In fact this country is the largest producer of REE, and is providing 95% of the overall production. REE in China are at the center of a geopolitical strategy. In fact, China implements a sort of protectionism. Indeed, the export tax on REE is very high so, as a foreign company, it is financially attractive to establish a manufacturing subsidiary in China in order to use REE. As a matter of fact

  17. Prospects of geothermal resource exploitation

    International Nuclear Information System (INIS)

    Bourrelier, P.H.; Cornet, F.; Fouillac, C.

    1994-01-01

    The use of geothermal energy to generate electricity has only occurred during the past 50 years by drilling wells in aquifers close to magmas and producing either dry steam or hot water. The world's production of electricity from geothermal energy is over 6000 MWe and is still growing. The direct use of geothermal energy for major urban communities has been developed recently by exploitation of aquifers in sedimentary basins under large towns. Scaling up the extraction of heat implies the exploitation of larger and better located fields requiring an appropriate method of extraction; the objective of present attempts in USA, Japan and Europe is to create heat exchangers by the circulation of water between several deep wells. Two field categories are considered: the extension of classical geothermal fields beyond the aquifer areas, and areas favoured by both a high geothermal gradient, fractures inducing a natural permeability at large scale, and good commercial prospects (such as in the Rhenan Graben). Hot dry rocks concept has gained a large interest. 1 fig., 5 tabs., 11 refs

  18. Non-local ground-state functional for quantum spin chains with translational broken symmetry

    Energy Technology Data Exchange (ETDEWEB)

    Libero, Valter L.; Penteado, Poliana H.; Veiga, Rodrigo S. [Universidade de Sao Paulo (IFSC/USP), Sao Carlos, SP (Brazil). Inst. de Fisica

    2011-07-01

    Full text. Thanks to the development and use of new materials with special doping, it becomes relevant the study of Heisenberg spin-chains with broken translational symmetry, induced for instance by finite-size effects, bond defects or by impurity spin in the chain. The exact numerical results demands huge computational efforts, due to the size of the Hilbert space involved and the lack of symmetry to exploit. Density Functional Theory (DFT) has been considered a simple alternative to obtain ground-state properties for such systems. Usually, DFT starts with a uniform system to build the correlation energy and after implement a local approximation to construct local functionals. Based on our prove of the Hohenberg-Kohn theorem for Heisenberg models, and in order to describe more realistic models, we have recently developed a non-local exchange functional for the ground-state energy of quantum-spin chains. A alternating-bond chain is used to obtain the correlation energy and a local unit-cell approximation - LUCA, is defined in the context of DFT. The alternating chain is a good starting point to construct functionals since it is intrinsically non-homogeneous, therefore instead of the usual local approximation (like LDA for electronic systems) we need to introduce an approximation based upon a unit cell concept, that renders a non-local functional in the bond exchange interaction. The agreement with exact numerical data (obtained only for small chains, although the functional can be applied for chains with arbitrary size) is significantly better than in our previous local formulation, even for chains with several ferromagnetic or antiferromagnetic bond defects. These results encourage us to extend the concept of LUCA for chains with alternating-spin magnitudes. We also have constructed a non-local functional based on an alternating-spin chain, instead of a local alternating-bond, using spin-wave-theory. Because of its non-local nature, this functional is expected to

  19. Non-local ground-state functional for quantum spin chains with translational broken symmetry

    International Nuclear Information System (INIS)

    Libero, Valter L.; Penteado, Poliana H.; Veiga, Rodrigo S.

    2011-01-01

    Full text. Thanks to the development and use of new materials with special doping, it becomes relevant the study of Heisenberg spin-chains with broken translational symmetry, induced for instance by finite-size effects, bond defects or by impurity spin in the chain. The exact numerical results demands huge computational efforts, due to the size of the Hilbert space involved and the lack of symmetry to exploit. Density Functional Theory (DFT) has been considered a simple alternative to obtain ground-state properties for such systems. Usually, DFT starts with a uniform system to build the correlation energy and after implement a local approximation to construct local functionals. Based on our prove of the Hohenberg-Kohn theorem for Heisenberg models, and in order to describe more realistic models, we have recently developed a non-local exchange functional for the ground-state energy of quantum-spin chains. A alternating-bond chain is used to obtain the correlation energy and a local unit-cell approximation - LUCA, is defined in the context of DFT. The alternating chain is a good starting point to construct functionals since it is intrinsically non-homogeneous, therefore instead of the usual local approximation (like LDA for electronic systems) we need to introduce an approximation based upon a unit cell concept, that renders a non-local functional in the bond exchange interaction. The agreement with exact numerical data (obtained only for small chains, although the functional can be applied for chains with arbitrary size) is significantly better than in our previous local formulation, even for chains with several ferromagnetic or antiferromagnetic bond defects. These results encourage us to extend the concept of LUCA for chains with alternating-spin magnitudes. We also have constructed a non-local functional based on an alternating-spin chain, instead of a local alternating-bond, using spin-wave-theory. Because of its non-local nature, this functional is expected to

  20. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  1. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  2. Design principles of a conditional futile cycle exploited for regulation.

    Science.gov (United States)

    Tolla, Dean A; Kiley, Patricia J; Lomnitz, Jason G; Savageau, Michael A

    2015-07-01

    In this report, we characterize the design principles of futile cycling in providing rapid adaptation by regulatory proteins that act as environmental sensors. In contrast to the energetically wasteful futile cycles that are avoided in metabolic pathways, here we describe a conditional futile cycle exploited for a regulatory benefit. The FNR (fumarate and nitrate reduction) cycle in Escherichia coli operates under two regimes - a strictly futile cycle in the presence of O2 and as a pathway under anoxic conditions. The computational results presented here use FNR as a model system and provide evidence that cycling of this transcription factor and its labile sensory cofactor between active and inactive states affords rapid signaling and adaptation. We modify a previously developed mechanistic model to examine a family of FNR models each with different cycling speeds but mathematically constrained to be otherwise equivalent, and we identify a trade-off between energy expenditure and response time that can be tuned by evolution to optimize cycling rate of the FNR system for a particular ecological context. Simulations mimicking experiments with proposed double mutant strains offer suggestions for experimentally testing our predictions and identifying potential fitness effects. Our approach provides a computational framework for analyzing other conditional futile cycles, which when placed in their larger biological context may be found to confer advantages to the organism.

  3. Malware Sandbox Analysis for Secure Observation of Vulnerability Exploitation

    Science.gov (United States)

    Yoshioka, Katsunari; Inoue, Daisuke; Eto, Masashi; Hoshizawa, Yuji; Nogawa, Hiroki; Nakao, Koji

    Exploiting vulnerabilities of remote systems is one of the fundamental behaviors of malware that determines their potential hazards. Understanding what kind of propagation tactics each malware uses is essential in incident response because such information directly links with countermeasures such as writing a signature for IDS. Although recently malware sandbox analysis has been studied intensively, little work is done on securely observing the vulnerability exploitation by malware. In this paper, we propose a novel sandbox analysis method for securely observing malware's vulnerability exploitation in a totally isolated environment. In our sandbox, we prepare two victim hosts. We first execute the sample malware on one of these hosts and then let it attack the other host which is running multiple vulnerable services. As a simple realization of the proposed method, we have implemented a sandbox using Nepenthes, a low-interaction honeypot, as the second victim. Because Nepenthes can emulate a variety of vulnerable services, we can efficiently observe the propagation of sample malware. In the experiments, among 382 samples whose scan capabilities are confirmed, 381 samples successfully started exploiting vulnerabilities of the second victim. This indicates the certain level of feasibility of the proposed method.

  4. Adapting computational text analysis to social science (and vice versa

    Directory of Open Access Journals (Sweden)

    Paul DiMaggio

    2015-11-01

    Full Text Available Social scientists and computer scientist are divided by small differences in perspective and not by any significant disciplinary divide. In the field of text analysis, several such differences are noted: social scientists often use unsupervised models to explore corpora, whereas many computer scientists employ supervised models to train data; social scientists hold to more conventional causal notions than do most computer scientists, and often favor intense exploitation of existing algorithms, whereas computer scientists focus more on developing new models; and computer scientists tend to trust human judgment more than social scientists do. These differences have implications that potentially can improve the practice of social science.

  5. Final Report, “Exploiting Global View for Resilience”

    Energy Technology Data Exchange (ETDEWEB)

    Chien, Andrew [Univ. of Chicago, IL (United States)

    2017-03-29

    Final technical report for the "Exploiting Global View for Resilience" project. The GVR project aims to create a new approach to portable, resilient applications. The GVR approach builds on a global view data model,, adding versioning (multi-version), user control of timing and rate (multi-stream), and flexible cross layer error signalling and recovery. With a versioned array as a portable abstraction, GVR enables application programmers to exploit deep scientific and application code insights to manage resilience (and its overhead) in a flexible, portable fashion.

  6. Cloudonomics the business value of cloud computing

    CERN Document Server

    Weinman, Joe

    2012-01-01

    The ultimate guide to assessing and exploiting the customer value and revenue potential of the Cloud A new business model is sweeping the world—the Cloud. And, as with any new technology, there is a great deal of fear, uncertainty, and doubt surrounding cloud computing. Cloudonomics radically upends the conventional wisdom, clearly explains the underlying principles and illustrates through understandable examples how Cloud computing can create compelling value—whether you are a customer, a provider, a strategist, or an investor. Cloudonomics covers everything you need to consider f

  7. Quantum computation and Shor close-quote s factoring algorithm

    International Nuclear Information System (INIS)

    Ekert, A.; Jozsa, R.

    1996-01-01

    Current technology is beginning to allow us to manipulate rather than just observe individual quantum phenomena. This opens up the possibility of exploiting quantum effects to perform computations beyond the scope of any classical computer. Recently Peter Shor discovered an efficient algorithm for factoring whole numbers, which uses characteristically quantum effects. The algorithm illustrates the potential power of quantum computation, as there is no known efficient classical method for solving this problem. The authors give an exposition of Shor close-quote s algorithm together with an introduction to quantum computation and complexity theory. They discuss experiments that may contribute to its practical implementation. copyright 1996 The American Physical Society

  8. Exploiting the ALICE HLT for PROOF by scheduling of Virtual Machines

    International Nuclear Information System (INIS)

    Meoni, Marco; Boettger, Stefan; Zelnicek, Pierre; Kebschull, Udo; Lindenstruth, Volker

    2011-01-01

    The HLT (High-Level Trigger) group of the ALICE experiment at the LHC has prepared a virtual Parallel ROOT Facility (PROOF) enabled cluster (HAF - HLT Analysis Facility) for fast physics analysis, detector calibration and reconstruction of data samples. The HLT-Cluster currently consists of 2860 CPU cores and 175TB of storage. Its purpose is the online filtering of the relevant part of data produced by the particle detector. However, data taking is not running continuously and exploiting unused cluster resources for other applications is highly desirable and improves the usage-cost ratio of the HLT cluster. As such, unused computing resources are dedicated to a PROOF-enabled virtual cluster available to the entire collaboration. This setup is especially aimed at the prototyping phase of analyses that need a high number of development iterations and a short response time, e.g. tuning of analysis cuts, calibration and alignment. HAF machines are enabled and disabled upon user request to start or complete analysis tasks. This is achieved by a virtual machine scheduling framework which dynamically assigns and migrates virtual machines running PROOF workers to unused physical resources. Using this approach we extend the HLT usage scheme to running both online and offline computing, thereby optimizing the resource usage.

  9. Computational methods for planning and evaluating geothermal energy projects

    International Nuclear Information System (INIS)

    Goumas, M.G.; Lygerou, V.A.; Papayannakis, L.E.

    1999-01-01

    In planning, designing and evaluating a geothermal energy project, a number of technical, economic, social and environmental parameters should be considered. The use of computational methods provides a rigorous analysis improving the decision-making process. This article demonstrates the application of decision-making methods developed in operational research for the optimum exploitation of geothermal resources. Two characteristic problems are considered: (1) the economic evaluation of a geothermal energy project under uncertain conditions using a stochastic analysis approach and (2) the evaluation of alternative exploitation schemes for optimum development of a low enthalpy geothermal field using a multicriteria decision-making procedure. (Author)

  10. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    Science.gov (United States)

    Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T.

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best “LFP proxy”, we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with “ground-truth” LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo. PMID:26657024

  11. Computing the Local Field Potential (LFP from Integrate-and-Fire Network Models.

    Directory of Open Access Journals (Sweden)

    Alberto Mazzoni

    2015-12-01

    Full Text Available Leaky integrate-and-fire (LIF network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP. Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best "LFP proxy", we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents with "ground-truth" LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo.

  12. How Less Alienation Creates More Exploitation? Audience Labour on Social Network Sites.

    Directory of Open Access Journals (Sweden)

    Eran Fisher

    2012-05-01

    Full Text Available Abstract: The notion of audience labour has been an important contribution to Marxist political economy of the media. It revised the traditional political economy analysis, which focused on media ownership, by suggesting that media was also a site of production, constituting particular relations of production. Such analysis highlighted the active role of audience in the creation of media value as both commodities and workers, thus pointing to audience exploitation. Recently, in light of paradigmatic transformations in the media environment – particularly the emergence of Web 2.0 and social network sites – there has been a renewed interest in such analysis, and a reexamination of audience exploitation. Focusing on Facebook as a case-study, this article examines audience labour on social network sites along two Marxist themes – exploitation and alienation. It argues for a historical shift in the link between exploitation and alienation of audience labour, concurrent with the shift from mass media to social media. In the mass media, the capacity for exploitation of audience labour was quite limited while the alienation that such work created was high. In contrast, social media allows for the expansion and intensification of exploitation. Simultaneously, audience labour on social media – because it involves communication and sociability – also ameliorates alienation by allowing self-expression, authenticity, and relations with others. Moreover, the article argues that the political economy of social network sites is founded on a dialectical link between exploitation and alienation: in order to be de-alienated, Facebook users must communicate and socialize, thus exacerbating their exploitation. And vice-versa, in order for Facebook to exploit the work of its users, it must contribute to their de-alienation.

  13. Exploiting Stabilizers and Parallelism in State Space Generation with the Symmetry Method

    DEFF Research Database (Denmark)

    Lorentsen, Louise; Kristensen, Lars Michael

    2001-01-01

    The symmetry method is a main reduction paradigm for alleviating the state explosion problem. For large symmetry groups deciding whether two states are symmetric becomes time expensive due to the apparent high time complexity of the orbit problem. The contribution of this paper is to alleviate th...... the negative impact of the orbit problem by the specification of canonical representatives for equivalence classes of states in Coloured Petri Nets, and by giving algorithms exploiting stabilizers and parallelism for computing the condensed state space.......The symmetry method is a main reduction paradigm for alleviating the state explosion problem. For large symmetry groups deciding whether two states are symmetric becomes time expensive due to the apparent high time complexity of the orbit problem. The contribution of this paper is to alleviate...

  14. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  15. Accelerating artificial intelligence with reconfigurable computing

    Science.gov (United States)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  16. An Efficient Local Correlation Matrix Decomposition Approach for the Localization Implementation of Ensemble-Based Assimilation Methods

    Science.gov (United States)

    Zhang, Hongqin; Tian, Xiangjun

    2018-04-01

    Ensemble-based data assimilation methods often use the so-called localization scheme to improve the representation of the ensemble background error covariance (Be). Extensive research has been undertaken to reduce the computational cost of these methods by using the localized ensemble samples to localize Be by means of a direct decomposition of the local correlation matrix C. However, the computational costs of the direct decomposition of the local correlation matrix C are still extremely high due to its high dimension. In this paper, we propose an efficient local correlation matrix decomposition approach based on the concept of alternating directions. This approach is intended to avoid direct decomposition of the correlation matrix. Instead, we first decompose the correlation matrix into 1-D correlation matrices in the three coordinate directions, then construct their empirical orthogonal function decomposition at low resolution. This procedure is followed by the 1-D spline interpolation process to transform the above decompositions to the high-resolution grid. Finally, an efficient correlation matrix decomposition is achieved by computing the very similar Kronecker product. We conducted a series of comparison experiments to illustrate the validity and accuracy of the proposed local correlation matrix decomposition approach. The effectiveness of the proposed correlation matrix decomposition approach and its efficient localization implementation of the nonlinear least-squares four-dimensional variational assimilation are further demonstrated by several groups of numerical experiments based on the Advanced Research Weather Research and Forecasting model.

  17. Epileptic patterns of local cerebral metabolism and perfusion in man determined by emission computed tomography of 18FDG and 13NH3

    International Nuclear Information System (INIS)

    Kuhl, D.E.; Engel, J. Jr.; Phelps, M.E.; Selin, C.

    1979-01-01

    Seventeen patients with partial epilepsy had EEG monitoring concurrent with cerebral emission computed tomography (ECT) after 18 F-fluorodeoxyglucose ( 18 FDG) and 13 N-ammonia were given intravenously as indicators of local cerebral glucose utilization (LCMR/sub glc/) and relative perfusion, respectively. In 12 of 15 patients who had unilateral or focal electrical abnormalities, interictal 18 FDG scan patterns clearly showed localized regions of decreased (20% to 50%) LCMR/sub glc/, which correlated anatomically with the eventual EEG localization. These hypometabolic zones appeared normal on x-ray computed tomography in all but three patients and were unchanged on scans repeated on different days. In 5 of 6 patients who underwent temporal lobectomy, the interictal 18 FDG scan correctly detected the pathologically confirmed lesion as a hypometabolic zone, and removal of the lesion site resulted in marked clinical improvement. In contrast, the ictal 18 FDG scan patterns clearly showed foci of increased (82% to 130%) LCMR/sub glc/, which correlated temporally and anatomically with ictal EEG spike foci and were within the zones of interictal hypometabolism (3 studies in 2 patients). 13 NH 3 distributions paralleled 18 FDG increases and decreases in abnormal zones, but 13 NH 3 differences were of lesser magnitude. When the relationship of 13 NH 3 uptake to local blood flow found in dog brain was applied as a correction to the patients' 13 NH 3 scan data, local alterations in perfusion and glucose utilization were usually matched, both in the interictal and ictal states

  18. Boys are not exempt: Sexual exploitation of adolescents in sub-Saharan Africa.

    Science.gov (United States)

    Adjei, Jones K; Saewyc, Elizabeth M

    2017-03-01

    Research on youth sexual exploitation in Africa has largely neglected the experiences of exploited boys. To date, much of the research in sub-Saharan Africa continues to consider boys mainly as exploiters but not as exploited. Using the only publicly available population-based surveys from the National Survey of Adolescents, conducted in four sub-Saharan African countries - Burkina Faso, Ghana, Malawi, and Uganda-we assessed factors associated with transactional sexual behaviour among never-married adolescent boys and girls. We also examined whether boys' reported sexual exploitation was linked to similar risky sexual behaviours as has been noted among girls in sub-Saharan Africa. Results from our analyses indicated that even though adolescent girls have a somewhat higher likelihood of reporting sexual abuse and exploitation, the odds of trading sex were significantly elevated for previously traumatized boys (that is those with a history of sexual and physical abuse) but not for their female counterparts. Just like adolescent girls, transactional sexual behaviour was associated with the risk of having concurrent multiple sexual partners for boys. These findings support the reality of boys' sexual exploitation within the African context, and further highlight the importance of including males in general and boys in particular in population-based studies on sexual health, risk, and protective factors in the sub-Saharan African region. Understanding the factors linked to sexual exploitation for both boys and girls will help in developing policies and programs that could improve the overall sexual and reproductive health outcomes among adolescents and youth in sub-Saharan Africa. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A Descriptive Study on Sexually Exploited Children in Residential Treatment

    Science.gov (United States)

    Twill, Sarah E.; Green, Denise M.; Traylor, Amy

    2010-01-01

    Sexual exploitation and prostitution of children and adolescents is a multibillion dollar industry in the United States (Estes and Weiner in "Medical, legal & social science aspects of child sexual exploitation: A comprehensive review of pornography, prostitution, and internet crimes, vol I," G.W. Medical Publishing, Inc, St Louis,…

  20. Sustainable exploitation and management of aquatic resources

    DEFF Research Database (Denmark)

    Neuenfeldt, Stefan; Köster, Fritz

    2014-01-01

    DTU Aqua conducts research, provides advice,educates at university level and contributes toinnovation in sustainable exploitation andmanagement of aquatic resources. The vision of DTUAqua is to enable ecologically and economicallysustainable exploitation of aquatic resourcesapplying an integrated...... management. Marineecosystems aims at understanding the mechanisms that govern the interaction between individuals,species and populations in an ecosystem enabling us to determine the stability and flexibility of theecosystem.Marine living resources looks at the sustainable utilization of fish and shellfish...... stocks.Ecosystem effects expands from the ecosystem approach to fisheries management to an integratedapproach where other human activities are taken into consideration. Fisheries management developsmethods, models and tools for predicting and evaluating the effects of management measures andregulations...

  1. A morphing strategy to couple non-local to local continuum mechanics

    KAUST Repository

    Lubineau, Gilles

    2012-06-01

    A method for coupling non-local continuum models with long-range central forces to local continuum models is proposed. First, a single unified model that encompasses both local and non-local continuum representations is introduced. This model can be purely non-local, purely local or a hybrid depending on the constitutive parameters. Then, the coupling between the non-local and local descriptions is performed through a transition (morphing) affecting only the constitutive parameters. An important feature is the definition of the morphing functions, which relies on energy equivalence. This approach is useful in large-scale modeling of materials that exhibit strong non-local effects. The computational cost can be reduced while maintaining a reasonable level of accuracy. Efficiency, robustness and basic properties of the approach are discussed using one- and two-dimensional examples. © 2012 Elsevier Ltd.

  2. A morphing strategy to couple non-local to local continuum mechanics

    KAUST Repository

    Lubineau, Gilles; Azdoud, Yan; Han, Fei; Rey, Christian C.; Askari, Abe H.

    2012-01-01

    A method for coupling non-local continuum models with long-range central forces to local continuum models is proposed. First, a single unified model that encompasses both local and non-local continuum representations is introduced. This model can be purely non-local, purely local or a hybrid depending on the constitutive parameters. Then, the coupling between the non-local and local descriptions is performed through a transition (morphing) affecting only the constitutive parameters. An important feature is the definition of the morphing functions, which relies on energy equivalence. This approach is useful in large-scale modeling of materials that exhibit strong non-local effects. The computational cost can be reduced while maintaining a reasonable level of accuracy. Efficiency, robustness and basic properties of the approach are discussed using one- and two-dimensional examples. © 2012 Elsevier Ltd.

  3. MOMCC: Market-Oriented Architecture for Mobile Cloud Computing Based on Service Oriented Architecture

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah; Shiraz, Muhammad

    2012-01-01

    The vision of augmenting computing capabilities of mobile devices, especially smartphones with least cost is likely transforming to reality leveraging cloud computing. Cloud exploitation by mobile devices breeds a new research domain called Mobile Cloud Computing (MCC). However, issues like portability and interoperability should be addressed for mobile augmentation which is a non-trivial task using component-based approaches. Service Oriented Architecture (SOA) is a promising design philosop...

  4. Imouraren mining exploitation : Complementary studies Synthetic report Volum B - Mines

    International Nuclear Information System (INIS)

    1980-01-01

    The object of the current study is to determine the main technical characteristics of the reference project of a mine that can supply the necessary ore quantity at a production of 3000 tonnes uranium per year, along 10 years. The project is one of the possible solutions for exploiting the mine. The current study permits to establish : investment and functioning cost estimation, overall project of the mining exploitation program, necessary strength estimation, average ore grades evaluation and variations of these grades, utilities needs, production vizing program, main exploitation methods and necessary materials. Reference project study of the mine serves as base to the economics studies and studies optimization [fr

  5. New Concepts and Applications in Soft Computing

    CERN Document Server

    Fodor, János; Várkonyi-Kóczy, Annamária

    2013-01-01

                  The book provides a sample of research on the innovative theory and applications of soft computing paradigms.             The idea of Soft Computing was initiated in 1981 when Professor Zadeh published his first paper on soft data analysis and constantly evolved ever since. Professor Zadeh defined Soft Computing as the fusion of the fields of fuzzy logic (FL), neural network theory (NN) and probabilistic reasoning (PR), with the latter subsuming belief networks, evolutionary computing including DNA computing, chaos theory and parts of learning theory into one multidisciplinary system. As Zadeh said the essence of soft computing is that unlike the traditional, hard computing, soft computing is aimed at an accommodation with the pervasive imprecision of the real world. Thus, the guiding principle of soft computing is to exploit the tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness, low solution cost and better rapport with reality. ...

  6. Energy efficient hybrid computing systems using spin devices

    Science.gov (United States)

    Sharad, Mrigank

    Emerging spin-devices like magnetic tunnel junctions (MTJ's), spin-valves and domain wall magnets (DWM) have opened new avenues for spin-based logic design. This work explored potential computing applications which can exploit such devices for higher energy-efficiency and performance. The proposed applications involve hybrid design schemes, where charge-based devices supplement the spin-devices, to gain large benefits at the system level. As an example, lateral spin valves (LSV) involve switching of nanomagnets using spin-polarized current injection through a metallic channel such as Cu. Such spin-torque based devices possess several interesting properties that can be exploited for ultra-low power computation. Analog characteristic of spin current facilitate non-Boolean computation like majority evaluation that can be used to model a neuron. The magneto-metallic neurons can operate at ultra-low terminal voltage of ˜20mV, thereby resulting in small computation power. Moreover, since nano-magnets inherently act as memory elements, these devices can facilitate integration of logic and memory in interesting ways. The spin based neurons can be integrated with CMOS and other emerging devices leading to different classes of neuromorphic/non-Von-Neumann architectures. The spin-based designs involve `mixed-mode' processing and hence can provide very compact and ultra-low energy solutions for complex computation blocks, both digital as well as analog. Such low-power, hybrid designs can be suitable for various data processing applications like cognitive computing, associative memory, and currentmode on-chip global interconnects. Simulation results for these applications based on device-circuit co-simulation framework predict more than ˜100x improvement in computation energy as compared to state of the art CMOS design, for optimal spin-device parameters.

  7. Etude de l'exploitation et du marché des produits forestiers non ligneux à Kinshasa

    Directory of Open Access Journals (Sweden)

    Biloso, A.

    2006-01-01

    Full Text Available Study of Non Woody Forest Products Exploitation and Market in Kinshasa. In spite of the considerable number of ethnobotanic studies carried out in many regions of the Democratic Republic of Congo, almost no information is available on the exploitation and commerce of the wild products (non woody forest products, "produits forestiers non ligneux" – PFNL of the Congolese provinces in general and the urban province of Kinshasa in particular. Nevertheless, these products are largely used and marketed. Therefore, direct observations in situ, and socio-economic and ethnoecologic investigations (including interviews were organized in the urban province of Kinshasa to analyse various aspects of consumption of these wild products. The purpose of these investigations was to collect information regarding the use and the marketing of PFNL products by the populations living in the zones surrounding the urban area. The analysis of the various types of exploitation and use of the PFNL has shown twelve categories of PFNL use: energy, food, construction of music instruments, saw mill applications, drink, drugs, dye, packing, construction of baskets, textile fabrication, construction and ornamentation. The majority of these PFNL originate not only from the secondary forests and from forest galleries, but from shrubby savannas and the marshes as well. The exploitations and uses of the PFNL vary rather largely with the level of income, the purchasing power, the attachment with food practices and with local traditions. The various levels of semi-monthly income by owner of the PFNL are estimated for the following plants which were used as vegetables: Gnetum africanum Welw., (275.0 $; Pteridium aquilinium Hieron. (166.7 $; Dracaena camerooniana Baker. (75.5 $; Dioscorea praehensilis (Benth. (71.0 $; Psophocarpus scandens (Endl. Verdc. (58.7 $. The average income resulting from the firewood sale was estimated at 80 $ per month and per person, for a group of 25 owners

  8. Role of local governments in promoting energy efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Lee, H.

    1980-11-01

    An examination is made of the incentives which influence the decisions by local governments to adopt energy-efficiency programs, either unilaterally or in partnership with the Federal government. It is found that there is significant potential for improved energy efficiency in urban residential, commercial, and industrial buildings and that exploiting these opportunities is in the interest of both Federal and local governments. Unless there is a unique combination of strong local leadership, a tradition of resource management, and external energy shocks, communities are unlikely to realize this potential. Conflicting demands, traditional perceptions, and lack of funding pose a major barrier to a strong unilateral commitment by local governments. A Federal-local partnership built upon and complementary to existing efforts in areas such as housing, social welfare, and economic development offers an excellent opportunity to realize the inherent potential of local energy-efficiency programs. At the local level, energy is not perceived as an isolated issue, but one which is part of a number of problems arising from the continuing increase in energy prices.

  9. Pupil Diameter Tracks the Exploration-Exploitation Trade-off during Analogical Reasoning and Explains Individual Differences in Fluid Intelligence.

    Science.gov (United States)

    Hayes, Taylor R; Petrov, Alexander A

    2016-02-01

    The ability to adaptively shift between exploration and exploitation control states is critical for optimizing behavioral performance. Converging evidence from primate electrophysiology and computational neural modeling has suggested that this ability may be mediated by the broad norepinephrine projections emanating from the locus coeruleus (LC) [Aston-Jones, G., & Cohen, J. D. An integrative theory of locus coeruleus-norepinephrine function: Adaptive gain and optimal performance. Annual Review of Neuroscience, 28, 403-450, 2005]. There is also evidence that pupil diameter covaries systematically with LC activity. Although imperfect and indirect, this link makes pupillometry a useful tool for studying the locus coeruleus norepinephrine system in humans and in high-level tasks. Here, we present a novel paradigm that examines how the pupillary response during exploration and exploitation covaries with individual differences in fluid intelligence during analogical reasoning on Raven's Advanced Progressive Matrices. Pupillometry was used as a noninvasive proxy for LC activity, and concurrent think-aloud verbal protocols were used to identify exploratory and exploitative solution periods. This novel combination of pupillometry and verbal protocols from 40 participants revealed a decrease in pupil diameter during exploitation and an increase during exploration. The temporal dynamics of the pupillary response was characterized by a steep increase during the transition to exploratory periods, sustained dilation for many seconds afterward, and followed by gradual return to baseline. Moreover, the individual differences in the relative magnitude of pupillary dilation accounted for 16% of the variance in Advanced Progressive Matrices scores. Assuming that pupil diameter is a valid index of LC activity, these results establish promising preliminary connections between the literature on locus coeruleus norepinephrine-mediated cognitive control and the literature on analogical

  10. Do sex-specific densities affect local survival of free-ranging great tits?

    NARCIS (Netherlands)

    Michler, Stephanie P. M.; Nicolaus, Marion; Ubels, Richard; van der Velde, Marco; Both, Christiaan; Tinbergen, Joost M.; Komdeur, Jan

    2011-01-01

    Competition within sexes is expected when resources are sex specific, whereas competition between sexes can occur when similar resources are exploited. Local population density and sex ratio will determine the amount of sex-specific interactions and thus the potential degree of sex-specific

  11. CMS computing upgrade and evolution

    CERN Document Server

    Hernandez Calama, Jose

    2013-01-01

    The distributed Grid computing infrastructure has been instrumental in the successful exploitation of the LHC data leading to the discovery of the Higgs boson. The computing system will need to face new challenges from 2015 on when LHC restarts with an anticipated higher detector output rate and event complexity, but with only a limited increase in the computing resources. A more efficient use of the available resources will be mandatory. CMS is improving the data storage, distribution and access as well as the processing efficiency. Remote access to the data through the WAN, dynamic data replication and deletion based on the data access patterns, and separation of disk and tape storage are some of the areas being actively developed. Multi-core processing and scheduling is being pursued in order to make a better use of the multi-core nodes available at the sites. In addition, CMS is exploring new computing techniques, such as Cloud Computing, to get access to opportunistic resources or as a means of using wit...

  12. Numerical computation of molecular integrals via optimized (vectorized) FORTRAN code

    International Nuclear Information System (INIS)

    Scott, T.C.; Grant, I.P.; Saunders, V.R.

    1997-01-01

    The calculation of molecular properties based on quantum mechanics is an area of fundamental research whose horizons have always been determined by the power of state-of-the-art computers. A computational bottleneck is the numerical calculation of the required molecular integrals to sufficient precision. Herein, we present a method for the rapid numerical evaluation of molecular integrals using optimized FORTRAN code generated by Maple. The method is based on the exploitation of common intermediates and the optimization can be adjusted to both serial and vectorized computations. (orig.)

  13. Local Ray-Based Traveltime Computation Using the Linearized Eikonal Equation

    KAUST Repository

    Almubarak, Mohammed S.

    2013-05-01

    The computation of traveltimes plays a critical role in the conventional implementations of Kirchhoff migration. Finite-difference-based methods are considered one of the most effective approaches for traveltime calculations and are therefore widely used. However, these eikonal solvers are mainly used to obtain early-arrival traveltime. Ray tracing can be used to pick later traveltime branches, besides the early arrivals, which may lead to an improvement in velocity estimation or in seismic imaging. In this thesis, I improved the accuracy of the solution of the linearized eikonal equation by constructing a linear system of equations (LSE) based on finite-difference approximation, which is of second-order accuracy. The ill-conditioned LSE is initially regularized and subsequently solved to calculate the traveltime update. Numerical tests proved that this method is as accurate as the second-order eikonal solver. Later arrivals are picked using ray tracing. These traveltimes are binned to the nearest node on a regular grid and empty nodes are estimated by interpolating the known values. The resulting traveltime field is used as an input to the linearized eikonal algorithm, which improves the accuracy of the interpolated nodes and yields a local ray-based traveltime. This is a preliminary study and further investigation is required to test the efficiency and the convergence of the solutions.

  14. EXPLOITATION OF GLASS SAND ON THE OPEN PIT »VRTLINSKA« (MOSLAVINA, CROATIA

    Directory of Open Access Journals (Sweden)

    Slavko Vujec

    1994-12-01

    Full Text Available The exploitation of glass sand in Slavonia and Moslavina has a long tradition. The open-pit »Vrtlinska« is-accordirtg to its dimensions and production capacity the biggest one in this regions. Exploitation reserves within this open pit amount about 1 000 000 t glass sand of a very good quality, and the production capacity is 200 000 t yearly according to real needs during design and opening the pit, i. e. before the war. This article discusses geological and geomechanical characteristics of the deposit, as well as the exploitation process, which is considerably matching natural characteristics of the deposit. A more detailed description is given of the planned exploitation phase I above groundwater level which is carried out according to discontinuous system. For the exploitation of the depth part under groundwater level in the phase II, the necessity of further examination of hydrogeological characteristics of the deposit is presented, in order to acquire necessary information on groundwater regime and drainage conditions. Such knowledge will influence the choice of the most appropriate solutions in the exploitation of the depth part of the deposit.

  15. Pre- and post-radiotherapy computed tomography in laryngeal cancer: imaging-based prediction of local failure

    International Nuclear Information System (INIS)

    Pameijer, Frank A.; Hermans, Robert; Mancuso, Anthony A.; Mendenhall, William M.; Parsons, James T.; Stringer, Scott P.; Kubilis, Paul S.; Tinteren, Harm van

    1999-01-01

    Purpose: To determine if pre-radiotherapy (RT) and/or post-radiotherapy computed tomography (CT) can predict local failure in patients with laryngeal carcinoma treated with definitive RT. Methods and Materials: The pre- and post-RT CT examinations of 59 patients (T3 glottic carcinoma [n = 30] and T1-T4 supraglottic carcinoma [n = 29]) were reviewed. For each patient, the first post-RT CT study between 1 and 6 months after irradiation was used. All patients were treated with definitive hyperfractionated twice-daily continuous-course irradiation to a total dose of 6,720-7,920 cGy, and followed-up clinically for at least 2 years after completion of RT. Local control was defined as absence of primary tumor recurrence and a functioning larynx. On the pre-treatment CT study, each tumor was assigned a high-or low-risk profile for local failure after RT. The post-RT CT examinations were evaluated for post-treatment changes using a three-point post-RT CT-score: 1 = expected post-RT changes; 2 = focal mass with a maximal diameter of 1 cm, or < 50% estimated tumor volume reduction. Results: The local control rates at 2 years post-RT based on pre-treatment CT evaluation were 88% for low pre-treatment risk profile patients (95% CI: 66-96%) and 34% (95% CI: 19-50%) for high pre-treatment risk profile patients (risk ratio 6.583; 95% CI: 2.265-9.129; p = 0.0001). Based on post-treatment CT, the local control rates at 2 years post-RT were 94% for score 1, 67% for score 2, and 10% for score 3 (risk ratio 4.760; 95% CI: 2.278-9.950 p 0.0001). Post-RT CT scores added significant information to the pre-treatment risk profiles on prognosis. Conclusions: Pre-treatment CT risk profiles, as well as post-RT CT evaluation can identify patients, irradiated for laryngeal carcinomas, at high risk for developing local failure. When the post-RT CT score is available, it proves to be an even better prognosticator than the pre-treatment CT-risk profile

  16. [Impacts of hydroelectric cascade exploitation on river ecosystem and landscape: a review].

    Science.gov (United States)

    Yang, Kun; Deng, Xi; Li, Xue-Ling; Wen, Ping

    2011-05-01

    Hydroelectric cascade exploitation, one of the major ways for exploiting water resources and developing hydropower, not only satisfies the needs of various national economic sectors, but also promotes the socio-economic sustainable development of river basin. unavoidable anthropogenic impacts on the entire basin ecosystem. Based on the process of hydroelectric cascade exploitation and the ecological characteristics of river basins, this paper reviewed the major impacts of hydroelectric cascade exploitation on dam-area ecosystems, river reservoirs micro-climate, riparian ecosystems, river aquatic ecosystems, wetlands, and river landscapes. Some prospects for future research were offered, e.g., strengthening the research of chain reactions and cumulative effects of ecological factors affected by hydroelectric cascade exploitation, intensifying the study of positive and negative ecological effects under the dam networks and their joint operations, and improving the research of successional development and stability of basin ecosystems at different temporal and spatial scales.

  17. The exploitation of living resources in the Dutch Wadden Sea : a historical overview

    NARCIS (Netherlands)

    Wolff, W J

    An overview, based on written sources and personal observations, is presented of exploitation of living resources in and around the Dutch Wadden Sea during the past few centuries. It is concluded that before about 1900 exploitation was almost unrestricted. Exploitation of plants has been documented

  18. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  19. Guess Where? Actor-Supervision for Spatiotemporal Action Localization

    KAUST Repository

    Escorcia, Victor

    2018-04-05

    This paper addresses the problem of spatiotemporal localization of actions in videos. Compared to leading approaches, which all learn to localize based on carefully annotated boxes on training video frames, we adhere to a weakly-supervised solution that only requires a video class label. We introduce an actor-supervised architecture that exploits the inherent compositionality of actions in terms of actor transformations, to localize actions. We make two contributions. First, we propose actor proposals derived from a detector for human and non-human actors intended for images, which is linked over time by Siamese similarity matching to account for actor deformations. Second, we propose an actor-based attention mechanism that enables the localization of the actions from action class labels and actor proposals and is end-to-end trainable. Experiments on three human and non-human action datasets show actor supervision is state-of-the-art for weakly-supervised action localization and is even competitive to some fully-supervised alternatives.

  20. Guess Where? Actor-Supervision for Spatiotemporal Action Localization

    KAUST Repository

    Escorcia, Victor; Dao, Cuong D.; Jain, Mihir; Ghanem, Bernard; Snoek, Cees

    2018-01-01

    This paper addresses the problem of spatiotemporal localization of actions in videos. Compared to leading approaches, which all learn to localize based on carefully annotated boxes on training video frames, we adhere to a weakly-supervised solution that only requires a video class label. We introduce an actor-supervised architecture that exploits the inherent compositionality of actions in terms of actor transformations, to localize actions. We make two contributions. First, we propose actor proposals derived from a detector for human and non-human actors intended for images, which is linked over time by Siamese similarity matching to account for actor deformations. Second, we propose an actor-based attention mechanism that enables the localization of the actions from action class labels and actor proposals and is end-to-end trainable. Experiments on three human and non-human action datasets show actor supervision is state-of-the-art for weakly-supervised action localization and is even competitive to some fully-supervised alternatives.

  1. PROBA-V Mission Exploitation Platform

    Directory of Open Access Journals (Sweden)

    Erwin Goor

    2016-07-01

    Full Text Available As an extension of the PROBA-Vegetation (PROBA-V user segment, the European Space Agency (ESA, de Vlaamse Instelling voor Technologisch Onderzoek (VITO, and partners TRASYS and Spacebel developed an operational Mission Exploitation Platform (MEP to drastically improve the exploitation of the PROBA-V Earth Observation (EO data archive, the archive from the historical SPOT-VEGETATION mission, and derived products by researchers, service providers, and thematic users. The analysis of the time series of data (petabyte range is addressed, as well as the large scale on-demand processing of the complete archive, including near real-time data. The platform consists of a private cloud environment, a Hadoop-based processing environment and a data manager. Several applications are released to the users, e.g., a full resolution viewing service, a time series viewer, pre-defined on-demand processing chains, and virtual machines with powerful tools and access to the data. After an initial release in January 2016 a research platform was deployed gradually, allowing users to design, debug, and test applications on the platform. From the PROBA-V MEP, access to, e.g., Sentinel-2 and Sentinel-3 data will be addressed as well.

  2. Risk assessment by dynamic representation of vulnerability, exploitation, and impact

    Science.gov (United States)

    Cam, Hasan

    2015-05-01

    Assessing and quantifying cyber risk accurately in real-time is essential to providing security and mission assurance in any system and network. This paper presents a modeling and dynamic analysis approach to assessing cyber risk of a network in real-time by representing dynamically its vulnerabilities, exploitations, and impact using integrated Bayesian network and Markov models. Given the set of vulnerabilities detected by a vulnerability scanner in a network, this paper addresses how its risk can be assessed by estimating in real-time the exploit likelihood and impact of vulnerability exploitation on the network, based on real-time observations and measurements over the network. The dynamic representation of the network in terms of its vulnerabilities, sensor measurements, and observations is constructed dynamically using the integrated Bayesian network and Markov models. The transition rates of outgoing and incoming links of states in hidden Markov models are used in determining exploit likelihood and impact of attacks, whereas emission rates help quantify the attack states of vulnerabilities. Simulation results show the quantification and evolving risk scores over time for individual and aggregated vulnerabilities of a network.

  3. Three-dimensional localization of impacted canines and root resorption assessment using cone beam computed tomography.

    Science.gov (United States)

    Almuhtaseb, Eyad; Mao, Jing; Mahony, Derek; Bader, Rawan; Zhang, Zhi-xing

    2014-06-01

    The purpose of this study was to develop a new way to localize the impacted canines from three dimensions and to investigate the root resorption of the adjacent teeth by using cone beam computed tomography (CBCT). Forty-six patients undergoing orthodontic treatments and having impacted canines in Tongji Hospital were examined. The images of CBCT scans were obtained from KaVo 3D exam vision. Angular and linear measurements of the cusp tip and root apex according to the three planes (mid-sagittal, occlusal and frontal) have been taken using the cephalometric tool of the InVivo Dental Anatomage Version 5.1.10. The measurements of the angular and linear coordinates of the maxillary and mandibular canines were obtained. Using this technique the operators could envision the location of the impacted canine according to the three clinical planes. Adjacent teeth root resorption of 28.26 % was in the upper lateral incisors while 17.39% in upper central incisors, but no lower root resorption was found in our samples. Accurate and reliable localization of the impacted canines could be obtained from the novel analysis system, which offers a better surgical and orthodontic treatment for the patients with impacted canines.

  4. Adaptive local learning in sampling based motion planning for protein folding.

    Science.gov (United States)

    Ekenna, Chinwe; Thomas, Shawna; Amato, Nancy M

    2016-08-01

    Simulating protein folding motions is an important problem in computational biology. Motion planning algorithms, such as Probabilistic Roadmap Methods, have been successful in modeling the folding landscape. Probabilistic Roadmap Methods and variants contain several phases (i.e., sampling, connection, and path extraction). Most of the time is spent in the connection phase and selecting which variant to employ is a difficult task. Global machine learning has been applied to the connection phase but is inefficient in situations with varying topology, such as those typical of folding landscapes. We develop a local learning algorithm that exploits the past performance of methods within the neighborhood of the current connection attempts as a basis for learning. It is sensitive not only to different types of landscapes but also to differing regions in the landscape itself, removing the need to explicitly partition the landscape. We perform experiments on 23 proteins of varying secondary structure makeup with 52-114 residues. We compare the success rate when using our methods and other methods. We demonstrate a clear need for learning (i.e., only learning methods were able to validate against all available experimental data) and show that local learning is superior to global learning producing, in many cases, significantly higher quality results than the other methods. We present an algorithm that uses local learning to select appropriate connection methods in the context of roadmap construction for protein folding. Our method removes the burden of deciding which method to use, leverages the strengths of the individual input methods, and it is extendable to include other future connection methods.

  5. Panel discussion on exploitation of geothermal resources in thermal zones

    Energy Technology Data Exchange (ETDEWEB)

    Viramonte, J G; Mange, J; Stefani, G

    1978-03-01

    The topics discussed include the major known geothermal resources, varying ways of exploiting geothermal resources, technical and economic difficulties in the exploitation, the place of geothermal energy in the total energy policy of a given country, advanced exploration techniques, and indications of needed areas of study. The panelists represented most of the South American countries, Mexico, and Italy. (JSR)

  6. Exploiting Non-Markovianity for Quantum Control.

    Science.gov (United States)

    Reich, Daniel M; Katz, Nadav; Koch, Christiane P

    2015-07-22

    Quantum technology, exploiting entanglement and the wave nature of matter, relies on the ability to accurately control quantum systems. Quantum control is often compromised by the interaction of the system with its environment since this causes loss of amplitude and phase. However, when the dynamics of the open quantum system is non-Markovian, amplitude and phase flow not only from the system into the environment but also back. Interaction with the environment is then not necessarily detrimental. We show that the back-flow of amplitude and phase can be exploited to carry out quantum control tasks that could not be realized if the system was isolated. The control is facilitated by a few strongly coupled, sufficiently isolated environmental modes. Our paradigmatic example considers a weakly anharmonic ladder with resonant amplitude control only, restricting realizable operations to SO(N). The coupling to the environment, when harnessed with optimization techniques, allows for full SU(N) controllability.

  7. An Exploitability Analysis Technique for Binary Vulnerability Based on Automatic Exception Suppression

    Directory of Open Access Journals (Sweden)

    Zhiyuan Jiang

    2018-01-01

    Full Text Available To quickly verify and fix vulnerabilities, it is necessary to judge the exploitability of the massive crash generated by the automated vulnerability mining tool. While the current manual analysis of the crash process is inefficient and time-consuming, the existing automated tools can only handle execute exceptions and some write exceptions but cannot handle common read exceptions. To address this problem, we propose a method of determining the exploitability based on the exception type suppression. This method enables the program to continue to execute until an exploitable exception is triggered. The method performs a symbolic replay of the crash sample, constructing and reusing data gadget, to bypass the complex exception, thereby improving the efficiency and accuracy of vulnerability exploitability analysis. The testing of typical CGC/RHG binary software shows that this method can automatically convert a crash that cannot be judged by existing analysis tools into a different crash type and judge the exploitability successfully.

  8. The role of the noradrenergic system in the exploration-exploitation trade-off: a pharmacological study

    Directory of Open Access Journals (Sweden)

    Marieke Jepma

    2010-08-01

    Full Text Available Animal research and computational modeling have indicated an important role for the neuromodulatory locus coeruleus-norepinephrine (LC-NE system in the control of behavior. According to the adaptive gain theory, the LC-NE system is critical for optimizing behavioral performance by regulating the balance between exploitative and exploratory control states. However, crucial direct empirical tests of this theory in human subjects have been lacking. We used a pharmacological manipulation of the LC-NE system to test predictions of this theory in humans. In a double-blind parallel-groups design (N = 52, participants received 4 mg reboxetine (a selective norepinephrine reuptake inhibitor, 30 mg citalopram (a selective serotonin reuptake inhibitor or placebo. The adaptive gain theory predicted that the increased tonic NE levels induced by reboxetine would promote task disengagement and exploratory behavior. We assessed the effects of reboxetine on performance in two cognitive tasks designed to examine task (disengagement and exploitative versus exploratory behavior: a diminishing-utility task and a gambling task with a non-stationary pay-off structure. In contrast to predictions of the adaptive gain theory, we did not find differences in task (disengagement or exploratory behavior between the three experimental groups, despite demonstrable effects of the two drugs on non-specific central and autonomic nervous system parameters. Our findings suggest that the LC-NE system may not be involved in the regulation of the exploration-exploitation trade-off in humans, at least not within the context of a single task. It remains to be examined whether the LC-NE system is involved in random exploration exceeding the current task context.

  9. Locality-Driven Parallel Static Analysis for Power Delivery Networks

    KAUST Repository

    Zeng, Zhiyu

    2011-06-01

    Large VLSI on-chip Power Delivery Networks (PDNs) are challenging to analyze due to the sheer network complexity. In this article, a novel parallel partitioning-based PDN analysis approach is presented. We use the boundary circuit responses of each partition to divide the full grid simulation problem into a set of independent subgrid simulation problems. Instead of solving exact boundary circuit responses, a more efficient scheme is proposed to provide near-exact approximation to the boundary circuit responses by exploiting the spatial locality of the flip-chip-type power grids. This scheme is also used in a block-based iterative error reduction process to achieve fast convergence. Detailed computational cost analysis and performance modeling is carried out to determine the optimal (or near-optimal) number of partitions for parallel implementation. Through the analysis of several large power grids, the proposed approach is shown to have excellent parallel efficiency, fast convergence, and favorable scalability. Our approach can solve a 16-million-node power grid in 18 seconds on an IBM p5-575 processing node with 16 Power5+ processors, which is 18.8X faster than a state-of-the-art direct solver. © 2011 ACM.

  10. The ALARA project of the EDF nuclear park exploitation

    International Nuclear Information System (INIS)

    Potoczek, J.

    1998-01-01

    To bring the exploitation of the nuclear park of EDF at the level of the best exploiters in the world in matter of collective and individual dosimetry, the ALARA principle coordinates numerous actions: to associate the temporary societies, to define common language, methods and tools for the whole park to organize a level effect in this area, to optimize the maintenance that is expansive in radiation doses, to make the different levels of management responsible on dosimetric stakes, to reduce the singular sources of exposure, to assure the organisation and exploitation of the experience back in this field and adapt consequently the system of information. The results are cheerful and the objectives for 2000 are: less than 1.2 h.Sv by year and by reactor, no intervener whom annual radiation dose is upper than 20 mSv (out of exceptional case). (N.C.)

  11. Use of Respiratory-Correlated Four-Dimensional Computed Tomography to Determine Acceptable Treatment Margins for Locally Advanced Pancreatic Adenocarcinoma

    International Nuclear Information System (INIS)

    Goldstein, Seth D.; Ford, Eric C.; Duhon, Mario; McNutt, Todd; Wong, John; Herman, Joseph M.

    2010-01-01

    Purpose: Respiratory-induced excursions of locally advanced pancreatic adenocarcinoma could affect dose delivery. This study quantified tumor motion and evaluated standard treatment margins. Methods and Materials: Respiratory-correlated four-dimensional computed tomography images were obtained on 30 patients with locally advanced pancreatic adenocarcinoma; 15 of whom underwent repeat scanning before cone-down treatment. Treatment planning software was used to contour the gross tumor volume (GTV), bilateral kidneys, and biliary stent. Excursions were calculated according to the centroid of the contoured volumes. Results: The mean ± standard deviation GTV excursion in the superoinferior (SI) direction was 0.55 ± 0.23 cm; an expansion of 1.0 cm adequately accounted for the GTV motion in 97% of locally advanced pancreatic adenocarcinoma patients. Motion GTVs were generated and resulted in a 25% average volume increase compared with the static GTV. Of the 30 patients, 17 had biliary stents. The mean SI stent excursion was 0.84 ± 0.32 cm, significantly greater than the GTV motion. The xiphoid process moved an average of 0.35 ± 0.12 cm, significantly less than the GTV. The mean SI motion of the left and right kidneys was 0.65 ± 0.27 cm and 0.77 ± 0.30 cm, respectively. At repeat scanning, no significant changes were seen in the mean GTV size (p = .8) or excursion (p = .3). Conclusion: These data suggest that an asymmetric expansion of 1.0, 0.7, and 0.6 cm along the respective SI, anteroposterior, and medial-lateral directions is recommended if a respiratory-correlated four-dimensional computed tomography scan is not available to evaluate the tumor motion during treatment planning. Surrogates of tumor motion, such as biliary stents or external markers, should be used with caution.

  12. Learning English with "The Sims": Exploiting Authentic Computer Simulation Games for L2 Learning

    Science.gov (United States)

    Ranalli, Jim

    2008-01-01

    With their realistic animation, complex scenarios and impressive interactivity, computer simulation games might be able to provide context-rich, cognitively engaging virtual environments for language learning. However, simulation games designed for L2 learners are in short supply. As an alternative, could games designed for the mass-market be…

  13. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  14. Dynamic integration of remote cloud resources into local computing clusters

    Energy Technology Data Exchange (ETDEWEB)

    Fleig, Georg; Erli, Guenther; Giffels, Manuel; Hauth, Thomas; Quast, Guenter; Schnepf, Matthias [Institut fuer Experimentelle Kernphysik, Karlsruher Institut fuer Technologie (Germany)

    2016-07-01

    In modern high-energy physics (HEP) experiments enormous amounts of data are analyzed and simulated. Traditionally dedicated HEP computing centers are built or extended to meet this steadily increasing demand for computing resources. Nowadays it is more reasonable and more flexible to utilize computing power at remote data centers providing regular cloud services to users as they can be operated in a more efficient manner. This approach uses virtualization and allows the HEP community to run virtual machines containing a dedicated operating system and transparent access to the required software stack on almost any cloud site. The dynamic management of virtual machines depending on the demand for computing power is essential for cost efficient operation and sharing of resources with other communities. For this purpose the EKP developed the on-demand cloud manager ROCED for dynamic instantiation and integration of virtualized worker nodes into the institute's computing cluster. This contribution will report on the concept of our cloud manager and the implementation utilizing a remote OpenStack cloud site and a shared HPC center (bwForCluster located in Freiburg).

  15. On Real-Time Systems Using Local Area Networks.

    Science.gov (United States)

    1987-07-01

    87-35 July, 1987 CS-TR-1892 On Real - Time Systems Using Local Area Networks*I VShem-Tov Levi Department of Computer Science Satish K. Tripathit...1892 On Real - Time Systems Using Local Area Networks* Shem-Tov Levi Department of Computer Science Satish K. Tripathit Department of Computer Science...constraints and the clock systems that feed the time to real - time systems . A model for real-time system based on LAN communication is presented in

  16. Structure-aware Local Sparse Coding for Visual Tracking

    KAUST Repository

    Qi, Yuankai

    2018-01-24

    Sparse coding has been applied to visual tracking and related vision problems with demonstrated success in recent years. Existing tracking methods based on local sparse coding sample patches from a target candidate and sparsely encode these using a dictionary consisting of patches sampled from target template images. The discriminative strength of existing methods based on local sparse coding is limited as spatial structure constraints among the template patches are not exploited. To address this problem, we propose a structure-aware local sparse coding algorithm which encodes a target candidate using templates with both global and local sparsity constraints. For robust tracking, we show local regions of a candidate region should be encoded only with the corresponding local regions of the target templates that are the most similar from the global view. Thus, a more precise and discriminative sparse representation is obtained to account for appearance changes. To alleviate the issues with tracking drifts, we design an effective template update scheme. Extensive experiments on challenging image sequences demonstrate the effectiveness of the proposed algorithm against numerous stateof- the-art methods.

  17. Image-based metal artifact reduction in x-ray computed tomography utilizing local anatomical similarity

    Science.gov (United States)

    Dong, Xue; Yang, Xiaofeng; Rosenfield, Jonathan; Elder, Eric; Dhabaan, Anees

    2017-03-01

    X-ray computed tomography (CT) is widely used in radiation therapy treatment planning in recent years. However, metal implants such as dental fillings and hip prostheses can cause severe bright and dark streaking artifacts in reconstructed CT images. These artifacts decrease image contrast and degrade HU accuracy, leading to inaccuracies in target delineation and dose calculation. In this work, a metal artifact reduction method is proposed based on the intrinsic anatomical similarity between neighboring CT slices. Neighboring CT slices from the same patient exhibit similar anatomical features. Exploiting this anatomical similarity, a gamma map is calculated as a weighted summation of relative HU error and distance error for each pixel in an artifact-corrupted CT image relative to a neighboring, artifactfree image. The minimum value in the gamma map for each pixel is used to identify an appropriate pixel from the artifact-free CT slice to replace the corresponding artifact-corrupted pixel. With the proposed method, the mean CT HU error was reduced from 360 HU and 460 HU to 24 HU and 34 HU on head and pelvis CT images, respectively. Dose calculation accuracy also improved, as the dose difference was reduced from greater than 20% to less than 4%. Using 3%/3mm criteria, the gamma analysis failure rate was reduced from 23.25% to 0.02%. An image-based metal artifact reduction method is proposed that replaces corrupted image pixels with pixels from neighboring CT slices free of metal artifacts. This method is shown to be capable of suppressing streaking artifacts, thereby improving HU and dose calculation accuracy.

  18. Quality of Governance and Local Development: The Case of Top Nine Performing Local Government Units in the Philippines

    Directory of Open Access Journals (Sweden)

    MA. NIÑA I. ADRIANO

    2014-08-01

    Full Text Available There is a large body of literature that studies the link between good governance and development in a country level. However, only a few have exploited the same study in the local government unit (LGU setting. This study attempts to establish the relationship between the quality of governance and the state of local development of the Top 9 Performing LGUs in the Philippines (La Union, Albay, Cavite, Ilocos Norte, Makati City Valenzuela City, Taguig City, Davao City and Angeles City as measured by the Local Governance Performance Management System (LGPMS, the nationwide governance performance evaluation and management tool used in the Philippines. I used the data generated by the LGPMS, particularly the state of local governance and the state of local development, to see if there is a relationship between the two variables using Spearman’s correlation coefficient. Results revealed that that there is no relationship between the quality of governance and the state of local development in the consistently top performing LGUs in the Philippines for the period 2009-2011. The findings of this study will be useful to government officials such as public administrators, LGU executives, policy makers, researchers, and students of public administration in addressing the issue of good governance and local development in their respective LGUs.

  19. HBC-Evo: predicting human breast cancer by exploiting amino acid sequence-based feature spaces and evolutionary ensemble system.

    Science.gov (United States)

    Majid, Abdul; Ali, Safdar

    2015-01-01

    We developed genetic programming (GP)-based evolutionary ensemble system for the early diagnosis, prognosis and prediction of human breast cancer. This system has effectively exploited the diversity in feature and decision spaces. First, individual learners are trained in different feature spaces using physicochemical properties of protein amino acids. Their predictions are then stacked to develop the best solution during GP evolution process. Finally, results for HBC-Evo system are obtained with optimal threshold, which is computed using particle swarm optimization. Our novel approach has demonstrated promising results compared to state of the art approaches.

  20. The concept of exploitation in international human trafficking law

    OpenAIRE

    von der Pütten, Tuija Kaarina

    2017-01-01

    Human trafficking is commonly known as a criminal practice that takes place in the framework of sex trade: women and children are trafficked within a state, or from one state to another, for the purpose of sexual exploitation. Similarly, the early 20th century international conventions aimed to tackle ‘white slave traffic’, trafficking of women and children for sexual exploitation. However, it is misleading to see trafficking only within this context. People are trafficked so that they can be...

  1. A specialized ODE integrator for the efficient computation of parameter sensitivities

    Directory of Open Access Journals (Sweden)

    Gonnet Pedro

    2012-05-01

    Full Text Available Abstract Background Dynamic mathematical models in the form of systems of ordinary differential equations (ODEs play an important role in systems biology. For any sufficiently complex model, the speed and accuracy of solving the ODEs by numerical integration is critical. This applies especially to systems identification problems where the parameter sensitivities must be integrated alongside the system variables. Although several very good general purpose ODE solvers exist, few of them compute the parameter sensitivities automatically. Results We present a novel integration algorithm that is based on second derivatives and contains other unique features such as improved error estimates. These features allow the integrator to take larger time steps than other methods. In practical applications, i.e. systems biology models of different sizes and behaviors, the method competes well with established integrators in solving the system equations, and it outperforms them significantly when local parameter sensitivities are evaluated. For ease-of-use, the solver is embedded in a framework that automatically generates the integrator input from an SBML description of the system of interest. Conclusions For future applications, comparatively ‘cheap’ parameter sensitivities will enable advances in solving large, otherwise computationally expensive parameter estimation and optimization problems. More generally, we argue that substantially better computational performance can be achieved by exploiting characteristics specific to the problem domain; elements of our methods such as the error estimation could find broader use in other, more general numerical algorithms.

  2. Assessing uncertainty and risk in exploited marine populations

    International Nuclear Information System (INIS)

    Fogarty, M.J.; Mayo, R.K.; O'Brien, L.; Serchuk, F.M.; Rosenberg, A.A.

    1996-01-01

    The assessment and management of exploited fish and invertebrate populations is subject to several types of uncertainty. This uncertainty translates into risk to the population in the development and implementation of fishery management advice. Here, we define risk as the probability that exploitation rates will exceed a threshold level where long term sustainability of the stock is threatened. We distinguish among several sources of error or uncertainty due to (a) stochasticity in demographic rates and processes, particularly in survival rates during the early fife stages; (b) measurement error resulting from sampling variation in the determination of population parameters or in model estimation; and (c) the lack of complete information on population and ecosystem dynamics. The first represents a form of aleatory uncertainty while the latter two factors represent forms of epistemic uncertainty. To illustrate these points, we evaluate the recent status of the Georges Bank cod stock in a risk assessment framework. Short term stochastic projections are made accounting for uncertainty in population size and for random variability in the number of young surviving to enter the fishery. We show that recent declines in this cod stock can be attributed to exploitation rates that have substantially exceeded sustainable levels

  3. Commercial sexual exploitation and sex trafficking of adolescents.

    Science.gov (United States)

    Chung, Richard J; English, Abigail

    2015-08-01

    This review describes the current state of commercial sexual exploitation and sex trafficking of adolescents in the United States and globally, the legal and health implications of this severe form of abuse, and the roles that pediatric and adolescent healthcare providers can play in addressing this issue. Although this form of exploitation and abuse is shrouded in secrecy, pediatric and adolescent healthcare providers are well positioned to respond when it arises. However, awareness and understanding of the issue are generally lacking among healthcare professionals, currently limiting their effectiveness in combating this problem. Although the empirical evidence base available to guide clinical care of victims of trafficking remains limited given the secretive nature of the abuse, important contributions to the multidisciplinary literature on this issue have been made in recent years, including the Institute of Medicine's landmark report in the United States. Commercial sexual exploitation and sex trafficking of adolescents represent a human rights tragedy that remains inadequately addressed. As preeminent advocates for the health and well-being of adolescents, pediatric and adolescent healthcare providers can play a crucial role in advancing efforts not only to intervene but also to prevent further victimization of vulnerable youth.

  4. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  5. Promising role of single photon emission computed tomography/computed tomography in Meckel's scan

    International Nuclear Information System (INIS)

    Jain, Anurag; Chauhan, MS; Pandit, AG; Kumar, Rajeev; Sharma, Amit

    2012-01-01

    Meckel's scan is a common procedure performed in nuclear medicine. Single-photon emission computed tomography/computed tomography (SPECT/CT) in a suspected case of heterotopic location of gastric mucosa can increase the accuracy of its anatomic localization. We present two suspected cases of Meckel's diverticulum in, which SPECT/CT co-registration has helped in better localization of the pathology

  6. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  7. Exploitation of linkage learning in evolutionary algorithms

    CERN Document Server

    Chen, Ying-ping

    2010-01-01

    The exploitation of linkage learning is enhancing the performance of evolutionary algorithms. This monograph examines recent progress in linkage learning, with a series of focused technical chapters that cover developments and trends in the field.

  8. Perspective: Memcomputing: Leveraging memory and physics to compute efficiently

    Science.gov (United States)

    Di Ventra, Massimiliano; Traversa, Fabio L.

    2018-05-01

    It is well known that physical phenomena may be of great help in computing some difficult problems efficiently. A typical example is prime factorization that may be solved in polynomial time by exploiting quantum entanglement on a quantum computer. There are, however, other types of (non-quantum) physical properties that one may leverage to compute efficiently a wide range of hard problems. In this perspective, we discuss how to employ one such property, memory (time non-locality), in a novel physics-based approach to computation: Memcomputing. In particular, we focus on digital memcomputing machines (DMMs) that are scalable. DMMs can be realized with non-linear dynamical systems with memory. The latter property allows the realization of a new type of Boolean logic, one that is self-organizing. Self-organizing logic gates are "terminal-agnostic," namely, they do not distinguish between the input and output terminals. When appropriately assembled to represent a given combinatorial/optimization problem, the corresponding self-organizing circuit converges to the equilibrium points that express the solutions of the problem at hand. In doing so, DMMs take advantage of the long-range order that develops during the transient dynamics. This collective dynamical behavior, reminiscent of a phase transition, or even the "edge of chaos," is mediated by families of classical trajectories (instantons) that connect critical points of increasing stability in the system's phase space. The topological character of the solution search renders DMMs robust against noise and structural disorder. Since DMMs are non-quantum systems described by ordinary differential equations, not only can they be built in hardware with the available technology, they can also be simulated efficiently on modern classical computers. As an example, we will show the polynomial-time solution of the subset-sum problem for the worst cases, and point to other types of hard problems where simulations of DMMs

  9. Effect of Gender on Students' Academic Performance in Computer Studies in Secondary Schools in New Bussa, Borgu Local Government of Niger State

    Science.gov (United States)

    Adigun, Joseph; Onihunwa, John; Irunokhai, Eric; Sada, Yusuf; Adesina, Olubunmi

    2015-01-01

    This research studied the relationship between student's gender and academic performance in computer science in New Bussa, Borgu local government of Niger state. Questionnaire which consisted of 30 multiple-choice items drawn from Senior School Certificate Examination past questions as set by the West Africa Examination Council in 2014 multiple…

  10. Annexe 3. Constitution d'une exploitation à Homs

    OpenAIRE

    2015-01-01

    ‘Abd al-Laṭīf, né en 1925, est jardinier à Horns. Il exploite avec son fils aîné et l'aide de deux de ses petits-fils un jardin de quelques 35 dunum (3,5 hectares) situé dans le zūr al-Ǧdīdeh, secteur central de la zone agricole. Cette grande exploitation, constituée de plusieurs parcelles ayant différents statuts, est en partie le produit d'une histoire familiale dont je voudrais présenter ici brièvement les principaux aspects. Figure 50 – Alliances matrimoniales entre la lignée de ‘Abbās e...

  11. Open pit coal exploitation viability. Margarita mine. Case of study

    International Nuclear Information System (INIS)

    Veloza, Julia; Molina, Jorge; Mejia, Humberto

    2006-01-01

    This paper provides an analysis of financial viability, planning and design for the new coal open pit exploitation for La Margarita mine, with coal-resources estimated on 440.139,7 ton. Dimension, design and economic evaluation were possible by three exploitation methods: (multiple bench, open cast contour, and terraces). Net present values (NVP) were calculated: $c 817,5; $c 518,5 and $c 645,2 respectively for each method (given in million current Colombian pesos $. $c 2380 are equivalent to $us 1) and rate of return (ROR) 78,33%; 34,0% and 38,62% respectively for each method. These indicators served as a parameter to choose the multiple bench method, which should be recalculated because it was necessary to work jointly with two pits and making feasible the project. in addition a general environmental evaluation was done, which is vital for the exploitation. Important impacts on the flower, animals, air, water were found, and measures of control, prevention and mitigation were stated. it is expected that this paper can be useful as a technical-economical support for the development of the open pit exploitation in the margarita mine

  12. Challenges in authorization of exploration and exploitation of radioactive minerals in Slovakia

    International Nuclear Information System (INIS)

    Janova, V.; Turner, M.

    2014-01-01

    Slovakia has a long tradition in the peaceful use of nuclear energy which dates back to the 1950s of last century. In parallel with the development of nuclear power uranium exploitation has started. Whereas the development of nuclear power has continued without interruption the uranium exploitation has been suspended during the political and economic restructuralization until 2005. There is also a difference in the acceptance of nuclear power by the local authorities in comparison with uranium exploration and exploitation. Permitting process of exploration activities in the Slovak Republic is regulated by geological law and falls within the competence of the Ministry of Environment. Radioactive minerals are considered as exclusive minerals and their survey is allowed to the applicant only in exploration territory. The exploration area is determined for four years and then extended for additional periods. The opinions of the affected municipalities and self governing regions (which reflect the compliance of the geological project with the objectives and priorities of the economic and social development programs and with the land-use planning documentation) have to be submitted with the application. Meanwhile, the affected municipality and the self governing region are the parties of administrative procedure of designation, change or cancellation of exploration area for radioactive minerals and they have a right of veto. In case, that affected municipality or autonomous region disagree with proposal, the Ministry of the Environment recommends a modification of proposed exploration area. Extraction of minerals is regulated by the mining act that falls under the competency of the Ministry of Economy. The right to mine exclusive deposits has the organization which has got mining license and which a mining area has been determined to. Preferential right for mining area determination has the company, that exploration area was determined and the research was carried out at

  13. Deoxyglucose method for the estimation of local myocardial glucose metabolism with positron computed tomography

    International Nuclear Information System (INIS)

    Ratib, O.; Phelps, M.E.; Huang, S.C.; Henze, E.; Selin, C.E.; Schelbert, H.R.

    1981-01-01

    The deoxyglucose method originally developed for measurements of the local cerebral metabolic rate for glucose has been investigated in terms of its application to studies of the heart with positron computed tomography (PCT) and FDG. Studies were performed in dogs to measure the tissue kinetics of FDG with PCT and by direct arterial-venous sampling. The operational equation developed in our laboratory as an extension of the Sokoloff model was used to analyze the data. The FDG method accurately predicted the true MMRGlc even when the glucose metabolic rate was normal but myocardial blood flow (MBF) was elevated 5 times the control value or when metabolism was reduced to 10% of normal and MBF increased 5 times normal. Improvements in PCT resolution are required to improve the accuracy of the estimates of the rate constants and the MMRGlc

  14. Enhanced delegated computing using coherence

    Science.gov (United States)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  15. Multiscale Computation. Needs and Opportunities for BER Science

    Energy Technology Data Exchange (ETDEWEB)

    Scheibe, Timothy D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-01

    The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSL decisions regarding future computational (hardware and software) architectures.

  16. Hackers Heroes of the Computer Revolution - 25th Anniversary Edition

    CERN Document Server

    Levy, Steven

    2010-01-01

    This 25th anniversary edition of Steven Levy's classic book traces the exploits of the computer revolution's original hackers -- those brilliant and eccentric nerds from the late 1950s through the early '80s who took risks, bent the rules, and pushed the world in a radical new direction. With updated material from noteworthy hackers such as Bill Gates, Mark Zuckerberg, Richard Stallman, and Steve Wozniak, Hackers is a fascinating story that begins in early computer research labs and leads to the first home computers. Levy profiles the imaginative brainiacs who found clever and unorthodox sol

  17. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  18. Evolving ATLAS Computing For Today’s Networks

    CERN Document Server

    Campana, S; The ATLAS collaboration; Jezequel, S; Negri, G; Serfon, C; Ueda, I

    2012-01-01

    The ATLAS computing infrastructure was designed many years ago based on the assumption of rather limited network connectivity between computing centres. ATLAS sites have been organized in a hierarchical model, where only a static subset of all possible network links can be exploited and a static subset of well connected sites (CERN and the T1s) can cover important functional roles such as hosting master copies of the data. The pragmatic adoption of such simplified approach, in respect of a more relaxed scenario interconnecting all sites, was very beneficial during the commissioning of the ATLAS distributed computing system and essential in reducing the operational cost during the first two years of LHC data taking. In the mean time, networks evolved far beyond this initial scenario: while a few countries are still poorly connected with the rest of the WLCG infrastructure, most of the ATLAS computing centres are now efficiently interlinked. Our operational experience in running the computing infrastructure in ...

  19. DETERMINATION OF OPTIMAL CONTOURS OF OPEN PIT MINE DURING OIL SHALE EXPLOITATION, BY MINEX 5.2.3. PROGRAM

    Directory of Open Access Journals (Sweden)

    Miroslav Ignjatović

    2013-04-01

    Full Text Available By examination and determination of optimal solution of technological processes of exploitation and oil shale processing from Aleksinac site and with adopted technical solution and exploitation of oil shale, derived a technical solution that optimize contour of the newly defined open pit mine. In the world, this problem is solved by using a computer program that has become the established standard for quick and efficient solution for this problem. One of the computer’s program, which can be used for determination of the optimal contours of open pit mines is Minex 5.2.3. program, produced in Australia in the Surpac Minex Group Pty Ltd Company, which is applied at the Mining and Metallurgy Institute Bor (no. of licenses are SSI - 24765 and SSI - 24766. In this study, authors performed 11 optimization of deposit geo - models in Minex 5.2.3. based on the tests results, performed in a laboratory for soil mechanics of Mining and Metallurgy Institute, Bor, on samples from the site of Aleksinac deposits.

  20. Cultural Work as a Site of Struggle: Freelancers and Exploitation

    Directory of Open Access Journals (Sweden)

    Nicole S. Cohen

    2012-05-01

    Full Text Available This paper argues that Marxist political economy is a useful framework for understanding contemporary conditions of cultural work. Drawing on Karl Marx’s foundational concepts, labour process theory, and a case study of freelance writers, I argue that the debate over autonomy and control in cultural work ignores exploitation in labour-capital relationships, which is a crucial process shaping cultural work. To demonstrate the benefits of this approach, I discuss two methods media firms use to extract surplus value from freelance writers: exploitation of unpaid labour time and exploitation of intellectual property through aggressive copyright regimes. I argue that a Marxist perspective can uncover the dynamics that are transforming cultural industries and workers’ experiences. From this perspective, cultural work is understood as a site of struggle.