WorldWideScience

Sample records for open distributed processing

  1. Unifying Distributed Processing and Open Hypertext through a Heterogeneous Communication Model

    OpenAIRE

    Goose, Stuart; Dale, Jonathan; Hill, Gary J.; DeRoure, David C.; Hall, Wendy

    1995-01-01

    A successful distributed open hypermedia system can be characterised by a scaleable architecture which is inherently distributed. While the architects of distributed hypermedia systems have addressed the issues of providing and retrieving distributed resources, they have often neglected to design systems with the inherent capability to exploit the distributed processing of this information. The research presented in this paper describes the construction and use of an open hypermedia system co...

  2. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    Science.gov (United States)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  3. Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation

    NARCIS (Netherlands)

    Sloep, Peter

    2009-01-01

    Sloep, P. B. (2009). Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation. In V. Hornung-Prähauser & M. Luckmann (Eds.), Kreativität und Innovationskompetenz im digitalen Netz - Creativity and Innovation Competencies in the Web, Sammlung von ausgewäh

  4. Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation

    NARCIS (Netherlands)

    Sloep, Peter

    2009-01-01

    Sloep, P. B. (2009). Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation. In V. Hornung-Prähauser & M. Luckmann (Eds.), Kreativität und Innovationskompetenz im digitalen Netz - Creativity and Innovation Competencies in the Web, Sammlung von

  5. Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation

    NARCIS (Netherlands)

    Sloep, Peter

    2009-01-01

    Sloep, P. B. (2009). Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation. In V. Hornung-Prähauser & M. Luckmann (Eds.), Kreativität und Innovationskompetenz im digitalen Netz - Creativity and Innovation Competencies in the Web, Sammlung von ausgewäh

  6. Selected papers from Middleware'98: The IFIP International Conference on Distributed Systems Platforms and Open Distributed Processing

    Science.gov (United States)

    Davies, Nigel; Raymond, Kerry; Blair, Gordon

    1999-03-01

    In recent years the distributed systems community has witnessed a growth in the number of conferences, leading to difficulties in tracking the literature and a consequent loss of awareness of work done by others in this important research domain. In an attempt to synthesize many of the smaller workshops and conferences in the field, and to bring together research communities which were becoming fragmented, IFIP staged Middleware'98: The IFIP International Conference on Distributed Systems Platforms and Open Distributed Processing. The conference was widely publicized and attracted over 150 technical submissions including 135 full paper submissions. The final programme consisted of 28 papers, giving an acceptance ratio of a little over one in five. More crucially, the programme accurately reflected the state of the art in middleware research, addressing issues such as ORB architectures, engineering of large-scale systems and multimedia. The traditional role of middleware as a point of integration and service provision was clearly intact, but the programme stressed the importance of emerging `must-have' features such as support for extensibility, mobility and quality of service. The Middleware'98 conference was held in the Lake District, UK in September 1998. Over 160 delegates made the journey to one of the UK's most beautiful regions and contributed to a lively series of presentations and debates. A permanent record of the conference, including transcripts of the panel discussions which took place, is available at: http://www.comp.lancs.ac.uk/computing/middleware98/ Based on their original reviews and the reactions of delegates to the ensuing presentations we have selected six papers from the conference for publication in this special issue of Distributed Systems Engineering. The first paper, entitled `Jonathan: an open distributed processing environment in Java', by Dumant et al describes a minimal, modular ORB framework which can be used for supporting real

  7. Open-ocean convection process: A driver of the winter nutrient supply and the spring phytoplankton distribution in the Northwestern Mediterranean Sea

    Science.gov (United States)

    Severin, Tatiana; Kessouri, Faycal; Rembauville, Mathieu; Sánchez-Pérez, Elvia Denisse; Oriol, Louise; Caparros, Jocelyne; Pujo-Pay, Mireille; Ghiglione, Jean-François; D'Ortenzio, Fabrizio; Taillandier, Vincent; Mayot, Nicolas; Durrieu De Madron, Xavier; Ulses, Caroline; Estournel, Claude; Conan, Pascal

    2017-06-01

    This study was a part of the DeWEX project (Deep Water formation Experiment), designed to better understand the impact of dense water formation on the marine biogeochemical cycles. Here, nutrient and phytoplankton vertical and horizontal distributions were investigated during a deep open-ocean convection event and during the following spring bloom in the Northwestern Mediterranean Sea (NWM). In February 2013, the deep convection event established a surface nutrient gradient from the center of the deep convection patch to the surrounding mixed and stratified areas. In the center of the convection area, a slight but significant difference of nitrate, phosphate and silicate concentrations was observed possibly due to the different volume of deep waters included in the mixing or to the sediment resuspension occurring where the mixing reached the bottom. One of this process, or a combination of both, enriched the water column in silicate and phosphate, and altered significantly the stoichiometry in the center of the deep convection area. This alteration favored the local development of microphytoplankton in spring, while nanophytoplankton dominated neighboring locations where the convection reached the deep layer but not the bottom. This study shows that the convection process influences both winter nutrients distribution and spring phytoplankton distribution and community structure. Modifications of the convection's spatial scale and intensity (i.e., convective mixing depth) are likely to have strong consequences on phytoplankton community structure and distribution in the NWM, and thus on the marine food web.Plain Language SummaryThe deep open-ocean convection in the Northwestern Mediterranean Sea is an important process for the formation and the circulation of the deep waters of the entire Mediterranean Sea, but also for the local spring phytoplankton bloom. In this study, we showed that variations of the convective mixing depth induced different supply in nitrate

  8. Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2010-01-01

    We present the concept of interactive parallel and distributed processing, and the challenges that programmers face in designing interactive parallel and distributed systems. Specifically, we introduce the challenges that are met and the decisions that need to be taken with respect...... to distributedness, master dependency, software behavioural models, adaptive interactivity, feedback, connectivity, topology, island modeling, and user interaction. We introduce the system of modular interactive tiles as a tool for easy, fast, and flexible exploration of these issues, and through examples show how...... to implement interactive parallel and distributed processing with different software behavioural models such as open loop, randomness based, rule based, user interaction based, AI and ALife based software....

  9. Design of a Free and Open Source Data Processing, Archiving, and Distribution Subsystem for the Ground Receiving Station of the Philippine Scientific Earth Observation Micro-Satellite

    Science.gov (United States)

    Aranas, R. K. D.; Jiao, B. J. D.; Magallon, B. J. P.; Ramos, M. K. F.; Amado, J. A.; Tamondong, A. M.; Tupas, M. E. A.

    2016-06-01

    The Philippines's PHL-Microsat program aims to launch its first earth observation satellite, DIWATA, on the first quarter of 2016. DIWATA's payload consists of a high-precision telescope (HPT), spaceborne multispectral imager (SMI) with liquid crystal tunable filter (LCTF), and a wide field camera (WFC). Once launched, it will provide information about the Philippines, both for disaster and environmental applications. Depending on the need, different remote sensing products will be generated from the microsatellite sensors. This necessitates data processing capability on the ground control segment. Rather than rely on commercial turnkey solutions, the PHL-Microsat team, specifically Project 3:DPAD, opted to design its own ground receiving station data subsystems. This paper describes the design of the data subsystems of the ground receiving station (GRS) for DIWATA. The data subsystems include: data processing subsystem for automatic calibration and georeferencing of raw images as well as the generation of higher level processed data products; data archiving subsystem for storage and backups of both raw and processed data products; and data distribution subsystem for providing a web-based interface and product download facility for the user community. The design covers the conceptual design of the abovementioned subsystems, the free and open source software (FOSS) packages used to implement them, and the challenges encountered in adapting the existing FOSS packages to DIWATA GRS requirements.

  10. DESIGN OF A FREE AND OPEN SOURCE DATA PROCESSING, ARCHIVING, AND DISTRIBUTION SUBSYSTEM FOR THE GROUND RECEIVING STATION OF THE PHILIPPINE SCIENTIFIC EARTH OBSERVATION MICRO-SATELLITE

    Directory of Open Access Journals (Sweden)

    R. K. D. Aranas

    2016-06-01

    Full Text Available The Philippines’s PHL-Microsat program aims to launch its first earth observation satellite, DIWATA, on the first quarter of 2016. DIWATA’s payload consists of a high-precision telescope (HPT, spaceborne multispectral imager (SMI with liquid crystal tunable filter (LCTF, and a wide field camera (WFC. Once launched, it will provide information about the Philippines, both for disaster and environmental applications. Depending on the need, different remote sensing products will be generated from the microsatellite sensors. This necessitates data processing capability on the ground control segment. Rather than rely on commercial turnkey solutions, the PHL-Microsat team, specifically Project 3:DPAD, opted to design its own ground receiving station data subsystems. This paper describes the design of the data subsystems of the ground receiving station (GRS for DIWATA. The data subsystems include: data processing subsystem for automatic calibration and georeferencing of raw images as well as the generation of higher level processed data products; data archiving subsystem for storage and backups of both raw and processed data products; and data distribution subsystem for providing a web-based interface and product download facility for the user community. The design covers the conceptual design of the abovementioned subsystems, the free and open source software (FOSS packages used to implement them, and the challenges encountered in adapting the existing FOSS packages to DIWATA GRS requirements.

  11. Opening up the Innovation Process

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2008-01-01

    . Seeking, developing, and protecting knowledge is a costly endeavour. Moreover, apart from being expensive, the process of turning new knowledge into useful and well-protected innovations often slows the speed of development and increases costs. In this chapter, alternative strategies for innovation......, in which sharing and co-operation play a critical part, are discussed. In particular, we address the involvement of users in opening up the innovation process which, in turn, offers participating actors some useful strategies for product development. Four archetypal strategies are identified and classified...

  12. Opening up the Innovation Process

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2008-01-01

    , in which sharing and co-operation play a critical part, are discussed. In particular, we address the involvement of users in opening up the innovation process which, in turn, offers participating actors some useful strategies for product development. Four archetypal strategies are identified and classified......An organization's ability to create, retrieve, and use knowledge to innovate is a critical strategic asset. Until recently, most textbooks on business and product development argued that managers should keep their new ideas to themselves and protect knowledge from getting into competitors' hands...

  13. A Multiview Visualisation Architecture for Open Distributed Systems

    NARCIS (Netherlands)

    Hughes, E.; Oldengarm, P.; van Halteren, Aart

    1998-01-01

    Program visualisation is an attractive way for understanding collaboration structures of complex distributed systems. By using the concepts of the open distributed processing-reference model (ODP-RM) as entities for visualisation, a multiview visualisation architecture is presented, which provides a

  14. 75 FR 56920 - Express Mail Open and Distribute and Priority Mail Open and Distribute

    Science.gov (United States)

    2010-09-17

    ... guarantee for Express Mail Open and Distribute is receipt by end of day (11:59 p.m.) and ends upon receipt... ``EXPRESS MAIL OPEN AND DIST'' or ``PRIORITY MAIL OPEN AND DIST,'' as applicable. c. For Line 3 (origin...

  15. 开放型分布式处理RTU的研制与开发%Research and Development on Opening Structure Distributive Process With GR-90H RTU

    Institute of Scientific and Technical Information of China (English)

    李辉; 柴美丽

    2001-01-01

    针对中国电力系统发展的实际需要,研制开发了GR-90H型远方终端单元(RTU)系统,本系统为远方/就地测量控制系统建起了成功的桥梁,使之能安全可靠地应用于220kV/110kV变电站中。本系统采用新的开放式结构,分布式处理的概念,它可应用在需要高可靠性设备的无人站及各种电量信息采集的场合,也可广泛用于电力、输油、输气、城市公用事业等监控系统中。%This paper introduces a GR-90H RTU(remote terminal unit), which sets up a successful bridge for remote/local measuring and controlling system. It is used in 220kV/110kV power station and it adopts new opening structure, distributive processing conception. This system can be used in power station and measuring control spots, as well transporting oil, transporting gas and urban public enterprise etc.

  16. Advanced design concepts for open distributed systems

    NARCIS (Netherlands)

    Pires, L.F.; Ferreira Pires, Luis; van Sinderen, Marten J.; Vissers, C.A.

    1993-01-01

    Experience with the engineering of large scale open distributed systems has shown that their design should be specified at several well defined levels of abstractions, where each level aims at satisfying specific user, architectural and implementation purposes. Therefore designers should dispose of

  17. Open Source Live Distributions for Computer Forensics

    Science.gov (United States)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  18. Leadership in Open and Distributed Innovation

    DEFF Research Database (Denmark)

    Haefliger, Stefan; Poetz, Marion

    Creating innovation and capturing value from doing so have considerably changed over the last two decades. New technologies – specifically rapid advances in information and communication technologies – and their extensive use, higher labor mobility and new divisions of labor, increased customer...... demands and shorter product life cycles have triggered new forms of creation and innovative practices (von Hippel and von Krogh, 2003; Baden-Fuller and Haefliger, 2013). These new forms can be characterized by being more open, distributed, collaborative, and democratized than traditional models...... of innovation. More and more companies are experimenting with such new forms in order to leverage widely distributed knowledge for their innovation efforts (Hienerth et al., 2011; Bogers et al. 2010). These innovation practices imply the work of individuals who combine and exchange knowledge more rapidly, more...

  19. Process Planning and Scheduling Integration in an Open Manufacturing Environment

    Institute of Scientific and Technical Information of China (English)

    LOU Ping; LIU Quan; ZHOU Zu-de; QUAN Shu-hai; FANG Ban-hang

    2009-01-01

    New open manufacturing environments have been proposed aiming at realizing more flexible distributed manufacturing paradigms, which can deal with not only dynamic changes in volume and variety of products, but also changes of machining equipments, dispersals of processing locations, and also with unscheduled disruptions. This research is to develop an integrated process planning and scheduling system, which is suited to this open, dynamic,distributed manufacturing environment. Multi-agent system (MAS) approaches are used for integration of manufacturing processing planning and scheduling in an open distributed manufacturing environment, in which process planning can be adjusted dynamically and manufacturing resources can increase/decrease according to the requirements. One kind of multi-level dynamic negotiated approaches to process planning and scheduling is presented for the integration of manufacturing process planning and scheduling.

  20. Opening up the innovation process: archetypal strategies

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2005-01-01

    sharing and co-operation play a critical part. The paper addresses the involvement of users in opening up the innovation process, which in turn gives the participating actors an interesting alternative for product development. We identify and classify four archetypal strategies for opening up...

  1. Harvesting, Integrating and Distributing Large Open Geospatial Datasets Using Free and Open-Source Software

    Science.gov (United States)

    Oliveira, Ricardo; Moreno, Rafael

    2016-06-01

    Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies' transparency and accountability, as well as to improve citizens' awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a `not-ready-to-download' source that could then be combined with the initial data set to enhance its potential use.

  2. Freeing Worldview's development process: Open source everything!

    Science.gov (United States)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  3. Effective Factor Endowments, Trade Openness and Income Distribution in China

    OpenAIRE

    Xiaodong Lu; Guowei Cai

    2011-01-01

    This paper studies the empirical relationship among factor endowment, trade openness and individual income distribution. Using panel data, we show that factor endowment characters, to some extent, explains income gap in China. First, land and Capital intensive provinces have a more equal income distribution while human capital and labor-intensive provinces have a less equal income distribution. Second, Trade openness has a significant effect on China¡¯s income distribution; the interaction be...

  4. Multimedia Services in Open Distributed Telecommunications Environments

    NARCIS (Netherlands)

    Leydekkers, Peter

    1997-01-01

    In the majority of European countries a twofold change is taking place in the telecommunications marketplace. Firstly, the traditional monopolistic state owned telecommunications provider is being privatised and secondly, the market in which this newly privatised provider operates is being opened to

  5. Beyond Scientific Workflows: Networked Open Processes

    NARCIS (Netherlands)

    Cushing, R.; Bubak, M.; Belloum, A.; de Laat, C.

    2013-01-01

    The multitude of scientific services and processes being developed brings about challenges for future in silico distributed experiments. Choosing the correct service from an expanding body of processes means that the the task of manually building workflows is becoming untenable. In this paper we pro

  6. Opening up the innovation process: archetypal strategies

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2005-01-01

    The ability to create, retrieve and use knowledge and to be innovative is a strategic asset of immeasurable value. Until recently, most textbooks on business and product development taught that managers should keep their ideas to themselves and protect knowledge from getting into competitors' hands....... Seeking, developing and protecting knowledge is a costly activity. Moreover, apart from being expensive, the process of turning acquired knowledge into useful and well-protected innovations often slows the speed of development. In this paper we examine alternative strategies to innovation, in which...... sharing and co-operation play a critical part. The paper addresses the involvement of users in opening up the innovation process, which in turn gives the participating actors an interesting alternative for product development. We identify and classify four archetypal strategies for opening up...

  7. Safe Distribution of Declarative Processes

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2011-01-01

    process model generalizing labelled prime event structures to a systems model able to finitely represent ω-regular languages. An operational semantics given as a transition semantics between markings of the graph allows DCR Graphs to be conveniently used as both specification and execution model....... The technique for distribution is based on a new general notion of projection of DCR Graphs relative to a subset of labels and events identifying the set of external events that must be communicated from the other processes in the network in order for the distribution to be safe.We prove that for any vector...... of projections that covers a DCR Graph that the network of synchronously communicating DCR Graphs given by the projections is bisimilar to the original global process graph. We exemplify the distribution technique on a process identified in a case study of an cross-organizational case management system carried...

  8. Managing the operation of open distributed laboratory information systems.

    Science.gov (United States)

    Wade, V; Grimson, W; Hederman, L; Yearworth, M; Groth, T

    1996-07-01

    This paper examines how the concepts and designs of workflow management systems and distributed systems management can be integrated and customized to manage open laboratory computing services. The paper outlines the objectives of managing laboratory computing services and identifies techniques and designs which facilitate this management. The paper also outlines the implementation of an open laboratory service management system.

  9. Open Access This is an Open Access article distributed under the ...

    African Journals Online (AJOL)

    Open Access This is an Open Access article distributed under the terms of the ... ONLINE (AJOL) · Journals · Advanced Search · USING AJOL · RESOURCES ... the capacity and contribution of Nigerian Nurses to health care research, it is ... Aim: To review the academic and research preparedness of Nigerian nurses in ...

  10. Opening up the Agile Innovation Process

    Science.gov (United States)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  11. Full current statistics for a disordered open exclusion process

    Science.gov (United States)

    Ayyer, Arvind

    2016-04-01

    We consider the nonabelian sandpile model defined on directed trees by Ayyer et al (2015 Commun. Math. Phys. 335 1065) and restrict it to the special case of a one-dimensional lattice of n sites which has open boundaries and disordered hopping rates. We focus on the joint distribution of the integrated currents across each bond simultaneously, and calculate its cumulant generating function exactly. Surprisingly, the process conditioned on seeing specified currents across each bond turns out to be a renormalised version of the same process. We also remark on a duality property of the large deviation function. Lastly, all eigenvalues and both Perron eigenvectors of the tilted generator are determined.

  12. Specifying Processes: Application to Electrical Power Distribution

    Directory of Open Access Journals (Sweden)

    Sabah Al-Fedaghi

    2011-01-01

    Full Text Available Problem statement: This study deals with the problem of how to specify processes. Many process specification methodologies have been determined to be incomplete; for example, ISO 9000:2005 defines process as transforming media inputs into outputs. Nevertheless, the author of the Quality Systems Handbook, declares that such a definition is incomplete because processes create results and not necessarily by transforming inputs. Still, it is not clear what description of process can embed transformation of input to output or include creation that leads to results. Approach: This problem is important because process specification is an essential component in building projects utilized in such tasks as scheduling, planning, production, management, work flow and reengineering. Results: We solve the problem by “opening” the black box in the input-transformation-output model. This action uncovers many possible sources and destinations related to input and output, such as the disappearance, storage and copying of input. It is possible to reject input and also to block output from leaving the process. The approach is based on a conceptual framework for process specification of all generic phases that make up any process and embraces input, transformation, creation and output. The study applies the method in the field of electrical power distribution systems. Conclusion: We conclude that the results demonstrate a viable specification method that can be adopted for different types of processes.

  13. Learning Features in an Open, Flexible, and Distributed Environment

    Science.gov (United States)

    Khan, Badrul

    2005-01-01

    The Internet, supported by various digital technologies, is well-suited for open, flexible and distributed e-learning. Designing and delivering instruction and training on the Internet requires thoughtful analysis and investigation, combined with an understanding of both the Internet's capabilities and resources and the ways in which instructional…

  14. INNOVATION PROCESS IN OPEN CAPITAL BRAZILIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2013-12-01

    Full Text Available This study aims to identify the innovation process used by the open capital Brazilian companies and establish a ranking of the potentially innovative ones. For this, a questionnaire was sent to 484 companies with shares traded in Bovespa, receiving a response from 22. The innovation process is based on the model of Barrett and Sexton (2006. A summary of the results is presented below. (i Organizational Capabilities – 95.5% answered that they have incentives for innovation activities and 68.2% reported having procedures for all services. The leadership has a facilitator role encouraging the initiative (86.4% and promotes the maintenance of the group relationship (72.7%. Value risk taking, even through failures and prioritize the learning and experimenting new ideas. (ii Background of the innovation – reveals aspects of the capacity (internal or (external. Of the respondents, 59.1% developed internal activities of continuing P & D. Training to innovate is present in a continuous or occasional basis in 81.8% of the companies. The respondents characterize the economic environment as dynamic and the majority purchased software and equipments. In only 12 opportunities was a reference to obtaining patents as innovation protection measure. (iii Focus of innovation – the majority of the companies mentioned process or product innovation. Rewards are offered when the objectives are met and it is brought to attention when this does not occur. (iv Highlighted performance – the innovations achieved the expectations and created effects. The relevant benefits noticed were: improvement in quality of goods and services, increase of market share, increase of goods and services, and increase of productive capacity.

  15. A Transparent Runtime Data Distribution Engine for OpenMP

    Directory of Open Access Journals (Sweden)

    Dimitrios S. Nikolopoulos

    2000-01-01

    Full Text Available This paper makes two important contributions. First, the paper investigates the performance implications of data placement in OpenMP programs running on modern NUMA multiprocessors. Data locality and minimization of the rate of remote memory accesses are critical for sustaining high performance on these systems. We show that due to the low remote-to-local memory access latency ratio of contemporary NUMA architectures, reasonably balanced page placement schemes, such as round-robin or random distribution, incur modest performance losses. Second, the paper presents a transparent, user-level page migration engine with an ability to gain back any performance loss that stems from suboptimal placement of pages in iterative OpenMP programs. The main body of the paper describes how our OpenMP runtime environment uses page migration for implementing implicit data distribution and redistribution schemes without programmer intervention. Our experimental results verify the effectiveness of the proposed framework and provide a proof of concept that it is not necessary to introduce data distribution directives in OpenMP and warrant the simplicity or the portability of the programming model.

  16. Motion Analysis during Sabot Opening Process

    Directory of Open Access Journals (Sweden)

    R.S. Acharya

    2007-03-01

    Full Text Available For FSAPDS projectile, the trajectory and stability are dependent on different forces indifferent phases of the motion. During the first phase gravity, aerodynamic drag along withpropellant gas force affect the motion. The motion is influenced by shock wave and mechanicalforce in sabot opening phase and the effect of time lag during opening of sabots also forms partof this work.

  17. The open source, object- and process oriented simulation system OpenGeoSys - concepts, development, community

    Science.gov (United States)

    Bauer, S.; Li, D.; Beyer, C.; Wang, W.; Bilke, L.; Graupner, B.

    2011-12-01

    Many geoscientific problems, such as underground waste disposal, nuclear waste disposal, CO2 sequestration, geothermal energy, etc., require for prediction of ongoing processes as well as risk and safety assessment a numerical simulation system. The governing processes are thermal heat transfer (T), hydraulic flow in multi-phase systems (H), mechanical deformation (M) and geochemical reactions (C), which interact in a complex way (THMC). The development of suitable simulation systems requires a large amount of effort for code development, verification and applications. OpenGeoSys (OGS) is an open source scientific initiative for the simulation of these THMC processes in porous media. A flexible numerical framework based on the Finite Element Method is provided and applied to the governing process equations. Due to the object- and process-oriented character of the code, functionality enhancement and code coupling with external simulators can be performed reasonably effectively. This structure also allows for a distributed development, with developers at different locations contributing to the common code. The code is platform independent, accessible via internet for development and application, and checked by an automated benchmarking system regularly.

  18. Two dimensional velocity distribution in open channels using Renyi entropy

    Science.gov (United States)

    Kumbhakar, Manotosh; Ghoshal, Koeli

    2016-05-01

    In this study, the entropy concept is employed for describing the two-dimensional velocity distribution in an open channel. Using the principle of maximum entropy, the velocity distribution is derived by maximizing the Renyi entropy by assuming dimensionless velocity as a random variable. The derived velocity equation is capable of describing the variation of velocity along both the vertical and transverse directions with maximum velocity occurring on or below the water surface. The developed model of velocity distribution is tested with field and laboratory observations and is also compared with existing entropy-based velocity distributions. The present model has shown good agreement with the observed data and its prediction accuracy is comparable with the other existing models.

  19. Article Processing Charges and OpenAPC

    CERN Document Server

    CERN. Geneva

    2017-01-01

    The publication landscape is about to change. While being largely operated by subscription based journals in the past, recent political decisions force the publishing industry towards OpenAccess. Especially, the publication of the Finch report in 2012 put APC based Gold OpenAccess models almost everywhere on the agenda. These models also require quite some adoptions for library work flows to handle payments, bills and centralized funds for publication fees. Sometimes handled in specialized systems (e.g. first setups in Jülich) pretty early on discussions started to handle APCs in local repositories which would also hold the OpenAccess content resulting from these fees, e.g. the University of Regenburg uses ePrints for this purpose. Backed up by the OpenData movmement, libraries also saw opportunity to exchange data about fees payed. Thus, OpenAPC.de was born in 2014 on github to facilitate this exchange and aggregate large amounts of data for evaluation and comparison. Using the repository to hold payment d...

  20. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  1. Multivariate normal-Laplace distribution and processes

    Directory of Open Access Journals (Sweden)

    Kanichukattu Korakutty Jose

    2014-12-01

    Full Text Available The normal-Laplace distribution is considered and its properties are discussed. A multivariate normal-Laplace distribution is introduced and its properties are studied. First order autoregressive processes with these stationary marginal distributions are developed and studied. A generalized multivariate normal-Laplace distribution is introduced. Multivariate geometric normal-Laplace distribution and multivariate geometric generalized normal-Laplace distributions are introduced and their properties are studied. Estimation of parameters and some applications are also discussed.

  2. Assessing the Open Source Development Processes Using OMM

    Directory of Open Access Journals (Sweden)

    Etiel Petrinja

    2012-01-01

    Full Text Available The assessment of development practices in Free Libre Open Source Software (FLOSS projects can contribute to the improvement of the development process by identifying poor practices and providing a list of necessary practices. Available assessment methods (e.g., Capability Maturity Model Integration (CMMI do not address sufficiently FLOSS-specific aspects (e.g., geographically distributed development, importance of the contributions, reputation of the project, etc.. We present a FLOSS-focused, CMMI-like assessment/improvement model: the QualiPSo Open Source Maturity Model (OMM. OMM focuses on the development process. This makes it different from existing assessment models that are focused on the assessment of the product. We have assessed six FLOSS projects using OMM. Three projects were started and led by a software company, and three are developed by three different FLOSS communities. We identified poorly addressed development activities as the number of commit/bug reports, the external contributions, and the risk management. The results showed that FLOSS projects led by companies adopt standard project management approaches as product planning, design definition, and testing, that are less often addressed by community led FLOSS projects. The OMM is valuable for both the FLOSS community, by identifying critical development activities necessary to be improved, and for potential users that can better decide which product to adopt.

  3. An Open Distributed Architecture for Sensor Networks for Risk Management

    Directory of Open Access Journals (Sweden)

    Ralf Denzer

    2008-03-01

    Full Text Available Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word ‘sensors’ covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth’s surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: ‘Open Architecture and Spatial Data

  4. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback...... parallel and distributed processing with different software behavioural models such as open loop, randomness based, rule based, user interaction based, AI and ALife based software....

  5. Exploiting VSIPL and OpenMP for Parallel Image Processing

    CERN Document Server

    Kepner, J V

    2001-01-01

    VSIPL and OpenMP are two open standards for portable high performance computing. VSIPL delivers optimized single processor performance while OpenMP provides a low overhead mechanism for executing thread based parallelism on shared memory systems. Image processing is one of the main areas where VSIPL and OpenMP can make a large impact. Currently, a large fraction of image processing applications are written in the Interpreted Data Language (IDL) environment. The aim of this work is to demonstrate that the performance benefits of these new standards can be brought to image processing community in a high level manner that is transparent to users. To this end, this talk presents a fast, FFT based algorithm for performing image convolutions. This algorithm has been implemented within the IDL environment using VSIPL (for optimized single processor performance) with added OpenMP directives (for parallelism). This work demonstrates that good parallel speedups are attainable using standards and can be integrated seaml...

  6. Decay Process of Quantum Open System at Finite Temperatures

    Institute of Scientific and Technical Information of China (English)

    肖骁; 高一波

    2012-01-01

    Starting from the formal solution to the Heisenberg equation, we revisit an universal model for a quantum open system with a harmonic oscillator linearly coupled to a boson bath. The analysis of the decay process for a Fock state and a coherent state demonstrate that this method is very useful in dealing with the problems in decay process of the open system. For finite temperatures, the calculations of the reduced density matrix and the mean excitation number for the open system show that an initiaJ coherent state will evolve into a temperature-dependant coherent state after tracing over the bath variables. Also in short-time limit, a temperature-dependant effective Hamiltonian for the open system characterizes the decay process of the open system.

  7. Design Principles for Improving the Process of Publishing Open data

    NARCIS (Netherlands)

    Zuiderwijk, A.M.G.; Janssen, M.F.W.H.A.; Choenni, R. .; Meijer, R.F.

    2014-01-01

    · Purpose: Governments create large amounts of data. However, the publication of open data is often cumbersome and there are no standard procedures and processes for opening data. This blocks the easy publication of government data. The purpose of this paper is to derive design principles for improv

  8. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Segre, Daniel [Boston Univ., MA (United States)

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  9. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus

    This paper reviews useful results related to Palm distributions of spatial point processes and provides a new result regarding the characterization of Palm distributions for the class of log Gaussian Cox processes. This result is used to study functional summary statistics for a log Gaussian Cox...

  10. Analyzing Distributed Processing For Electric Utilities

    Science.gov (United States)

    Klein, Stanley A.; Kirkham, Harold; Beardmore, Julie A.

    1990-01-01

    Distributed Processing Trade-Off Model for Electric Utility Operation computer program based upon study performed at California Institute of Technology for NASA's Jet Propulsion Laboratory. Study presented technique addressing question of tradeoffs between expanding communications network or expanding capacity of distributed computers in energy-management systems (EMS) of electric utility. Gives EMS planners macroscopic tool for evaluation of architectures of distributed-processing systems and major technical and economic tradeoffs as well as interactions within systems.

  11. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    Science.gov (United States)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  12. Autonomous open-source hardware apparatus for quantum key distribution

    Directory of Open Access Journals (Sweden)

    Ignacio H. López Grande

    2016-01-01

    Full Text Available We describe an autonomous, fully functional implementation of the BB84 quantum key distribution protocol using open source hardware microcontrollers for the synchronization, communication, key sifting and real-time key generation diagnostics. The quantum bits are prepared in the polarization of weak optical pulses generated with light emitting diodes, and detected using a sole single-photon counter and a temporally multiplexed scheme. The system generates a shared cryptographic key at a rate of 365 bps, with a raw quantum bit error rate of 2.7%. A detailed description of the peripheral electronics for control, driving and communication between stages is released as supplementary material. The device can be built using simple and reliable hardware and it is presented as an alternative for a practical realization of sophisticated, yet accessible quantum key distribution systems. Received: 11 Novembre 2015, Accepted: 7 January 2016; Edited by: O. Martínez; DOI: http://dx.doi.org/10.4279/PIP.080002 Cite as: I H López Grande, C T Schmiegelow, M A Larotonda, Papers in Physics 8, 080002 (2016

  13. Managed traffic evacuation using distributed sensor processing

    Science.gov (United States)

    Ramuhalli, Pradeep; Biswas, Subir

    2005-05-01

    This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.

  14. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  15. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  16. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  17. Current Large Deviations for Asymmetric Exclusion Processes with Open Boundaries

    Science.gov (United States)

    Bodineau, T.; Derrida, B.

    2006-04-01

    We study the large deviation functional of the current for the Weakly Asymmetric Simple Exclusion Process in contact with two reservoirs. We compare this functional in the large drift limit to the one of the Totally Asymmetric Simple Exclusion Process, in particular to the Jensen-Varadhan functional. Conjectures for generalizing the Jensen-Varadhan functional to open systems are also stated.

  18. Rinsing Processes in Open-width Washing Machines

    NARCIS (Netherlands)

    Kroezen, A.B.J.; Linden, van der H.J.L.J.; Groot Wassink, J.

    1986-01-01

    A simulator is described for rinsing processes carried out on open-width washing machines. In combination with a theoretical model, a simple method is given for testing rinsing processes. The method has been used to investigate the extraction of caustic soda from a cotton fabric, varying the tempera

  19. OpenRS-Cloud:A remote sensing image processing platform based on cloud computing environment

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.

  20. Study on transient aerodynamic characteristics of parachute opening process

    Institute of Scientific and Technical Information of China (English)

    Li Yu; Xiao Ming

    2007-01-01

    In the research of parachute, canopy inflation process modeling is one of the most complicated tasks. As canopy often experiences the largest deformations and loa-dings during a very short time, it is of great difficulty for theoretical analysis and experimental measurements. In this paper, aerodynamic equations and structural dynamics equa-tions were developed for describing parachute opening process, and an iterative coupling solving strategy incorpo-rating the above equations was proposed for a small-scale, flexible and flat-circular parachute. Then, analyses were car-ried out for canopy geometry, time-dependent pressure diffe-rence between the inside and outside of the canopy, transient vortex around the canopy and the flow field in the radial plane as a sequence in opening process. The mechanism of the canopy shape development was explained from perspective of transient flow fields during the inflation process. Experi-ments of the parachute opening process were conducted in a wind tunnel, in which instantaneous shape of the canopy was measured by high velocity camera and the opening loading was measured by dynamometer balance. The theoretical pre-dictions were found in good agreement with the experimen-tal results, validating the proposed approach. This numerical method can improve the situation of strong dependence of parachute research on wind tunnel tests, and is of signifi-cance to the understanding of the mechanics of parachute inflation process.

  1. Implications of Fault Curvature for Slip Distributions, Opening, and Damage

    Science.gov (United States)

    Ritz, E.; Pollard, D. D.; Griffith, W. A.

    2010-12-01

    In his seminal 1905 paper on the dynamics of faulting, E.M. Anderson idealized faults as planar structures. Although the theory of fault mechanics has developed from this idealization, abundant evidence from geological and geophysical investigations shows that fault surfaces exhibit geometric irregularities on many scales. Understanding the mechanical behavior of non-planar fault surfaces is a fundamental problem for scientists working on the brittle deformation of Earth’s crust and is of practical importance to disciplines such as rock mechanics, geotechnical engineering, and earthquake science. Geologic observations of exhumed meter-scale strike-slip faults in the Bear Creek drainage, Sierra Nevada, CA, provide insights into the relationship between non-planar fault geometry and frictional slip at depth. These faults have smoothly curving surface expressions which may be approximated as sinusoidal curves. We numerically investigate both the natural fault geometries and model sinusoidal faults. Earlier models for the stress and deformation near a sinusoidal fault assume boundary conditions and fault characteristics that are not observed in nature. The 2D displacement discontinuity boundary element method is combined with a complementarity algorithm to model quasi-static slip on non-planar faults, and the resulting deformation of the nearby rock. This numerical technique can provide an accurate solution for any boundary value problem regarding crack-like features in an otherwise homogeneous and isotropic elastic material. Both field and numerical investigations indicate that non-planar fault geometry perturbs the along-fault slip form the distribution predicted for planar faults. In addition, both field observations and numerical modeling show that sliding along curved faults at depth may lead to localized fault opening, affecting local permeability and fluid migration.

  2. Apache Flink: Distributed Stream Data Processing

    CERN Document Server

    Jacobs, Kevin; CERN. Geneva. IT Department

    2016-01-01

    The amount of data is growing significantly over the past few years. Therefore, the need for distributed data processing frameworks is growing. Currently, there are two well-known data processing frameworks with an API for data batches and an API for data streams which are named Apache Flink and Apache Spark. Both Apache Spark and Apache Flink are improving upon the MapReduce implementation of the Apache Hadoop framework. MapReduce is the first programming model for distributed processing on large scale that is available in Apache Hadoop. This report compares the Stream API and the Batch API for both frameworks.

  3. Process of adoption communication openness in adoptive families: adopters’ perspective

    Directory of Open Access Journals (Sweden)

    Maria Acciaiuoli Barbosa-Ducharne

    2016-01-01

    Full Text Available Abstract Communication about adoption is a family interaction process which is more than the simple exchange of information. Adoption communication can be characterized in terms of the level of openness of family conversations regarding the child’s past and the degree of the family’s adoption social disclosure. The objective of this study is to explore the process of adoption communication openness in Portuguese adoptive families by identifying the impact of variables related to the adoption process, the adoptive parenting and the adoptee. One hundred twenty five parents of children aged 3 to 15, who were adopted on average 4 years ago, participated in this study. Data was collected during home visits using the Parents Adoption Process Interview. A cluster analysis identified three different groups of families according to the level of adoption communication openness within the family and outside. The findings also showed that the process of the adoption communication openness started when parents decided to adopt, developed in parent-child interaction and was susceptible to change under professional intervention. The relevance of training given to prospective adopters and of professional practice based on scientific evidence is highlighted.

  4. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  5. Agents-based distributed processes control systems

    Directory of Open Access Journals (Sweden)

    Adrian Gligor

    2011-12-01

    Full Text Available Large industrial distributed systems have revealed a remarkable development in recent years. We may note an increase of their structural and functional complexity, at the same time with those on requirements side. These are some reasons why there are involvednumerous researches, energy and resources to solve problems related to these types of systems. The paper addresses the issue of industrial distributed systems with special attention being given to the distributed industrial processes control systems. A solution for a distributed process control system based on mobile intelligent agents is presented.The main objective of the proposed system is to provide an optimal solution in terms of costs, maintenance, reliability and flexibility. The paper focuses on requirements, architecture, functionality and advantages brought by the proposed solution.

  6. An educational tool for interactive parallel and distributed processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2012-01-01

    In this article we try to describe how the modular interactive tiles system (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing a handson educational tool that allows a change in the representation...... of abstract problems related to designing interactive parallel and distributed systems. Indeed, the MITS seems to bring a series of goals into education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback...... interactive parallel and distributed processing with different behavioral software models such as open loop, randomness-based, rule-based, user interaction-based, and AI- and ALife-based software....

  7. Cluster-Enabled OpenMP: An OpenMP Compiler for the SCASH Software Distributed Shared Memory System

    Directory of Open Access Journals (Sweden)

    Mitsuhisa Sato

    2001-01-01

    Full Text Available OpenMP is attracting wide-spread interest because of its easy-to-use parallel programming model for shared memory multiprocessors. We have implemented a "cluster-enabled" OpenMP compiler for a page-based software distributed shared memory system, SCASH, which works on a cluster of PCs. It allows OpenMP programs to run transparently in a distributed memory environment. The compiler transforms OpenMP programs into parallel programs using SCASH so that shared global variables are allocated at run time in the shared address space of SCASH. A set of directives is added to specify data mapping and loop scheduling method which schedules iterations onto threads associated with the data mapping. Our experimental results show that the data mapping may greatly impact on the performance of OpenMP programs in the software distributed shared memory system. The performance of some NAS parallel benchmark programs in OpenMP is improved by using our extended directives.

  8. Parallelizing AT with open multi-processing and MPI

    Institute of Scientific and Technical Information of China (English)

    罗承明; 田顺强; 王坤; 张满洲; 张庆磊; 姜伯承

    2015-01-01

    Simulating charged particle motion through the elements is necessary to understand modern particle accel-erators. The particle numbers and the circling turns in a synchrotron are huge, and a simulation can be time-consuming. Open multi-processing (OpenMP) is a convenient method to speed up the computing of multi-cores for computers based on share memory model. Using message passing interface (MPI) which is based on non-uniform memory access architecture, a coarse grain parallel algorithm is set up for the Accelerator Toolbox (AT) for dynamic tracking processes. The computing speedup of the tracking process is 3.77 times with a quad-core CPU computer and the speed almost grows linearly with the number of CPU.

  9. Asymmetric Simple Exclusion Process with Open Boundaries and Quadratic Harnesses

    Science.gov (United States)

    Bryc, Włodek; Wesołowski, Jacek

    2017-04-01

    We show that the joint probability generating function of the stationary measure of a finite state asymmetric exclusion process with open boundaries can be expressed in terms of joint moments of Markov processes called quadratic harnesses. We use our representation to prove the large deviations principle for the total number of particles in the system. We use the generator of the Markov process to show how explicit formulas for the average occupancy of a site arise for special choices of parameters. We also give similar representations for limits of stationary measures as the number of sites tends to infinity.

  10. Parallel Programming with Matrix Distributed Processing

    CERN Document Server

    Di Pierro, Massimo

    2005-01-01

    Matrix Distributed Processing (MDP) is a C++ library for fast development of efficient parallel algorithms. It constitues the core of FermiQCD. MDP enables programmers to focus on algorithms, while parallelization is dealt with automatically and transparently. Here we present a brief overview of MDP and examples of applications in Computer Science (Cellular Automata), Engineering (PDE Solver) and Physics (Ising Model).

  11. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  12. An Overview of Models of Distributed Innovation. Open Innovation, User Innovation, and Social Innovation

    OpenAIRE

    GABISON Garry; PESOLE ANNAROSA

    2014-01-01

    This report discusses models of distributed innovation and how they differ in their nature, effects, and origins. Starting from Open Innovation, the paper analyses its methodological evolution, some of its applications, and the opportunities to apply it in a social context. Open Innovation has gained traction in the last ten years and because of this popularity, Open Innovation has been endowed with numerous meanings. This paper dives into the large literature associated with Open Innovati...

  13. Numerical Investigation of Developing Velocity Distributions in Open Channel Flows

    Directory of Open Access Journals (Sweden)

    Usman Ghani

    2014-04-01

    Full Text Available The velocity profiles in open channel flows start developing after entering into the channel for quite some length. All types of laboratory experiments for open channel flows are carried out in the fully developed flow regions which exist at some length downstream the inlet. In this research work an attempt has been made to investigate the impact of roughness and slope of the channel bed on the length required for establishment of fully developed flow in an open channel. A range of different roughness values along with various slopes were considered for this purpose. It was observed that an increase in roughness results in reduction of development length; and development length reduces drastically when roughness reaches to the range normally encountered in open channel flows with emergent vegetation or natural river flows. However, it was observed that the change of slope did not have any noticeable effect on development length. This work suggests that CFD (Computational Fluid Dynamics technique can be used for getting a reliable development length before performing an experimental work

  14. The message processing and distribution system development

    Science.gov (United States)

    Whitten, K. L.

    1981-06-01

    A historical approach is used in presenting the life cycle development of the Navy's message processing and distribution system beginning with the planning phase and ending with the integrated logistic support phase. Several maintenance problems which occurred after the system was accepted for fleet use were examined to determine if they resulted from errors in the acquisition process. The critical decision points of the acquisition process are examined and constructive recommendations are made for avoiding the problems which hindered the successful development of this system.

  15. Analysis of multi-stage open shop processing systems

    CERN Document Server

    Eggermont, Christian; Woeginger, Gerhard J

    2011-01-01

    We study algorithmic problems in multi-stage open shop processing systems that are centered around reachability and deadlock detection questions. We characterize safe and unsafe system states. We show that it is easy to recognize system states that can be reached from the initial state (where the system is empty), but that in general it is hard to decide whether one given system state is reachable from another given system state. We show that the problem of identifying reachable deadlock states is hard in general open shop systems, but is easy in the special case where no job needs processing on more than two machines (by linear programming and matching theory), and in the special case where all machines have capacity one (by graph-theoretic arguments).

  16. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is rev

  17. Raster Data Partitioning for Supporting Distributed GIS Processing

    Science.gov (United States)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    . A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.

  18. A Distributed Process Infrastructure for a Distributed Data Structure

    CERN Document Server

    Rodriguez, Marko A

    2008-01-01

    The Resource Description Framework (RDF) is continuing to grow outside the bounds of its initial function as a metadata framework and into the domain of general-purpose data modeling. This expansion has been facilitated by the continued increase in the capacity and speed of RDF database repositories known as triple-stores. High-end RDF triple-stores can hold and process on the order of 10 billion triples. In an effort to provide a seamless integration of the data contained in RDF repositories, the Linked Data community is providing specifications for linking RDF data sets into a universal distributed graph that can be traversed by both man and machine. While the seamless integration of RDF data sets is important, at the scale of the data sets that currently exist and will ultimately grow to become, the "download and index" philosophy of the World Wide Web will not so easily map over to the Semantic Web. This essay discusses the importance of adding a distributed RDF process infrastructure to the current distr...

  19. Open Core Data: Semantic driven data access and distribution for terrestrial and marine scientific drilling data

    Science.gov (United States)

    Fils, D.; Noren, A. J.; Lehnert, K. A.

    2015-12-01

    Open Core Data (OCD) is a science-driven, innovative, efficient, and scalable infrastructure for data generated by scientific drilling and coring projects across all Earth sciences. It is designed to make make scientific drilling data semantically discoverable, persistent, citable, and approachable to maximize their utility to present and future geoscience researchers. Scientific drilling and coring is crucial for the advancement of the Earth Sciences, unlocking new frontiers in the geologic record. Open Core Data will utilize and link existing data systems, services, and expertise of the JOIDES Resolution Science Operator (JRSO), the Continental Scientific Drilling Coordination Office (CSDCO), the Interdisciplinary Earth Data Alliance (IEDA) data facility, and the Consortium for Ocean Leadership (OL). Open Core Data will leverage efforts currently taking place under the EarthCube GeoLink Building Block and other previous efforts in Linked Open Data around ocean drilling data coordinated by OL. The OCD architecture for data distribution blends Linked Data Platform approaches with web services and schema.org use. OCD will further enable integration and tool development by assigning and using vocabularies, provenance, and unique IDs (DOIs, IGSN, URIs) in scientific drilling resources. A significant focus of this effort is to enable large scale automated access to the data by domain specific communities such as MagIC and Neotoma. Providing them a process to integrate the facility data into their data models, workflows and tools. This aspect will encompass methods to maintain awareness of authority information enabling users to trace data back to the originating facility. Initial work on OCD is taking place under a supplemental awarded to IEDA. This talk gives an overview of that work to date and planned future directions for the distribution of scientific drilling data by this effort.

  20. Generalized parton distributions and exclusive processes

    Energy Technology Data Exchange (ETDEWEB)

    Guzey, Vadim [Hampton U.

    2013-10-01

    In last fifteen years, GPDs have emerged as a powerful tool to reveal such aspects of the QCD structure of the nucleon as: - 3D parton correlations and distributions; - spin content of the nucleon. Further advances in the field of GPDs and hard exclusive processes rely on: - developments in theory and new methods in phenomenology such as new flexible parameterizations, neural networks, global QCD fits - new high-precision data covering unexplored kinematics: JLab at 6 and 12 GeV, Hermes with recoil detector, Compass, EIC. This slide-show presents: Nucleon structure in QCD, particularly hard processes, factorization and parton distributions; and a brief overview of GPD phenomenology, including basic properties of GPDs, GPDs and QCD structure of the nucleon, and constraining GPDs from experiments.

  1. Study on the Medical Image Distributed Dynamic Processing Method

    Institute of Scientific and Technical Information of China (English)

    张全海; 施鹏飞

    2003-01-01

    To meet the challenge of implementing rapidly advanced, time-consuming medical image processing algorithms,it is necessary to develop a medical image processing technology to process a 2D or 3D medical image dynamically on the web. But in a premier system, only static image processing can be provided with the limitation of web technology. The development of Java and CORBA (common object request broker architecture) overcomes the shortcoming of the web static application and makes the dynamic processing of medical images on the web available. To develop an open solution of distributed computing, we integrate the Java, and web with the CORBA and present a web-based medical image dynamic processing methed, which adopts Java technology as the language to program application and components of the web and utilies the CORBA architecture to cope with heterogeneous property of a complex distributed system. The method also provides a platform-independent, transparent processing architecture to implement the advanced image routines and enable users to access large dataset and resources according to the requirements of medical applications. The experiment in this paper shows that the medical image dynamic processing method implemented on the web by using Java and the CORBA is feasible.

  2. Fouling distribution in forward osmosis membrane process.

    Science.gov (United States)

    Lee, Junseok; Kim, Bongchul; Hong, Seungkwan

    2014-06-01

    Fouling behavior along the length of membrane module was systematically investigated by performing simple modeling and lab-scale experiments of forward osmosis (FO) membrane process. The flux distribution model developed in this study showed a good agreement with experimental results, validating the robustness of the model. This model demonstrated, as expected, that the permeate flux decreased along the membrane channel due to decreasing osmotic pressure differential across the FO membrane. A series of fouling experiments were conducted under the draw and feed solutions at various recoveries simulated by the model. The simulated fouling experiments revealed that higher organic (alginate) fouling and thus more flux decline were observed at the last section of a membrane channel, as foulants in feed solution became more concentrated. Furthermore, the water flux in FO process declined more severely as the recovery increased due to more foulants transported to membrane surface with elevated solute concentrations at higher recovery, which created favorable solution environments for organic adsorption. The fouling reversibility also decreased at the last section of the membrane channel, suggesting that fouling distribution on FO membrane along the module should be carefully examined to improve overall cleaning efficiency. Lastly, it was found that such fouling distribution observed with co-current flow operation became less pronounced in counter-current flow operation of FO membrane process.

  3. ERP and E-Business Application Deployment in Open Source Distributed Cloud Systems

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2012-10-01

    Full Text Available In this paper we present the way in which we combine SlapOS, the fist open source operating system for distributed cloud computing, and Enterprise Resource Modeling (ERP to provide an simple, unified API for E-Business Applications based on Iaas, PaaS and SaaS models. SlapOS is based on a grid computing daemon – called slapgrid – which is capable of installing any software on a PC and instantiate any number of processes of potentially infinite duration of any installed software using a master-slave model. SlapOS Master follows an ERP model to handle at the same time process allocation optimization and billing.

  4. Quasi-stationary distributions and population processes

    CERN Document Server

    Méléard, Sylvie

    2011-01-01

    This survey concerns the study of quasi-stationary distributions with a specific focus on models derived from ecology and population dynamics. We are concerned with the long time behavior of different stochastic population size processes when 0 is an absorbing point almost surely attained by the process. The hitting time of this point, namely the extinction time, can be large compared to the physical time and the population size can fluctuate for large amount of time before extinction actually occurs. This phenomenon can be understood by the study of quasi-limiting distributions. In this paper, general results on quasi-stationarity are given and examples developed in detail. One shows in particular how this notion is related to the spectral properties of the semi-group of the process killed at 0. Then we study different stochastic population models including nonlinear terms modeling the regulation of the population. These models will take values in countable sets (as birth and death processes) or in continuou...

  5. Distributed radiofrequency signal processing using multicore fibers

    Science.gov (United States)

    Garcia, S.; Gasulla, I.

    2016-11-01

    Next generation fiber-wireless communication paradigms will require new technologies to address the current limitations to massive capacity, connectivity and flexibility. Multicore optical fibers, which were conceived for high-capacity digital communications, can bring numerous advantages to fiber-wireless radio access architectures. Besides radio over fiber parallel distribution and multiple antenna connectivity, multicore fibers can implement, at the same time, a variety of broadband processing functionalities for microwave and millimeter-wave signals. This approach leads to the novel concept of "fiber-distributed signal processing". In particular, we capitalize on the spatial parallelism inherent to multicore fibers to implement a broadband tunable true time delay line, which is the basis of multiple processing applications such as signal filtering, arbitrary waveform generation and squint-free radio beamsteering. We present the design of trench-assisted heterogeneous multicore fibers composed of cores featuring individual spectral group delays and chromatic dispersion profiles. Besides fulfilling the requirements for true time delay line operation, the MCFs are optimized in terms of higher-order dispersion, crosstalk and bend sensitivity. Microwave photonics signal processing will benefit from the performance stability, 2D operation versatility and compactness brought by the reported fiberintegrated solution.

  6. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    Science.gov (United States)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other

  7. Gossip Algorithms for Distributed Signal Processing

    CERN Document Server

    Dimakis, Alexandros G; Moura, Jose M F; Rabbat, Michael G; Scaglione, Anna

    2010-01-01

    Gossip algorithms are attractive for in-network processing in sensor networks because they do not require any specialized routing, there is no bottleneck or single point of failure, and they are robust to unreliable wireless network conditions. Recently, there has been a surge of activity in the computer science, control, signal processing, and information theory communities, developing faster and more robust gossip algorithms and deriving theoretical performance guarantees. This article presents an overview of recent work in the area. We describe convergence rate results, which are related to the number of transmitted messages and thus the amount of energy consumed in the network for gossiping. We discuss issues related to gossiping over wireless links, including the effects of quantization and noise, and we illustrate the use of gossip algorithms for canonical signal processing tasks including distributed estimation, source localization, and compression.

  8. Distributed data processing for public health surveillance

    Directory of Open Access Journals (Sweden)

    Yih Katherine

    2006-09-01

    Full Text Available Abstract Background Many systems for routine public health surveillance rely on centralized collection of potentially identifiable, individual, identifiable personal health information (PHI records. Although individual, identifiable patient records are essential for conditions for which there is mandated reporting, such as tuberculosis or sexually transmitted diseases, they are not routinely required for effective syndromic surveillance. Public concern about the routine collection of large quantities of PHI to support non-traditional public health functions may make alternative surveillance methods that do not rely on centralized identifiable PHI databases increasingly desirable. Methods The National Bioterrorism Syndromic Surveillance Demonstration Program (NDP is an example of one alternative model. All PHI in this system is initially processed within the secured infrastructure of the health care provider that collects and holds the data, using uniform software distributed and supported by the NDP. Only highly aggregated count data is transferred to the datacenter for statistical processing and display. Results Detailed, patient level information is readily available to the health care provider to elucidate signals observed in the aggregated data, or for ad hoc queries. We briefly describe the benefits and disadvantages associated with this distributed processing model for routine automated syndromic surveillance. Conclusion For well-defined surveillance requirements, the model can be successfully deployed with very low risk of inadvertent disclosure of PHI – a feature that may make participation in surveillance systems more feasible for organizations and more appealing to the individuals whose PHI they hold. It is possible to design and implement distributed systems to support non-routine public health needs if required.

  9. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  10. Hydrazoic acid distribution coefficients in Purex processing

    Energy Technology Data Exchange (ETDEWEB)

    Kelmers, A.D.; Browning, D.N.

    1977-01-01

    Mixtures of hydroxylamine nitrate and hydrazine are being considered for the reductive stripping of plutonium during Purex processing. The hydrazine functions as a holding reductant for plutonium(III) by destroying nitrous acid via the fast reaction N/sub 2/H/sub 4/ + HNO/sub 2/ ..-->.. HN/sub 3/ + 2H/sub 2/O which leads to the stoichiometric formation of hydrazoic acid. We have measured the distribution coefficients for hydrazoic acid between nitric acid solutions and tributylphosphate-dodecane solutions. Values in the range of 1 to 10 were obtained under typical Purex process conditions. This indicates that most of the hydrazoic acid will be present in the organic phase leaving the plutonium stripping contactors. The distribution coefficients can be expressed as log(E O/A) = n log(Free TBP) + log K' where K' is 7.0, 10.0, 5.1 and 4.7, respectively, at 25, 35, 45 and 55/sup 0/C; and the corresponding values of n are 1.11, 1.27, 0.97 and 1.20.

  11. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.

    2016-05-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  12. Open Access Article Processing Charges: DOAJ Survey May 2014

    Directory of Open Access Journals (Sweden)

    Heather Morrison

    2015-02-01

    Full Text Available As of May 2014, the Directory of Open Access Journals (DOAJ listed close to ten thousand fully open access, peer reviewed, scholarly journals. Most of these journals do not charge article processing charges (APCs. This article reports the results of a survey of the 2567 journals, or 26% of journals listed in DOAJ, that do have APCs based on a sample of 1432 of these journals. Results indicate a volatile sector that would make future APCs difficult to predict for budgeting purposes. DOAJ and publisher title lists often did not closely match. A number of journals were found on examination not to have APCs. A wide range of publication costs was found for every publisher type. The average (mean APC of $964 contrasts with a mode of $0. At least 61% of publishers using APCs are commercial in nature, while many publishers are of unknown types. The vast majority of journals charging APCs (80% were found to offer one or more variations on pricing, such as discounts for authors from mid to low income countries, differential pricing based on article type, institutional or society membership, and/or optional charges for extras such as English language editing services or fast track of articles. The complexity and volatility of this publishing landscape is discussed.

  13. QUALITY AND PROCESSES OF BANGLADESH OPEN UNIVERSITY COURSE MATERIALS DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    K. M. Rezanur RAHMAN

    2006-04-01

    Full Text Available A new member of the mega-Universities, Bangladesh Open University (BOU introduced a course team approach for developing effective course materials for distance students. BOU teaching media includes printed course books, study guides, radio and television broadcasts, audiocassettes and occasional face-to-face tutorials. Each course team comprises specialist course writer(s, editor, trained style editor, graphic designer,illustrator, audio-visual producer and anonymous referees. An editorial board or preview committee is responsible for the final approval for publishing or broadcasting materials for learners. This approach has been proved to be effective, but appeared to be complicated and time-consuming. This report focuses on the quality and processes of BOU course materials development taking into account the strengths and weaknesses of the current approach.

  14. Processing Government Data: ZIP Codes, Python, and OpenRefine

    Directory of Open Access Journals (Sweden)

    Frank Donnelly

    2014-07-01

    Full Text Available While there is a vast amount of useful US government data on the web, some of it is in a raw state that is not readily accessible to the average user. Data librarians can improve accessibility and usability for their patrons by processing data to create subsets of local interest and by appending geographic identifiers to help users select and aggregate data. This case study illustrates how census geography crosswalks, Python, and OpenRefine were used to create spreadsheets of non-profit organizations in New York City from the IRS Tax-Exempt Organization Masterfile. This paper illustrates the utility of Python for data librarians and should be particularly insightful for those who work with address-based data.

  15. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    Science.gov (United States)

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  16. Opening up the Collaborative Problem-Solving Process to Solvers

    Science.gov (United States)

    Robison, Tyler

    2013-01-01

    In software systems, having features of openness means that some of the internal components of the system are made available for examination by users. Researchers have looked at different effects of open systems a great deal in the area of educational technology, but also in areas outside of education. Properly used, openness has the potential to…

  17. Distributive Distillation Enabled by Microchannel Process Technology

    Energy Technology Data Exchange (ETDEWEB)

    Arora, Ravi

    2013-01-22

    The application of microchannel technology for distributive distillation was studied to achieve the Grand Challenge goals of 25% energy savings and 10% return on investment. In Task 1, a detailed study was conducted and two distillation systems were identified that would meet the Grand Challenge goals if the microchannel distillation technology was used. Material and heat balance calculations were performed to develop process flow sheet designs for the two distillation systems in Task 2. The process designs were focused on two methods of integrating the microchannel technology 1) Integrating microchannel distillation to an existing conventional column, 2) Microchannel distillation for new plants. A design concept for a modular microchannel distillation unit was developed in Task 3. In Task 4, Ultrasonic Additive Machining (UAM) was evaluated as a manufacturing method for microchannel distillation units. However, it was found that a significant development work would be required to develop process parameters to use UAM for commercial distillation manufacturing. Two alternate manufacturing methods were explored. Both manufacturing approaches were experimentally tested to confirm their validity. The conceptual design of the microchannel distillation unit (Task 3) was combined with the manufacturing methods developed in Task 4 and flowsheet designs in Task 2 to estimate the cost of the microchannel distillation unit and this was compared to a conventional distillation column. The best results were for a methanol-water separation unit for the use in a biodiesel facility. For this application microchannel distillation was found to be more cost effective than conventional system and capable of meeting the DOE Grand Challenge performance requirements.

  18. ACToR Chemical Structure processing using Open Source ...

    Science.gov (United States)

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d

  19. Sensitive periods differentiate processing of open- and closed-class words: an ERP study of bilinguals.

    Science.gov (United States)

    Weber-Fox, C; Neville, H J

    2001-12-01

    The goal of this study was to test the hypothesis that neural processes for language are heterogeneous in their adaptations to maturation and experience. This study examined whether the neural processes for open- and closed-class words are differentially affected by delays in second-language immersion. In English, open-class words primarily convey referential meaning, whereas closed-class words are primarily related to grammatical information in sentence processing. Previous studies indicate that event-related brain potentials (ERPs) elicited by these word classes display nonidentical distributions and latencies, show different developmental time courses, and are differentially affected by early language experience in Deaf individuals. In this study, ERPs were recorded from 10 monolingual English speakers and 53 Chinese-English bilingual speakers who were grouped according to their age of immersion in English: 1-3, 4-6, 7-10, 11-13, and >15 years of age. Closed-class words elicited an N280 that was largest over left anterior electrode sites for all groups. However, the peak latency was later (>35 ms) in bilingual speakers immersed in English after 7 years of age. In contrast, the latencies and distributions of the N350 elicited by open-class words were similar in all groups. In addition, the N400, elicited by semantic anomalies (open-class words that violated semantic expectation), displayed increased peak latencies for only the later-learning bilingual speakers (>11 years). These results are consistent with the hypothesis that language subprocesses are differentially sensitive to the timing of second-language experience.

  20. VELOCITY DISTRIBUTION IN TRAPEZOID-SECTION OPEN CHANNEL FLOW WITH A NEW REYNOLDS-STRESS EXPRESSION

    Institute of Scientific and Technical Information of China (English)

    Ma Zheng

    2003-01-01

    By considering that the coherent structure is the main cause of the Reynolds stress, a new Reynolds stress expression was given. On this basis the velocity distribution in the trapezoid-section open channel flow was worked out with the pseudo-spectral method. The results were compared with experimental data and the influence of the ratio of length to width of the cross-section and the lateral inclination on the velocity distribution was analyzed. This model can be used the large flux in rivers and open channes.

  1. Parallel and distributed processing: applications to power systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Felix; Murphy, Liam [California Univ., Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1994-12-31

    Applications of parallel and distributed processing to power systems problems are still in the early stages. Rapid progress in computing and communications promises a revolutionary increase in the capacity of distributed processing systems. In this paper, the state-of-the art in distributed processing technology and applications is reviewed and future trends are discussed. (author) 14 refs.,1 tab.

  2. 基于OpenCV的图像处理%Image Processing Based On OpenCV

    Institute of Scientific and Technical Information of China (English)

    苏慧娟; 于正林; 张桂林

    2014-01-01

    OpenCV是近年来最受欢迎的计算机视觉应用库.在其基础上编写图像处理代码效率得到有效提高.本文旨在对OpenCV进行一个快速全面简介,通过介绍其数据结构、HighGUI库,图像处理函数使读者能快速形成对OpenCV印象.文章详细介绍了2.4.4版本在VS2010中的安装测试说明.读者能够在此基础上架构自己代码.文章最后通过自适应阈值分割实例来介绍OpenCV的具体应用.

  3. Distributed architecture and distributed processing mode in urban sewage treatment

    Science.gov (United States)

    Zhou, Ruipeng; Yang, Yuanming

    2017-05-01

    Decentralized rural sewage treatment facility over the broad area, a larger operation and management difficult, based on the analysis of rural sewage treatment model based on the response to these challenges, we describe the principle, structure and function in networking technology and network communications technology as the core of distributed remote monitoring system, through the application of case analysis to explore remote monitoring system features in a decentralized rural sewage treatment facilities in the daily operation and management. Practice shows that the remote monitoring system to provide technical support for the long-term operation and effective supervision of the facilities, and reduced operating, maintenance and supervision costs for development.

  4. Specifying Processes: Application to Electrical Power Distribution

    OpenAIRE

    Sabah Al-Fedaghi; Lina Al-Saleh

    2011-01-01

    Problem statement: This study deals with the problem of how to specify processes. Many process specification methodologies have been determined to be incomplete; for example, ISO 9000:2005 defines process as transforming media inputs into outputs. Nevertheless, the author of the Quality Systems Handbook, declares that such a definition is incomplete because processes create results and not necessarily by transforming inputs. Still, it is not clear what description of process can embed transfo...

  5. Achievements and open issues in the determination of polarized parton distribution functions

    CERN Document Server

    Nocera, Emanuele R

    2015-01-01

    I review the current status of the determination of helicity-dependent, or polarized, parton distribution functions from a comprehensive analysis of experimental data in perturbative quantum chromodynamics. I illustrate the latest achievements driven by new measurements in polarized proton-proton collisions at the Relativistic Heavy Ion Collider, namely the first evidence of a sizable polarized light sea quark asymmetry and of a positive polarized gluon distribution in the proton. I discuss which are the open issues in the determination of polarized distributions, and how these may be addressed in the future by ongoing, planned and proposed experimental programs.

  6. First exit distribution and path continuity of Hunt processes

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui-zeng; KANG Xu-sheng; ZHAO Min-zhi

    2009-01-01

    This paper gives a characterization of a Hunt process path by the first exit left limit distribution. It is also showed that if the first exit left limit distribution leaving any ball from the center is a uniform distribution on the sphere, then the Lévy Processes are a scaled Brownian motion.

  7. 基于OpenCV的图像处理%Image processing based on OpenCV

    Institute of Scientific and Technical Information of China (English)

    秦小文; 温志芳; 乔维维

    2011-01-01

    OpenCV是近年来推出的开源、免费的计算机视觉库,利用其所包含的函数可以很方便地实现数字图像和视频处理.同时利用面向对象的VC++6.0编程工具,用C++语言进行程序编写,大大提高了计算机的运行速度.本文首先阐述了OpenCV的特点以及结构,然后以平滑处理、图像形态学为例介绍了OpenCV在数字图像处理中的典型应用.OpenCV算法库为VC++编程处理数字图像提供了很大的方便,其必将成为图像视频处理领域的强有力的工具.

  8. A radial distribution function-based open boundary force model for multi-centered molecules

    KAUST Repository

    Neumann, Philipp

    2014-06-01

    We derive an expression for radial distribution function (RDF)-based open boundary forcing for molecules with multiple interaction sites. Due to the high-dimensionality of the molecule configuration space and missing rotational invariance, a computationally cheap, 1D approximation of the arising integral expressions as in the single-centered case is not possible anymore. We propose a simple, yet accurate model invoking standard molecule- and site-based RDFs to approximate the respective integral equation. The new open boundary force model is validated for ethane in different scenarios and shows very good agreement with data from periodic simulations. © World Scientific Publishing Company.

  9. Very Large Scale Distributed Information Processing Systems

    Science.gov (United States)

    1991-09-27

    34Reliable Distributed Database Management", Proc. of the IEEE, May 1987, pp. 601-620. [GOTT881 Gottlob , Georg andRoberto Zicari, "Closed World Databases... Gottlob , and Gio Wiederhold, "Interfacing Relational Databases and Prolog Efficiently," in Proceedings 2nd Expert Database Systems Conference, pp. 141

  10. Quality and Processes of Bangladesh Open University Course Materials Development

    Science.gov (United States)

    Islam, Tofazzal; Rahman, Morshedur; Rahman, K. M. Rezanur

    2006-01-01

    A new member of the mega-Universities, Bangladesh Open University (BOU) introduced a course team approach for developing effective course materials for distance students. BOU teaching media includes printed course books, study guides, radio and television broadcasts, audiocassettes and occasional face-to-face tutorials. Each course team…

  11. Fluctuations in the Weakly Asymmetric Exclusion Process with Open Boundary Conditions

    Science.gov (United States)

    Derrida, B.; Enaud, C.; Landim, C.; Olla, S.

    2005-03-01

    We investigate the fluctuations around the average density profile in the weakly asymmetric exclusion process with open boundaries in the steady state. We show that these fluctuations are given, in the macroscopic limit, by a centered Gaussian field and we compute explicitly its covariance function. We use two approaches. The first method is dynamical and based on fluctuations around the hydrodynamic limit. We prove that the density fluctuations evolve macroscopically according to an autonomous stochastic equation, and we search for the stationary distribution of this evolution. The second approach, which is based on a representation of the steady state as a sum over paths, allows one to write the density fluctuations in the steady state as a sum over two independent processes, one of which is the derivative of a Brownian motion, the other one being related to a random path in a potential.

  12. Signal processing for distributed readout using TESs

    Science.gov (United States)

    Smith, Stephen J.; Whitford, Chris H.; Fraser, George W.

    2006-04-01

    We describe optimal filtering algorithms for determining energy and position resolution in position-sensitive Transition Edge Sensor (TES) Distributed Read-Out Imaging Devices (DROIDs). Improved algorithms, developed using a small-signal finite-element model, are based on least-squares minimisation of the total noise power in the correlated dual TES DROID. Through numerical simulations we show that significant improvements in energy and position resolution are theoretically possible over existing methods.

  13. Signal processing for distributed readout using TESs

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Stephen J. [Department of Physics and Astronomy, Space Research Centre, University of Leicester, Michael Atiyah Building, University Road, Leicester, LE1 7RH (United Kingdom)]. E-mail: sts@star.le.ac.uk; Whitford, Chris H. [Department of Physics and Astronomy, Space Research Centre, University of Leicester, Michael Atiyah Building, University Road, Leicester, LE1 7RH (United Kingdom); Fraser, George W. [Department of Physics and Astronomy, Space Research Centre, University of Leicester, Michael Atiyah Building, University Road, Leicester, LE1 7RH (United Kingdom)

    2006-04-15

    We describe optimal filtering algorithms for determining energy and position resolution in position-sensitive Transition Edge Sensor (TES) Distributed Read-Out Imaging Devices (DROIDs). Improved algorithms, developed using a small-signal finite-element model, are based on least-squares minimisation of the total noise power in the correlated dual TES DROID. Through numerical simulations we show that significant improvements in energy and position resolution are theoretically possible over existing methods.

  14. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  15. wradlib - an Open Source Library for Weather Radar Data Processing

    Science.gov (United States)

    Pfaff, Thomas; Heistermann, Maik; Jacobi, Stephan

    2014-05-01

    for interactive data exploration and analysis. Based on the powerful scientific python stack (numpy, scipy, matplotlib) and in parts augmented by functions compiled in C or Fortran, most routines are fast enough to also allow data intensive re-analyses or even real-time applications. From the organizational point of view, wradlib is intended to be community driven. To this end, the source code is made available using a distributed version control system (DVCS) with a publicly hosted repository. Code may be contributed using the fork/pull-request mechanism available to most modern DVCS. Mailing lists were set up to allow dedicated exchange among users and developers in order to fix problems and discuss new developments. Extensive documentation is a key feature of the library, and is available online at http://wradlib.bitbucket.org. It includes an individual function reference as well as examples, tutorials and recipes, showing how those routines can be combined to create complete processing workflows. This should allow new users to achieve results quickly, even without much prior experience with weather radar data.

  16. Sentence Comprehension: A Parallel Distributed Processing Approach

    Science.gov (United States)

    1989-07-14

    Manuscript, University of California, San Diego. Bates, E., & Wulfeck, B. (in press). Crosslinguistic studies of aphasia. In B. MacWhinney & E. Bates (Eds...The crosslinguistic study of sentence processing. New York: Cambridge University Press. Chomsky, N. (1988). Lecture presented at the University of...MacWhinney & E. Bates (Eds.), The crosslinguistic study of sentence processing. New York: Cambridge University Press. Miikkulainen, R, & Dyer, M. G

  17. Proton momentum distribution in water: an open path integral molecular dynamics study

    Science.gov (United States)

    Morrone, Joseph A.; Srinivasan, Varadharajan; Sebastiani, Daniel; Car, Roberto

    2007-06-01

    Recent neutron Compton scattering experiments have detected the proton momentum distribution in water. The theoretical calculation of this property can be carried out via "open" path integral expressions. In this work, present an extension of the staging path integral molecular dynamics method, which is then employed to calculate the proton momentum distributions of water in the solid, liquid, and supercritical phases. We utilize a flexible, single point charge empirical force field to model the system's interactions. The calculated momentum distributions depict both agreement and discrepancies with experiment. The differences may be explained by the deviation of the force field from the true interactions. These distributions provide an abundance of information about the environment and interactions surrounding the proton.

  18. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    Science.gov (United States)

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  19. Numerical simulation of distributed parameter processes

    CERN Document Server

    Colosi, Tiberiu; Unguresan, Mihaela-Ligia; Muresan, Vlad

    2013-01-01

    The present monograph defines, interprets and uses the matrix of partial derivatives of the state vector with applications for the study of some common categories of engineering. The book covers broad categories of processes that are formed by systems of partial derivative equations (PDEs), including systems of ordinary differential equations (ODEs). The work includes numerous applications specific to Systems Theory based on Mpdx, such as parallel, serial as well as feed-back connections for the processes defined by PDEs. For similar, more complex processes based on Mpdx with PDEs and ODEs as components, we have developed control schemes with PID effects for the propagation phenomena, in continuous media (spaces) or discontinuous ones (chemistry, power system, thermo-energetic) or in electro-mechanics (railway – traction) and so on. The monograph has a purely engineering focus and is intended for a target audience working in extremely diverse fields of application (propagation phenomena, diffusion, hydrodyn...

  20. Personalized implant for high tibial opening wedge: combination of solid freeform fabrication with combustion synthesis process.

    Science.gov (United States)

    Zhim, Fouad; Ayers, Reed A; Moore, John J; Moufarrège, Richard; Yahia, L'Hocine

    2012-09-01

    In this work a new generation of bioceramic personalized implants were developed. This technique combines the processes of solid freeform fabrication (SFF) and combustion synthesis (CS) to create personalized bioceramic implants with tricalcium phosphate (TCP) and hydroxyapatite (HA). These porous bioceramics will be used to fill the tibial bone gap created by the opening wedge high tibial osteotomy (OWHTO). A freeform fabrication with three-dimensional printing (3DP) technique was used to fabricate a metallic mold with the same shape required to fill the gap in the opening wedge osteotomy. The mold was subsequently used in a CS process to fabricate the personalized ceramic implants with TCP and HA compositions. The mold geometry was designed on commercial 3D CAD software. The final personalized bioceramic implant was produced using a CS process. This technique was chosen because it exploits the exothermic reaction between P₂O₅ and CaO. Also, chemical composition and distribution of pores in the implant could be controlled. To determine the chemical composition, the microstructure, and the mechanical properties of the implant, cylindrical shapes were also fabricated using different fabrication parameters. Chemical composition was performed by X-ray diffraction. Pore size and pore interconnectivity was measured and analyzed using an electronic microscope system. Mechanical properties were determined by a mechanical testing system. The porous TCP and HA obtained have an open porous structure with an average 400 µm channel size. The mechanical behavior shows great stiffness and higher load to failure for both ceramics. Finally, this personalized ceramic implant facilitated the regeneration of new bone in the gap created by OWHTO and provides additional strength to allow accelerated rehabilitation.

  1. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  2. Power system operations: State estimation distributed processing

    Science.gov (United States)

    Ebrahimian, Mohammad Reza

    We present an application of a robust and fast parallel algorithm to power system state estimation with minimal amount of modifications to existing state estimators presently in place using the Auxiliary Problem Principle. We demonstrate its effectiveness on IEEE test systems, the Electric Reliability Counsel of Texas (ERCOT), and the Southwest Power Pool (SPP) systems. Since state estimation formulation may lead to an ill-conditioned system, we provide analytical explanations of the effects of mixtures of measurements on the condition of the state estimation information matrix. We demonstrate the closeness of the analytical equations to condition of several test case systems including IEEE RTS-96 and IEEE 118 bus systems. The research on the condition of the state estimation problem covers the centralized as well as distributed state estimation.

  3. Stepwise Distributed Open Innovation Contests for Software Development: Acceleration of Genome-Wide Association Analysis.

    Science.gov (United States)

    Hill, Andrew; Loh, Po-Ru; Bharadwaj, Ragu B; Pons, Pascal; Shang, Jingbo; Guinan, Eva; Lakhani, Karim; Kilty, Iain; Jelinsky, Scott A

    2017-05-01

    The association of differing genotypes with disease-related phenotypic traits offers great potential to both help identify new therapeutic targets and support stratification of patients who would gain the greatest benefit from specific drug classes. Development of low-cost genotyping and sequencing has made collecting large-scale genotyping data routine in population and therapeutic intervention studies. In addition, a range of new technologies is being used to capture numerous new and complex phenotypic descriptors. As a result, genotype and phenotype datasets have grown exponentially. Genome-wide association studies associate genotypes and phenotypes using methods such as logistic regression. As existing tools for association analysis limit the efficiency by which value can be extracted from increasing volumes of data, there is a pressing need for new software tools that can accelerate association analyses on large genotype-phenotype datasets. Using open innovation (OI) and contest-based crowdsourcing, the logistic regression analysis in a leading, community-standard genetics software package (PLINK 1.07) was substantially accelerated. OI allowed us to do this in computational, numeric, and algorithmic approaches was identified that accelerated the logistic regression in PLINK 1.07 by 18- to 45-fold. Combining contest-derived logistic regression code with coarse-grained parallelization, multithreading, and associated changes to data initialization code further developed through distributed innovation, we achieved an end-to-end speedup of 591-fold for a data set size of 6678 subjects by 645 863 variants, compared to PLINK 1.07's logistic regression. This represents a reduction in run time from 4.8 hours to 29 seconds. Accelerated logistic regression code developed in this project has been incorporated into the PLINK2 project. Using iterative competition-based OI, we have developed a new, faster implementation of logistic regression for genome-wide association

  4. Distribution of radionuclides in bayer process

    Energy Technology Data Exchange (ETDEWEB)

    Cuccia, Valeria; Oliveira, Arno H. de [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear; Rocha, Zildete [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Quimica e Radioquimica], E-mail: rochaz@cdtn.br

    2007-07-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in bauxite and samples of intermediate phases of Bayer process for alumina production, including the end product - alumina - and its main residue - red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and Neutron Activation Analysis. It was found that the bauxite is the major contributor to radioactivity in Bayer process. It has activities of 37+-12 Bq.kg{sup -1} for {sup 238}U and 154+-16 Bq.kg{sup -1} for {sup 232}Th. The intermediate phases and the end product do not carry significant activity, desirable characteristic from the health physics point of view. Sand and red mud carry most part of radionuclides, and the concentrations are higher in the red mud than in the sand. Thus, these solid residues present activities concentrations enhanced, when compared to bauxite. (author)

  5. Current Fluctuations in the One-Dimensional Symmetric Exclusion Process with Open Boundaries

    Science.gov (United States)

    Derrida, B.; Douçot, B.; Roche, P.-E.

    2004-05-01

    We calculate the first four cumulants of the integrated current of the one dimensional symmetric simple exclusion process of $N$ sites with open boundary conditions. For large system size $N$, the generating function of the integrated current depends on the densities $\\rho_a$ and $\\rho_b$ of the two reservoirs and on the fugacity $z$, the parameter conjugated to the integrated current, through a single parameter. Based on our expressions for these first four cumulants, we make a conjecture which leads to a prediction for all the higher cumulants. In the case $\\rho_a=1$ and $\\rho_b=0$, our conjecture gives the same universal distribution as the one obtained by Lee, Levitov and Yakovets for one dimensional quantum conductors in the metallic regime.

  6. Co-occurrence of Photochemical and Microbiological Transformation Processes in Open-Water Unit Process Wetlands.

    Science.gov (United States)

    Prasse, Carsten; Wenk, Jannis; Jasper, Justin T; Ternes, Thomas A; Sedlak, David L

    2015-12-15

    The fate of anthropogenic trace organic contaminants in surface waters can be complex due to the occurrence of multiple parallel and consecutive transformation processes. In this study, the removal of five antiviral drugs (abacavir, acyclovir, emtricitabine, lamivudine and zidovudine) via both bio- and phototransformation processes, was investigated in laboratory microcosm experiments simulating an open-water unit process wetland receiving municipal wastewater effluent. Phototransformation was the main removal mechanism for abacavir, zidovudine, and emtricitabine, with half-lives (t1/2,photo) in wetland water of 1.6, 7.6, and 25 h, respectively. In contrast, removal of acyclovir and lamivudine was mainly attributable to slower microbial processes (t1/2,bio = 74 and 120 h, respectively). Identification of transformation products revealed that bio- and phototransformation reactions took place at different moieties. For abacavir and zidovudine, rapid transformation was attributable to high reactivity of the cyclopropylamine and azido moieties, respectively. Despite substantial differences in kinetics of different antiviral drugs, biotransformation reactions mainly involved oxidation of hydroxyl groups to the corresponding carboxylic acids. Phototransformation rates of parent antiviral drugs and their biotransformation products were similar, indicating that prior exposure to microorganisms (e.g., in a wastewater treatment plant or a vegetated wetland) would not affect the rate of transformation of the part of the molecule susceptible to phototransformation. However, phototransformation strongly affected the rates of biotransformation of the hydroxyl groups, which in some cases resulted in greater persistence of phototransformation products.

  7. Control-based Scheduling in a Distributed Stream Processing System

    OpenAIRE

    2006-01-01

    Stream processing systems receive continuous streams of messages with raw information and produce streams of messages with processed information. The utility of a stream-processing system depends, in part, on the accuracy and timeliness of the output. Streams in complex event processing systems are processed on distributed systems; several steps are taken on different processors to process each incoming message, and messages may be enqueued between steps. This paper de...

  8. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available , it became apparent that there are very limited resources available, locally or internationally, that documented process related information about organizational OS migrations. This lack of information provides the motivation for this research...

  9. Velocity distribution measurements in a fishway like open channel by Laser Doppler Anemometry (LDA)

    Science.gov (United States)

    Sayeed-Bin-Asad, S. M.; Lundström, T. S.; Andersson, A. G.; Hellström, J. G. I.

    2016-03-01

    Experiments in an open channel flume with placing a vertical half cylinder barrier have been performed in order to investigate how the upstream velocity profiles are affected by a barrier. An experimental technique using Laser Doppler Velocimetry (LDV) was adopted to measure these velocity distributions in the channel for four different discharge rates. Velocity profiles were measured very close to wall and at 25, 50 and 100 mm upstream of the cylinder wall. For comparing these profiles with well-known logarithmic velocity profiles, velocity profiles were also measured in smooth open channel flow for all same four discharge rates. The results indicate that regaining the logarithmic velocity profiles upstream of the half cylindrical barrier occurs at 100 mm upstream of the cylinder wall.

  10. Velocity distribution measurements in a fishway like open channel by Laser Doppler Anemometry (LDA

    Directory of Open Access Journals (Sweden)

    Sayeed-Bin-Asad S.M.

    2016-01-01

    Full Text Available Experiments in an open channel flume with placing a vertical half cylinder barrier have been performed in order to investigate how the upstream velocity profiles are affected by a barrier. An experimental technique using Laser Doppler Velocimetry (LDV was adopted to measure these velocity distributions in the channel for four different discharge rates. Velocity profiles were measured very close to wall and at 25, 50 and 100 mm upstream of the cylinder wall. For comparing these profiles with well-known logarithmic velocity profiles, velocity profiles were also measured in smooth open channel flow for all same four discharge rates. The results indicate that regaining the logarithmic velocity profiles upstream of the half cylindrical barrier occurs at 100 mm upstream of the cylinder wall.

  11. Preparing for open access : distribution rate order application to the Ontario Energy Board 1999-2000

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-07

    The Ontario Hydro Services Company (OHSC) Inc. filed an application with the Ontario Energy Board (OEB) requesting grant of an order approving the revenue requirements for the Company`s distribution business, including those for distribution in remote communities, for the years 1999 and 2000, up until the point of open access. The revenue requirement for 1999 is $701 million, for the year 2000 it is $640 million. OHSC is a successor company to Ontario Hydro and it will become operational in its new incarnation on April 1, 1999. This marks the beginning of regulation of OHSC`s distribution business by the OEB, following the restructuring the electricity industry in Ontario. Restructuring ended the monopoly position of Ontario Hydro and introduced competition to the generation and retailing sectors, and regulation to the transmission and distribution sectors of the industry. The document sets out the circumstances leading up to the restructuring of the industry, the unbundling of Ontario Hydro into separate generation, transmission and distribution companies, outlines the new regulatory framework and provides the justification for the revenue requirements.

  12. OpenGeoSys: an open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media

    Science.gov (United States)

    Kolditz, O.

    2013-12-01

    In this paper we describe the OpenGeoSys (OGS) project, which is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical processes in porous media. The basic concept is to provide a flexible numerical framework (using primarily the Finite Element Method (FEM)) for solving multifield problems in porous and fractured media for applications in geoscience and hydrology. To this purpose OGS is based on an object-oriented FEM concept including a broad spectrum of interfaces for pre- and postprocessing. The OGS idea has been in development since the mid-eighties; meanwhile we are working on its 6th version. We provide a short historical note about the continuous process of concept and software development having evolved through Fortran, C, and C++ implementations. The idea behind OGS is to provide an open platform to the community, outfitted with professional software-engineering tools such as platform-independent compiling and automated benchmarking. A comprehensive benchmarking book has been prepared for publication. Benchmarking has been proven to be a valuable tool for cooperation between different developer teams, for example, for code comparison and validation purposes (DEVOVALEX, CO2BENCH and SSBENCH projects). On one hand, object-orientation (OO) provides a suitable framework for distributed code development; however, the parallelization of OO codes still lacks efficiency. High-performance-computing efficiency of OO codes is subject to future research (accompanying poster).

  13. Experimental open-air quantum key distribution with a single-photon source

    Energy Technology Data Exchange (ETDEWEB)

    Alleaume, R [Laboratoire de Photonique Quantique et Moleculaire, UMR 8537 du CNRS, ENS Cachan, 61 avenue du President Wilson, 94235 Cachan Cedex (France); Treussart, F [Laboratoire de Photonique Quantique et Moleculaire, UMR 8537 du CNRS, ENS Cachan, 61 avenue du President Wilson, 94235 Cachan Cedex (France); Messin, G [Laboratoire Charles Fabry de l' Institut d' Optique, UMR 8501 du CNRS, F-91403 Orsay (France); Dumeige, Y [Laboratoire de Photonique Quantique et Moleculaire, UMR 8537 du CNRS, ENS Cachan, 61 avenue du President Wilson, 94235 Cachan Cedex (France); Roch, J-F [Laboratoire de Photonique Quantique et Moleculaire, UMR 8537 du CNRS, ENS Cachan, 61 avenue du President Wilson, 94235 Cachan Cedex (France); Beveratos, A [Laboratoire Charles Fabry de l' Institut d' Optique, UMR 8501 du CNRS, F-91403 Orsay (France); Brouri-Tualle, R [Laboratoire Charles Fabry de l' Institut d' Optique, UMR 8501 du CNRS, F-91403 Orsay (France); Poizat, J-P [Laboratoire Charles Fabry de l' Institut d' Optique, UMR 8501 du CNRS, F-91403 Orsay (France); Grangier, P [Laboratoire Charles Fabry de l' Institut d' Optique, UMR 8501 du CNRS, F-91403 Orsay (France)

    2004-07-01

    We describe the implementation of a quantum key distribution (QKD) system using a single-photon source, operating at night in open air. The single-photon source at the heart of the functional and reliable set-up relies on the pulsed excitation of a single nitrogen-vacancy colour centre in a diamond nanocrystal. We tested the effect of attenuation on the polarized encoded photons for inferring the longer distance performance of our system. For strong attenuation, the use of pure single-photon states gives measurable advantage over systems relying on weak attenuated laser pulses. The results are in good agreement with theoretical models developed to assess QKD security.

  14. Experimental open air quantum key distribution with a single photon source

    CERN Document Server

    Alleaume, R; Brouri-Tualle, R; Dumeige, Y; Messin, G; Poizat, J P; Roch, J F; Treussart, F; Alleaume, Romain; Beveratos, Alexios; Brouri-Tualle, Rosa; Dumeige, Yannick; Messin, Gaetan; Poizat, Jean-Philippe; Proxy, Philippe Grangier; Roch, Jean-Francois; Treussart, Francois; ccsd-00001148, ccsd

    2004-01-01

    We present a full implementation of a quantum key distribution (QKD) system with a single photon source, operating at night in open air. The single photon source at the heart of the functional and reliable setup relies on the pulsed excitation of a single nitrogen-vacancy color center in diamond nanocrystal. We tested the effect of attenuation on the polarized encoded photons for inferring longer distance performance of our system. For strong attenuation, the use of pure single photon states gives measurable advantage over systems relying on weak attenuated laser pulses. The results are in good agreement with theoretical models developed to assess QKD security.

  15. MzJava: An open source library for mass spectrometry data processing.

    Science.gov (United States)

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics.

  16. Open Data Distribution Service (DDS) for Use in a Real Time Simulation Laboratory Environment

    Science.gov (United States)

    2012-02-29

    definitions are constrained to define only data that can be transported by the DDS service. The model will be used to generate CORBA Interface Definition...static discovery mechanisms to support small footprint and embedded system applications. CORBA Component Model Integrate OpenDDS with the DDS OMG... Corba Component Model (DDS4CCM) abstraction. Delay Tolerant Networking Implement an RFC5050 DTN capability, including bundle processing and a

  17. A Distributed DB Architecture for Processing cPIR Queries

    Directory of Open Access Journals (Sweden)

    Sultan.M

    2013-06-01

    Full Text Available Information Retrieval is the Process of obtaining materials, usually documents from unstructured huge volume of data. Several Protocols are available to retrieve bit information available in the distributed databases. A Cloud framework provides a platform for private information retrieval. In this article, we combine the artifacts of the distributed system with Cloud framework for extracting information from unstructured databases. The process involves distributing the database to a number of co-operative peers which will reduce the response of the query by influencing computational resources in the peer. A single query is subdivided into multiple queries and processed in parallel across the distributed sites. Our Simulation results using Cloud Sim shows that this distributed database architecture reduces the cost of computational Private Information Retrieval with reduced response time and processor overload in peer sites.

  18. OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets

    Science.gov (United States)

    Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa

    2017-04-01

    The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire

  19. Driven transport on open filaments with interfilament switching processes

    Science.gov (United States)

    Ghosh, Subhadip; Pagonabarraga, Ignacio; Muhuri, Sudipto

    2017-02-01

    We study a two-filament driven lattice gas model with oppositely directed species of particles moving on two parallel filaments with filament-switching processes and particle inflow and outflow at filament ends. The filament-switching process is correlated with the occupation number of the adjacent site such that particles switch filaments with finite probability only when oppositely directed particles meet on the same filament. This model mimics some of the coarse-grained features observed in context of microtubule-(MT) based intracellular transport, wherein cellular cargo loaded and off-loaded at filament ends are transported on multiple parallel MT filaments and can switch between the parallel microtubule filaments. We focus on a regime where the filaments are weakly coupled, such that filament-switching rate of particles scale inversely as the length of the filament. We find that the interplay of (off-) loading processes at the boundaries and the filament-switching process of particles leads to some distinctive features of the system. These features includes occurrence of a variety of phases in the system with inhomogeneous density profiles including localized density shocks, density difference across the filaments, and bidirectional current flows in the system. We analyze the system by developing a mean field (MF) theory and comparing the results obtained from the MF theory with the Monte Carlo (MC) simulations of the dynamics of the system. We find that the steady-state density and current profiles of particles and the phase diagram obtained within the MF picture matches quite well with MC simulation results. These findings maybe useful for studying multifilament intracellular transport.

  20. Intelligent Computational Systems. Opening Remarks: CFD Application Process Workshop

    Science.gov (United States)

    VanDalsem, William R.

    1994-01-01

    This discussion will include a short review of the challenges that must be overcome if computational physics technology is to have a larger impact on the design cycles of U.S. aerospace companies. Some of the potential solutions to these challenges may come from the information sciences fields. A few examples of potential computational physics/information sciences synergy will be presented, as motivation and inspiration for the Improving The CFD Applications Process Workshop.

  1. SignalPlant: an open signal processing software platform.

    Science.gov (United States)

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  2. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  3. Discriminating between Weibull distributions and log-normal distributions emerging in branching processes

    Science.gov (United States)

    Goh, Segun; Kwon, H. W.; Choi, M. Y.

    2014-06-01

    We consider the Yule-type multiplicative growth and division process, and describe the ubiquitous emergence of Weibull and log-normal distributions in a single framework. With the help of the integral transform and series expansion, we show that both distributions serve as asymptotic solutions of the time evolution equation for the branching process. In particular, the maximum likelihood method is employed to discriminate between the emergence of the Weibull distribution and that of the log-normal distribution. Further, the detailed conditions for the distinguished emergence of the Weibull distribution are probed. It is observed that the emergence depends on the manner of the division process for the two different types of distribution. Numerical simulations are also carried out, confirming the results obtained analytically.

  4. Tempered stable distributions stochastic models for multiscale processes

    CERN Document Server

    Grabchak, Michael

    2015-01-01

    This brief is concerned with tempered stable distributions and their associated Levy processes. It is a good text for researchers interested in learning about tempered stable distributions.  A tempered stable distribution is one which takes a stable distribution and modifies its tails to make them lighter. The motivation for this class comes from the fact that infinite variance stable distributions appear to provide a good fit to data in a variety of situations, but the extremely heavy tails of these models are not realistic for most real world applications. The idea of using distributions that modify the tails of stable models to make them lighter seems to have originated in the influential paper of Mantegna and Stanley (1994). Since then, these distributions have been extended and generalized in a variety of ways. They have been applied to a wide variety of areas including mathematical finance, biostatistics,computer science, and physics.

  5. Learning from the History of Distributed Query Processing

    DEFF Research Database (Denmark)

    Betz, Heiko; Gropengießer, Francis; Hose, Katja

    2012-01-01

    The vision of the Semantic Web has triggered the development of various new applications and opened up new directions in research. Recently, much effort has been put into the development of techniques for query processing over Linked Data. Being based upon techniques originally developed for dist...

  6. Learning from the History of Distributed Query Processing

    DEFF Research Database (Denmark)

    Betz, Heiko; Gropengießer, Francis; Hose, Katja;

    2012-01-01

    The vision of the Semantic Web has triggered the development of various new applications and opened up new directions in research. Recently, much effort has been put into the development of techniques for query processing over Linked Data. Being based upon techniques originally developed for dist...

  7. The collaboration between and contribution of a digital open innovation platform to a local design process

    DEFF Research Database (Denmark)

    del Castillo, Jacqueline; Bhatti, Yasser; Hossain, Mokter

    2017-01-01

    We examine the potential of an open innovation digital platform to expose a local innovation process to a greater number of ideas and a more inclusive set of stakeholders. To do so we studied an online innovation challenge on the OpenIDEO to reimagine the end-of-life experience sponsored by Sutte...

  8. Distributed System of Processing of Data of Physical Experiments

    Science.gov (United States)

    Nazarov, A. A.; Moiseev, A. N.

    2014-11-01

    Complication of physical experiments and increasing volumes of experimental data necessitate the application of supercomputer and distributed computing systems for data processing. Design and development of such systems, their mathematical modeling, and investigation of their characteristics and functional capabilities is an urgent scientific and practical problem. In the present work, the characteristics of operation of such distributed system of processing of data of physical experiments are investigated using the apparatus of theory of queuing networks.

  9. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    Science.gov (United States)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  10. Cataloguing in the open - the disintegration and distribution of the record

    Directory of Open Access Journals (Sweden)

    Martin Malmsten

    2013-01-01

    Full Text Available As part of a strategic investment in openness the Swedish National Library has released the National Bibliography and accompanying authority file as open data with a Creative Commons Zero license effectively putting it in the public domain. The data has been available  as linked open data since 2008 but is now also released in its original, complete form making it fit for re-use by other library systems. An important principle of linked data is to link out the other datasets. However, as data becomes more interconnected and distributed the need for ways to track and respond to changes in other datasets, even ones outside our area of control, becomes bigger. The issue of who to trust of course becomes vitally important. This paper details the motivation behind the release as well as the technology used to support it. Also, a consequence of exposing and using linked data is that the idea of the record as a self contained and delimited entity starts to fall apart.

  11. An Open Platform for Processing IFC Model Versions

    Institute of Scientific and Technical Information of China (English)

    Mohamed Nour; Karl Beucke

    2008-01-01

    The IFC initiative from the International Alliance of Interoperability has been developing since the mid-nineties through several versions.This paper addresses the problem of binding the growing number of IFC versions and their EXPRESS definitions to programming environments (Java and.NET).The solution developed in this paper automates the process of generating early binding classes,whenever a new version of the IFC model is released.Furthermore, a runtime instantiation of the generated eady binding classes takes place by importing IFC-STEP ISO 10303-P21 models.The user can navigate the IFC STEP model with relevance to the defining EXPRESS-schema,modify,deletem,and create new instances.These func-tionalities are considered to be a basis for any IFC based implementation.It enables researchers to experi-ment the IFC model independently from any software application.

  12. Open Markov processes: A compositional perspective on non-equilibrium steady states in biology

    CERN Document Server

    Pollard, Blake S

    2016-01-01

    In recent work, Baez, Fong and the author introduced a framework for describing Markov processes equipped with a detailed balanced equilibrium as open systems of a certain type. These `open Markov processes' serve as the building blocks for more complicated processes. In this paper, we describe the potential application of this framework in the modeling of biological systems as open systems maintained away from equilibrium. We show that non-equilibrium steady states emerge in open systems of this type, even when the rates of the underlying process are such that a detailed balanced equilibrium is permitted. It is shown that these non-equilibrium steady states minimize a quadratic form which we call `dissipation.' In some circumstances, the dissipation is approximately equal to the rate of change of relative entropy plus a correction term. On the other hand, Prigogine's principle of minimum entropy production generally fails for non-equilibrium steady states. We use a simple model of membrane transport to illus...

  13. Characteristics of the Audit Processes for Distributed Informatics Systems

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2009-01-01

    Full Text Available The paper contains issues regarding: main characteristics and examples of the distributed informatics systems and main difference categories among them, concepts, principles, techniques and fields for auditing the distributed informatics systems, concepts and classes of the standard term, characteristics of this one, examples of standards, guidelines, procedures and controls for auditing the distributed informatics systems. The distributed informatics systems are characterized by the following issues: development process, resources, implemented functionalities, architectures, system classes, particularities. The audit framework has two sides: the audit process and auditors. The audit process must be led in accordance with the standard specifications in the IT&C field. The auditors must meet the ethical principles and they must have a high-level of professional skills and competence in IT&C field.

  14. Mistaking geography for biology: inferring processes from species distributions.

    Science.gov (United States)

    Warren, Dan L; Cardillo, Marcel; Rosauer, Dan F; Bolnick, Daniel I

    2014-10-01

    Over the past few decades, there has been a rapid proliferation of statistical methods that infer evolutionary and ecological processes from data on species distributions. These methods have led to considerable new insights, but they often fail to account for the effects of historical biogeography on present-day species distributions. Because the geography of speciation can lead to patterns of spatial and temporal autocorrelation in the distributions of species within a clade, this can result in misleading inferences about the importance of deterministic processes in generating spatial patterns of biodiversity. In this opinion article, we discuss ways in which patterns of species distributions driven by historical biogeography are often interpreted as evidence of particular evolutionary or ecological processes. We focus on three areas that are especially prone to such misinterpretations: community phylogenetics, environmental niche modelling, and analyses of beta diversity (compositional turnover of biodiversity).

  15. The mass-ratio and eccentricity distributions of red giants in open clusters, barium and S stars

    CERN Document Server

    Van der Swaelmen, Mathieu; Jorissen, Alain; Van Eck, Sophie

    2016-01-01

    In order to identify diagnostics distinguishing between pre- and post-mass-transfer systems, the mass-ratio distribution and period - eccentricity (P - e) diagram of barium and S stars are compared to those of the sample of binary red giants in open clusters from Mermilliod et al. (2007). From the analysis of the mass-ratio distribution for the cluster binary giants, we find an excess of systems with companion masses between 0.58 and 0.87 Msun, typical for white dwarfs. They represent 22% of the sample, which are thus candidate post-mass-transfer systems. Among these candidates which occupy the same locus as the barium and S stars in the (P-e) diagram, only 33% (= 4/12) show a chemical signature of mass transfer in the form of s-process overabundances (from rather moderate -- about 0.3 dex -- to more extreme -- about 1 dex). These s-process-enriched cluster stars show a clear tendency to be in the clusters with the lowest metallicity in the sample, confirming the classical prediction that the s-process nucleo...

  16. Comparison of parallelizations of the algorithms for image processing in OpenCL

    OpenAIRE

    Kenda, Bojana

    2015-01-01

    The thesis explores the effects of running an image processing program with the OpenCL framework at different settings. Three methods, common in image processing, have been selected from the programming library GEGL and their execution times compared when using images of different sizes as well as on different hardware. In addition, one of the methods has been modified to use local memory on an OpenCL device, allowing us to identify the potential advantages and disadvantages of such an implem...

  17. A comprehensive open package format for preservation and distribution of geospatial data and metadata

    Science.gov (United States)

    Pons, X.; Masó, J.

    2016-12-01

    The complexities of the intricate geospatial resources and formats make preservation and distribution of GIS data difficult even among experts. The proliferation of, for instance, KML, Internet map services, etc, reflects the need for sharing geodata but a comprehensive solution when having to deal with data and metadata of a certain complexity is not currently provided. Original geospatial data is usually divided into several parts to record its different aspects (spatial and thematic features, etc), plus additional files containing, metadata, symbolization specifications and tables, etc; these parts are encoded in different formats, both standard and proprietary. To simplify data access, software providers encourage the use of an additional element that we call generically "map project", and this contains links to other parts (local or remote). Consequently, in order to distribute the data and metadata refereed by the map in a complete way, or to apply the Open Archival Information System (OAIS) standard to preserve it for the future, we need to face the multipart problem. This paper proposes a package allowing the distribution of real (comprehensive although diverse and complex) GIS data over the Internet and for data preservation. This proposal, complemented with the right tools, hides but keeps the multipart structure, so providing a simpler but professional user experience. Several packaging strategies are reviewed in the paper, and a solution based on ISO 29500-2 standard is chosen. The solution also considers the adoption of the recent Open Geospatial Consortium Web Services common standard (OGC OWS) context document as map part, and as a way for also combining data files with geospatial services. Finally, and by using adequate strategies, different GIS implementations can use several parts of the package and ignore the rest: a philosophy that has proven useful (e.g. in TIFF).

  18. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  19. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    Science.gov (United States)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated

  20. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  1. Parallel and distributed processing in power system simulation and control

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, Djalma M. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia

    1994-12-31

    Recent advances in computer technology will certainly have a great impact in the methodologies used in power system expansion and operational planning as well as in real-time control. Parallel and distributed processing are among the new technologies that present great potential for application in these areas. Parallel computers use multiple functional or processing units to speed up computation while distributed processing computer systems are collection of computers joined together by high speed communication networks having many objectives and advantages. The paper presents some ideas for the use of parallel and distributed processing in power system simulation and control. It also comments on some of the current research work in these topics and presents a summary of the work presently being developed at COPPE. (author) 53 refs., 2 figs.

  2. OASIS: a data and software distribution service for Open Science Grid

    Science.gov (United States)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  3. Design of a distributed CORBA based image processing server.

    Science.gov (United States)

    Giess, C; Evers, H; Heid, V; Meinzer, H P

    2000-01-01

    This paper presents the design and implementation of a distributed image processing server based on CORBA. Existing image processing tools were encapsulated in a common way with this server. Data exchange and conversion is done automatically inside the server, hiding these tasks from the user. The different image processing tools are visible as one large collection of algorithms and due to the use of CORBA are accessible via intra-/internet.

  4. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  5. An Analysis of OpenACC Programming Model: Image Processing Algorithms as a Case Study

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2014-06-01

    Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

  6. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; Pobre, Zed; Bell, Gavin M.; Drach, Bob; Williams, Dean; Kershaw, Philip; Pascoe, Stephen; Gonzalez, Estanislao; Fiore, Sandro; Schweitzer, Roland

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  7. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Cinquini, Luca [Jet Propulsion Laboratory, Pasadena, CA; Crichton, Daniel [Jet Propulsion Laboratory, Pasadena, CA; Miller, Neill [Argonne National Laboratory (ANL); Mattmann, Chris [Jet Propulsion Laboratory, Pasadena, CA; Harney, John F [ORNL; Shipman, Galen M [ORNL; Wang, Feiyi [ORNL; Bell, Gavin [Lawrence Livermore National Laboratory (LLNL); Drach, Bob [Lawrence Livermore National Laboratory (LLNL); Ananthakrishnan, Rachana [Argonne National Laboratory (ANL); Pascoe, Stephen [STFC Rutherford Appleton Laboratory, NCAS/BADC; Kershaw, Philip [STFC Rutherford Appleton Laboratory, NCAS/BADC; Gonzalez, Estanislao [German Climate Computing Center; Fiore, Sandro [Euro-Mediterranean Center on Climate Change; Schweitzer, Roland [Pacific Marine Environmental Laboratory, National Oceanic and Atmospheric Administration; Danvil, Sebastian [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Morgan, Mark [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  8. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    Energy Technology Data Exchange (ETDEWEB)

    Ananthakrishnan, Rachana [Argonne National Laboratory (ANL); Bell, Gavin [Lawrence Livermore National Laboratory (LLNL); Cinquini, Luca [Jet Propulsion Laboratory, Pasadena, CA; Crichton, Daniel [Jet Propulsion Laboratory, Pasadena, CA; Danvil, Sebastian [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Drach, Bob [Lawrence Livermore National Laboratory (LLNL); Fiore, Sandro [Euro-Mediterranean Center on Climate Change; Gonzalez, Estanislao [German Climate Computing Center; Harney, John F [ORNL; Mattmann, Chris [Jet Propulsion Laboratory, Pasadena, CA; Kershaw, Philip [STFC Rutherford Appleton Laboratory, NCAS/BADC; Miller, Neill [Argonne National Laboratory (ANL); Morgan, Mark [Institut Pierre Simon Laplace (IPSL), Des Sciences de L' Environnement; Pascoe, Stephen [STFC Rutherford Appleton Laboratory, NCAS/BADC; Schweitzer, Roland [Pacific Marine Environmental Laboratory, National Oceanic and Atmospheric Administration; Shipman, Galen M [ORNL; Wang, Feiyi [ORNL

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  9. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    Science.gov (United States)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  10. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  11. The process group approach to reliable distributed computing

    Science.gov (United States)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  12. Electric power processing, distribution, management and energy storage

    Science.gov (United States)

    Giudici, R. J.

    1980-01-01

    Power distribution subsystems are required for three elements of the SPS program: (1) orbiting satellite, (2) ground rectenna, and (3) Electric Orbiting Transfer Vehicle (EOTV). Power distribution subsystems receive electrical power from the energy conversion subsystem and provide the power busses rotary power transfer devices, switchgear, power processing, energy storage, and power management required to deliver control, high voltage plasma interactions, electric thruster interactions, and spacecraft charging of the SPS and the EOTV are also included as part of the power distribution subsystem design.

  13. The Land Processes Distributed Active Archive Center (LP DAAC)

    Science.gov (United States)

    Golon, Danielle K.

    2016-10-03

    The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.

  14. Distributed Signal Processing for Wireless EEG Sensor Networks.

    Science.gov (United States)

    Bertrand, Alexander

    2015-11-01

    Inspired by ongoing evolutions in the field of wireless body area networks (WBANs), this tutorial paper presents a conceptual and exploratory study of wireless electroencephalography (EEG) sensor networks (WESNs), with an emphasis on distributed signal processing aspects. A WESN is conceived as a modular neuromonitoring platform for high-density EEG recordings, in which each node is equipped with an electrode array, a signal processing unit, and facilities for wireless communication. We first address the advantages of such a modular approach, and we explain how distributed signal processing algorithms make WESNs more power-efficient, in particular by avoiding data centralization. We provide an overview of distributed signal processing algorithms that are potentially applicable in WESNs, and for illustration purposes, we also provide a more detailed case study of a distributed eye blink artifact removal algorithm. Finally, we study the power efficiency of these distributed algorithms in comparison to their centralized counterparts in which all the raw sensor signals are centralized in a near-end or far-end fusion center.

  15. Distributed Processing of Snort Alert Log using Hadoop

    Directory of Open Access Journals (Sweden)

    JeongJin Cheon

    2013-06-01

    Full Text Available Snort is a famous tool for Intrusion Detection System (IDS, which is used to gather and analyse network packet in order to decide attacks through network. Until now, although processing a number of warning messages in real time, Snort is executed mainly in single computer systems. Unfortunately, current amount of network messages exceeds processing capacity of single computer systems. In order to embrace the huge amount of network messages, we have constructed a distributedIDS using Hadoop, HDFS, and 8 working nodes. Experimental results show that our distributed IDS has 426% performance compared to a single computer system.

  16. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  17. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  18. processes controlling the depth distribution of soil organic carbon

    Science.gov (United States)

    Murphy, Brian; Wilson, Brian; Koen, Terry

    2017-04-01

    Knowledge of the processes controlling the depth distribution of soil organic carbon (SOC) has two major purposes: A. Providing insights into the dynamics of SOC) that can be used for managing soil organic carbon and improving soil carbon sequestration B. The prediction of SOC stocks from surface measurements of soil carbon. We investigated the depth distributions of SOC in a range of soils under a number of land management practices tested how various mathematical models fitted these distributions. The mathematical models included exponential, power functions, inverse functions and multiphase exponential functions. While spline functions have been shown to fit depth distributions of SOC, the use of these functions is largely a data fitting exercise and does not necessarily provide insight into the processes of SOC dynamics. In general soils that were depleted of SOC (under traditional tillage and land management practices that deplete the soil of SOC) had depth distributions that were fitted closely by a number of mathematical functions, including the exponential function. As the amount of SOC in the soil increased, especially in the surface soils, it became clear that the only mathematical function that could reasonably fit the depth distribution of SOC was the multiphase exponential model. To test the mathematical models further, several of the depth distributions were tested with semi-log plots of depth v log (SOC). These plots clearly showed that there were definite phases in the distribution of SOC with depth. The implication is that different processes are occurring in the addition and losses of SOC within each of these phases, and the phases identified by the semi-log plots appear to be equivalent to the zones of SOC cycling postulated by Eyles et al. (2015). The identification of these zones has implications for the management and sequestration of carbon in soils. Eyles, A, Coghlan, G, Hardie, M, Hovenden, M and Bridle, K (2015). Soil carbon sequestration

  19. Supporting users through integrated retrieval, processing, and distribution systems at the Land Processes Distributed Active Archive Center

    Science.gov (United States)

    Kalvelage, Thomas A.; Willems, Jennifer

    2005-01-01

    The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.

  20. Control of Grafting Density and Distribution in Graft Polymers by Living Ring-Opening Metathesis Copolymerization.

    Science.gov (United States)

    Lin, Tzu-Pin; Chang, Alice B; Chen, Hsiang-Yun; Liberman-Martin, Allegra L; Bates, Christopher M; Voegtle, Matthew J; Bauer, Christina A; Grubbs, Robert H

    2017-03-15

    Control over polymer sequence and architecture is crucial to both understanding structure-property relationships and designing functional materials. In pursuit of these goals, we developed a new synthetic approach that enables facile manipulation of the density and distribution of grafts in polymers via living ring-opening metathesis polymerization (ROMP). Discrete endo,exo-norbornenyl dialkylesters (dimethyl DME, diethyl DEE, di-n-butyl DBE) were strategically designed to copolymerize with a norbornene-functionalized polystyrene (PS), polylactide (PLA), or polydimethylsiloxane (PDMS) macromonomer mediated by the third-generation metathesis catalyst (G3). The small-molecule diesters act as diluents that increase the average distance between grafted side chains, generating polymers with variable grafting density. The grafting density (number of side chains/number of norbornene backbone repeats) could be straightforwardly controlled by the macromonomer/diluent feed ratio. To gain insight into the copolymer sequence and architecture, self-propagation and cross-propagation rate constants were determined according to a terminal copolymerization model. These kinetic analyses suggest that copolymerizing a macromonomer/diluent pair with evenly matched self-propagation rate constants favors randomly distributed side chains. As the disparity between macromonomer and diluent homopolymerization rates increases, the reactivity ratios depart from unity, leading to an increase in gradient tendency. To demonstrate the effectiveness of our method, an array of monodisperse polymers (PLA(x)-ran-DME(1-x))n bearing variable grafting densities (x = 1.0, 0.75, 0.5, 0.25) and total backbone degrees of polymerization (n = 167, 133, 100, 67, 33) were synthesized. The approach disclosed in this work therefore constitutes a powerful strategy for the synthesis of polymers spanning the linear-to-bottlebrush regimes with controlled grafting density and side chain distribution, molecular

  1. Benchmarking Distributed Stream Processing Platforms for IoT Applications

    OpenAIRE

    Shukla, Anshu; Simmhan, Yogesh

    2016-01-01

    Internet of Things (IoT) is a technology paradigm where millions of sensors monitor, and help inform or manage, physical, envi- ronmental and human systems in real-time. The inherent closed-loop re- sponsiveness and decision making of IoT applications makes them ideal candidates for using low latency and scalable stream processing plat- forms. Distributed Stream Processing Systems (DSPS) are becoming es- sential components of any IoT stack, but the efficacy and performance of contemporary DSP...

  2. Nuclear parton distributions and the Drell-Yan process

    Science.gov (United States)

    Kulagin, S. A.; Petti, R.

    2014-10-01

    We study the nuclear parton distribution functions on the basis of our recently developed semimicroscopic model, which takes into account a number of nuclear effects including nuclear shadowing, Fermi motion and nuclear binding, nuclear meson-exchange currents, and off-shell corrections to bound nucleon distributions. We discuss in detail the dependencies of nuclear effects on the type of parton distribution (nuclear sea vs valence), as well as on the parton flavor (isospin). We apply the resulting nuclear parton distributions to calculate ratios of cross sections for proton-induced Drell-Yan production off different nuclear targets. We obtain a good agreement on the magnitude, target and projectile x, and the dimuon mass dependence of proton-nucleus Drell-Yan process data from the E772 and E866 experiments at Fermilab. We also provide nuclear corrections for the Drell-Yan data from the E605 experiment.

  3. Nuclear Parton Distributions and the Drell-Yan Process

    CERN Document Server

    Kulagin, S A

    2014-01-01

    We study the nuclear parton distribution functions basing on our recently developed semi-microscopic model, which takes into account a number of nuclear effects including nuclear shadowing, Fermi motion and nuclear binding, nuclear meson-exchange currents and off-shell corrections to bound nucleon distributions. We discuss in details the dependencies of nuclear effects on the type of parton distribution (nuclear sea vs. valence) as well as on the parton flavour (isospin). The resulting nuclear parton distributions are applied to calculate the ratios of cross sections for proton-induced Drell-Yan production off different nuclear targets. We obtain a good agreement on the magnitude, target and projectile x and the dimuon mass dependence of proton-nucleus Drell-Yan process data from the E772 and E866 experiments at Fermilab.

  4. Searching for the Tracy-Widom distribution in nonequilibrium processes

    Science.gov (United States)

    Mendl, Christian B.; Spohn, Herbert

    2016-06-01

    While originally discovered in the context of the Gaussian unitary ensemble, the Tracy-Widom distribution also rules the height fluctuations of growth processes. This suggests that there might be other nonequilibrium processes in which the Tracy-Widom distribution plays an important role. In our contribution we study one-dimensional systems with domain wall initial conditions. For an appropriate choice of parameters, the profile develops a rarefaction wave while maintaining the initial equilibrium states far to the left and right, which thus serve as infinitely extended thermal reservoirs. For a Fermi-Pasta-Ulam type anharmonic chain, we will demonstrate that the time-integrated current has a deterministic contribution, linear in time t , and fluctuations of size t1 /3 with a Tracy-Widom distributed random amplitude.

  5. Probability distributions for Poisson processes with pile-up

    CERN Document Server

    Sevilla, Diego J R

    2013-01-01

    In this paper, two parametric probability distributions capable to describe the statistics of X-ray photon detection by a CCD are presented. They are formulated from simple models that account for the pile-up phenomenon, in which two or more photons are counted as one. These models are based on the Poisson process, but they have an extra parameter which includes all the detailed mechanisms of the pile-up process that must be fitted to the data statistics simultaneously with the rate parameter. The new probability distributions, one for number of counts per time bins (Poisson-like), and the other for waiting times (exponential-like) are tested fitting them to statistics of real data, and between them through numerical simulations, and their results are analyzed and compared. The probability distributions presented here can be used as background statistical models to derive likelihood functions for statistical methods in signal analysis.

  6. Eigenvalue distribution of large sample covariance matrices of linear processes

    CERN Document Server

    Pfaffel, Oliver

    2012-01-01

    We derive the distribution of the eigenvalues of a large sample covariance matrix when the data is dependent in time. More precisely, the dependence for each variable $i=1,...,p$ is modelled as a linear process $(X_{i,t})_{t=1,...,n}=(\\sum_{j=0}^\\infty c_j Z_{i,t-j})_{t=1,...,n}$, where $\\{Z_{i,t}\\}$ are assumed to be independent random variables with finite fourth moments. If the sample size $n$ and the number of variables $p=p_n$ both converge to infinity such that $y=\\lim_{n\\to\\infty}{n/p_n}>0$, then the empirical spectral distribution of $p^{-1}\\X\\X^T$ converges to a non\\hyp{}random distribution which only depends on $y$ and the spectral density of $(X_{1,t})_{t\\in\\Z}$. In particular, our results apply to (fractionally integrated) ARMA processes, which we illustrate by some examples.

  7. Building Big Flares: Constraining Generating Processes of Solar Flare Distributions

    Science.gov (United States)

    Wyse Jackson, T.; Kashyap, V.; McKillop, S.

    2015-12-01

    We address mechanisms which seek to explain the observed solar flare distribution, dN/dE ~ E1.8. We have compiled a comprehensive database, from GOES, NOAA, XRT, and AIA data, of solar flares and their characteristics, covering the year 2013. These datasets allow us to probe how stored magnetic energy is released over the course of an active region's evolution. We fit power-laws to flare distributions over various attribute groupings. For instance, we compare flares that occur before and after an active region reaches its maximum area, and show that the corresponding flare distributions are indistinguishable; thus, the processes that lead to magnetic reconnection are similar in both cases. A turnover in the distribution is not detectable at the energies accessible to our study, suggesting that a self-organized critical (SOC) process is a valid mechanism. However, we find changes in the distributions that suggest that the simple picture of an SOC where flares draw energy from an inexhaustible reservoir of stored magnetic energy is incomplete. Following the evolution of the flare distribution over the lifetimes of active regions, we find that the distribution flattens with time, and for larger active regions, and that a single power-law model is insufficient. This implies that flares that occur later in the lifetime of the active region tend towards higher energies. We conclude that the SOC process must have an upper bound. Increasing the scope of the study to include data from other years and more instruments will increase the robustness of these results. This work was supported by the NSF-REU Solar Physics Program at SAO, grant number AGS 1263241, NASA Contract NAS8-03060 to the Chandra X-ray Center and by NASA Hinode/XRT contract NNM07AB07C to SAO

  8. Just-in-time Data Distribution for Analytical Query Processing

    NARCIS (Netherlands)

    Ivanova, M.G.; Kersten, M.L.; Groffen, F.E.

    2012-01-01

    Distributed processing commonly requires data spread across machines using a priori static or hash-based data allocation. In this paper, we explore an alternative approach that starts from a master node in control of the complete database, and a variable number of worker nodes for delegated

  9. Just-In-Time Data Distribution for Analytical Query Processing

    NARCIS (Netherlands)

    Ivanova, M.; Kersten, M.; Groffen, F.

    2012-01-01

    Distributed processing commonly requires data spread across machines using a priori static or hash-based data allocation. In this paper, we explore an alternative approach that starts from a master node in control of the complete database, and a variable number of worker nodes for delegated query pr

  10. Post-processing procedure for industrial quantum key distribution systems

    Science.gov (United States)

    Kiktenko, Evgeny; Trushechkin, Anton; Kurochkin, Yury; Fedorov, Aleksey

    2016-08-01

    We present algorithmic solutions aimed on post-processing procedure for industrial quantum key distribution systems with hardware sifting. The main steps of the procedure are error correction, parameter estimation, and privacy amplification. Authentication of classical public communication channel is also considered.

  11. Lifetime-Based Memory Management for Distributed Data Processing Systems

    DEFF Research Database (Denmark)

    Lu, Lu; Shi, Xuanhua; Zhou, Yongluan

    2016-01-01

    In-memory caching of intermediate data and eager combining of data in shuffle buffers have been shown to be very effective in minimizing the re-computation and I/O cost in distributed data processing systems like Spark and Flink. However, it has also been widely reported that these techniques would...

  12. First-Passage-Time Distribution for Variable-Diffusion Processes

    Science.gov (United States)

    Barney, Liberty; Gunaratne, Gemunu H.

    2017-05-01

    First-passage-time distribution, which presents the likelihood of a stock reaching a pre-specified price at a given time, is useful in establishing the value of financial instruments and in designing trading strategies. First-passage-time distribution for Wiener processes has a single peak, while that for stocks exhibits a notable second peak within a trading day. This feature has only been discussed sporadically—often dismissed as due to insufficient/incorrect data or circumvented by conversion to tick time—and to the best of our knowledge has not been explained in terms of the underlying stochastic process. It was shown previously that intra-day variations in the market can be modeled by a stochastic process containing two variable-diffusion processes (Hua et al. in, Physica A 419:221-233, 2015). We show here that the first-passage-time distribution of this two-stage variable-diffusion model does exhibit a behavior similar to the empirical observation. In addition, we find that an extended model incorporating overnight price fluctuations exhibits intra- and inter-day behavior similar to those of empirical first-passage-time distributions.

  13. Postscript: Parallel Distributed Processing in Localist Models without Thresholds

    Science.gov (United States)

    Plaut, David C.; McClelland, James L.

    2010-01-01

    The current authors reply to a response by Bowers on a comment by the current authors on the original article. Bowers (2010) mischaracterizes the goals of parallel distributed processing (PDP research)--explaining performance on cognitive tasks is the primary motivation. More important, his claim that localist models, such as the interactive…

  14. Postscript: Parallel Distributed Processing in Localist Models without Thresholds

    Science.gov (United States)

    Plaut, David C.; McClelland, James L.

    2010-01-01

    The current authors reply to a response by Bowers on a comment by the current authors on the original article. Bowers (2010) mischaracterizes the goals of parallel distributed processing (PDP research)--explaining performance on cognitive tasks is the primary motivation. More important, his claim that localist models, such as the interactive…

  15. The mass-ratio and eccentricity distributions of barium and S stars, and red giants in open clusters

    Science.gov (United States)

    Van der Swaelmen, M.; Boffin, H. M. J.; Jorissen, A.; Van Eck, S.

    2017-01-01

    Context. A complete set of orbital parameters for barium stars, including the longest orbits, has recently been obtained thanks to a radial-velocity monitoring with the HERMES spectrograph installed on the Flemish Mercator telescope. Barium stars are supposed to belong to post-mass-transfer systems. Aims: In order to identify diagnostics distinguishing between pre- and post-mass-transfer systems, the properties of barium stars (more precisely their mass-function distribution and their period-eccentricity (P-e) diagram) are compared to those of binary red giants in open clusters. As a side product, we aim to identify possible post-mass-transfer systems among the cluster giants from the presence of s-process overabundances. We investigate the relation between the s-process enrichment, the location in the (P-e) diagram, and the cluster metallicity and turn-off mass. Methods: To invert the mass-function distribution and derive the mass-ratio distribution, we used the method pioneered by Boffin et al. (1992) that relies on a Richardson-Lucy deconvolution algorithm. The derivation of s-process abundances in the open-cluster giants was performed through spectral synthesis with MARCS model atmospheres. Results: A fraction of 22% of post-mass-transfer systems is found among the cluster binary giants (with companion masses between 0.58 and 0.87 M⊙, typical for white dwarfs), and these systems occupy a wider area than barium stars in the (P-e) diagram. Barium stars have on average lower eccentricities at a given orbital period. When the sample of binary giant stars in clusters is restricted to the subsample of systems occupying the same locus as the barium stars in the (P-e) diagram, and with a mass function compatible with a WD companion, 33% (=4/12) show a chemical signature of mass transfer in the form of s-process overabundances (from rather moderate - about 0.3 dex - to more extreme - about 1 dex). The only strong barium star in our sample is found in the cluster with

  16. EQUAL DISTRIBUTION OF KNOWLEDGE-CONDITION FOR SUCCESSFUL PROCESS APPROACH

    Directory of Open Access Journals (Sweden)

    Milan J.Perović

    2009-03-01

    Full Text Available Training and qualifying are important request of international standards of ISO 9000 series, but also of other management environment ISO 14000 and work safety ISO 18000 standards. The paper points on significance of training and qualifying especialy on equal distribution of knowledge in an organization. The paper also emphasizes the connection between training and qualifying and processes, and points on the robe of management in process off all training of staff and members of management. Training of staff depends on management as well as the level of knowledge. Process improvement is knowledge dependent, and survival of organization is improvement dependent.

  17. {open_quotes}BIOX{close_quotes} hydrogen sulfide abatement process - application analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gallup, D.L. [UNOCAL Corp., Santa Rosa, CA (United States)

    1996-12-31

    A new hydrogen sulfide abatement process, known as {open_quotes}BIOX,{close_quotes} has been specifically developed for the geothermal industry. {open_quotes}BIOX{close_quotes} (biocide induced oxidation) successfully controls both primary and secondary emissions from cooling towers in pilot, demonstration, and commercial operations by air-wet oxidation. Independent laboratory tests recently controverted the efficacy of {open_quotes}BIOX{close_quotes} to catalytically oxidize sulfides to sulfate. Studies conducted in our laboratory with a simulated cooling tower indicate that the experimental conditions employed by Nardini, et al, are unrealistic for geothermal cooling towers. Furthermore, our investigations demonstrate that the {open_quotes}BIOX{close_quotes} process performs optimally at near neutral pH, a condition common to most geothermal cooling tower circulating water systems. A {open_quotes}BIOX{close_quotes} agent, trichloroisocyanuric acid (TCCA), proved to mitigate sulfide emissions much more efficiently than air, sodium hypochlorite or chlorine dioxide. {open_quotes}BIOX{close_quotes} is a proven, cost-effective H{sub 2}S abatement technology.

  18. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    Science.gov (United States)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  19. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    Science.gov (United States)

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  20. Marketing promotion in the consumer goods’ retail distribution process

    Directory of Open Access Journals (Sweden)

    S.Bălăşescu

    2013-06-01

    Full Text Available The fundamental characteristic of contemporary marketing is the total opening towards three major directions: consumer needs, organization needs and society’s needs. The continuous expansion of marketing has been accompanied by a process of differentiation and specialization. Differentiation has led to the so called “specific marketing”. In this paper, we aim to explain that in the retail companies, the concept of sales marketing can be distinguished as an independent marketing specialization. The main objectives for this paper are: the definition and delimitation of consumer goods’ sales marketing in the retail business and the sectoral approach of the marketing concept and its specific techniques for the retail activities.

  1. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  2. Distributed situation management processing: enabling next generation C2

    Science.gov (United States)

    Dunkelberger, Kirk A.

    1996-06-01

    Increased use of joint task force concepts is expanding the battlespace and placing higher demands on interoperability. But simultaneous downsizing of forces is increasing the workload on warfighters; while there is a demand for increased decision aiding there has not been a corresponding increase in computational resources. Force wide situation management, the proactive command and control (C2) of the battlespace enabled by broad situation awareness and a deep understanding of mission context, is not likely given today's computational capability, system architecture, algorithmic, and datalink limitations. Next generation C2, e.g. decentralized, `rolling' etc., could be significantly enhanced by distributed situation management processing techniques. Presented herein is a sampling of core technologies, software architectures, cognitive processing algorithms, and datalink requirements which could enable next generation C2. Dynamic, adaptive process distribution concepts are discussed which address platform and tactical application computational capability limitations. Software and datalink architectures are then presented which facilitate situation management process distribution. Finally, required evolution of current algorithms and algorithms potentially enabled within these concepts are introduced.

  3. Regular conditional distributions of max infinitely divisible processes

    CERN Document Server

    Dombry, Clément

    2011-01-01

    This paper is devoted to the prediction problem in extreme value theory. Our main result is an explicit expression of the regular conditional distribution of a max-stable (or max-infinitely divisible) process $\\{\\eta(t)\\}_{t\\in T}$ given observations $\\{\\eta(t_i)=y_i,\\ 1\\leq i\\leq k\\}$. Our starting point is the point process representation of max-infinitely divisible processes by Gin\\'e, Hahn and Vatan (1990). We carefully analyze the structure of the underlying point process, introduce the notions of extremal function, sub-extremal function and hitting scenario associated to the constraints and derive the associated distributions. This allows us to explicit the conditional distribution as a mixture over all hitting scenarios compatible with the conditioning constraints. This formula extends a recent related result by Wang and Stoev (2011) dealing with the case of spectrally discrete max-stable random fields. We believe this work offers new tools and perspective for prediction in extreme value theory togethe...

  4. Numerical Modeling of the Compression Process of Elastic Open-cell Foams

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The random models of open-cell foams that can reflect the actual cell geometrical properties are constructed with the Voronoi technique. The compression process of elastic open-cell foams is simulated with the nonlinear calculation module of finite element analysis program. In order to get the general results applicable to this kind of materials, the dimensionless compressive stress is used and the stress-strain curves of foam models with different geometrical properties are obtained. Then, the influences of open-cell geometrical properties, including the shape of strut cross section, relative density and cell shape irregularity, on the compressive nonlinear mechanical performance are analyzed. In addition, the numerical results are compared with the predicted results of cubic staggering model. Numerical results indicate that the simulated results reflect the compressive process of foams quite well and the geometrical properties of cell have significant influences on the nonlinear mechanical behavior of foams.

  5. Interacting discrete Markov processes with power-law probability distributions

    Science.gov (United States)

    Ridley, Kevin D.; Jakeman, Eric

    2017-09-01

    During recent years there has been growing interest in the occurrence of long-tailed distributions, also known as heavy-tailed or fat-tailed distributions, which can exhibit power-law behaviour and often characterise physical systems that undergo very large fluctuations. In this paper we show that the interaction between two discrete Markov processes naturally generates a time-series characterised by such a distribution. This possibility is first demonstrated by numerical simulation and then confirmed by a mathematical analysis that enables the parameter range over which the power-law occurs to be quantified. The results are supported by comparison of numerical results with theoretical predictions and general conclusions are drawn regarding mechanisms that can cause this behaviour.

  6. Processing TOVS Polar Pathfinder data using the distributed batch controller

    Science.gov (United States)

    Duff, James; Salem, Kenneth M.; Schweiger, Axel; Livny, Miron

    1997-09-01

    The distributed batch controller (DBC) supports scientific batch data processing. Batch jobs are distributed by the DBC over a collection of computing resources. Since these resources may be widely scattered the DBC is well suited for collaborative research efforts whose resources may not be centrally located. The DBC provides its users with centralized monitoring and control of distributed batch jobs. Version 1 of the DBC is currently being used by the TOVS Polar Pathfinder project to generate Arctic atmospheric temperature and humidity profiles. Profile generating jobs are distributed and executed by the DBC on workstation clusters located at several sites across the US. This paper describes the data processing requirements of the TOVS Polar Pathfinder project, and how the DBC is being used to meet them. It also describes Version 2 of the DBC. DBC V2 is implemented in Java, and utilizes a number of advanced Java features such as threads and remote method invocation. It incorporates a number of functional enhancements. These include a flexible mechanism supporting interoperation of the DBC with a wider variety of execution resources and an improved user interface.

  7. Teaching image processing and pattern recognition with the Intel OpenCV library

    Science.gov (United States)

    Kozłowski, Adam; Królak, Aleksandra

    2009-06-01

    In this paper we present an approach to teaching image processing and pattern recognition with the use of the OpenCV library. Image processing, pattern recognition and computer vision are important branches of science and apply to tasks ranging from critical, involving medical diagnostics, to everyday tasks including art and entertainment purposes. It is therefore crucial to provide students of image processing and pattern recognition with the most up-to-date solutions available. In the Institute of Electronics at the Technical University of Lodz we facilitate the teaching process in this subject with the OpenCV library, which is an open-source set of classes, functions and procedures that can be used in programming efficient and innovative algorithms for various purposes. The topics of student projects completed with the help of the OpenCV library range from automatic correction of image quality parameters or creation of panoramic images from video to pedestrian tracking in surveillance camera video sequences or head-movement-based mouse cursor control for the motorically impaired.

  8. Opening the Learning Process: The Potential Role of Feature Film in Teaching Employment Relations

    Science.gov (United States)

    Lafferty, George

    2016-01-01

    This paper explores the potential of feature film to encourage more inclusive, participatory and open learning in the area of employment relations. Evaluations of student responses in a single postgraduate course over a five-year period revealed how feature film could encourage participatory learning processes in which students reexamined their…

  9. Process of Market Strategy Optimization Using Distributed Computing Systems

    Directory of Open Access Journals (Sweden)

    Nowicki Wojciech

    2015-12-01

    Full Text Available If market repeatability is assumed, it is possible with some real probability to deduct short term market changes by making some calculations. The algorithm, based on logical and statistically reasonable scheme to make decisions about opening or closing position on a market, is called an automated strategy. Due to market volatility, all parameters are changing from time to time, so there is need to constantly optimize them. This article describes a team organization process when researching market strategies. Individual team members are merged into small groups, according to their responsibilities. The team members perform data processing tasks through a cascade organization, providing solutions to speed up work related to the use of remote computing resources. They also work out how to store results in a suitable way, according to the type of task, and facilitate the publication of a large amount of results.

  10. A Bird's eye view of Matrix Distributed Processing

    OpenAIRE

    Di Pierro, Massimo

    2003-01-01

    We present Matrix Distributed Processing, a C++ library for fast development of efficient parallel algorithms. MDP is based on MPI and consists of a collection of C++ classes and functions such as lattice, site and field. Once an algorithm is written using these components the algorithm is automatically parallel and no explicit call to communication functions is required. MDP is particularly suitable for implementing parallel solvers for multi-dimensional differential equations and mesh-like ...

  11. Impact of established skills in open surgery on the proficiency gain process for laparoscopic surgery.

    Science.gov (United States)

    Brown, Daniel C; Miskovic, Danilo; Tang, Benjie; Hanna, George B

    2010-06-01

    Laparoscopic training traditionally follows open surgical training. This study aimed to investigate the impact of experience in open surgery on the laparoscopic proficiency gain process. A survey form investigating the importance of open experience before the start of laparoscopic training was sent to surgical experts and trainees in the United Kingdom. A separate experimental study objectively assessed the effects of open experience on laparoscopic skill acquisition using a virtual reality simulator. In the study, 11 medical students with no prior surgical experience (group A) and 14 surgical trainees with open but no laparoscopic experience in (group B) performed 250 simulated laparoscopic cholecystectomies. Psychomotor skills were evaluated by motion analysis and video-based global rating scores. Before the first and after the fifth and tenth operation, knowledge of laparoscopic techniques was assessed by a written test and by self-reported confidence levels indicated on a questionnaire. The 80 experts and 282 trainees who responded to the survey believed prior open experience aids confidence levels, knowledge, and skills acquisition. In the simulation study, no intergroup difference was found for any parameter after the first procedure. Group B scored significantly higher in the laparoscopic knowledge test before training began (42.7% vs. 64.3%; p = 0.002), but no significant difference was found after five operations. The two groups did not differ significantly in terms of confidence. Group B had a significantly shorter total operation time only at the first operation (2,305.6 s vs. 1,884.6 s; p = 0.037). No significant intergroup difference in path length, number of movements, or video-based global rating scores was observed. Prior open experience does not aid the laparoscopic learning process, as demonstrated in a simulated setting. Given the wealth of evidence demonstrating translation of virtual skills to the operating theater, we propose that the safe and

  12. Generation of open biomedical datasets through ontology-driven transformation and integration processes.

    Science.gov (United States)

    Carmen Legaz-García, María Del; Miñarro-Giménez, José Antonio; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2016-06-03

    Biomedical research usually requires combining large volumes of data from multiple heterogeneous sources, which makes difficult the integrated exploitation of such data. The Semantic Web paradigm offers a natural technological space for data integration and exploitation by generating content readable by machines. Linked Open Data is a Semantic Web initiative that promotes the publication and sharing of data in machine readable semantic formats. We present an approach for the transformation and integration of heterogeneous biomedical data with the objective of generating open biomedical datasets in Semantic Web formats. The transformation of the data is based on the mappings between the entities of the data schema and the ontological infrastructure that provides the meaning to the content. Our approach permits different types of mappings and includes the possibility of defining complex transformation patterns. Once the mappings are defined, they can be automatically applied to datasets to generate logically consistent content and the mappings can be reused in further transformation processes. The results of our research are (1) a common transformation and integration process for heterogeneous biomedical data; (2) the application of Linked Open Data principles to generate interoperable, open, biomedical datasets; (3) a software tool, called SWIT, that implements the approach. In this paper we also describe how we have applied SWIT in different biomedical scenarios and some lessons learned. We have presented an approach that is able to generate open biomedical repositories in Semantic Web formats. SWIT is able to apply the Linked Open Data principles in the generation of the datasets, so allowing for linking their content to external repositories and creating linked open datasets. SWIT datasets may contain data from multiple sources and schemas, thus becoming integrated datasets.

  13. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  14. An open-source distributed mesoscale hydrologic model (mHM)

    Science.gov (United States)

    Samaniego, Luis; Kumar, Rohini; Zink, Matthias; Thober, Stephan; Mai, Juliane; Cuntz, Matthias; Schäfer, David; Schrön, Martin; Musuuza, Jude; Prykhodko, Vladyslav; Dalmasso, Giovanni; Attinger, Sabine; Spieler, Diana; Rakovec, Oldrich; Craven, John; Langenberg, Ben

    2014-05-01

    The mesoscale hydrological model (mHM) is based on numerical approximations of dominant hydrological processes that have been tested in various hydrological models such as: HBV and VIC. In general, mHM simulates the following processes: canopy interception, snow accumulation and melting, soil moisture dynamics (n-horizons), infiltration and surface runoff, evapotranspiration, subsurface storage and discharge generation, deep percolation and baseflow, and discharge attenuation and flood routing. The main characteristic of mHM is the treatment of the sub-grid variability of input variables and model parameters which clearly distinguishes this model from existing precipitation-runoff models or land surface models. It uses a Multiscale Parameter Regionalization (MPR) to account for the sub-grid variability and to avoid continuous re-calibration. Effective model parameters are location and time dependent (e.g., soil porosity). They are estimated through upscaling operators that link sub-grid morphologic information (e.g., soil texture) with global transfer-function parameters, which, in turn, are found through multi-basin optimization. Global parameters estimated with the MPR technique are quasi-scale invariant and guarantee flux-matching across scales. mHM is an open source code, written in Fortran 2003 (standard), fully modular, with high computational efficiency, and parallelized. It is portable to multiple platforms (Linux, OS X, Windows) and includes a number of algorithms for sensitivity analysis, analysis of parameter uncertainty (MCMC), and optimization (DDS, SA, SCE). All simulated state variables and outputs can be stored as netCDF files for further analysis and visualization. mHM has been evaluated in all major river basins in Germany and over 80 US and 250 European river basins. The model efficiency (NSE) during validation at proxy locations is on average greater than 0.6. During last years, mHM had been used for number of hydrologic applications such as

  15. Open End Correction for Flanged Circular Tube from the Diffusion Process

    CERN Document Server

    Ogawa, Naohisa

    2013-01-01

    In the physics education of waves and resonance phenomena at high school, we usually consider the sound waves in a tube with open or closed end as a simple model. However, it is well known that we need end correction $\\Delta L$ for an open end. The value of the correction for franged circular tube was first given by L. Rayleigh and experimentally checked by several authors.In this paper we show the different method to obtain the end correction for circular tube by using the diffusion process.

  16. Sustainability and Governance in Developing Open Source Projects as Processes of In-Becoming

    Directory of Open Access Journals (Sweden)

    Daniel Curto-Millet

    2013-01-01

    Full Text Available Sustainability is often thought of as a binary state: an open source project is either sustainable or not. In reality, sustainability is much more complex. What makes this project more sustainable than that one? Why should it be assumed in the first place that sustainability is a prolonged state of an ingraced project? The threads are pulled from their yarns in many directions. This article attempts to reconceptualize some assumed notions of the processes involved in developing open source software. It takes the stance in favour of studying the fluctuant nature of open source and the associated artefacts, not as well-defined objects, but as commons that are continually built upon, evolved, and modified; sometimes in unexpected ways. Further, the governance of these commons is an ongoing process, tightly linked with the way in which these commons are allowed to further develop. This perspective of "in-becoming" is useful in understanding the efforts and processes that need to be provided to sustainably govern the development of open source projects and the advantages for managing requirements derived therein.

  17. Release Process on Quality Improvement in Open Source Software Project Management

    Directory of Open Access Journals (Sweden)

    S. Chandra Kumar Mangalam

    2012-01-01

    Full Text Available Problem statement: The Software Industry has changed and developed as a consequence of the impact of Open Source Software (OSS since 1990s. Over a period of time, OSS has evolved in an integrated manner and most of the participants in OSS activity are volunteers. Approach: This coordination form of development has produced a considerable quantity of software; and often, the development method has been viewed as an unorganized and unstructured method of development. Few existing researches deal with the Open Source Software phenomenon from a quality perception point of view and studies where enhancements are possible in the development process. Results: Release Process in OSS plays a key role in most of the OSS projects. As this process is related to the evolution of a quality software from the community of OSS developers, this research attempts to explore the process practices which are employed by OSS developers and examines the problems associated with the development process. The scope of the study is mainly confined to process management in OSS. “Prototype development and iterative development process” approaches were adapted as a methodology. Conclusion/Recommendations: The major finding and conclusion drawn is ‘lack of coordination among developers’ who are geographically isolated. Hence, the study suggests the need for coordination among developers to line up their development process for achieving the goal of the software release process.

  18. Medical Image Dynamic Collaborative Processing on the Distributed Environment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A new trend in the development of medical image processing systems is to enhance the sharing of medical resources and the collaborative processing of medical specialists. This paper presents an architecture of medical image dynamic collaborative processing on the distributed environment by combining the JAVA, CORBA (Common Object Request and Broker Architecture) and the MAS (Multi-Agents System) collaborative mechanism. The architecture allows medical specialists or applications to share records and communicate with each other on the web by overcoming the shortcut of traditional approach using Common Gateway Interface (CGI) and client/server architecture, and can support the remote heterogeneous systems collaboration. The new approach improves the collaborative processing of medical data and applications and is able to enhance the interoperation among heterogeneous system. Research on the system will help the collaboration and cooperation among medical application systems distributed on the web, thus supply high quality medical service such as diagnosis and therapy to practicing specialists regardless of their actual geographic location.

  19. BioSig: the free and open source software library for biomedical signal processing.

    Science.gov (United States)

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  20. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets.

  1. Risk assessment of occupational groups working in open pit mining: Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yaşar Kasap

    2017-01-01

    Full Text Available In open pit mining it is possible to prevent industrial accidents and the results of industrial accidents such as deaths, physical disabilities and financial loss by implementing risk analyses in advance. If the probabilities of different occupational groups encountering various hazards are determined, workers’ risk of having industrial accidents and catching occupational illnesses can be controlled. In this sense, the aim of this study was to assess the industrial accidents which occurred during open pit coal production in the Turkish Coal Enterprises (TCE Garp Lignite unit between 2005 and 2010 and to analyze the risks using the Analytic Hierarchy Process (AHP. The analyses conducted with AHP revealed that the greatest risk in open pit mining is landslides, the most risky occupational group is unskilled labourers and the most common hazards are caused by landslides and transportation/hand tools/falling.

  2. Land processes distributed active archive center product lifecycle plan

    Science.gov (United States)

    Daucsavage, John C.; Bennett, Stacie D.

    2014-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.

  3. Hadoop distributed batch processing for Gaia: a success story

    Science.gov (United States)

    Riello, Marco

    2015-12-01

    The DPAC Cambridge Data Processing Centre (DPCI) is responsible for the photometric calibration of the Gaia data including the low resolution spectra. The large data volume produced by Gaia (~26 billion transits/year), the complexity of its data stream and the self-calibrating approach pose unique challenges for scalability, reliability and robustness of both the software pipelines and the operations infrastructure. DPCI has been the first in DPAC to realise the potential of Hadoop and Map/Reduce and to adopt them as the core technologies for its infrastructure. This has proven a winning choice allowing DPCI unmatched processing throughput and reliability within DPAC to the point that other DPCs have started following our footsteps. In this talk we will present the software infrastructure developed to build the distributed and scalable batch data processing system that is currently used in production at DPCI and the excellent results in terms of performance of the system.

  4. Distributed query processing in flash-based sensor networks

    Institute of Scientific and Technical Information of China (English)

    Jianliang XU; Xueyan TANG; Wang-Chien LEE

    2008-01-01

    Wireless sensor networks are used in a large array of applications to capture,collect,and analyze physical environmental data.Many existing sensor systems instruct sensor nodes to report their measurements to central repositories outside the network,which is expensive in energy cost.Recent technological advances in flash memory have given rise to the development of storagecentric sensor networks,where sensor nodes are equipped with high-capacity flash memory storage such that sensor data can b.e stored and managed inside the network to reduce expensive communication.This novel architecture calls for new data management techniques to fully exploit distributed in-network data storage.This paper describes some of our research on distributed query processing in such flash-based sensor networks.Of particular interests are the issues that arise in the design of storage management and indexing structures combining sensor system workload and read/write/erase characteristics of flash memory.

  5. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    Science.gov (United States)

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography

  6. Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application

    Directory of Open Access Journals (Sweden)

    Lia Duarte

    2017-03-01

    Full Text Available Geographical Information Systems (GIS are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment—GIS-ECOAL (freely available. The performance of the application was evaluated with distributed temperature measurements gathered in the S. Pedro da Cova (Portugal coal waste pile. In order to obtain the temperature data, an optical fiber cable was disposed over the affected area of the pile, with 42 location stakes acting as precisely-located control points for the temperature measurement. A monthly data set from July (15 min of interval was fed into the application and a video composed by several layouts with temperature measurements was created allowing for recognizing two main areas with higher temperatures. The field observations also allow the identification of these zones; however, the identification of an area with higher temperatures in the top of the studied area was only possible through the visualization of the images created by this application. The generated videos make possible the dynamic and continuous visualization of the combustion process in the monitored area.

  7. On the spectral distribution of the free Jacobi process

    CERN Document Server

    Demni, Nizar; Hmidi, Taoufik

    2012-01-01

    In this paper, we are interested in the free Jacobi process starting at the unit of the compressed probability space where it takes values and associated with the parameter values $\\lambda=1, \\theta =1/2$. Firstly, we derive a time-dependent recurrence equation for the moments of the process (valid for any starting point and all parameter values). Secondly, we transform this equation to a nonlinear partial differential one for the moment generating function that we solve when $\\lambda = 1, \\theta =1/2$. The obtained solution together with tricky computations lead to an explicit expression of the moments which shows that the free Jacobi process is distributed at any time $t$ as $(1/4)(2+Y_{2t}+Y_{2t}^{\\star})$ where $Y$ is a free unitary Brownian motion. This expression is recovered relying on enumeration techniques after proving that if $a$ is a symmetric Bernoulli random variable which is free from $\\{Y, Y^{\\star}\\}$, then the distributions of $Y_{2t}$ and that of $aY_taY_t^{\\star}$ coincide.

  8. Typical features of pedestrian spatial distribution in the inflow process

    Science.gov (United States)

    Liu, Xiaodong; Song, Weiguo; Fu, Libi; Lv, Wei; Fang, Zhiming

    2016-04-01

    Pedestrian inflow is frequently observed in various pedestrian facilities. In this work, we first proposed four hypotheses concerning the inflow process. Then, we performed a series of experiments to test the hypotheses. With several analytical methods, e.g., the proxemics theory and Voronoi diagram method, the features of pedestrian inflow are analyzed in detail. Results demonstrate that the distribution of pedestrians in the room is not uniform. Boundaries are attractive for these pedestrians. The impact of two factors of the inflow are analyzed, i.e., movement rule, and first-out reward. It is found pedestrians can enter the room more effectively under the random rule or two queues. Under some hurry circumstances, pedestrians may prefer to gather around the door, and the spatial distribution is not uniform, leading to the imbalance use of the room. Practical suggestions are given for pedestrians to improve the travel efficiency in the inflow process. This experimental study is meaningful to reveal some fundamental phenomena of inflow process, which can provide the realistic basis for building the theory and mathematical-physical models.

  9. Chemical and aerosol processes in the transition from closed to open cells during VOCALS-REx

    Directory of Open Access Journals (Sweden)

    J. Kazil

    2011-02-01

    Full Text Available Chemical and aerosol processes in the transition from closed- to open-cell circulation in the remote, cloudy marine boundary layer are explored. It has previously been shown that precipitation can initiate a transition from the closed- to the open-cellular state, but that the boundary layer cannot maintain this open-cell state without a resupply of cloud condensation nuclei (CCN. Potential sources include wind-driven production of sea salt particles from the ocean, nucleation from the gas phase, and entrainment from the free troposphere. In order to investigate aerosol sources in the marine boundary layer and their role in supplying new particles, we have coupled in detail chemical, aerosol, and cloud processes in the WRF/Chem model, and added state-of-the-art representations of sea salt emissions and aerosol nucleation. We introduce the new features of the model and conduct simulations of the marine boundary layer in the transition from a closed- to an open-cell state. Results are compared with observations in the Southeast Pacific boundary layer during the VAMOS Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx. The transition from the closed- to the open-cell state generates conditions that are conducive to nucleation by forming a cloud-scavenged, ultra-clean layer below the inversion base. Open cell wall updrafts loft dimethyl sulfide from the ocean surface into the ultra-clean layer, where it is oxidized during daytime to SO2 and subsequently to H2SO4. Low H2SO4 condensation sink values in the ultra-clean layer allow H2SO4 to rise to concentrations at which aerosol nucleation proceeds efficiently. The existence of the ultra-clean layer is confirmed by observations. We find that the observed DMS flux from the ocean in the VOCALS-REx region can support a nucleation source of aerosol in open cells that exceeds sea salt emissions in terms of the number

  10. Diversity and distribution of Listeria monocytogenes in meat processing plants.

    Science.gov (United States)

    Martín, Belén; Perich, Adriana; Gómez, Diego; Yangüela, Javier; Rodríguez, Alicia; Garriga, Margarita; Aymerich, Teresa

    2014-12-01

    Listeria monocytogenes is a major concern for the meat processing industry because many listeriosis outbreaks have been linked to meat product consumption. The aim of this study was to elucidate L. monocytogenes diversity and distribution across different Spanish meat processing plants. L. monocytogenes isolates (N = 106) collected from food contact surfaces of meat processing plants and meat products were serotyped and then characterised by multilocus sequence typing (MLST). The isolates were serotyped as 1/2a (36.8%), 1/2c (34%), 1/2b (17.9%) and 4b (11.3%). MLST identified ST9 as the most predominant allelic profile (33% of isolates) followed by ST121 (16%), both of which were detected from several processing plants and meat products sampled in different years, suggesting that those STs are highly adapted to the meat processing environment. Food contact surfaces during processing were established as an important source of L. monocytogenes in meat products because the same STs were obtained in isolates recovered from surfaces and products. L. monocytogenes was recovered after cleaning and disinfection procedures in two processing plants, highlighting the importance of thorough cleaning and disinfection procedures. Epidemic clone (EC) marker ECI was identified in 8.5%, ECIII was identified in 2.8%, and ECV was identified in 7.5% of the 106 isolates. Furthermore, a selection of presumably unrelated ST9 isolates was analysed by multi-virulence-locus sequence typing (MVLST). Most ST9 isolates had the same virulence type (VT11), confirming the clonal origin of ST9 isolates; however, one ST9 isolate was assigned to a new VT (VT95). Consequently, MLST is a reliable tool for identification of contamination routes and niches in processing plants, and MVLST clearly differentiates EC strains, which both contribute to the improvement of L. monocytogenes control programs in the meat industry. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Parallel distributed processing: Implications for cognition and development. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    McClelland, J.L.

    1988-07-11

    This paper provides a brief overview of the connectionist or parallel distributed processing framework for modeling cognitive processes, and considers the application of the connectionist framework to problems of cognitive development. Several aspects of cognitive development might result from the process of learning as it occurs in multi-layer networks. This learning process has the characteristic that it reduces the discrepancy between expected and observed events. As it does this, representations develop on hidden units which dramatically change both the way in which the network represents the environment from which it learns and the expectations that the network generates about environmental events. The learning process exhibits relatively abrupt transitions corresponding to stage shifts in cognitive development. These points are illustrated using a network that learns to anticipate which side of a balance beam will go down, based on the number of weights on each side of the fulcrum and their distance from the fulcrum on each side of the beam. The network is trained in an environment in which weight more frequently governs which side will go down. It recapitulates the states of development seen in children, as well as the stage transitions, as it learns to represent weight and distance information.

  12. LIMITING POSSIBILITIES OF RESOURCE EXCHANGE PROCESS IN COMPLEX OPEN MICROECONOMIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Serghey A. Amelkin

    2004-06-01

    Full Text Available A problem on extreme performance of microeconomic system with several firms is considered. Each firm aspires to increase the profit. Flows of the good between the firms determine the structure of the system. So, sequential structure corresponds to intermediaries (dealers operating in the market, parallel structure corresponds to competition in the market. The system at issue is an open economic system because of presence of external flows from the sources described by a distribution of the value of the good. The problem is solved for the basic structures: maximal profit and corresponding prices are found for each firm.

  13. Solid Waste Processing Center Primary Opening Cells Systems, Equipment and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Sharon A.; Baker, Carl P.; Mullen, O Dennis; Valdez, Patrick LJ

    2006-04-17

    This document addresses the remote systems and design integration aspects of the development of the Solid Waste Processing Center (SWPC), a facility to remotely open, sort, size reduce, and repackage mixed low-level waste (MLLW) and transuranic (TRU)/TRU mixed waste that is either contact-handled (CH) waste in large containers or remote-handled (RH) waste in various-sized packages.

  14. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    Science.gov (United States)

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  15. Processes determining the marine alkalinity and carbonate saturation distributions

    Directory of Open Access Journals (Sweden)

    B. R. Carter

    2014-07-01

    Full Text Available We introduce a composite tracer, Alk*, that has a global distribution primarily determined by CaCO3 precipitation and dissolution. Alk* also highlights riverine alkalinity plumes that are due to dissolved calcium carbonate from land. We estimate the Arctic receives approximately twice the riverine alkalinity per unit area as the Atlantic, and 8 times that of the other oceans. Riverine inputs broadly elevate Alk* in the Arctic surface and particularly near river mouths. Strong net carbonate precipitation lowers basin mean Indian and Atlantic Alk*, while upwelling of dissolved CaCO3 rich deep waters elevates Northern Pacific and Southern Ocean Alk*. We use the Alk* distribution to estimate the carbonate saturation variability resulting from CaCO3 cycling and other processes. We show regional variations in surface carbonate saturation are due to temperature changes driving CO2 fluxes and, to a lesser extent, freshwater cycling. Calcium carbonate cycling plays a tertiary role. Monitoring the Alk* distribution would allow us to isolate the impact of acidification on biological calcification and remineralization.

  16. MyChEMBL: A Virtual Platform for Distributing Cheminformatics Tools and Open Data

    National Research Council Canada - National Science Library

    Mark Davies; Michal Nowotka; George Papadatos; Francis Atkinson; Gerard J P VanWesten; Nathan Dedman; Rodrigo Ochoa; John P Overington

    2014-01-01

      MyChEMBL is an open virtual platform which provides a free, secure, standardised and easy to use chemoinformatics environment for bioactivity data mining, machine learning, application development...

  17. Breaking-bud pollination: a new pollination process in partially opened flowers by small bees.

    Science.gov (United States)

    Yamaji, Futa; Ohsawa, Takeshi A

    2015-09-01

    Plant-pollinator interactions have usually been researched in flowers that have fully opened. However, some pollinators can visit flowers before full opening and contribute to fruit and seed sets. In this paper, we researched the pollination biology of flowers just starting to open in four field experiments. We observed the insect visitors to Lycoris sanguinea var. sanguinea for 3 years at five sites. These observations revealed that only small bees, Lasioglossum japonicum, often entered through tiny spaces between the tepals of 'breaking buds' (i.e. partially opened flowers) and collected pollen. We hypothesized that they can pollinate this species at the breaking-bud stage, when the stigma is located near the anthers. To measure the pollination effect of small bees at the breaking-bud stage, we bagged several breaking buds after small bees had visited them and examined whether these buds were pollinated. In bagging experiments, 30% of the breaking buds set fruit and seeds. Fruit-set ratios of the breaking buds did not differ significantly from those of the fully opened flowers, which had been visited by several insect species. We also counted the pollen grain numbers on the body of L. japonicum and on the anthers of randomly-selected and manipulated flowers. These experiments revealed that all of the captured bees had some pollen of target plants and that L. japonicum collected most of the pollen grains at the breaking-bud stage. Our results showed that the new pollination process, breaking-bud pollination, happened in breaking buds by L. japonicum, although there is no evidence to reveal that this is the most effective pollination method for L. sanguinea var. sanguinea. In principle, this new pollination process can occur in other flowering plants and our results are a major contribution to studies of plant-pollinator interactions.

  18. OpenVPX总线及其在雷达信息处理的应用%OpenVPX Bus and Its Application in Radar Information Processing

    Institute of Scientific and Technical Information of China (English)

    池凌鸿; 史鸿声

    2013-01-01

    As the state-of-the-art bus standard in the world,open versatile protocol switch(OpenVPX) bus with high bandwidth,good real-time performance,flexible topology agility,strong generality,and hush environment resistance is introduced.It is becoming the development direction of a new generation of the military comprehensive information processing system.The technical features of OpenVPX bus are described.A solution to radar comprehensive information processing system based on OpenVPX standard is proposed.%开放多协议交换(OpenVPX)总线作为目前国际上最新的一种通用总线标准,具有带宽高、实时性强、拓扑结构灵活、通用性强、抗恶劣环境能力强等优点,代表着新一代军用综合信息处理平台系统的发展方向.详细介绍了OpenVPX总线的技术特点,并结合数字阵列雷达应用,给出了基于OpenVPX总线标准的雷达综合信息处理系统设计解决方案.

  19. Application Characterization at Scale: Lessons learned from developing a distributed Open Community Runtime system for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Landwehr, Joshua B.; Suetterlein, Joshua D.; Marquez, Andres; Manzano Franco, Joseph B.; Gao, Guang R.

    2016-05-16

    Since 2012, the U.S. Department of Energy’s X-Stack program has been developing solutions including runtime systems, programming models, languages, compilers, and tools for the Exascale system software to address crucial performance and power requirements. Fine grain programming models and runtime systems show a great potential to efficiently utilize the underlying hardware. Thus, they are essential to many X-Stack efforts. An abundant amount of small tasks can better utilize the vast parallelism available on current and future machines. Moreover, finer tasks can recover faster and adapt better, due to a decrease in state and control. Nevertheless, current applications have been written to exploit old paradigms (such as Communicating Sequential Processor and Bulk Synchronous Parallel processing). To fully utilize the advantages of these new systems, applications need to be adapted to these new paradigms. As part of the applications’ porting process, in-depth characterization studies, focused on both application characteristics and runtime features, need to take place to fully understand the application performance bottlenecks and how to resolve them. This paper presents a characterization study for a novel high performance runtime system, called the Open Community Runtime, using key HPC kernels as its vehicle. This study has the following contributions: one of the first high performance, fine grain, distributed memory runtime system implementing the OCR standard (version 0.99a); and a characterization study of key HPC kernels in terms of runtime primitives running on both intra and inter node environments. Running on a general purpose cluster, we have found up to 1635x relative speed-up for a parallel tiled Cholesky Kernels on 128 nodes with 16 cores each and a 1864x relative speed-up for a parallel tiled Smith-Waterman kernel on 128 nodes with 30 cores.

  20. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  1. MULTIPROCESSOR AND DISTRIBUTED PROCESSING BIBLIOGRAPHIC DATA BASE SOFTWARE SYSTEM

    Science.gov (United States)

    Miya, E. N.

    1994-01-01

    Multiprocessors and distributed processing are undergoing increased scientific scrutiny for many reasons. It is more and more difficult to keep track of the existing research in these fields. This package consists of a large machine-readable bibliographic data base which, in addition to the usual keyword searches, can be used for producing citations, indexes, and cross-references. The data base is compiled from smaller existing multiprocessing bibliographies, and tables of contents from journals and significant conferences. There are approximately 4,000 entries covering topics such as parallel and vector processing, networks, supercomputers, fault-tolerant computers, and cellular automata. Each entry is represented by 21 fields including keywords, author, referencing book or journal title, volume and page number, and date and city of publication. The data base contains UNIX 'refer' formatted ASCII data and can be implemented on any computer running under the UNIX operating system. The data base requires approximately one megabyte of secondary storage. The documentation for this program is included with the distribution tape, although it can be purchased for the price below. This bibliography was compiled in 1985 and updated in 1988.

  2. Generalized binomial multiplicative cascade processes and asymmetrical multifractal distributions

    Science.gov (United States)

    Cheng, Q.

    2014-04-01

    The concepts and models of multifractals have been employed in various fields in the geosciences to characterize singular fields caused by nonlinear geoprocesses. Several indices involved in multifractal models, i.e., asymmetry, multifractality, and range of singularity, are commonly used to characterize nonlinear properties of multifractal fields. An understanding of how these indices are related to the processes involved in the generation of multifractal fields is essential for multifractal modeling. In this paper, a five-parameter binomial multiplicative cascade model is proposed based on the anisotropic partition processes. Each partition divides the unit set (1-D length or 2-D area) into h equal subsets (segments or subareas) and m1 of them receive d1 (> 0) and m2 receive d2 (> 0) proportion of the mass in the previous subset, respectively, where m1+m2 ≤ h. The model is demonstrated via several examples published in the literature with asymmetrical fractal dimension spectra. This model demonstrates the various properties of asymmetrical multifractal distributions and multifractal indices with explicit functions, thus providing insight into and an understanding of the properties of asymmetrical binomial multifractal distributions.

  3. Specificity and completion time distributions of biochemical processes

    Energy Technology Data Exchange (ETDEWEB)

    Munsky, Brian [Los Alamos National Laboratory; Nemenman, Ilya [Los Alamos National Laboratory; Bel, Golan [Los Alamos National Laboratory

    2009-01-01

    In order to produce specific complex structures from a large set of similar biochemical building blocks, many biochemical systems require high sensitivity to small molecular differences. The first and most common mqdel used to explain this high specificity is kinetic proofreading, which has been extended to a variety of systems from detection of DNA mismatch to cell signaling processes. While the specification properties of the kinetic proofreading model are well known and were studied in various contexts, very little is known about its temporal behavior. In this work, we study the dynamical properties of discrete stochastic two-branch kinetic proofreading schemes. Using the Laplace transform of the corresponding chemical master equation, we obtain an analytical solution for the completion time distribution. In particular we provide expressions for the specificity and the mean and the variance of the process completion times. We also show that, for a wide range of parameters a process distinguishing between two different products can be reduced to a much simpler three point process. Our results allow for the systematic study of the interplay between specificity and completion times as well as testing the validity of the kinetic proofreading model in biological systems.

  4. Comparison between open phase fault of arc suppression coil and single phase to earth fault in coal mine distribution network

    Institute of Scientific and Technical Information of China (English)

    LI Xiao-bo; WANG Chong-lin

    2008-01-01

    When, in a coal mine distribution network whose neutral point is grounded by an arc suppression coil (ASC), a fault occurs in the ASC, compensation cannot be properly realized. Furthermore, it can damage the safe and reliable run of the network.We first introduce a three-phase five-column arc suppression coil (TPFCASC) and discuss its autotracking compensation theory.Then we compare the single phase to ground fault of the coal mine distribution network with an open phase fault at the TPFCASC using the Thévenin theory, the symmetrical-component method and the complex sequence network respectively. The results show that, in both types of faults, zero-sequence voltage of the network will appear and the maximum magnitude of this zero-sequence voltage is different in both faults. Based on this situation, a protection for the open phase fault at the TPFCASC should be estab-lished.

  5. Fiber-distributed multi-channel open-path H2S sensor based on tunable diode laser absorption spectroscopy

    Institute of Scientific and Technical Information of China (English)

    Dong Chen; Wenqing Liu; Yujun Zhang; Jianguo Liu; Ruifeng Kan; Min Wang; Xi Fang; Yiben Cui

    2007-01-01

    Tunable diode laser based gas detectors are now being used in a wide variety of applications for safety and environmental interest. A fiber-distributed multi-channel open-path H2S sensor based on tunable diode laser absorption spectroscopy (TDLAS) is developed, the laser used is a telecommunication near infrared distributed feed-back (DFB) tunable diode laser, combining with wavelength modulation specby combining optical fiber technique. An on-board reference cell provides on-line sensor calibration and almost maintenance-free operation. The sensor is suitable for large area field H2S monitoring application.

  6. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    Science.gov (United States)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis

  7. THE APPLICATION OF OpenCV OPEN-SOURCE LIBRARY TO NUCLEAR DATA INFORMATION PROCESSING%OpenCV开源库在核数据信息处理中的应用

    Institute of Scientific and Technical Information of China (English)

    倪宁; 曾国强; 葛良全; 林凡; 肖雪夫; 赖万昌

    2012-01-01

    介绍了OpenCV开源库在核数据谱线数据处理中的应用,使用OpenCV实现了谱线光滑、寻峰,以及卡尔曼滤波常用的谱处理方法.对492条256道谱线的NASVD算法的测试的结果表明,使用OpenCV方法的运算时间仅约为传统C++方法的1/3,大大提升了数据处理的效率,在核数据处理中具有很好的应用前景和价值.%This paper deals with the utilization of OpenCV open-source library in nuclear data processing,discusses the realization of the smoothing of OpenCV-based spectrum by several different kinds of methods and Kalman filter,and describes the OpenCV-based peak-finding method. A test of NASVD algorithm indicates that in large matrix data operation,the cost of OpenCV method is about 1/ 3 that of the traditional method. It is concluded that OpenCV can increase the efficiency of data processing and hence has a good application prospect in the field of nuclear data.

  8. Distribution of green open space in Malang City based on multispectral data

    Science.gov (United States)

    Hasyim, A. W.; Hernawan, F. P.

    2017-06-01

    Green open space is one of the land that its existence is quite important in urban areas where the minimum area is set to reach 30% of the total area of the city. Malang which has an area of 110,6 square kilometers, is one of the major cities in East Java Province that is prone to over-land conversion due to development needs. In support of the green space program, calculation of green space is needed precisely so that remote sensing which has high accuracy is now used for measurement of green space. This study aims to analyze the area of green open space in Malang by using Landsat 8 image in 2015. The method used was the vegetation index that is Normalized Difference Vegetation Index (NDVI). From the study obtained the calculation of green open space was better to use the vegetation index method to avoid the occurrence of misclassification of other types of land use. The results of the calculation of green open space using NDVI found that the area of green open space in Malang City in 2015 reached 39% of the total area.

  9. Distributed Processing of Sentinel-2 Products using the BIGEARTH Platform

    Science.gov (United States)

    Bacu, Victor; Stefanut, Teodor; Nandra, Constantin; Mihon, Danut; Gorgan, Dorian

    2017-04-01

    The constellation of observational satellites orbiting around Earth is constantly increasing, providing more data that need to be processed in order to extract meaningful information and knowledge from it. Sentinel-2 satellites, part of the Copernicus Earth Observation program, aim to be used in agriculture, forestry and many other land management applications. ESA's SNAP toolbox can be used to process data gathered by Sentinel-2 satellites but is limited to the resources provided by a stand-alone computer. In this paper we present a cloud based software platform that makes use of this toolbox together with other remote sensing software applications to process Sentinel-2 products. The BIGEARTH software platform [1] offers an integrated solution for processing Earth Observation data coming from different sources (such as satellites or on-site sensors). The flow of processing is defined as a chain of tasks based on the WorDeL description language [2]. Each task could rely on a different software technology (such as Grass GIS and ESA's SNAP) in order to process the input data. One important feature of the BIGEARTH platform comes from this possibility of interconnection and integration, throughout the same flow of processing, of the various well known software technologies. All this integration is transparent from the user perspective. The proposed platform extends the SNAP capabilities by enabling specialists to easily scale the processing over distributed architectures, according to their specific needs and resources. The software platform [3] can be used in multiple configurations. In the basic one the software platform runs as a standalone application inside a virtual machine. Obviously in this case the computational resources are limited but it will give an overview of the functionalities of the software platform, and also the possibility to define the flow of processing and later on to execute it on a more complex infrastructure. The most complex and robust

  10. Distributed Iterative Processing for Interference Channels with Receiver Cooperation

    CERN Document Server

    Badiu, Mihai-Alin; Bota, Vasile; Fleury, Bernard Henri

    2012-01-01

    We propose a framework for the derivation and evaluation of distributed iterative algorithms for receiver cooperation in interference-limited wireless systems. Our approach views the processing within and collaboration between receivers as the solution to an inference problem in the probabilistic model of the whole system. The probabilistic model is formulated to explicitly incorporate the receivers' ability to share information of a predefined type. We employ a recently proposed unified message-passing tool to infer the variables of interest in the factor graph representation of the probabilistic model. The exchange of information between receivers arises in the form of passing messages along some specific edges of the factor graph; the rate of updating and passing these messages determines the communication overhead associated with cooperation. Simulation results illustrate the high performance of the proposed algorithm even with a low number of message exchanges between receivers.

  11. Lifetime-Based Memory Management for Distributed Data Processing Systems

    DEFF Research Database (Denmark)

    Lu, Lu; Shi, Xuanhua; Zhou, Yongluan;

    2016-01-01

    In-memory caching of intermediate data and eager combining of data in shuffle buffers have been shown to be very effective in minimizing the re-computation and I/O cost in distributed data processing systems like Spark and Flink. However, it has also been widely reported that these techniques would...... create a large amount of long-living data objects in the heap, which may quickly saturate the garbage collector, especially when handling a large dataset, and hence would limit the scalability of the system. To eliminate this problem, we propose a lifetime-based memory management framework, which......, by automatically analyzing the user-defined functions and data types, obtains the expected lifetime of the data objects, and then allocates and releases memory space accordingly to minimize the garbage collection overhead. In particular, we present Deca, a concrete implementation of our proposal on top of Spark...

  12. An integrated distributed processing interface for supercomputers and workstations

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.; McGavran, L.

    1989-01-01

    Access to documentation, communication between multiple processes running on heterogeneous computers, and animation of simulations of engineering problems are typically weak in most supercomputer environments. This presentation will describe how we are improving this situation in the Computer Research and Applications group at Los Alamos National Laboratory. We have developed a tool using UNIX filters and a SunView interface that allows users simple access to documentation via mouse driven menus. We have also developed a distributed application that integrated a two point boundary value problem on one of our Cray Supercomputers. It is controlled and displayed graphically by a window interface running on a workstation screen. Our motivation for this research has been to improve the usual typewriter/static interface using language independent controls to show capabilities of the workstation/supercomputer combination. 8 refs.

  13. Lifetime-Based Memory Management for Distributed Data Processing Systems

    DEFF Research Database (Denmark)

    Lu, Lu; Shi, Xuanhua; Zhou, Yongluan;

    2016-01-01

    , by automatically analyzing the user-defined functions and data types, obtains the expected lifetime of the data objects, and then allocates and releases memory space accordingly to minimize the garbage collection overhead. In particular, we present Deca, a concrete implementation of our proposal on top of Spark......In-memory caching of intermediate data and eager combining of data in shuffle buffers have been shown to be very effective in minimizing the re-computation and I/O cost in distributed data processing systems like Spark and Flink. However, it has also been widely reported that these techniques would...... create a large amount of long-living data objects in the heap, which may quickly saturate the garbage collector, especially when handling a large dataset, and hence would limit the scalability of the system. To eliminate this problem, we propose a lifetime-based memory management framework, which...

  14. Distributed Iterative Processing for Interference Channels with Receiver Cooperation

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Manchón, Carles Navarro; Bota, Vasile

    2012-01-01

    We propose a method for the design and evaluation of distributed iterative algorithms for receiver cooperation in interference-limited wireless systems. Our approach views the processing within and collaboration between receivers as the solution to an inference problem in the probabilistic model...... of the whole system. The probabilistic model is formulated to explicitly incorporate the receivers' ability to share information of a predefined type. We employ a recently proposed unified message-passing tool to infer the variables of interest in the factor graph representation of the probabilistic model....... The exchange of information between receivers arises in the form of passing messages along some specific edges of the factor graph; the rate of updating and passing these messages determines the amount of communication overhead associated with cooperation. Simulation results illustrate the high performance...

  15. Influence of particle size distribution on nanopowder cold compaction processes

    Science.gov (United States)

    Boltachev, G.; Volkov, N.; Lukyashin, K.; Markov, V.; Chingina, E.

    2017-06-01

    Nanopowder uniform and uniaxial cold compaction processes are simulated by 2D granular dynamics method. The interaction of particles in addition to wide-known contact laws involves the dispersion forces of attraction and possibility of interparticle solid bridges formation, which have a large importance for nanopowders. Different model systems are investigated: monosized systems with particle diameter of 10, 20 and 30 nm; bidisperse systems with different content of small (diameter is 10 nm) and large (30 nm) particles; polydisperse systems corresponding to the log-normal size distribution law with different width. Non-monotone dependence of compact density on powder content is revealed in bidisperse systems. The deviations of compact density in polydisperse systems from the density of corresponding monosized system are found to be minor, less than 1 per cent.

  16. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  17. Applying SDN/OpenFlow in Virtualized LTE to support Distributed Mobility Management (DMM)

    NARCIS (Netherlands)

    Karimzadeh, Morteza; Valtulina, Luca; Karagiannis, Georgios

    2014-01-01

    Distributed Mobility Management (DMM) is a mobility management solution, where the mobility anchors are distributed instead of being centralized. The use of DMM can be applied in cloud-based (virtualized) Long Term Evolution (LTE) mobile network environments to (1) provide session continuity to user

  18. Distribution of meiofaunal abundances in a marine cave complex with secondary openings and freshwater filtrations

    DEFF Research Database (Denmark)

    Riera, Rodrigo; Monterroso, Óscar; Núñez, Jorge

    2017-01-01

    ). Meiofaunal communities (0.062–0.5-mm body size) have been largely neglected in ecological studies about communities inhabiting sea caves. In the present study, we analysed meiofaunal communities from Los Cerebros cave, a shallow marine cave (3–8 m in depth, 80 m long), with secondary openings in the inner...

  19. Hyporheic flow and dissolved oxygen distribution in fish nests: The effects of open channel velocity, permeability patterns, and groundwater upwelling

    Science.gov (United States)

    Cardenas, M. Bayani; Ford, Aimee E.; Kaufman, Matthew H.; Kessler, Adam J.; Cook, Perran L. M.

    2016-12-01

    Many fish lay their eggs in nests, or redds, which they construct in sediment. The viability of eggs depends on many factors, particularly their oxygenation. Because dissolved oxygen is typically saturated within the stream channel, the dissolved oxygen distribution within the redd depends on whether or not hyporheic flow and transport occur within the sediment. We conducted a series of flume and numerical flow and age transport modeling experiments with the aim of understanding the effects of salmonid redds on the hyporheic transport of young oxygenated water. Hyporheic flow was visualized directly through dye injections. Dissolved oxygen throughout the fish nest was measured using a planar optode. Experiments were conducted at various open channel flow velocities in order to understand their effect on dissolved oxygen, and computational simulations considered various sediment textures and ambient groundwater upwelling rates to add process-level insight. We found that, as also shown by previous studies, the redd topography induces multiscale hyporheic flow that effectively flushes the egg pocket location with younger presumably oxygenated water; older water upwells and forms anoxic zones. This pattern persists even at the lowest channel flow rates and at small upwelling velocities of older ambient groundwater which splits the multiscale hyporheic flow cells into isolated pockets. Large groundwater upwelling rates can shut down all the hyporheic flushing. The relatively coarse texture of the redd further promotes hyporheic flushing of the redd sediment with oxygenated water. Thus, redd morphology and sediment texture optimally combine to induce hyporheic exchange flow that delivers young oxygenated water to the egg pocket.

  20. Technical Note: An open source library for processing weather radar data (wradlib

    Directory of Open Access Journals (Sweden)

    T. Pfaff

    2012-11-01

    Full Text Available The potential of weather radar observations for hydrological and meteorological research and applications is undisputed, particularly with increasing world-wide radar coverage. However, several barriers impede the use of weather radar data. These barriers are of both scientific and technical nature. The former refers to inherent measurement errors and artefacts, the latter to aspects such as reading specific data formats, geo-referencing, visualisation. The radar processing library wradlib is intended to lower these barriers by providing a free and open source tool for the most important steps in processing weather radar data for hydro-meteorological and hydrological applications. Moreover, the community-based development approach of wradlib allows scientists to share their knowledge about efficient processing algorithms and to make this knowledge available to the weather radar community in a transparent, structured and well-documented way.

  1. Technical Note: An open source library for processing weather radar data (wradlib

    Directory of Open Access Journals (Sweden)

    T. Pfaff

    2013-02-01

    Full Text Available The potential of weather radar observations for hydrological and meteorological research and applications is undisputed, particularly with increasing world-wide radar coverage. However, several barriers impede the use of weather radar data. These barriers are of both scientific and technical nature. The former refers to inherent measurement errors and artefacts, the latter to aspects such as reading specific data formats, geo-referencing, visualisation. The radar processing library wradlib is intended to lower these barriers by providing a free and open source tool for the most important steps in processing weather radar data for hydro-meteorological and hydrological applications. Moreover, the community-based development approach of wradlib allows scientists to share their knowledge about efficient processing algorithms and to make this knowledge available to the weather radar community in a transparent, structured and well-documented way.

  2. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  3. The Marginal Distributions of a Crossing Time and Renewal Numbers Related with Two Poisson Processes are as Ph-Distributions

    Directory of Open Access Journals (Sweden)

    Mir G. H. Talpur

    2006-01-01

    Full Text Available In this paper we consider, how to find the marginal distributions of crossing time and renewal numbers related with two poisson processes by using probability arguments. The obtained results show that the one-dimension marginal distributions are N+1 order PH-distributions.

  4. Amplitude Distribution of Emission Wave for Cracking Process

    Directory of Open Access Journals (Sweden)

    Shahidan Shahiron

    2016-01-01

    Full Text Available Acoustic emission technique is a method of assessment for structural health monitoring system. This technique is an effective tool for the evaluation of any system without destroying the material conditions. It enables early crack detections and has very high sensitivity to crack growth. The crack patterns in concrete beam have been identified according to the type of cracking process and the crack classifications using the AE data parameters are mainly based on the AE amplitude, rise time, and average frequency. These data parameters have been analysed using statistical methods of b-value analysis. This research paper will mainly focus on the utilization of statistical b-value analysis in evaluating the emission amplitude distribution of concrete beams. The beam specimens (150 × 250 × 1900 mm were prepared in the laboratory system and tested with the four point bending test using cyclic loading together with acoustic emission monitoring system. The results showed that this statistical analysis is promising in determining the cracking process in concrete beams.

  5. Designing a Repetitive Group Sampling Plan for Weibull Distributed Processes

    Directory of Open Access Journals (Sweden)

    Aijun Yan

    2016-01-01

    Full Text Available Acceptance sampling plans are useful tools to determine whether the submitted lots should be accepted or rejected. An efficient and economic sampling plan is very desirable for the high quality levels required by the production processes. The process capability index CL is an important quality parameter to measure the product quality. Utilizing the relationship between the CL index and the nonconforming rate, a repetitive group sampling (RGS plan based on CL index is developed in this paper when the quality characteristic follows the Weibull distribution. The optimal plan parameters of the proposed RGS plan are determined by satisfying the commonly used producer’s risk and consumer’s risk at the same time by minimizing the average sample number (ASN and then tabulated for different combinations of acceptance quality level (AQL and limiting quality level (LQL. The results show that the proposed plan has better performance than the single sampling plan in terms of ASN. Finally, the proposed RGS plan is illustrated with an industrial example.

  6. Employing OpenCL to Accelerate Ab Initio Calculations on Graphics Processing Units.

    Science.gov (United States)

    Kussmann, Jörg; Ochsenfeld, Christian

    2017-06-13

    We present an extension of our graphics processing units (GPU)-accelerated quantum chemistry package to employ OpenCL compute kernels, which can be executed on a wide range of computing devices like CPUs, Intel Xeon Phi, and AMD GPUs. Here, we focus on the use of AMD GPUs and discuss differences as compared to CUDA-based calculations on NVIDIA GPUs. First illustrative timings are presented for hybrid density functional theory calculations using serial as well as parallel compute environments. The results show that AMD GPUs are as fast or faster than comparable NVIDIA GPUs and provide a viable alternative for quantum chemical applications.

  7. Theoretical study on Cold Open Die Forging Process Optimization for Multipass Workability

    Directory of Open Access Journals (Sweden)

    Gaikwad Ajitkumar

    2016-01-01

    Full Text Available Cold Workability limits strength enhancement of austenitic materials through cold deformation. The intrinsic workability is the material characteristic whereas state-of-stress workability is governed by nature of applied stress, strain rate and geometry of deformation zone. For Cold Open Die Forging (CODF, multipass workability is essential. In this work, FEM tool FORGE-3 is used to optimize CODF on hydraulic press by analysis of stress-strain profiles and use of Latham-Cockroft damage criterion. Study recommends optimized process parameters, die combinations and pass-schedules.

  8. Parallel processing implementation for the coupled transport of photons and electrons using OpenMP

    Science.gov (United States)

    Doerner, Edgardo

    2016-05-01

    In this work the use of OpenMP to implement the parallel processing of the Monte Carlo (MC) simulation of the coupled transport for photons and electrons is presented. This implementation was carried out using a modified EGSnrc platform which enables the use of the Microsoft Visual Studio 2013 (VS2013) environment, together with the developing tools available in the Intel Parallel Studio XE 2015 (XE2015). The performance study of this new implementation was carried out in a desktop PC with a multi-core CPU, taking as a reference the performance of the original platform. The results were satisfactory, both in terms of scalability as parallelization efficiency.

  9. Open Access, Library Subscriptions and Article Processing Charges: Hybrid journals models and issues

    OpenAIRE

    Vijayakumar, J. K.; Tamarkin, Molly

    2016-01-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestl...

  10. Improvement of air distribution in refrigerated vertical open front remote supermarket display cases

    Energy Technology Data Exchange (ETDEWEB)

    Gray, I.; Luscombe, P.; McLean, L.; Sarathy, C.S.P.; Sheahen, P.; Srinivasan, K. [Frigrite Refrigeration Pty. Ltd, 27 Grange Road, Cheltenham, Vic. 3192 (Australia)

    2008-08-15

    This paper presents some of the results derived from extensive experimentation on display cases for supermarkets and derives some possible improvements to augment temperature uniformity and energy performance. The effect of the perforation pattern of the rear duct on the distribution of airflow between these and the front curtain is brought out. The critical component is air infiltration across the curtain, which is governed by internal air distribution and curtain characteristics. It is observed that a 70-30 distribution of flow between the curtain and the rear duct perforations yields a performance that satisfies the standards. Further, a judicious distribution of perforations on the rear duct at various levels and across is necessary. Correlations available in the literature are useful in making a qualitative assessment of the results. (author)

  11. Open charm production in Double Parton Scattering processes in the forward kinematics

    CERN Document Server

    Blok, B

    2016-01-01

    We calculate the rate of double open charm production in the forward kinematics studied recently in the LHCb experiment. We find that the mean field approximation for the double parton GPD (Generalized parton distributions), which neglects parton - parton correlations, underestimates the rate by a factor of two. The enhancement due to the perturbative QCD correlation \\12 mechanism which explains the rates of the double parton interactions at the central rapidities is found to explain 60--80 percentof the discrepancy. We argue that the nonperturbative fluctuations leading to non-factorized(correllated)contributions to the initial conditions for the DGLAP collinear evolution of the double parton GPD play an important role in this kinematics. Combined, two correlation mechanisms provide a good description of the rate of double charm production reported by the LHCb. We also give predictions for the variation of the ratio of double and square of single inclusive rates in the discussed kinematics as a function of p...

  12. A novel bio-safe phase separation process for preparing open-pore biodegradable polycaprolactone microparticles.

    Science.gov (United States)

    Salerno, Aurelio; Domingo, Concepción

    2014-09-01

    Open-pore biodegradable microparticles are object of considerable interest for biomedical applications, particularly as cell and drug delivery carriers in tissue engineering and health care treatments. Furthermore, the engineering of microparticles with well definite size distribution and pore architecture by bio-safe fabrication routes is crucial to avoid the use of toxic compounds potentially harmful to cells and biological tissues. To achieve this important issue, in the present study a straightforward and bio-safe approach for fabricating porous biodegradable microparticles with controlled morphological and structural features down to the nanometer scale is developed. In particular, ethyl lactate is used as a non-toxic solvent for polycaprolactone particles fabrication via a thermal induced phase separation technique. The used approach allows achieving open-pore particles with mean particle size in the 150-250 μm range and a 3.5-7.9 m(2)/g specific surface area. Finally, the combination of thermal induced phase separation and porogen leaching techniques is employed for the first time to obtain multi-scaled porous microparticles with large external and internal pore sizes and potential improved characteristics for cell culture and tissue engineering. Samples were characterized to assess their thermal properties, morphology and crystalline structure features and textural properties.

  13. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Science.gov (United States)

    Oishi, A. Christopher; Hawthorne, David A.; Oren, Ram

    Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or "baseline", which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  14. Model for Understanding Flow Processes and Distribution in Rock Rubble

    Science.gov (United States)

    Green, R. T.; Manepally, C.; Fedors, R.; Gwo, J.

    2006-12-01

    Recent studies of the potential high-level nuclear waste repository at Yucca Mountain, Nevada, suggest that degradation of emplacement drifts may be caused by either persistent stresses induced by thermal decay of the spent nuclear fuel disposed in the drifts or seismic ground motion. Of significant interest to the performance of the repository is how seepage of water onto the engineered barriers in degraded emplacement drifts would be altered by rubble. Difficulty arises because of the uncertainty associated with the heterogeneity of the natural system complicated by the unknown fragment size and distribution of the rock rubble. A prototype experiment was designed to understand the processes that govern the convergence and divergence of flow in the rubble. This effort is expected to provide additional realism in the corresponding process models and performance assessment of the repository, and to help evaluate the chemistry of water contacting the waste as well as conditions affecting waste package corrosion in the presence of rubble. The rubble sample for the experiment was collected from the lower lithophysal unit of the Topopah Spring (Tptpll) unit in the Enhanced Characterization of the Repository Block Cross Drift and is used as an approximate analog. Most of the potential repository is planned to be built in the Tptpll unit. Sample fragment size varied from 1.0 mm [0.04 in] to 15 cm [6 in]. Ongoing experiments use either a single or multiple sources of infiltration at the top to simulate conditions that could exist in a degraded drift. Seepage is evaluated for variable infiltration rates, rubble particle size distribution, and rubble layering. Comparison of test results with previous bench-scale tests performed on smaller-sized fragments and different geological media will be presented. This paper is an independent product of CNWRA and does not necessarily reflect the view or regulatory position of NRC. The NRC staff views expressed herein are preliminary

  15. Optimization of valve opening process for the suppression of impulse exhaust noise

    Science.gov (United States)

    Li, Jingxiang; Zhao, Shengdun

    2017-02-01

    Impulse exhaust noise generated by the sudden impact of discharging flow of pneumatic systems has significant temporal characteristics including high sound pressure and rapid sound transient. The impulse noise exposures are more hazardous to hearing than the energy equivalent uniform noise exposures. This paper presents a novel approach to suppress the peak sound pressure as a major indicator of impulsiveness of the impulse exhaust noise by an optimization of the opening process of valve. Relationships between exhaust flow and impulse noise are described by thermodynamics and noise generating mechanism. Then an optimized approach by controlling the valve opening process is derived under a constraint of pre-setting exhaust time. A modified servo-direct-driven valve was designed and assembled in a typical pneumatic system for the verification experiments comparing with an original solenoid valve. Experimental results with groups of initial cylinder pressures and pre-setting exhaust times are shown to verify the effects of the proposed optimization. Some indicators of energy-equivalent and impulsiveness are introduced to discuss the effects of the noise suppressions. Relationship between noise reduction and exhaust time delay is also discussed.

  16. Dynamics of Listeria monocytogenes colonisation in a newly-opened meat processing facility.

    Science.gov (United States)

    Bolocan, Andrei Sorin; Nicolau, Anca Ioana; Alvarez-Ordóñez, Avelino; Borda, Daniela; Oniciuc, Elena Alexandra; Stessl, Beatrix; Gurgu, Leontina; Wagner, Martin; Jordan, Kieran

    2016-03-01

    This study determined the colonisation scenario of Listeria monocytogenes in a newly-opened ready-to-eat meat processing facility using a combination of classical microbiology and molecular biology techniques. Samples (n=183), including food contact surfaces, non-food contact surfaces, raw materials and food samples, collected on four sampling occasions, were analysed for L. monocytogenes by the ISO 11290:1996 standard method and by real-time PCR applied to the second enrichment broth from the ISO method. No L. monocytogenes were detected on the first sampling occasion, but by the second sampling occasion a persistent clone had colonised the facility. Analysis of the second enrichment of the ISO method by real-time PCR was more sensitive for the detection of L. monocytogenes than the ISO method alone. In order to reduce the risk of cross contamination and the public health risk, awareness and proactive measures are required to control L. monocytogenes from the first days of production in a newly opened meat processing facility. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Scaled model for simulating opening process of semiconductor opening switches%半导体断路开关截断过程模拟的缩比模型

    Institute of Scientific and Technical Information of China (English)

    林舒; 李永东; 王洪广; 刘纯亮

    2013-01-01

    在维持电路参数同比变化和通过半导体断路开关(SOS)的电流密度不变的基础上,提出了一种SOS截断特性模拟的缩比模型,并可在Silvaco ATLAS软件中应用.在以不同的缩比率选取等效SOS横截面积的情况下,将原电路中串联的100个二极管等效为若干个二极管,模拟得到了相同的二极管电流和电压波形.模拟结果表明,该模型不仅可以得到正确的SOS瞬态截断过程,而且可将计算速度提高近百倍.通过对SOS截断过程中载流子分布和电场分布变化过程的分析发现,SOS的截断过程发生在n-n+区.%A scaled model is presented to simulate the opening process of semiconductor opening switches (SOSs),which can be used in Silvaco ATALAS code.The model scales all external circuit components with the same proportion and keeps physics in SOSs well.Using various equivalent SOSs with different scales of cross section to represent the original 100 SOSs in series,the simulations get the same output pulses for each diode in all cases.It is verified that the model can not only simulate the opening process of the SOS correctly,but also speed up the computation by almost one hundred times.From the evolution of carrier density distribution and electric field distribution,it can be found that the opening process starts in the n-n+ area.

  18. Determination of Optimal Opening Scheme for Electromagnetic Loop Networks Based on Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yang Li

    2016-01-01

    Full Text Available Studying optimization and decision for opening electromagnetic loop networks plays an important role in planning and operation of power grids. First, the basic principle of fuzzy analytic hierarchy process (FAHP is introduced, and then an improved FAHP-based scheme evaluation method is proposed for decoupling electromagnetic loop networks based on a set of indicators reflecting the performance of the candidate schemes. The proposed method combines the advantages of analytic hierarchy process (AHP and fuzzy comprehensive evaluation. On the one hand, AHP effectively combines qualitative and quantitative analysis to ensure the rationality of the evaluation model; on the other hand, the judgment matrix and qualitative indicators are expressed with trapezoidal fuzzy numbers to make decision-making more realistic. The effectiveness of the proposed method is validated by the application results on the real power system of Liaoning province of China.

  19. A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.

    Science.gov (United States)

    Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos

    2015-01-01

    ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.

  20. Automated System of Study Nonlinear Processes in Electro-vacuum Devices with Open Resonant Periodic Structures

    Directory of Open Access Journals (Sweden)

    G.S. Vorobyov

    2014-04-01

    Full Text Available The article describes the experimental equipment and the results of investigations of nonlinear processes occurring during the excitation of electromagnetic oscillations in the resonant electron beam devices such as an orotron-generator of diffraction radiation. These devices are finding wide application in physics and microwave technology, now. A technique for experimental research, which bases on the using of the universal electro vacuum equipment diffraction radiation analyzer and the microprocessor system for collecting and processing data. The experimental investigations results of the energy and frequency characteristics for the most common modes of the excitation oscillations in the open resonant systems such as an orotron. The implementations on the optimum modes for the oscillations excitation in such devices were recommended.

  1. Calibration process of highly parameterized semi-distributed hydrological model

    Science.gov (United States)

    Vidmar, Andrej; Brilly, Mitja

    2017-04-01

    Hydrological phenomena take place in the hydrological system, which is governed by nature, and are essentially stochastic. These phenomena are unique, non-recurring, and changeable across space and time. Since any river basin with its own natural characteristics and any hydrological event therein, are unique, this is a complex process that is not researched enough. Calibration is a procedure of determining the parameters of a model that are not known well enough. Input and output variables and mathematical model expressions are known, while only some parameters are unknown, which are determined by calibrating the model. The software used for hydrological modelling nowadays is equipped with sophisticated algorithms for calibration purposes without possibility to manage process by modeler. The results are not the best. We develop procedure for expert driven process of calibration. We use HBV-light-CLI hydrological model which has command line interface and coupling it with PEST. PEST is parameter estimation tool which is used widely in ground water modeling and can be used also on surface waters. Process of calibration managed by expert directly, and proportionally to the expert knowledge, affects the outcome of the inversion procedure and achieves better results than if the procedure had been left to the selected optimization algorithm. First step is to properly define spatial characteristic and structural design of semi-distributed model including all morphological and hydrological phenomena, like karstic area, alluvial area and forest area. This step includes and requires geological, meteorological, hydraulic and hydrological knowledge of modeler. Second step is to set initial parameter values at their preferred values based on expert knowledge. In this step we also define all parameter and observation groups. Peak data are essential in process of calibration if we are mainly interested in flood events. Each Sub Catchment in the model has own observations group

  2. Laser scanner data processing and 3D modeling using a free and open source software

    Energy Technology Data Exchange (ETDEWEB)

    Gabriele, Fatuzzo [Dept. of Industrial and Mechanical Engineering, University of Catania (Italy); Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci; Salvatore, Zito [Dept. of Civil Engineering and Architecture, University of Catania (Italy)

    2015-03-10

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.

  3. Classification of bacterial contamination using image processing and distributed computing.

    Science.gov (United States)

    Ahmed, W M; Bayraktar, B; Bhunia, A; Hirleman, E D; Robinson, J P; Rajwa, B

    2013-01-01

    Disease outbreaks due to contaminated food are a major concern not only for the food-processing industry but also for the public at large. Techniques for automated detection and classification of microorganisms can be a great help in preventing outbreaks and maintaining the safety of the nations food supply. Identification and classification of foodborne pathogens using colony scatter patterns is a promising new label-free technique that utilizes image-analysis and machine-learning tools. However, the feature-extraction tools employed for this approach are computationally complex, and choosing the right combination of scatter-related features requires extensive testing with different feature combinations. In the presented work we used computer clusters to speed up the feature-extraction process, which enables us to analyze the contribution of different scatter-based features to the overall classification accuracy. A set of 1000 scatter patterns representing ten different bacterial strains was used. Zernike and Chebyshev moments as well as Haralick texture features were computed from the available light-scatter patterns. The most promising features were first selected using Fishers discriminant analysis, and subsequently a support-vector-machine (SVM) classifier with a linear kernel was used. With extensive testing we were able to identify a small subset of features that produced the desired results in terms of classification accuracy and execution speed. The use of distributed computing for scatter-pattern analysis, feature extraction, and selection provides a feasible mechanism for large-scale deployment of a light scatter-based approach to bacterial classification.

  4. Open-phase operating modes of power flow control topologies in a Smart Grid Distribution Network

    Science.gov (United States)

    Astashev, M. G.; Novikov, M. A.; Panfilov, D. I.; Rashitov, P. A.; Remizevich, T. V.; Fedorova, M. I.

    2015-12-01

    The power flow regulating circuit node in an alternating current system is reviewed. The circuit node is accomplished based on a thyristor controlled phase angle regulator (TCPAR) with controlled thyristor switch. Research results of the individual phase control of the output voltage for the TCPAR are presented. Analytical expressions for the overvoltage factor calculation in the thyristor switch circuit for open-phase operating modes are received. Based on evaluation of overvoltage in operational and emergency modes, the implementability conditions of the individual phase control of the output voltage are determined. Under these conditions, maximal performance and complete controllability are provided.

  5. Numerical model describing optimization of fibres winding process on open and closed frame

    Science.gov (United States)

    Petrů, M.; Mlýnek, J.; Martinec, T.

    2016-08-01

    This article discusses a numerical model describing optimization of fibres winding process on open and closed frame. The quality production of said type of composite frame depends primarily on the correct winding of fibers on a polyurethane core. It is especially needed to ensure the correct angles of the fibers winding on the polyurethane core and the homogeneity of individual winding layers. The article describes mathematical model for use an industrial robot in filament winding and how to calculate the trajectory of the robot. When winding fibers on the polyurethane core which is fastened to the robot-end-effector so that during the winding process goes through a fibre-processing head on the basis of the suitably determined robot-end-effector trajectory. We use the described numerical model and matrix calculus to enumerate the trajectory of the robot-end-effector to determine the desired passage of the frame through the fibre-processing head. The calculation of the trajectory was programmed in the Delphi development environment. Relations of the numerical model are important for use a real solving of the passage of a polyurethane core through fibre-processing head.

  6. Distributed, Modular, Network Enabled Architecture For Process Control Military Applications

    Directory of Open Access Journals (Sweden)

    Abhijit Kamble*,

    2014-02-01

    Full Text Available In process control world, use of distributed modular embedded controller architecture drastically reduces the number and complexity of cabling; at the same time increases the system computing performance and response for real time application as compared to centralized control system. We propose a design based on ARM Cortex M4 hardware architecture and Cortex Microcontroller Software Interface Standard (CMSIS based software development. The ARM Cortex-M series ensures a compatible target processor and provides common core peripherals whereas CMSIS abstraction layer reduces development time, helps design software reusability and provides seamless application software interface for controllers. Being a custom design, we can built features like Built-In Test Equipment (BITE, single point fault tolerance, redundancy, 2/3 logic, etc. which are more desirable for a military applications. This paper describes the design of a generic embedded hardware module that can be configured as local I/O controller or application controller or Man Machine Interface (MMI. This paper also proposes a philosophy for step by step hardware and software development.

  7. Spatially distributed fiber sensor with dual processed outputs

    Science.gov (United States)

    Xu, X.; Spillman, William B., Jr.; Claus, Richard O.; Meissner, K. E.; Chen, K.

    2005-05-01

    Given the rapid aging of the world"s population, improvements in technology for automation of patient care and documentation are badly needed. We have previously demonstrated a 'smart bed' that can non-intrusively monitor a patient in bed and determine a patient's respiration, heart rate and movement without intrusive or restrictive medical measurements. This is an application of spatially distributed integrating fiber optic sensors. The basic concept is that any patient movement that also moves an optical fiber within a specified area will produce a change in the optical signal. Two modal modulation approaches were considered, a statistical mode (STM) sensor and a high order mode excitation (HOME) sensor. The present design includes an STM sensor combined with a HOME sensor, using both modal modulation approaches. A special lens system allows only the high order modes of the optical fiber to be excited and coupled into the sensor. For handling output from the dual STM-HOME sensor, computer processing methods are discussed that offer comprehensive perturbation analysis for more reliable patient monitoring.

  8. Grain-size distribution in suspension over a sand-gravel bed in open channel flow

    Institute of Scientific and Technical Information of China (English)

    Koeli GHOSHAL; Debasish PAL

    2014-01-01

    Grain-size distributions of suspended load over a sand-gravel bed at two different flow velocities were studied in a laboratory flume. The experiments had been performed to study the influence of flow velocity and suspension height on grain-size distribution in suspension over a sand-gravel bed. The experimental findings show that with an increase of flow velocity, the grain-size distribution of suspended load changed from a skewed form to a bimodal one at higher suspension heights. This study focuses on the determination of the parameter βn which is the ratio of the sediment diffusion coefficient to the momentum diffusion coefficient of n th grain-size. A new relationship has been proposed involvingβn , the normalizing settling velocity of sediment particles and suspension height, which is applicable for widest range of normalizing settling velocity available in literature so far. A similar parameter β for calculating total suspension concentration is also developed. The classical Rouse equation is modified with βn and β and used to compute grain-size distribution and total concentration in suspension, respectively. The computed values have shown good agreement with the measured values of experimental data.

  9. Unity and disunity in evolutionary sciences: process-based analogies open common research avenues for biology and linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric

    2016-08-20

    For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to

  10. Wigner distribution function and entropy of the damped harmonic oscillator within the theory of the open quantum systems

    Science.gov (United States)

    Isar, Aurelian

    1995-01-01

    The harmonic oscillator with dissipation is studied within the framework of the Lindblad theory for open quantum systems. By using the Wang-Uhlenbeck method, the Fokker-Planck equation, obtained from the master equation for the density operator, is solved for the Wigner distribution function, subject to either the Gaussian type or the delta-function type of initial conditions. The obtained Wigner functions are two-dimensional Gaussians with different widths. Then a closed expression for the density operator is extracted. The entropy of the system is subsequently calculated and its temporal behavior shows that this quantity relaxes to its equilibrium value.

  11. Application of Photoshop and OpenCV in Teaching of Digital Image Process Course%Photoshop和OpenCV的数字图像处理教学应用

    Institute of Scientific and Technical Information of China (English)

    林忠

    2015-01-01

    针对计算机专业数字图像处理课程演示和实验平台的教学实际情况,详尽分析了PhotoShop和OpenCV软件在数字图像处理课程教学中应用的合理性和优势,提出将PhotoShop和OpenCV软件作为数字图像处理课程教学演示和实验的工具。既可以用其在理论教学环节展示各种图像处理算法的效果,又可以在实验教学环节中通过基于OpenCV的编程让学生掌握图像处理的实现方法,更深入了解图像处理的思想方法。%In view of the digital image processing course presentation and experiment teaching situation for computer specialty, a detailed analysis on rationality and advantage of application of PhotoShop and OpenCV in teaching of digital image process is conducted. Application of PhotoShop in the course is helpful for showing the result of various algorithms. Using of OpenCV in experiment teaching enables the student to master the method of image processing, and is good for getting a better understanding of the thought method of image processing.

  12. [Mathematical processing of human platelet distribution according to size for determination of cell heterogeneity].

    Science.gov (United States)

    Kosmovskiĭ, S Iu; Vasin, S L; Rozanova, I B; Sevast'ianov, V I

    1999-01-01

    The paper proposes a method for mathematical treatment of the distribution of human platelets by sizes to detect the heterogeneity of cell populations. Its use allowed the authors to identify three platelet populations that have different parameters of size distribution. The proposed method opens additional vistas for analyzing the heterogeneity of platelet populations without sophisticating experimental techniques.

  13. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    National Research Council Canada - National Science Library

    Kang, Sanggoo; Lee, Kiwon

    2016-01-01

    ... under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments...

  14. Distribution of Scientific Productivity and the Processes of Stratification

    OpenAIRE

    山崎, 博敏

    1982-01-01

    This paper reports that the distribution of productivity of 124 university chemists in Japan shows the best fittness to a negative binomial distribution, and then considers the reasons for and the sociological implications of the results. Since the inverse square law of A. J. Lotka (1926), several mathematical models on the distribution of productivity have been proposed by Williams (1944), Simon (1955), Shockley (1957), Price (1963, 1976), Allison (1976) and Rao (1980) et al. The charact...

  15. Open Computer Forensic Architecture a Way to Process Terabytes of Forensic Disk Images

    Science.gov (United States)

    Vermaas, Oscar; Simons, Joep; Meijer, Rob

    This chapter describes the Open Computer Forensics Architecture (OCFA), an automated system that dissects complex file types, extracts metadata from files and ultimately creates indexes on forensic images of seized computers. It consists of a set of collaborating processes, called modules. Each module is specialized in processing a certain file type. When it receives a so called 'evidence', the information that has been extracted so far about the file together with the actual data, it either adds new information about the file or uses the file to derive a new 'evidence'. All evidence, original and derived, is sent to a router after being processed by a particular module. The router decides which module should process the evidence next, based upon the metadata associated with the evidence. Thus the OCFA system can recursively process images until from every compound file the embedded files, if any, are extracted, all information that the system can derive, has been derived and all extracted text is indexed. Compound files include, but are not limited to, archive- and zip-files, disk images, text documents of various formats and, for example, mailboxes. The output of an OCFA run is a repository full of derived files, a database containing all extracted information about the files and an index which can be used when searching. This is presented in a web interface. Moreover, processed data is easily fed to third party software for further analysis or to be used in data mining or text mining-tools. The main advantages of the OCFA system are Scalability, it is able to process large amounts of data.

  16. The Effects of Cylinder Head Gasket Opening on Engine Temperature Distribution for a Water-Cooled Engine

    Science.gov (United States)

    Jang, J. Y.; Chi, G. X.

    2017-02-01

    In a liquid-cooled engine, coolant is pumped throughout the water jacket of the engine, drawing heat from the cylinder head, pistons, combustion chambers, cylinder walls, and valves, etc. If the engine temperature is too high or too low, various problems will occur. These include overheating of the lubricating oil and engine parts, excessive stresses between engine parts, loss of power, incomplete burning of fuel, etc. Thus, the engine should be maintained at the proper operating temperature. This study investigated the effects of different cylinder head gasket opening on the engine temperature distributions in a water-cooled motorcycle engine. The numerical predictions for the temperature distribution are in good agreement with the experimental data within 20%.

  17. Distributed Prognostic Health Management with Gaussian Process Regression

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed prognostics architecture design is an enabling step for efficient implementation of health management systems. A major challenge encountered in such...

  18. Amanzi: An Open-Source Multi-process Simulator for Environmental Applications

    Science.gov (United States)

    Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.

    2014-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.

  19. CAPE-OPEN Integration for Advanced Process Engineering Co-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zitney, S.E.

    2006-11-01

    This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to comply with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.

  20. New Neutron-Capture Measurements in 23 Open Clusters. I. The R-Process

    CERN Document Server

    Overbeek, Jamie C; Jacobson, Heather R

    2016-01-01

    Neutron-capture elements, those with Z > 35, are the least well-understood in terms of nucleosynthesis and formation environments. The rapid neutron-capture, or r-process, elements are formed in the environments and/or remnants of massive stars, while the slow neutron-capture, or s-process, elements are primarily formed in low-mass AGB stars. These elements can provide much information about Galactic star formation and enrichment, but observational data is limited. We have assembled a sample of 68 stars in 23 open clusters that we use to probe abundance trends for six neutron-capture elements (Eu, Gd, Dy, Mo, Pr, and Nd) with cluster age and location in the disk of the Galaxy. In order to keep our analysis as homogenous as possible, we use an automated synthesis fitting program, which also enables us to measure multiple (3-10) lines for each element. We find that the pure r-process elements (Eu, Gd, and Dy) have positive trends with increasing cluster age, while the mixed r- and s- process elements (Mo, Pr, a...

  1. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    Science.gov (United States)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  2. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    Science.gov (United States)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  3. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X).

  4. Open Source Science: The Gravitational Wave Processing-Enabled Archive for NANOGrav

    Science.gov (United States)

    Brazier, Adam; Cordes, James M.; Dieng, Awa; Ferdman, Robert; Garver-Daniels, Nathaniel; hawkins, steven; Hendrick, Justin; Huerta, Eliu; Lam, Michael T.; Lazio, T. Joseph W.; Lynch, Ryan S.; NANOGrav Consortium

    2016-01-01

    The North American Nanohertz Gravitational Wave Observatory (NANOGrav) dataset comprises pulsar timing data and data products from a continuing decades-long campaign of observations and high-precision analysis of over 40 millisecond pulsars conducted with the intent to detect nanohertz gravitational waves. Employing a team of developers, researchers and undergraduates, we have built an open source interface based on iPython/Jupyter notebooks allowing programmatic access and processing of the archived raw data and data products in order to greatly enhance science throughput. This is accomplished by: allowing instant access to the current dataset, subject to proprietary periods; providing an intuitive sandbox environment with a growing standard suite of analysis software to enhance learning opportunities for students in the NANOGrav science areas; and driving the development and sharing of new open source analysis tools. We also provide a separate web visualization interface, primarily developed by undergraduates, that allows the user to perform natural queries for data table construction and download, providing an environment for plotting both primary science and diagnostic data, with the next iteration allowing for real-time analysis tools such as model fitting and high-precision timing.

  5. DENIAL OF SERVICE ATTACK IN DISTRIBUTED WIRELESS NETWORK BY DISTRIBUTED JAMMER NETWORK: A BIRTH-DEATH RANDOM PROCESS ANALYSIS

    Directory of Open Access Journals (Sweden)

    R. Dhanasekaran

    2014-01-01

    Full Text Available Large number of low power, tiny radio jammers are constituting a Distributed Jammer Network (DJN is used nowadays to cause a Denial of Service (DoS attack on a Distributed Wireless Network (DWN. Using NANO technologies, it is possible to build huge number of tiny jammers in millions, if not more. The Denial of Service (DoS attacks in Distributed Wireless Network (DWN using Distributed Jammer Network (DJN considering each of them as separate Poisson Random Process. In an integrated approach, in this study, we advocate the more natural Birth-Death Random Process route to study the impact of Distributed Jammer Network (DJN on the connectivity of Distributed Wireless Network (DWN. We express that the Distributed Jammer Network (DJN can root a phase transition in the performance of the target network. We use Birth-Death Random Process (BDRP route for this phase transition to evaluate the collision of Distributed Jammer Network (DJN on the connectivity and global percolation of the target network. This study confirms the global percolation of Distributed Wireless Network (DWN is definite when the Distributed Jammer Network (DJN is not more significant.

  6. A model for the distribution channels planning process

    NARCIS (Netherlands)

    Neves, M.F.; Zuurbier, P.; Campomar, M.C.

    2001-01-01

    Research of existing literature reveals some models (sequence of steps) for companies that want to plan distribution channels. None of these models uses strong contributions from transaction cost economics, bringing a possibility to elaborate on a "distribution channels planning model", with these c

  7. Geosfear? - Overcoming System Boundaries by Open-source Based Monitoring of Spatio-temporal Processes

    Science.gov (United States)

    Brandt, T.; Schima, R.; Goblirsch, T.; Paschen, M.; Francyk, B.; Bumberger, J.; Zacharias, S.; Dietrich, P.; Rubin, Y.; Rinke, K.; Fleckenstein, J. H.; Schmidt, C.; Vieweg, M.

    2016-12-01

    The impact of global change, intensive agriculture and complex interactions between humans and the environment show different effects on different scales. However, the desire to obtain a better understanding of ecosystems and process dynamics in nature accentuates the need for observing these processes at higher temporal and spatial resolutions. Especially with regard to the process dynamics and heterogeneity of rivers and catchment areas, a comprehensive monitoring of the ongoing processes and effects remains to be a challenging issue. What we need are monitoring systems which can collect most diverse data across different environmental compartments and scales. Today, open-source based electronics and innovative sensors and sensor components are offering a promising approach to investigate new possibilities of mobile data acquisition to improve our understanding of the geosphere. To start with, we have designed and implemented a multi-operable, embedded Linux platform for fast integration of different sensors within a single infrastructure. In addition, a GPS module in combination with a GSM transmitter ensures the synchronization and geo-referencing of all data, no matter how old-fashioned the sensors are. To this end, initial field experiments were conducted at a 3rd order stream in the Central German Lowland. Here, we linked in-stream DOC inputs with subsurface metabolism by coupling miniaturized DOC sensor probes with a modified vertical oxygen profiler in situ. Starting from metrological observations to water quality and subsurface conditions, the overarching goal is the detection of interlinked process dynamics across highly reactive biogeochemical interfaces. Overall, the field experiments demonstrated the feasibility of this emerging technology and its potential towards a cutting-edge strategy based on a holistic and integrated process. Now, we are only a few steps away from realizing adaptive and event-triggered observations close to real

  8. Floral and nesting resources, habitat structure, and fire influence bee distribution across an open-forest gradient.

    Science.gov (United States)

    Grundel, Ralph; Jean, Robert P; Frohnapple, Krystalynn J; Glowacki, Gary A; Scott, Peter E; Pavlovic, Noel B

    2010-09-01

    Given bees' central effect on vegetation communities, it is important to understand how and why bee distributions vary across ecological gradients. We examined how plant community composition, plant diversity, nesting suitability, canopy cover, land use, and fire history affected bee distribution across an open-forest gradient in northwest Indiana, USA, a gradient similar to the historic Midwest United States landscape mosaic. When considered with the other predictors, plant community composition was not a significant predictor of bee community composition. Bee abundance was negatively related to canopy cover and positively to recent fire frequency, bee richness was positively related to plant richness and abundance of potential nesting resources, and bee community composition was significantly related to plant richness, soil characteristics potentially related to nesting suitability, and canopy cover. Thus, bee abundance was predicted by a different set of environmental characteristics than was bee species richness, and bee community composition was predicted, in large part, by a combination of the significant predictors of bee abundance and richness. Differences in bee community composition along the woody vegetation gradient were correlated with relative abundance of oligolectic, or diet specialist, bees. Because oligoleges were rarer than diet generalists and were associated with open habitats, their populations may be especially affected by degradation of open habitats. More habitat-specialist bees were documented for open and forest/scrub habitats than for savanna/woodland habitats, consistent with bees responding to habitats of intermediate woody vegetation density, such as savannas, as ecotones rather than as distinct habitat types. Similarity of bee community composition, similarity of bee abundance, and similarity of bee richness between sites were not significantly related to proximity of sites to each other. Nestedness analysis indicated that species

  9. Distributed Prognostic Health Management with Gaussian Process Regression

    Science.gov (United States)

    Saha, Sankalita; Saha, Bhaskar; Saxena, Abhinav; Goebel, Kai Frank

    2010-01-01

    Distributed prognostics architecture design is an enabling step for efficient implementation of health management systems. A major challenge encountered in such design is formulation of optimal distributed prognostics algorithms. In this paper. we present a distributed GPR based prognostics algorithm whose target platform is a wireless sensor network. In addition to challenges encountered in a distributed implementation, a wireless network poses constraints on communication patterns, thereby making the problem more challenging. The prognostics application that was used to demonstrate our new algorithms is battery prognostics. In order to present trade-offs within different prognostic approaches, we present comparison with the distributed implementation of a particle filter based prognostics for the same battery data.

  10. Image processing methods to obtain symmetrical distribution from projection image.

    Science.gov (United States)

    Asano, H; Takenaka, N; Fujii, T; Nakamatsu, E; Tagami, Y; Takeshima, K

    2004-10-01

    Flow visualization and measurement of cross-sectional liquid distribution is very effective to clarify the effects of obstacles in a conduit on heat transfer and flow characteristics of gas-liquid two-phase flow. In this study, two methods to obtain cross-sectional distribution of void fraction are applied to vertical upward air-water two-phase flow. These methods need projection image only from one direction. Radial distributions of void fraction in a circular tube and a circular-tube annuli with a spacer were calculated by Abel transform based on the assumption of axial symmetry. On the other hand, cross-sectional distributions of void fraction in a circular tube with a wire coil whose conduit configuration rotates about the tube central axis periodically were measured by CT method based on the assumption that the relative distributions of liquid phase against the wire were kept along the flow direction.

  11. Development Process Patterns for Distributed Onshore/Offshore Software Projects

    Directory of Open Access Journals (Sweden)

    Ravinder Singh

    2014-07-01

    Full Text Available the globalisation of the commercial world, and the use of distributed working practices (Offshore/ onshore/ near-shore has increased dramatically with the improvement of information and communication technologies. Many organisations, especially those that operate within knowledge intensive industries, have turned to distributed work arrangements to facilitate information exchange and provide competitive advantage in terms of cost and quicker delivery of the solutions. The information and communication technologies (ICT must be able to provide services similar to face-to-face conditions. Additional organisations functions must be enhanced to overcome the shortcomings of ICT and also to compensate for time gaps, cultural differences, and distributed team work. Our proposed model identifies four key work models or patterns that affect the operation of distributed work arrangements, and we also propose guidelines for managing distributed work efficiently and effectively.

  12. A Scalable Infrastructure for Lidar Topography Data Distribution, Processing, and Discovery

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Phan, M.; Cowart, C. A.; Arrowsmith, R.; Baru, C.

    2010-12-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology have emerged as a fundamental tool in the Earth sciences, and are also being widely utilized for ecological, planning, engineering, and environmental applications. Collected from airborne, terrestrial, and space-based platforms, these data are revolutionary because they permit analysis of geologic and biologic processes at resolutions essential for their appropriate representation. Public domain lidar data collection by federal, state, and local agencies are a valuable resource to the scientific community, however the data pose significant distribution challenges because of the volume and complexity of data that must be stored, managed, and processed. Lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative products. This massive volume of data is often challenging to host for resource-limited agencies. Furthermore, these data can be technically challenging for users who lack appropriate software, computing resources, and expertise. The National Science Foundation-funded OpenTopography Facility (www.opentopography.org) has developed a cyberinfrastructure-based solution to enable online access to Earth science-oriented high-resolution lidar topography data, online processing tools, and derivative products. OpenTopography provides access to terabytes of point cloud data, standard DEMs, and Google Earth image data, all co-located with computational resources for on-demand data processing. The OpenTopography portal is built upon a cyberinfrastructure platform that utilizes a Services Oriented Architecture (SOA) to provide a modular system that is highly scalable and flexible enough to support the growing needs of the Earth science lidar community. OpenTopography strives to host and provide access to datasets as soon as they become available, and also to expose greater application level functionalities to

  13. An open-source genetic algorithm for determining optimal seed distributions for low-dose-rate prostate brachytherapy.

    Science.gov (United States)

    McGeachy, P; Madamesila, J; Beauchamp, A; Khan, R

    2015-01-01

    An open source optimizer that generates seed distributions for low-dose-rate prostate brachytherapy was designed, tested, and validated. The optimizer was a simple genetic algorithm (SGA) that, given a set of prostate and urethra contours, determines the optimal seed distribution in terms of coverage of the prostate with the prescribed dose while avoiding hotspots within the urethra. The algorithm was validated in a retrospective study on 45 previously contoured low-dose-rate prostate brachytherapy patients. Dosimetric indices were evaluated to ensure solutions adhered to clinical standards. The SGA performance was further benchmarked by comparing solutions obtained from a commercial optimizer (inverse planning simulated annealing [IPSA]) with the same cohort of 45 patients. Clinically acceptable target coverage by the prescribed dose (V100) was obtained for both SGA and IPSA, with a mean ± standard deviation of 98 ± 2% and 99.5 ± 0.5%, respectively. For the prostate D90, SGA and IPSA yielded 177 ± 8 Gy and 186 ± 7 Gy, respectively, which were both clinically acceptable. Both algorithms yielded reasonable dose to the rectum, with V100 open source SGA was validated that provides a research tool for the brachytherapy community. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  14. OPEN SOURCE IMAGE-PROCESSING TOOLS FOR LOW-COST UAV-BASED LANDSLIDE INVESTIGATIONS

    Directory of Open Access Journals (Sweden)

    U. Niethammer

    2012-09-01

    Full Text Available In recent years, the application of unmanned aerial vehicles (UAVs has become more common and the availability of lightweight digital cameras has enabled UAV-systems to represent affordable and practical remote sensing platforms, allowing flexible and high- resolution remote sensing investigations. In the course of numerous UAV-based remote sensing campaigns significant numbers of airborne photographs of two different landslides have been acquired. These images were used for ortho-mosaic and digital terrain model (DTM generation, thus allowing for high-resolution landslide monitoring. Several new open source image- and DTM- processing tools are now providing a complete remote sensing working cycle with the use of no commercial hard- or software.

  15. DBSproc: An open source process for DBS electrode localization and tractographic analysis.

    Science.gov (United States)

    Lauro, Peter M; Vanegas-Arroyave, Nora; Huang, Ling; Taylor, Paul A; Zaghloul, Kareem A; Lungu, Codrin; Saad, Ziad S; Horovitz, Silvina G

    2016-01-01

    Deep brain stimulation (DBS) is an effective surgical treatment for movement disorders. Although stimulation sites for movement disorders such as Parkinson's disease are established, the therapeutic mechanisms of DBS remain controversial. Recent research suggests that specific white-matter tract and circuit activation mediates symptom relief. To investigate these questions, we have developed a patient-specific open-source software pipeline called 'DBSproc' for (1) localizing DBS electrodes and contacts from postoperative CT images, (2) processing structural and diffusion MRI data, (3) registering all images to a common space, (4) estimating DBS activation volume from patient-specific voltage and impedance, and (5) understanding the DBS contact-brain connectivity through probabilistic tractography. In this paper, we explain our methodology and provide validation with anatomical and tractographic data. This method can be used to help investigate mechanisms of action of DBS, inform surgical and clinical assessments, and define new therapeutic targets.

  16. Herschel Interactive Processing Environment (HIPE): Open to the World and the Future

    Science.gov (United States)

    Balm, P.

    2012-09-01

    Herschel is ESA's space-based infrared observatory. It was launched on May 14, 2009 and is in routine science operations. The Herschel Interactive Processing Environment, HIPE, is Herschel's interactive analysis package. HIPE has a user-base of approximately 1,000 users and a major new version is released twice a year. HIPE is the first open-source astronomy data analysis package written entirely in Java and Jython, which allows it to provide a modern GUI with command echoing, sophisticated interoperability and extensibility, with access to the vast amounts of Java libraries. HIPE includes the official data reduction scripts and allows executing and modifying them as needed. These aspects may make HIPE the seed for the astronomy working environment of the future.

  17. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  18. Quantifying atmospheric processing of mineral dust as a source of bioavailable phosphorus to the open oceans

    Science.gov (United States)

    Herbert, Ross; Stockdale, Anthony; Carslaw, Ken; Krom, Michael

    2016-04-01

    The transport and deposition of mineral dust is known to be the dominant source of phosphorus (P) to the surface waters of the open oceans. However, the fraction of this P that is deemed available for primary productivity remains a key uncertainty due to a limited understanding of the processes occurring during transport of the dust. Through a series of detailed laboratory experiments using desert dust and dust precursors, we show that the dissolution behaviour of P in these samples is controlled by a surface-bound labile pool, and an additional mineral pool primarily consisting of apatite. The acid dissolution of the apatite occurs rapidly and is controlled by the absolute number of H+ ions present in the solution surrounding the dust. Using these results we develop a new conceptual model that reproduces the major processes controlling P dissolution in the atmosphere. We then use a global aerosol microphysics model with a global soil database to quantify the deposition of bioavailable P to the open oceans and ice sheets. We show that, globally, the labile pool contributes 2.4 Gg P a-1 to the oceans and, from a potential pool of 11.5 Gg P a-1, the dissolved apatite pool contributes 0.24 Gg P a-1. A series of sensitivity studies identifying sources of acid in the atmosphere show that anthropogenic emissions of SO2 contribute 61% of the global mass of dissolved apatite, volcanic events contribute 11%, and DMS emissions contribute 10%. Finally, we show that the fraction of mineral dust P that is available for primary productivity varies, regionally, from 50% in the South Pacific Ocean; this explains the variability in the fraction of bioavailable P commonly observed in important oceanic regions.

  19. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    He, Qingyun; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; Feng, Jingchao

    2015-12-15

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  20. An open and transparent process to select ELIXIR Node Services as implemented by ELIXIR-UK.

    Science.gov (United States)

    Hancock, John M; Game, Alf; Ponting, Chris P; Goble, Carole A

    2016-01-01

    ELIXIR is the European infrastructure established specifically for the sharing and sustainability of life science data. To provide up-to-date resources and services, ELIXIR needs to undergo a continuous process of refreshing the services provided by its national Nodes. Here we present the approach taken by ELIXIR-UK to address the advice by the ELIXIR Scientific Advisory Board that Nodes need to develop " mechanisms to ensure that each Node continues to be representative of the Bioinformatics efforts within the country". ELIXIR-UK put in place an open and transparent process to identify potential ELIXIR resources within the UK during late 2015 and early to mid-2016. Areas of strategic strength were identified and Expressions of Interest in these priority areas were requested from the UK community. A set of criteria were established, in discussion with the ELIXIR Hub, and prospective ELIXIR-UK resources were assessed by an independent committee set up by the Node for this purpose. Of 19 resources considered, 14 were judged to be immediately ready to be included in the UK ELIXIR Node's portfolio. A further five were placed on the Node's roadmap for future consideration for inclusion. ELIXIR-UK expects to repeat this process regularly to ensure its portfolio continues to reflect its community's strengths.

  1. Developing Components of a Distributed Earth System Observatory using Open Source Technology

    Science.gov (United States)

    Braswell, B.; Vorosmarty, C.; Magill, A.; Fekete, B.; Glidden, S.; Justice, D.; Schloss, A.; Proussevitch, A.; Fahnestock, M.

    2006-12-01

    We present recent progress in the development of an Earth System Observatory at the University of New Hampshire. This effort combines existing open source geoinformatics tools, and focuses on custom, lightweight, and simple interfaces for querying and performing data manipulation with mathematical and conditional queries, and displaying results as maps and data tables. We also consider the extent to which scientific data discovery can be performed within WMS-enabled spinning globes. One focus of the presentation is how much effort and time is required for a research group to adapt existing, traditional data infrastructure, which is useful but limited in functionality and in access (e.g. large geographic databases and ftp archives), to expose their data as an interoperable map or data service, yielding maps from large time series datasets as single layer or animations. Our experience focuses on gridded surface climate and weather data, as well as a variety of individual remote sensing data imagery archives from a large number of projects, and benefited from the need for interoperability with existing data and information archives and services at UNH (e.g. WEBSTER and GRANIT).

  2. MATHEMATICAL MODEL OF ELECTROMAGNETIC PROCESSES IN LEHERA LINE AT OPEN-CIRCUIT OPERATION

    Directory of Open Access Journals (Sweden)

    A.V. Chaban

    2016-06-01

    Full Text Available Purpose. The work proposed for the modeling of transients in Lehera line uses a modified Hamilton-Ostrogradskiy principle. The above approach makes it possible to avoid the decomposition of a single dynamic system that allows you to take into account some subtle hidden movements. This is true for systems with distributed parameters, which in the current work we are considering. Methodology. Based on our developed new interdisciplinary method of mathematical modeling of dynamic systems, based on the principle of modified Hamilton-Ostrogradskiy and expansion of the latter on the non-conservative dissipative systems, build mathematical model Lehera line. The model allows to analyze transient electromagnetic processes in power lines. Results. In this work the model used for the study of transients in the non-working condition Lehera line. Analyzing the results shows that our proposed approach and developed based on a mathematical model is appropriate, certifying physical principles regarding electrodynamics of wave processes in long power lines. Presented in 3D format, time-space distribution function of current and voltage that gives the most information about wave processes in Lehera line at non-working condition go. Originality. The originality of the paper is that the method of finding the boundary conditions of the third kind (Poincare conditions taking into account all differential equations of electric power system, i.e. to find the boundary conditions at the end of the line involves all object equation. This approach enables the analysis of any electric systems. Practical value. Practical application is that the wave processes in lines affect the various kinds of electrical devices, proper investigation of wave processes is the theme of the present work.

  3. Clinical Laboratory Data Management: A Distributed Data Processing Solution

    OpenAIRE

    Levin, Martin; Morgner, Raymond; Packer, Bernice

    1980-01-01

    Two turn-key systems, one for patient registration and the other for the clinical laboratory have been installed and linked together at the Hospital of the University of Pennsylvania, forming the nucleus of an evolving distributed Hospital Information System.

  4. Distributed collaborative team effectiveness: measurement and process improvement

    Science.gov (United States)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  5. Open data within governmental organisations : Effects, benefits and challenges of the implementation process

    NARCIS (Netherlands)

    Hartog, M. (Martijn); Mulder, A.W. (Bert); Spée, B. (Bart); Visser, E. (Ed); Gribnau, A. (Antoine)

    2014-01-01

    This article describes the growth of open data, open government and the means for transparency and accountability, but aims to reflect on the bottlenecks and actual practicality of opening data to the public domain by two governmental bodies. The Municiaplity of The Hague and The Province of South-H

  6. Prediction of residence time distributions in food processing machinery

    DEFF Research Database (Denmark)

    Karlson, Torben; Friis, Alan; Szabo, Peter

    1996-01-01

    The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet.......The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet....

  7. Prediction of residence time distributions in food processing machinery

    DEFF Research Database (Denmark)

    Karlson, Torben; Friis, Alan; Szabo, Peter

    1996-01-01

    The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet.......The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet....

  8. Isotope biogeochemical assessment of natural biodegradation processes in open cast pit mining landscapes

    Science.gov (United States)

    Jeschke, Christina; Knöller, Kay; Koschorreck, Matthias; Ussath, Maria; Hoth, Nils

    2014-05-01

    In Germany, a major share of the energy production is based on the burning of lignite from open cast pit mines. The remediation and re-cultivation of the former mining areas in the Lusatian and Central German lignite mining district is an enormous technical and economical challenge. After mine closures, the surrounding landscapes are threatened by acid mine drainage (AMD), i.e. the acidification and mineralization of rising groundwater with metals and inorganic contaminants. The high content of sulfur (sulfuric acid, sulfate), nitrogen (ammonium) and iron compounds (iron-hydroxides) deteriorates the groundwater quality and decelerates sustainable development of tourism in (former) mining landscapes. Natural biodegradation or attenuation (NA) processes of inorganic contaminants are considered to be a technically low impact and an economically beneficial solution. The investigations of the stable isotope compositions of compounds involved in NA processes helps clarify the dynamics of natural degradation and provides specific informations on retention processes of sulfate and nitrogen-compounds in mine dump water, mine dump sediment, and residual pit lakes. In an active mine dump we investigated zones where the process of bacterial sulfate reduction, as one very important NA process, takes place and how NA can be enhanced by injecting reactive substrates. Stable isotopes signatures of sulfur and nitrogen components were examined and evaluated in concert with hydrogeochemical data. In addition, we delineated the sources of ammonium pollution in mine dump sediments and investigated nitrification by 15N-labeling techniques to calculate the limit of the conversion of harmful ammonium to nitrate in residual mining lakes. Ultimately, we provided an isotope biogeochemical assessment of natural attenuation of sulfate and ammonium at mine dump sites and mining lakes. Also, we estimated the risk potential for water in different compartments of the hydrological system. In

  9. Open charm production in double parton scattering processes in the forward kinematics

    Energy Technology Data Exchange (ETDEWEB)

    Blok, B. [Technion - Israel Institute of Technology, Department of Physics, Haifa (Israel); Strikman, M. [Pennsylvania State University, Physics Department, University Park, PA (United States)

    2016-12-15

    We calculate the rate of double open charm production in the forward kinematics studied recently in the LHCb experiment. We find that the mean field approximation for the double parton GPD (generalized parton distributions), which neglects parton-parton correlations, underestimates the rate by a factor of 2. The enhancement due to the perturbative QCD correlation 1 x 2 mechanism which explains the rate of double parton interactions at the central rapidities is found to explain 60 / 80% of the discrepancy. We argue that the nonperturbative fluctuations leading to non-factorized (correlated) contributions to the initial conditions for the DGLAP collinear evolution of the double parton GPD play an important role in this kinematics. Combined, the two correlation mechanisms provide a good description of the rate of double charm production reported by the LHCb. We also give predictions for the variation of the σ{sub eff} (i.e. the ratio of double and square of single inclusive rates) in the discussed kinematics as a function of p{sub t}. The account for two correlation mechanisms strongly reduces the sensitivity of the results to the starting point of the QCD evolution. (orig.)

  10. Distributed Realtime Transactions Processing%分布式实时事务处理

    Institute of Scientific and Technical Information of China (English)

    刘云生; 党德鹏

    2001-01-01

    Many realtime applications are inheretly distributed. As developing in centralizied realtime database systems and mature of distributed database system,distributed realtime database systems have been a new hot point. This paper focuses on the distributed transactions processing, which is the key problem in DRTDBS,and gives some possible approaches,in order to provide some advice for future research.

  11. Exploring the Role of Distributed Learning in Distance Education at Allama Iqbal Open University: Academic Challenges at Postgraduate Level

    Directory of Open Access Journals (Sweden)

    Qadir BUKHSH

    2015-01-01

    Full Text Available Distributed learning is derived from the concept of distributed resources. Different institutions around the globe connected through network and the learners are diverse, located in the different cultures and communities. Distributed learning provides global standards of quality to all learners through synchronous and asynchronous communications and provides the opportunity of flexible and independent learning with equity, low cost educational services and has become the first choice of the dispersed learners around the globe. The present study was undertaken to investigate the challenges faced by the Faculty Members of Department of Business Administration and Computer Science at Allama Iqbal Open University Islamabad Pakistan. 25 Faculty Members were taken as sample of the study from both Departments (100% Sampling. The study was qualitative in nature and interview was the data collection tool. Data was analyzed by thematic analysis technique. The major challenges faced by the Faculty Members were as: bandwidth, synchronous learning activities, irregularity of the learners, feedback on individual work, designing and managing the learning activities, quality issues and training to use the network for teaching learning activities

  12. A digital open-loop Doppler processing prototype for deep-space navigation

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    A prototype based on digital radio technology with associated open-loop Doppler signal processing techniques has been developed to measure a spacecraft’s line-of-sight velocity. The prototype was tested in China’s Chang’E-1 lunar mission relying on S-band telemetry signals transmitted by the sat-ellite,with results showing that the residuals had a RMS value of ~3 mm/s (1σ ) using 1-sec integration,which is consistent with the Chinese conventional USB (Unified S-Band) tracking system. Such preci-sion is mainly limited by the short-term stability of the atomic (e.g. rubidium) clock at the uplink ground station. It can also be improved with proper calibration to remove some effects of the transmission media (such as solar plasma,troposphere and ionosphere),and a longer integration time (e.g. down to 0.56 mm/s at 34 seconds) allowed by the spacecraft dynamics. The tracking accuracy can also be in-creased with differential methods that may effectively remove most of the long-term drifts and some of the short-term uncertainties of the uplink atomic clock,thereby further reducing the residuals to the 1 mm/s level. Our experimental tracking data have been used in orbit determination for Chang’E-1,while other applications (such as the upcoming YH-1 Mars orbiter) based on open-loop Doppler tracking will be initiated in the future. Successful application of the prototype to the Chang’E-1 mission in 2008 is believed to have great significance for China’s future deep space exploration.

  13. Gap formation processes in a high-density plasma opening switch

    Science.gov (United States)

    Grossmann, J. M.; Swanekamp, S. B.; Ottinger, P. F.; Commisso, R. J.; Hinshelwood, D. D.; Weber, B. V.

    1995-01-01

    A gap opening process in plasma opening switches (POS) is examined with the aid of numerical simulations. In these simulations, a high density (ne=1014-5×1015 cm-3) uniform plasma initially bridges a small section of the coaxial transmission line of an inductive energy storage generator. A short section of vacuum transmission line connects the POS to a short circuit load. The results presented here extend previous simulations in the ne=1012-1013 cm-3 density regime. The simulations show that a two-dimensional (2-D) sheath forms in the plasma near a cathode. This sheath is positively charged, and electrostatic sheath potentials that are large compared to the anode-cathode voltage develop. Initially, the 2-D sheath is located at the generator edge of the plasma. As ions are accelerated out of the sheath, it retains its original 2-D structure, but migrates axially toward the load creating a magnetically insulated gap in its wake. When the sheath reaches the load edge of the POS, the POS stops conducting current and the load current increases rapidly. At the end of the conduction phase a gap exists in the POS whose size is determined by the radial dimensions of the 2-D sheath. Simulations at various plasma densities and current levels show that the radial size of the gap scales roughly as B/ne, where B is the magnetic field. The results of this work are discussed in the context of long-conduction-time POS physics, but exhibit the same physical gap formation mechanisms as earlier lower density simulations more relevant to short-conduction-time POS.

  14. A digital open-loop Doppler processing prototype for deep-space navigation

    Institute of Scientific and Technical Information of China (English)

    JIAN NianChuan; QIU Shi; FUNG Lai-Wo; ZHANG Hua; WANG Zhen; GOU Wei; SHANG Kun; ZHANG SuJun; WANG MingYuan; SHI Xian; PING JingSong; YAN JianGuo; TANG GeShi; LIU JunZe

    2009-01-01

    A prototype based on digital radio technology with associated open-loop Doppler signal processing techniques has been developed to measure a spacecraft's line-of-sight velocity. The prototype was tested in China's Chang'E-1 lunar mission relying on S-band telemetry signals transmitted by the satellite, with results showing that the residuals had a RMS value of ~3 mm/s (1 σ ) using 1-sec integration, which is consistent with the Chinese conventional USB (Unified S-Band) tracking system. Such precision is mainly limited by the short-term stability of the atomic (e.g. Rubidium) clock at the uplink ground station. It can also be improved with proper calibration to remove some effects of the transmission media (such as solar plasma, troposphere and ionosphere), and a longer integration time (e.g. Down to 0.56 mm/s at 34 seconds) allowed by the spacecraft dynamics. The tracking accuracy can also be increased with differential methods that may effectively remove most of the long-term drifts and some of the short-term uncertainties of the uplink atomic clock, thereby further reducing the residuals to the 1 mm/s level. Our experimental tracking data have been used in orbit determination for Chang'E-1, while other applications (such as the upcoming YH-1 Mars orbiter) based on open-loop Doppler tracking will be initiated in the future. Successful application of the prototype to the Chang'E-1 mission in 2008 is believed to have great significance for China's future deep space exploration.

  15. Query processing in distributed, taxonomy-based information sources

    CERN Document Server

    Meghini, Carlo; Coltella, Veronica; Analyti, Anastasia

    2011-01-01

    We address the problem of answering queries over a distributed information system, storing objects indexed by terms organized in a taxonomy. The taxonomy consists of subsumption relationships between negation-free DNF formulas on terms and negation-free conjunctions of terms. In the first part of the paper, we consider the centralized case, deriving a hypergraph-based algorithm that is efficient in data complexity. In the second part of the paper, we consider the distributed case, presenting alternative ways implementing the centralized algorithm. These ways descend from two basic criteria: direct vs. query re-writing evaluation, and centralized vs. distributed data or taxonomy allocation. Combinations of these criteria allow to cover a wide spectrum of architectures, ranging from client-server to peer-to-peer. We evaluate the performance of the various architectures by simulation on a network with O(10^4) nodes, and derive final results. An extensive review of the relevant literature is finally included.

  16. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  17. OpenGeoSys: An open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media

    DEFF Research Database (Denmark)

    Kolditz, O.; Bauer, S.; Bilke, L.

    In this paper we describe the OpenGeoSys (OGS) project, which is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical processes in porous media. The basic concept is to provide a flexible numerical framework (using primarily the Finite Element Method (FEM......)) for solving multi-field problems in porous and fractured media for applications in geoscience, hydrology and energy storage. To this purpose OGS is based on an object-oriented FEM concept including a broad spectrum of interfaces for pre- and postprocessing. The OGS idea has been in development since the mid......, the parallelization of OO codes still lacks efficiency. High-performance-computing efficiency of OO codes is subject to future research (Wang et al. [2]). Currently, OGS development efforts are dedicated to visual data and model integration for complex hydrological applications (Rink et al. [3])...

  18. Orchestrating the Dynamic Adaptation of Distributed Software With Process Technology

    Science.gov (United States)

    2004-01-01

    Scanning and Software Distribution After Auto Discovery, IBM Red Book, May 9, 2003. [35] Marimba Inc., Marimba Embedded Management - Creating Self...Updating Appliances and Devices, Marimba White Paper, Mountain View, Ca., USA, 2001, http://www.marimba.com/products/datasheets/Embedded-wp-april

  19. Distributed XQuery and Updates Processing with Heterogeneous XQuery Engines

    NARCIS (Netherlands)

    Zhang, Y.; Boncz, P.A.

    2008-01-01

    We demonstrate XRPC, a minimal XQuery extension that enables distributed querying between heterogeneous XQuery engines. The XRPC language extension enhances the existing concept of XQuery functions with the Remote Procedure Call (RPC) paradigm. XRPC is orthogonal to all XQuery features, including th

  20. The redesign of a warranty distribution network with recovery processes

    NARCIS (Netherlands)

    Ashayeri, J.; Ma, N.; Sotirov, R.

    2015-01-01

    A warranty distribution network provides aftersales warranty services to customers and resembles a closed-loop supply chain network with specific challenges for reverse flows management like recovery, repair, and reflow of refurbished products. We present here a nonlinear and nonconvex mixed integer

  1. Effect of solid distribution on elastic properties of open-cell cellular solids using numerical and experimental methods.

    Science.gov (United States)

    Zargarian, A; Esfahanian, M; Kadkhodapour, J; Ziaei-Rad, S

    2014-09-01

    Effect of solid distribution between edges and vertices of three-dimensional cellular solid with an open-cell structure was investigated both numerically and experimentally. Finite element analysis (FEA) with continuum elements and appropriate periodic boundary condition was employed to calculate the elastic properties of cellular solids using tetrakaidecahedral (Kelvin) unit cell. Relative densities between 0.01 and 0.1 and various values of solid fractions were considered. In order to validate the numerical model, three scaffolds with the relative density of 0.08, but different amounts of solid in vertices, were fabricated via 3-D printing technique. Good agreement was observed between numerical simulation and experimental results. Results of numerical simulation showed that, at low relative densities (numerical simulation and considering the relative density and solid fraction in vertices, empirical relations were derived for Young׳s modulus and Poisson׳s ratio.

  2. MAAC: a software tool for user authentication and access control to the electronic patient record in an open distributed environment

    Science.gov (United States)

    Motta, Gustavo H.; Furuie, Sergio S.

    2004-04-01

    Designing proper models for authorization and access control for the electronic patient record (EPR) is essential to wide scale use of the EPR in large health organizations. This work presents MAAC (Middleware for Authentication and Access Control), a tool that implements a contextual role-based access control (RBAC) authorization model. RBAC regulates user"s access to computers resources based on their organizational roles. A contextual authorization uses environmental information available at access-request time, like user/patient relationship, in order to decide whether a user has the right to access an EPR resource. The software architecture where MAAC is implemented uses Lightweight Directory Access Protocol, Java programming language and the CORBA/OMG standards CORBA Security Service and Resource Access Decision Facility. With those open and distributed standards, heterogeneous EPR components can request user authentication and access authorization services in a unified and consistent fashion across multiple platforms.

  3. Image parallel processing based on CUDA and OpenCV%CUDA和OpenCV图像并行处理方法研究

    Institute of Scientific and Technical Information of China (English)

    刘鑫; 姜超; 冯存永

    2012-01-01

    NVIDIA's CUDA architecture programs easier, is more powerful and has more extensive applications than the traditional general-purpose GPU computing, so the introduction of the CUDA architecture to the image processing can improve image processing efficiency. This paper presented a parallel processing method combining OpenCV and CUDA to achieve the image binarization and the integration. The experimental results showed that this method could greatly improve image processing efficiency, and finally it was integrated into the MFC framework in order to apply in practical engineering fields.%CUDA架构与传统GPU通用计算相比,编程更简单、应用领域更广泛,将CUDA架构引入到图像处理中可以提高图像的处理效率.本文提出了一种基于CUDA和OpenCV的图像并行处理方法,实现了图像二值化以及融合,经实验结果表明基于该方法可以提高图像处理效率;将该方法集成到MFC框架,能够应用到实际工程开发领域.

  4. Open water processes of the San Francisco Estuary: From physical forcing to biological responses

    Directory of Open Access Journals (Sweden)

    Wim Kimmerer

    2004-02-01

    Full Text Available This paper reviews the current state of knowledge of the open waters of the San Francisco Estuary. This estuary is well known for the extent to which it has been altered through loss of wetlands, changes in hydrography, and the introduction of chemical and biological contaminants. It is also one of the most studied estuaries in the world, with much of the recent research effort aimed at supporting restoration efforts. In this review I emphasize the conceptual foundations for our current understanding of estuarine dynamics, particularly those aspects relevant to restoration. Several themes run throughout this paper. First is the critical role physical dynamics play in setting the stage for chemical and biological responses. Physical forcing by the tides and by variation in freshwater input combine to control the movement of the salinity field, and to establish stratification, mixing, and dilution patterns throughout the estuary. Many aspects of estuarine dynamics respond to interannual variation in freshwater flow; in particular, abundance of several estuarine-dependent species of fish and shrimp varies positively with flow, although the mechanisms behind these relationships are largely unknown. The second theme is the importance of time scales in determining the degree of interaction between dynamic processes. Physical effects tend to dominate when they operate at shorter time scales than biological processes; when the two time scales are similar, important interactions can arise between physical and biological variability. These interactions can be seen, for example, in the response of phytoplankton blooms, with characteristic time scales of days, to stratification events occurring during neap tides. The third theme is the key role of introduced species in all estuarine habitats; particularly noteworthy are introduced waterweeds and fishes in the tidal freshwater reaches of the estuary, and introduced clams there and in brackish water. The

  5. LatticeQCD using OpenCL

    CERN Document Server

    Bach, Matthias; Pinke, Christopher; Schäfer, Christian; Zeidlewicz, Lars

    2011-01-01

    We report on our implementation of LatticeQCD applications using OpenCL. We focus on the general concept and on distributing different parts on hybrid systems, consisting of both CPUs (Central Processing Units) and GPUs (Graphic Processing Units).

  6. Image Process Technology for Weeding Robots Based on OpenCV%基于 OpenCV 的除草机器人图像处理技术

    Institute of Scientific and Technical Information of China (English)

    刘立强; 蔡晓华; 吴泽全

    2013-01-01

    In order to achieve real-time processing of video images of weeding robot , based on Intel's open source vision library OpenCV technology , this paper puts forward a weeding robot video image processing method , and give the overall program .Using this scheme design weeding robot real-time video image processing system , and in the soil bin testing lab for verification .The test results show that the system can complete the high-speed processing of video images and the po-sitioning of crop plants , All these work provide a solid foundation for the realization of weeding robot end actuator precise motion control .%为实现除草机器人视频图像实时处理,基于Intel 开源视觉库OpenCV 技术,提出了一种除草机器人视频图像处理方法,并给出了整体方案。利用该方案设计除草机器人视频图像实时处理系统,并在土槽实验室进行试验验证。试验结果表明,该系统能够完成视频图像高速处理与作物植株定位,从而为实现除草机器人末端执行机构精确运动控制打下了坚实基础。

  7. Database Selection for Processing k Nearest Neighbors Queries in Distributed Environments.

    Science.gov (United States)

    Yu, Clement; Sharma, Prasoon; Meng, Weiyi; Qin, Yan

    This paper considers the processing of digital library queries, consisting of a text component and a structured component in distributed environments. The paper concentrates on the processing of the structured component of a distributed query. A method is proposed to identify the databases that are likely to be useful for processing any given…

  8. A distributed resource allocation algorithm for many processes

    NARCIS (Netherlands)

    Hesselink, Wim H.

    2013-01-01

    Resource allocation is the problem that a process may enter a critical section CS of its code only when its resource requirements are not in conflict with those of other processes in their critical sections. For each execution of CS, these requirements are given anew. In the resource requirements, l

  9. A distributed resource allocation algorithm for many processes

    NARCIS (Netherlands)

    Hesselink, Wim H.

    Resource allocation is the problem that a process may enter a critical section CS of its code only when its resource requirements are not in conflict with those of other processes in their critical sections. For each execution of CS, these requirements are given anew. In the resource requirements,

  10. Exclusive processes in position space and the pion distribution amplitude

    OpenAIRE

    Braun, Vladimir M.; Müller, Dieter

    2007-01-01

    We suggest to carry out lattice calculations of current correlators in position space, sandwiched between the vacuum and a hadron state (e.g. pion), in order to access hadronic light-cone distribution amplitudes (DAs). In this way the renormalization problem for composite lattice operators is avoided altogether, and the connection to the DA is done using perturbation theory in the continuum. As an example, the correlation function of two electromagnetic currents is calculated to the next-to-n...

  11. Collisional processes and size distribution in spatially extended debris discs

    CERN Document Server

    Thebault, Philippe

    2007-01-01

    We present a new multi-annulus code for the study of collisionally evolving extended debris discs. We first aim to confirm results obtained for a single-annulus system, namely that the size distribution in "real" debris discs always departs from the theoretical collisional equilibrium $dN\\proptoR^{-3.5}dR$ power law, especially in the crucial size range of observable particles (<1cm), where it displays a characteristic wavy pattern. We also aim at studying how debris discs density distributions, scattered light luminosity profiles, and SEDs are affected by the coupled effect of collisions and radial mixing due to radiation pressure affected small grains. The size distribution evolution is modeled from micron-sized grains to 50km-sized bodies. The model takes into account the crucial influence of radiation pressure-affected small grains. We consider the collisional evolution of a fiducial a=120AU radius disc with an initial surface density in $\\Sigma(a)\\propto a^{\\alpha}$. We show that the system's radial e...

  12. Simulation and analysis of ice processes in an artificial open channel

    Institute of Scientific and Technical Information of China (English)

    GUO Xin-lei; YANG Kai-lin; FU Hui; WANG Tao; GUO Yong-xin

    2013-01-01

    The Middle Route Project for South-to-North Water Transfer,which consists of a long artificial open channel and various hydraulic constructions,is a big water conveyance system.A numerical modeling of water conveyance in the ice period for such large-scale and long distance water transfer project is developed based on the integration of a river ice model and an unsteady flow model with complex inner boundaries.A simplified method to obtain the same flow discharge in the upstream and downstream of the structure by neglecting the storage effect is proposed for dealing with the inner boundaries.According to the measured and design data in winter-spring period,the whole ice process,which includes the formation of the ice cover,its development,the melting and the breaking up as well as the ice-water dynamic response during the gate operation for the middle route,is simulated.The ice characteristics and the water conveyance capacity are both analyzed and thus the hydraulic control conditions for a safety regulation are obtained.At last,the uncertainties of some parameters related to the ice model are discussed.

  13. Opening up the Black Box of Sensor Processing Algorithms through New Visualizations

    Directory of Open Access Journals (Sweden)

    Alexander M. Morison

    2016-09-01

    Full Text Available Vehicles and platforms with multiple sensors connect people in multiple roles with different responsibilities to scenes of interest. For many of these human–sensor systems there are a variety of algorithms that transform, select, and filter the sensor data prior to human intervention. Emergency response, precision agriculture, and intelligence, surveillance and reconnaissance (ISR are examples of these human–computation–sensor systems. The authors examined a case of the latter to understand how people in various roles utilize the algorithms output to identify meaningful properties in data streams given uncertainty. The investigations revealed: (a that increasingly complex interactions occur across agents in the human–computation–sensor system; and (b analysts struggling to interpret the output of “black box” algorithms given uncertainty and change in the scenes of interest. The paper presents a new interactive visualization concept designed to “open up the black box” of sensor processing algorithms to support human analysts as they look for meaning in feeds from sensors.

  14. Researchers’ Adoption of an Institutional Central Fund for Open-Access Article-Processing Charges

    Directory of Open Access Journals (Sweden)

    Stephen Pinfield

    2016-01-01

    Full Text Available This article analyzes researchers’ adoption of an institutional central fund (or faculty publication fund for open-access (OA article-processing charges (APCs to contribute to a wider understanding of take-up of OA journal publishing (“Gold” OA. Quantitative data, recording central fund usage at the University of Nottingham from 2006 to 2014, are analyzed alongside qualitative data from institutional documentation. The importance of the settings of U.K. national policy developments and international OA adoption trends are considered. Innovation Diffusion Theory (IDT is used as an explanatory framework. It is shown that use of the central fund grew during the period from covering less than 1% of the University’s outputs to more than 12%. Health and Life Sciences disciplines made greatest use of the fund. Although highly variable, average APC prices rose during the period, with fully OA publishers setting lower average APCs. APCs were paid largely from internal funds, but external funding became increasingly important. Key factors in adoption are identified to be increasing awareness and changing perceptions of OA, communication, disciplinary differences, and adoption mandates. The study provides a detailed longitudinal analysis of one of the earliest central funds to be established globally with a theoretically informed explanatory model to inform future work on managing central funds and developing institutional and national OA policies.

  15. Superplot3d: an open source GUI tool for 3d trajectory visualisation and elementary processing.

    Science.gov (United States)

    Whitehorn, Luke J; Hawkes, Frances M; Dublon, Ian An

    2013-09-30

    When acquiring simple three-dimensional (3d) trajectory data it is common to accumulate large coordinate data sets. In order to examine integrity and consistency of object tracking, it is often necessary to rapidly visualise these data. Ordinarily, to achieve this the user must either execute 3d plotting functions in a numerical computing environment or manually inspect data in two dimensions, plotting each individual axis.Superplot3d is an open source MATLAB script which takes tab delineated Cartesian data points in the form x, y, z and time and generates an instant visualization of the object's trajectory in free-rotational three dimensions. Whole trajectories may be instantly presented, allowing for rapid inspection. Executable from the MATLAB command line (or deployable as a compiled standalone application) superplot3d also provides simple GUI controls to obtain rudimentary trajectory information, allow specific visualization of trajectory sections and perform elementary processing.Superplot3d thus provides a framework for non-programmers and programmers alike, to recreate recently acquired 3d object trajectories in rotatable 3d space. It is intended, via the use of a preference driven menu to be flexible and work with output from multiple tracking software systems. Source code and accompanying GUIDE .fig files are provided for deployment and further development.

  16. The Open Physiology workflow: modeling processes over physiology circuitboards of interoperable tissue units

    Science.gov (United States)

    de Bono, Bernard; Safaei, Soroush; Grenon, Pierre; Nickerson, David P.; Alexander, Samuel; Helvensteijn, Michiel; Kok, Joost N.; Kokash, Natallia; Wu, Alan; Yu, Tommy; Hunter, Peter; Baldock, Richard A.

    2015-01-01

    A key challenge for the physiology modeling community is to enable the searching, objective comparison and, ultimately, re-use of models and associated data that are interoperable in terms of their physiological meaning. In this work, we outline the development of a workflow to modularize the simulation of tissue-level processes in physiology. In particular, we show how, via this approach, we can systematically extract, parcellate and annotate tissue histology data to represent component units of tissue function. These functional units are semantically interoperable, in terms of their physiological meaning. In particular, they are interoperable with respect to [i] each other and with respect to [ii] a circuitboard representation of long-range advective routes of fluid flow over which to model long-range molecular exchange between these units. We exemplify this approach through the combination of models for physiology-based pharmacokinetics and pharmacodynamics to quantitatively depict biological mechanisms across multiple scales. Links to the data, models and software components that constitute this workflow are found at http://open-physiology.org/. PMID:25759670

  17. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  18. Integration of distributed computing into the drug discovery process.

    Science.gov (United States)

    von Korff, Modest; Rufener, Christian; Stritt, Manuel; Freyss, Joel; Bär, Roman; Sander, Thomas

    2011-02-01

    Grid computing offers an opportunity to gain massive computing power at low costs. We give a short introduction into the drug discovery process and exemplify the use of grid computing for image processing, docking and 3D pharmacophore descriptor calculations. The principle of a grid and its architecture are briefly explained. More emphasis is laid on the issues related to a company-wide grid installation and embedding the grid into the research process. The future of grid computing in drug discovery is discussed in the expert opinion section. Most needed, besides reliable algorithms to predict compound properties, is embedding the grid seamlessly into the discovery process. User friendly access to powerful algorithms without any restrictions, that is, by a limited number of licenses, has to be the goal of grid computing in drug discovery.

  19. A distributed resource allocation algorithm for many processes

    CERN Document Server

    Hesselink, Wim H

    2012-01-01

    Resource allocation is the problem that a process may enter a critical section CS of its code only when its resource requirements are not in conflict with those of other processes in their critical sections. For each execution of CS, these requirements are given anew. In the resource requirements, levels can be distinguished, such as e.g. read access or write access. We allow infinitely many processes that communicate by reliable asynchronous messages and have finite memory. A simple starvation-free solution is presented. Processes only wait for one another when they have conflicting resource requirements. The correctness of the solution is argued with invariants and temporal logic. It has been verified with the proof assistant PVS.

  20. When to make proprietary software open source

    NARCIS (Netherlands)

    Caulkins, J.P.; Feichtinger, G.; Grass, D.; Hartl, R.F.; Kort, P.M.; Seidl, A.

    2013-01-01

    Software can be distributed closed source (proprietary) or open source (developed collaboratively). While a firm cannot sell open source software, and so loses potential sales revenue, the open source software development process can have a substantial positive impact on the quality of a software, i

  1. Open Data Within governmental Organisations: Effects, Benefits and Challenges of the Implementation Process

    Directory of Open Access Journals (Sweden)

    Martijn Hartog

    2014-10-01

    Full Text Available This article describes the growth of open government, open data and the means for transparency and accountability but aims to reflect on the bottlenecks and actual practicallity of opening data to the public domain by two governmental bodies. The Municiaplity of The Hague and The Province of South-Holland of The Netherlands are part of 2 research programmes called ‘Government of the Future’, which main goals are to explore and establish knowledge on societal innovation by new applications and possibilities of long term effects of ICT’s in the public sector. Part of these programmes are themes as transparecny and open data, which are  viewed form the somewhat pragmatic and operational side of its applicability. The paper shows the development within the governmental bodies and captivates the ‘readiness’ for open data.

  2. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...

  3. Control of Groundwater Remediation Process as Distributed Parameter System

    Directory of Open Access Journals (Sweden)

    Mendel M.

    2014-12-01

    Full Text Available Pollution of groundwater requires the implementation of appropriate solutions which can be deployed for several years. The case of local groundwater contamination and its subsequent spread may result in contamination of drinking water sources or other disasters. This publication aims to design and demonstrate control of pumping wells for a model task of groundwater remediation. The task consists of appropriately spaced soil with input parameters, pumping wells and control system. Model of controlled system is made in the program MODFLOW using the finitedifference method as distributed parameter system. Control problem is solved by DPS Blockset for MATLAB & Simulink.

  4. Ultralow field emission from thinned, open-ended, and defected carbon nanotubes by using microwave hydrogen plasma processing

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Jian-Hua, E-mail: jhdeng1983@163.com [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Lin; Wang, Fan-Jie; Yu, Bin; Li, Guo-Zheng; Li, De-Jun [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Guo-An [Key Laboratory of Beam Technology and Material Modification of Ministry of Education, Beijing Normal University, Beijing 100875 (China)

    2015-01-01

    Graphical abstract: Thinned, open-ended, and defected carbon nanotubes were prepared by using hydrogen plasma processing. The processed carbon nanotubes have far better field emission performance than that of the pristine ones. - Highlights: • CVD prepared CNT arrays were processed by microwave hydrogen plasma. • Thinned, open-ended, and defected CNTs were obtained. • Processed CNTs have far better field emission performance than the pristine ones. • Processed CNTs have applicable emission stability after being perfectly aged. - Abstract: Ultralow field emission is achieved from carbon nanotubes (CNTs) by using microwave hydrogen plasma processing. After the processing, typical capped CNT tips are removed, with thinned, open-ended, and defected CNTs left. Structural analyses indicate that the processed CNTs have more SP{sup 3}-hybridized defects as compared to the pristine ones. The morphology of CNTs can be readily controlled by adjusting microwave powers, which change the shape of CNTs by means of hydrogen plasma etching. Processed CNTs with optimal morphology are found to have an ultralow turn-on field of 0.566 V/μm and threshold field of 0.896 V/μm, much better than 0.948 and 1.559 V/μm of the as-grown CNTs, respectively. This improved FE performance is ascribed to the structural changes of CNTs after the processing. The thinned and open-ended shape of CNTs can facilitate electron tunneling through barriers and additionally, the increased defects at tube walls can serve as new active emission sites. Furthermore, our plasma processed CNTs exhibit excellent field emission stability at a large emission current density of 10.36 mA/cm{sup 2} after being perfectly aged, showing promising prospects in applications as high-performance vacuum electron sources.

  5. Prescription-induced jump distributions in multiplicative Poisson processes

    Science.gov (United States)

    Suweis, Samir; Porporato, Amilcare; Rinaldo, Andrea; Maritan, Amos

    2011-06-01

    Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative Gaussian white noise, the Stratonovich prescription does not correspond to the well-known midpoint (or any other intermediate) prescription. By introducing an inertial term in the GLE, we show that the Itô and Stratonovich prescriptions naturally arise depending on two time scales, one induced by the inertial term and the other determined by the jump event. We also show that, when the multiplicative noise is linear in the random variable, one prescription can be made equivalent to the other by a suitable transformation in the jump probability distribution. We apply these results to a recently proposed stochastic model describing the dynamics of primary soil salinization, in which the salt mass balance within the soil root zone requires the analysis of different prescriptions arising from the resulting stochastic differential equation forced by multiplicative white Poisson noise, the features of which are tailored to the characters of the daily precipitation. A method is finally suggested to infer the most appropriate prescription from the data.

  6. Deeply Virtual Exclusive Processes and Generalized Parton Distributions

    Energy Technology Data Exchange (ETDEWEB)

    ,

    2011-06-01

    The goal of the comprehensive program in Deeply Virtual Exclusive Scattering at Jefferson Laboratory is to create transverse spatial images of quarks and gluons as a function of their longitudinal momentum fraction in the proton, the neutron, and in nuclei. These functions are the Generalized Parton Distributions (GPDs) of the target nucleus. Cross section measurements of the Deeply Virtual Compton Scattering (DVCS) reaction ep {yields} ep{gamma} in Hall A support the QCD factorization of the scattering amplitude for Q^2 {>=} 2 GeV^2. Quasi-free neutron-DVCS measurements on the Deuteron indicate sensitivity to the quark angular momentum sum rule. Fully exclusive H(e, e'p{gamma} ) measurements have been made in a wide kinematic range in CLAS with polarized beam, and with both unpolarized and longitudinally polarized targets. Existing models are qualitatively consistent with the JLab data, but there is a clear need for less constrained models. Deeply virtual vector meson production is studied in CLAS. The 12 GeV upgrade will be essential for for these channels. The {rho} and {omega} channels reactions offer the prospect of flavor sensitivity to the quark GPDs, while the {phi}-production channel is dominated by the gluon distribution.

  7. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  8. Open Government and (Linked (Open (Government (Data

    Directory of Open Access Journals (Sweden)

    Christian Philipp Geiger

    2012-12-01

    Full Text Available This article explores the opening and the free usage of stored public sector data, supplied by state. In the age of Open Government and Open Data it’s not enough just to put data online. It should be rather weighed out whether, how and which supplied public sector data can be published. Open Data are defined as stored data which could be made accessible in a public interest without any restrictions for usage and distribution. These Open Data can possibly be statistics, geo data, maps, plans, environmental data and weather data in addition to materials of the parliaments, ministries and authorities. The preparation and the free access to existing data permit varied approaches to the reuse of data, discussed in the article. In addition, impulses can be given for Open Government – the opening of state and administration, to more transparency, participation and collaboration as well as to innovation and business development. The Open Data movement tries to get to the bottom of current publication processes in the public sector which could be formed even more friendly to citizens and enterprises.

  9. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    Science.gov (United States)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  10. The certification process of the LHCb distributed computing software

    CERN Document Server

    CERN. Geneva

    2015-01-01

    DIRAC contains around 200 thousand lines of python code, and LHCbDIRAC around 120 thousand. The testing process for each release consists of a number of steps, that includes static code analysis, unit tests, integration tests, regression tests, system tests. We dubbed the full p...

  11. Bubble size distribution in surface wave breaking entraining process

    Institute of Scientific and Technical Information of China (English)

    HAN; Lei; YUAN; YeLi

    2007-01-01

    From the similarity theorem,an expression of bubble population is derived as a function of the air entrainment rate,the turbulent kinetic energy (TKE) spectrum density and the surface tension.The bubble size spectrum that we obtain has a dependence of a-2.5+nd on the bubble radius,in which nd is positive and dependent on the form of TKE spectrum within the viscous dissipation range.To relate the bubble population with wave parameters,an expression about the air entrainment rate is deduced by introducing two statistical relations to wave breaking.The bubble population vertical distribution is also derived,based on two assumptions from two typical observation results.

  12. Dense distributed processing in a hindlimb scratch motor network

    DEFF Research Database (Denmark)

    Guzulaitis, Robertas; Hounsgaard, Jørn Dybkjær

    2014-01-01

    In reduced preparations, hindlimb movements can be generated by a minimal network of neurons in the limb innervating spinal segments. The network of neurons that generates real movements is less well delineated. In an ex vivo carapace-spinal cord preparation from adult turtles (Trachemys scripta ...... of a distributed motor network that secures motor coherence.......In reduced preparations, hindlimb movements can be generated by a minimal network of neurons in the limb innervating spinal segments. The network of neurons that generates real movements is less well delineated. In an ex vivo carapace-spinal cord preparation from adult turtles (Trachemys scripta...... elegans), we show that ventral horn interneurons in mid-thoracic spinal segments are functionally integrated in the hindlimb scratch network. First, mid-thoracic interneurons receive intense synaptic input during scratching and behave like neurons in the hindlimb enlargement. Second, some mid...

  13. Nonlinear plasma processes and the formation of electron kappa distribution

    Science.gov (United States)

    Yoon, Peter

    2016-07-01

    The goal of nonequilibrium statistical mechanics is to establish fundamental relationship between the time irreversible macroscopic dynamics and the underlying time reversible behavior of microscopic system. The paradigm of achieving this seemingly paradoxical goal is through the concept of probability. For classical systems Boltzmann accomplished this through his H theorem and his kinetic equation for dilute gas. Boltzmann's H function is the same as classical extensive entropy aside from the minus sign, and his kinetic equation is applicable for short-range molecular interaction. For plasmas, the long-range electromagnetic force dictates the inter-particular interaction, and the underlying entropy is expected to exhibit non-extensive, or non-additive behavior. Among potential models for the non-additive entropy, the celebrated Tsallis entropy is the most well known. One of the most useful fundamental kinetic equations that governs the long-range plasma interaction is that of weak turbulence kinetic theory. At present, however, there is no clear-cut connection between the Tsallis entropy and the kinetic equations that govern plasma behavior. This can be contrasted to Boltzmann's H theorem, which is built upon his kinetic equation. The best one can do is to show that the consequences of Tsallis entropy and plasma kinetic equation are the same, that is, they both imply kappa distribution. This presentation will overview the physics of electron acceleration by beam-generated Langmuir turbulence, and discuss the asymptotic solution that rigorously can be shown to correspond to the kappa distribution. Such a finding is a strong evidence, if not water-tight proof, that there must be profound inter-relatioship between the Tsallis thermostatistical theory and the plasma kinetic theory.

  14. Open-path FTIR spectroscopy of magma degassing processes during eight lava fountains on Mount Etna

    Science.gov (United States)

    La Spina, Alessandro; Burton, Mike; Allard, Patrick; Alparone, Salvatore; Murè, Filippo

    2016-04-01

    In June-July 2001 a series of 16 discrete lava fountain paroxysms occurred at the Southeast summit crater (SEC) of Mount Etna, preceding a 28-day long violent flank eruption. Each paroxysm was preceded by lava effusion, growing seismic tremor and a crescendo of Strombolian explosive activity culminating into powerful lava fountaining up to 500m in height. During 8 of these 16 events we could measure the chemical composition of the magmatic gas phase (H2O, CO2, SO2, HCl, HF and CO), using open-path Fourier transform infrared (OP-FTIR) spectrometry at ˜1-2km distance from SEC and absorption spectra of the radiation emitted by hot lava fragments. We show that each fountaining episode was characterized by increasingly CO2-rich gas release, with CO2/SO2and CO2/HCl ratios peaking in coincidence with maxima in seismic tremor and fountain height, whilst the SO2/HCl ratio showed a weak inverse relationship with respect to eruption intensity. Moreover, peak values in both CO2/SO2ratio and seismic tremor amplitude for each paroxysm were found to increase linearly in proportion with the repose interval (2-6 days) between lava fountains. These observations, together with a model of volatile degassing at Etna, support the following driving process. Prior to and during the June-July 2001 lava fountain sequence, the shallow (˜2km) magma reservoir feeding SEC received an increasing influx of deeply derived carbon dioxide, likely promoted by the deep ascent of volatile-rich primitive basalt that produced the subsequent flank eruption. This CO2-rich gas supply led to gas accumulation and overpressure in SEC reservoir, generating a bubble foam layer whose periodical collapse powered the successive fountaining events. The anti-correlation between SO2/HCl and eruption intensity is best explained by enhanced syn-eruptive degassing of chlorine from finer particles produced during more intense magma fragmentation.

  15. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  16. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Directory of Open Access Journals (Sweden)

    Konrad J Karczewski

    Full Text Available The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping, a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  17. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Science.gov (United States)

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  18. The Distribution of the Interval between Events of a Cox Process with Shot Noise Intensity

    Directory of Open Access Journals (Sweden)

    Angelos Dassios

    2008-01-01

    Full Text Available Applying piecewise deterministic Markov processes theory, the probability generating function of a Cox process, incorporating with shot noise process as the claim intensity, is obtained. We also derive the Laplace transform of the distribution of the shot noise process at claim jump times, using stationary assumption of the shot noise process at any times. Based on this Laplace transform and from the probability generating function of a Cox process with shot noise intensity, we obtain the distribution of the interval of a Cox process with shot noise intensity for insurance claims and its moments, that is, mean and variance.

  19. Distributed Processing and Cortical Specialization for Speech and Environmental Sounds in Human Temporal Cortex

    Science.gov (United States)

    Leech, Robert; Saygin, Ayse Pinar

    2011-01-01

    Using functional MRI, we investigated whether auditory processing of both speech and meaningful non-linguistic environmental sounds in superior and middle temporal cortex relies on a complex and spatially distributed neural system. We found that evidence for spatially distributed processing of speech and environmental sounds in a substantial…

  20. Space vehicle electrical power processing distribution and control study. Volume 1: Summary

    Science.gov (United States)

    Krausz, A.

    1972-01-01

    A concept for the processing, distribution, and control of electric power for manned space vehicles and future aircraft is presented. Emphasis is placed on the requirements of the space station and space shuttle configurations. The systems involved are referred to as the processing distribution and control system (PDCS), electrical power system (EPS), and electric power generation system (EPGS).

  1. The Maximum Surplus Distribution before Ruin in an Erlang(n)Risk Process Perturbed by Diffusion

    Institute of Scientific and Technical Information of China (English)

    Zhen Zhong ZHANG; Jie Zhong ZOU; Yuan Yuan LIU

    2011-01-01

    In this paper,we consider the distribution of the maximum surplus before ruin in a generalized Erlang(n) risk process (i.e.,convolution of n exponential distributions with possibly different parameters) perturbed by diffusion.It is shown that the maximum surplus distribution before ruin satisfies the integro-differential equation with certain boundary conditions.Explicit expressions are obtained when claims amounts are rationally distributed.Finally,the surplus distribution at the time of ruin and the surplus distribution immediately before ruin are presented.

  2. Simulation model of fatigue crack opening/closing phenomena for predicting RPG load under arbitrary stress distribution field

    Energy Technology Data Exchange (ETDEWEB)

    Toyosada, M.; Niwa, T. [Kyushu Univ., Fukuoka (Japan)

    1995-12-31

    In this paper, Newman`s calculation model is modified to solve his neglected effect of the change of stress distribution ahead of a crack, and to leave elastic plastic materials along the crack surface because of the compatibility of Dugdale model. In addition to above treatment, the authors introduce plastic shrinkage at an immediate generation of new crack surfaces due to emancipation of internal force with the magnitude of yield stress level during unloading process in the model. Moreover, the model is expanded to arbitrary stress distribution field. By using the model, RPG load is simulated for a center notched specimen under constant amplitude loading with various stress ratios and decreased maximum load while keeping minimum load.

  3. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    Science.gov (United States)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  4. EVALUATION OF POLLUTION PREVENTION OPTIONS TO REDUCE STYRENE EMISSIONS FROM FIBER-REINFORCED PLASTIC OPEN MOLDING PROCESSES

    Science.gov (United States)

    Pollution prevention (P2) options to reduce styrene emissions, such as new materials, and application equipment, are commercially available to the operators of open molding processes. However, information is lacking on the emissions reduction that these options can achieve. To me...

  5. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook (EFL

  6. VMStools: Open-source software for the processing, analysis and visualization of fisheries logbook and VMS data

    DEFF Research Database (Denmark)

    Hintzen, Niels T.; Bastardie, Francois; Beare, Doug

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook (...

  7. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook (EFL

  8. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  9. Automation of the CFD Process on Distributed Computing Systems

    Science.gov (United States)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational

  10. Point process models for household distributions within small areal units

    Directory of Open Access Journals (Sweden)

    Zack W. Almquist

    2012-06-01

    Full Text Available Spatio-demographic data sets are increasingly available worldwide, permitting ever more realistic modeling and analysis of social processes ranging from mobility to disease trans- mission. The information provided by these data sets is typically aggregated by areal unit, for reasons of both privacy and administrative cost. Unfortunately, such aggregation does not permit fine-grained assessment of geography at the level of individual households. In this paper, we propose to partially address this problem via the development of point pro- cess models that can be used to effectively simulate the location of individual households within small areal units.

  11. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    Science.gov (United States)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  12. Does open access improve the process and outcome of podiatric care?

    Science.gov (United States)

    Wrobel, James S; Davies, Michael L; Robbins, Jeffrey M

    2011-05-19

    Open access to clinics is a management strategy to improve healthcare delivery. Providers are sometimes hesitant to adopt open access because of fear of increased visits for potentially trivial complaints. We hypothesized open access clinics would result in decreased wait times, increased number of podiatry visits, fewer no shows, higher rates of acute care visits, and lower minor amputation rates over control clinics without open access. This study was a national retrospective case-control study of VHA (Veterans Hospital Administration) podiatry clinics in 2008. Eight case facilities reported to have open podiatry clinic access for at least one year were identified from an email survey. Sixteen control facilities with similar structural features (e.g., full time podiatrists, health tech, residency program, reconstructive foot surgery, vascular, and orthopedic surgery) were identified in the same geographic region as the case facilities. Twenty-two percent of facilities responded to the survey. Fifty-four percent reported open access and 46% did not. There were no differences in facility or podiatry panel size, podiatry visits, or visit frequency between the cases and controls. Podiatry visits trended higher for control facilities but didn't reach statistical significance. Case facilities had more new consults seen within 30 days (96%, 89%; P = 0.050) and lower minor amputation rates (0.62/1,000, 1.0/1,000; P = 0.041). The VHA is the worlds largest managed care organization and it relies on clinical efficiencies as one mechanism to improve the quality of care. Open access clinics had more timely access for new patients and lower rates of minor amputations.

  13. Joint Limit Distributions of Exceedances Point Processes and Partial Sums of Gaussian Vector Sequence

    Institute of Scientific and Technical Information of China (English)

    Zuo Xiang PENG; Jin Jun TONG; Zhi Chao WENG

    2012-01-01

    In this paper,we study the joint limit distributions of point processes of exceedances and partial sums of multivariate Gaussian sequences and show that the point processes and partial sums are asymptotically independent under some mild conditions.As a result,for a sequence of standardized stationary Gaussian vectors,we obtain that the point process of exceedances formed by the sequence (centered at the sample mean) converges in distribution to a Poisson process and it is asymptotically independent of the partial sums.The asymptotic joint limit distributions of order statistics and partial sums are also investigated under different conditions.

  14. The study of the face images processing based on OpenCV%基于OpenCV的人脸图像预处理技术研究

    Institute of Scientific and Technical Information of China (English)

    梁永霖

    2012-01-01

    对采集到的人脸图像进行预处理和训练,以改善图像的视觉效果,提高图像的清晰度,并且使图像更有利于计算机处理,便于对图像进行分割和边缘检测,从而提高人脸图像人别的准确率,为人脸的提取特征值和识别等操作做好准备.利用PCA人脸识别方法,实现简单且识别准确率高,OpenCV的特点是实现了图像处理和计算机视觉方面的很多通用算法,实验结果表明,通过预处理后的人脸图像识别效果更好,识别速度更快.%According to the collected face images, conducting the processing and training. To improve the im- ages of the visual affection and the clarity of the images, which enabled them more easily to computer process.- ing, facilitating images segmentation and edge detection, such as improving the detection accuracy of the face images, preparing for the face feature extraction values and identification operations. Using PCA face recognitior~ method, for its simple and accurate identification rate, the characteristics of OpenCV provide many common a~- gorithms of image processing and computer vision. The experimental results show that, after pretreatment of the images, it has a better face recognition affection and recognition speed.

  15. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    Science.gov (United States)

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  16. A Distributed Processing and Analysis System for Heliophysic Events

    Science.gov (United States)

    Hurlburt, N.; Cheung, M.; Bose, P.

    2008-12-01

    With several Virtual Observatories now under active development, the time is ripe to consider how they will interact to enable integrated studies that span the full range of Heliophysics. We present a solution that builds upon components of the Heliophysics Event Knowledgebase (HEK) being developed for the Solar Dynamics Observatory and the Heliophysics Event List Manager (HELMS), recently selected as part of the NASA VxO program. A Heliophysics Event Analysis and Processing System (HEAPS) could increase the scientific productivity of Heliophysics data by increasing the visibility of relevant events contained within them while decreasing the incremental costs of incorporating more events in research studies. Here we present the relevant precursors to such a system and show how it could operate within the Heliophysics Data Environment.

  17. The Solar-Type Hard-Binary Frequency and Distributions of Orbital Parameters in the Open Cluster M37

    Science.gov (United States)

    Geller, Aaron M.; Meibom, Soren; Barnes, Sydney A.; Mathieu, Robert D.

    2014-02-01

    Binary stars, and particularly the short-period ``hard'' binaries, govern the dynamical evolution of star clusters and determine the formation rates and mechanisms for exotic stars like blue stragglers and X-ray sources. Understanding the near-primordial hard-binary population of star clusters is of primary importance for dynamical models of star clusters, which have the potential to greatly advance our understanding of star cluster evolution. Yet the binary frequencies and distributions of binary orbital parameters (period, eccentricity, etc.) for young coeval stellar populations are poorly known, due to a lack of necessary observations. The young (~540 Myr) open cluster M37 hosts a rich binary population that can be used to empirically define these initial conditions. Importantly, this cluster has been the target of a comprehensive WIYN/Hydra radial-velocity (RV) survey, from which we have already identified a nearly complete sample of 329 solar-type (1.5 data with a multi-epoch RV survey using WIYN/Hydra to derive kinematic orbital solutions for these 82 binaries in M37. This project was granted time in 2013B and scheduled for later this year. We anticipate that about half of the detected binaries in M37 will acquire enough RV measurements (>=10) in 2013B to begin searching for orbital solutions. With this proposal and perhaps one additional semester we should achieve >=10 RV measurements for the remaining binaries.

  18. A Flexsim-based Optimization for the Operation Process of Cold-Chain Logistics Distribution Centre

    Directory of Open Access Journals (Sweden)

    X. Zhu

    2014-04-01

    Full Text Available With people’s increasing concern about food safety, cold-chain logistics distribution centre is playing an important role in preventing food from going bad. Now cold-chain logistics distribution centres have the problems of too much transportation, low degree of automation, unreasonable layout planning, complex distribution process etc. It is important to solve these problems in order to achieve efficient distribution. Firstly the modeling and simulation for the operation process of a fruits and vegetables cold-chain logistics distribution centre by using Flexsim software is realized. Then the paper analyses the preliminary output data and finds out the bottleneck and idle resources. Finally this paper makes adjustments for the system to get a better result which hopes to give a reference for the modeling and simulation for the operation process of other cold-chain logistics distribution centres.

  19. Research on image processing in scotopic vision based on OpenCV%基于OpenCV的暗视觉图像处理的研究

    Institute of Scientific and Technical Information of China (English)

    陈勇; 张虎; 李愿; 郑凡; 谢正祥

    2013-01-01

    为了获取低照度环境下更多的图像信息,本文基于OpenCV这个开源的机器视觉库平台,利用人类视觉对比度分辨率补偿和运动序列的帧间时域滤波,对低照度下的视频进行处理.通过对本地存储的低照度视频的处理,验证了该算法能有效地增强图像的信息.%In order to get more information from the image under low illuminative environment, we process the video frames with the compensation of contrast resolution algorithms utilizing the OpenCV that is a library for real time computer vision. We applied the algorithm to the scotopic video collected before and the results show that the algorithm can effectively enhance the image information.

  20. Interaction of ecological and angler processes: experimental stocking in an open access, spatially structured fishery.

    Science.gov (United States)

    Mee, Jonathan A; Post, John R; Ward, Hillary; Wilson, Kyle L; Newton, Eric; Cantin, Ariane

    2016-09-01

    Effective management of socioecological systems requires an understanding of the complex interactions between people and the environment. In recreational fisheries, which are prime examples of socioecological systems, anglers are analogous to mobile predators in natural predator-prey systems, and individual fisheries in lakes across a region are analogous to a spatially structured landscape of prey patches. Hence, effective management of recreational fisheries across large spatial scales requires an understanding of the dynamic interactions among ecological density dependent processes, landscape-level characteristics, and angler behaviors. We focused on the stocked component of the open access rainbow trout (Oncorhynchus mykiss) fishery in British Columbia (BC), and we used an experimental approach wherein we manipulated stocking densities in a subset of 34 lakes in which we monitored angler effort, fish abundance, and fish size for up to seven consecutive years. We used an empirically derived relationship between fish abundance and fish size across rainbow trout populations in BC to provide a measure of catch-based fishing quality that accounts for the size-abundance trade off in this system. We replicated our experimental manipulation in two regions known to have different angler populations and broad-scale access costs. We hypothesized that angler effort would respond to variation in stocking density, resulting in spatial heterogeneity in angler effort but homogeneity in catch-based fishing quality within regions. We found that there is an intermediate stocking density for a given lake or region at which angler effort is maximized (i.e., an optimal stocking density), and that this stocking density depends on latent effort and lake accessibility. Furthermore, we found no clear effect of stocking density on our measure of catch-based fishing quality, suggesting that angler effort homogenizes catch-related attributes leading to an eroded relationship between

  1. BioFed: federated query processing over life sciences linked open data.

    Science.gov (United States)

    Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich

    2017-03-15

    Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the

  2. New technical design of food packaging makes the opening process easier for patients with hand disorders.

    Science.gov (United States)

    Hensler, Stefanie; Herren, Daniel B; Marks, Miriam

    2015-09-01

    Opening packaged food is a complex daily activity carried out worldwide. Peelable packaging, as used for cheese or meat, causes real problems for many consumers, especially elderly people and those with hand disorders. Our aim was to investigate the possibility of producing meat packaging that is easier for patients with hand disorders to open. One hundred patients with hand osteoarthritis were asked to open a meat package currently available in supermarkets (Type A) and a modified, newly designed version (Type B), and rate their experiences with a consumer satisfaction index (CSI). The mean CSI of the Type B packs was 68.9%, compared with 41.9% for Type A (p food packages that afford greater consumer satisfaction. Such future packaging would benefit not only people with hand disorders but also the population as a whole.

  3. COLLABORATION FOR ENHANCING THE SYSTEM DEVELOPMENT PROCESS IN OPEN SOURCE DILIGENCE

    Directory of Open Access Journals (Sweden)

    Murtaza Hussain Shaikh

    2012-02-01

    Full Text Available According to different opponents and commercial giants in software industries, the open source stylesoftware development has enough capacity to complete successfully the large scale projects. But we haveseen many flaws and loops in collaboration and handling of mega scale projects in open sourceenvironment. Perhaps the collaboration is a key of successful project development. In this article wehave tries to identify different feasible and reliable solution to a better collaboration ways in the opensource system development. Some of the issues also that are found in the development phase of the opensource have been identified and a proposed solution by explaining Successful communities such as GNU,the Apache Software Foundation, and Eclipse Foundation is discusses in this research article. It must bekept in mind that to improvement the collaboration in open source environment both the developmentcommunity and the people should be more creative.

  4. Influence of Processing Parameters on Granularity Distribution of Superalloy Powders during PREP

    Institute of Scientific and Technical Information of China (English)

    Huanming CHEN; Benfu HU; Yiwen ZHANG; Huiying LI; Quanmao YU

    2003-01-01

    In order to investigate the influence of processing parameters on the granularity distribution of superalloy powders during the atomization of plasma rotating electrode processing (PREP), in this paper FGH95 superalloy powders is prepared under different processing conditions by PREP and the influence of PREP processing parameters on the granularity distribution of FGH95 superalloy powders is discussed based on fractal geometry theory. The results show that with the increase of rotating velocity of the self-consuming electrode, the fractal dimension of the granularity distribution increases linearly, which results in the increase of the proportion of smaller powders. The change of interval between plasma gun and the self-consuming electrode has a little effect on the granularity distribution, also the fractal dimension of the granularity distribution changed a little correspondingly.

  5. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    Science.gov (United States)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding

  6. The Gaia-ESO Survey: radial distribution of abundances in the Galactic disc from open clusters and young-field stars

    Science.gov (United States)

    Magrini, L.; Randich, S.; Kordopatis, G.; Prantzos, N.; Romano, D.; Chieffi, A.; Limongi, M.; François, P.; Pancino, E.; Friel, E.; Bragaglia, A.; Tautvaišienė, G.; Spina, L.; Overbeek, J.; Cantat-Gaudin, T.; Donati, P.; Vallenari, A.; Sordo, R.; Jiménez-Esteban, F. M.; Tang, B.; Drazdauskas, A.; Sousa, S.; Duffau, S.; Jofré, P.; Gilmore, G.; Feltzing, S.; Alfaro, E.; Bensby, T.; Flaccomio, E.; Koposov, S.; Lanzafame, A.; Smiljanic, R.; Bayo, A.; Carraro, G.; Casey, A. R.; Costado, M. T.; Damiani, F.; Franciosini, E.; Hourihane, A.; Lardo, C.; Lewis, J.; Monaco, L.; Morbidelli, L.; Sacco, G.; Sbordone, L.; Worley, C. C.; Zaggia, S.

    2017-06-01

    Context. The spatial distribution of elemental abundances in the disc of our Galaxy gives insights both on its assembly process and subsequent evolution, and on the stellar nucleogenesis of the different elements. Gradients can be traced using several types of objects as, for instance, (young and old) stars, open clusters, HII regions, planetary nebulae. Aims: We aim to trace the radial distributions of abundances of elements produced through different nucleosynthetic channels - the α-elements O, Mg, Si, Ca and Ti, and the iron-peak elements Fe, Cr, Ni and Sc - by use of the Gaia-ESO IDR4 results for open clusters and young-field stars. Methods: From the UVES spectra of member stars, we have determined the average composition of clusters with ages > 0.1 Gyr. We derived statistical ages and distances of field stars. We traced the abundance gradients using the cluster and field populations and compared them with a chemo-dynamical Galactic evolutionary model. Results: The adopted chemo-dynamical model, with the new generation of metallicity-dependent stellar yields for massive stars, is able to reproduce the observed spatial distributions of abundance ratios, in particular the abundance ratios of [O/Fe] and [Mg/Fe] in the inner disc (5 kpc

  7. Carrier-Grade Distributed Open Cloud Computing for Telecom Operators%电信运营商分布式开放云计算

    Institute of Scientific and Technical Information of China (English)

    王大鹏; 邢凯; 孙家飞

    2013-01-01

    This paper researches how to develop cloud computing services for telecom operators, points out the change of business model of operators during the process of cloud computing evolution, analyzes chal enges and needs for operators level cloud computing, designs operators level distributed open cloud computing system structure, and puts forward the selection of the software and hardware, distributed deployment method, operation and service mode of some typical operators level cloud computing center construction process. Final y, it summarizes the operators level cloud computing platform construction step, and puts forward the evolution direction of platform system architecture and combination of user experience and internet of things cloud computing.%  文章针对电信运营商如何开展云计算服务进行了研究,指出运营商向云计算演进过程中商业模式的变化,分析运营商级云计算面临的挑战和需求,设计运营商级分布式开放云计算的系统架构。并提出典型运营商级云计算中心搭建过程中的软硬件选择、分布式部署方法、运营与服务提供模式。最后总结了运营商级云计算平台建设步骤,提出平台系统架构、用户体验和物联网云计算结合方面的演进方向。

  8. TRACC: an open source software for processing sap flux data from thermal dissipation probes

    Science.gov (United States)

    Eric J. Ward; Jean-Christophe Domec; John King; Ge Sun; Steve McNulty; Asko Noormets

    2017-01-01

    Key message TRACC is an open-source software for standardizing the cleaning, conversion, and calibration of sap flux density data from thermal dissipation probes, which addresses issues of nighttime transpiration and water storage. Abstract Thermal dissipation probes (TDPs) have become a widely used method of monitoring plant water use in recent years. The use of TDPs...

  9. The Positions of Virtual Knowledge Brokers in the Core Process of Open Innovation

    NARCIS (Netherlands)

    Hacievliyagil, N.K.; Maisonneuve, Y.E.; Auger, J.F.; Hartmann, L.

    2007-01-01

    Several companies are implementing the strategy of open innovation in their research and development operations. They become more dependent, therefore, on their capabilities to exchange knowledge and technology with external parties. To facilitate these exchanges, virtual knowledge brokers use web-b

  10. A Genetic Algorithm-Based Approach for Process Scheduling In Distributed Operating Systems

    Directory of Open Access Journals (Sweden)

    2012-01-01

    Full Text Available A Distributed Computing System comprising networked heterogeneous processors requires efficient process allocation algorithms to achieve minimum turnaround time and highest possible throughput. To efficiently execute processes on a distributed system, processes must be correctly assigned to processors and determine the execution order of processes so that the overall execution time is minimized. Even when target processors are fully connected and the communication among processors is fast and no dependencies exist among processes the scheduling problem is NP-complete. Complexity of scheduling problem dependent of number of processors, process execution time and the processor network topology. As distributed systems exist in kinds of homogeneous and heterogeneous, in heterogeneous systems the difference between processors leads to different execution time for an individual process on different processors and makes scheduling problem more complex. Our proposed genetic algorithm is applicable for both homogeneous and heterogeneous kinds.

  11. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  12. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  13. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    Science.gov (United States)

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  14. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    Science.gov (United States)

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  15. Limit Distributions for Generalized Ji(r)ina Processes with Immigration

    Institute of Scientific and Technical Information of China (English)

    Yu Qiang LI

    2011-01-01

    A relationship between continuous state population-size-dependent branching(CSDB)processes with or without immigration and discrete state population-size-dependent branching(DSDB)processes with or without immigration is established via the representation of the former.Based on this relationship,some limiting distributions of CSDB processes with or without immigration are obtained.

  16. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    Science.gov (United States)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  17. Subcritical, Critical and Supercritical Size Distributions in Random Coagulation-Fragmentation Processes

    Institute of Scientific and Technical Information of China (English)

    Dong HAN; Xin Sheng ZHANG; Wei An ZHENG

    2008-01-01

    We consider the asymptotic probability distribution of the size of a reversible random coagula-tion-fragmentation process in the thermodynamic limit.We prove that the distributions of small,medium and the largest clusters converge to Gaussian,Poisson and 0-1 distributions in the supercritical stage (post-gelation),respectively.We show also that the mutually dependent distributions of clusters will become independent after the occurrence of a gelation transition.Furthermore,it is proved that all the number distributions of clusters are mutually independent at the critical stage (gelation),but the distributions of medium and the largest clusters are mutually dependent with positive correlation coe .cient in the supercritical stage.When the fragmentation strength goes to zero,there will exist only two types of clusters in the process,one type consists of the smallest clusters, the other is the largest one which has a size nearly equal to the volume (total number of units).

  18. Cache-Based Aggregate Query Shipping: An Efficient Scheme of Distributed OLAP Query Processing

    Institute of Scientific and Technical Information of China (English)

    Hua-Ming Liao; Guo-Shun Pei

    2008-01-01

    Our study introduces a novel distributed query plan refinement phase in an enhanced architecture of distributed query processing engine (DQPE). Query plan refinement generates potentially efficient distributed query plan by reusable aggregate query shipping (RAQS) approach. The approach improves response time at the cost of pre-processing time. If theoverheads could not be compensated by query results reusage, RAQS is no more favorable. Therefore a global cost estimation model is employed to get proper operators: RR_Agg, R_Agg, or R_Scan. For the purpose of reusing results of queries with aggregate function in distributed query processing, a multi-level hybrid view caching (HVC) scheme is introduced. The scheme retains the advantages of partial match and aggregate query results caching. By our solution, evaluations with distributed TPC-H queries show significant improvement on average response time.

  19. Importing and Processing of OBJ Model Files in OpenGL%OBJ模型文件在OpenGL中的导入与处理

    Institute of Scientific and Technical Information of China (English)

    鲁志伟

    2013-01-01

    在对OBJ模型的文件格式和OpenGL研究的基础上,提出了一种将OBJ模型文件的模型数据进行分离、提取并导入到OpenGL中,然后运用OpenGL的强大功能实现三维模型的显示和交互操作的方法.该方法可以将三维建模软件产生的OBJ模型导入OpenGL中,降低OpenGL建模的难度.

  20. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  1. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  2. Signal Processing for Fibre-optic Distributed Sensing Techniques Employing Brillouin Scattering

    Institute of Scientific and Technical Information of China (English)

    XIAO Shang-hui; LI Li

    2009-01-01

    As fibre optic distributed scattering sensing systems are providing innovative solutions for the monitoring of large structures, Brillouin-based distributed scattering sensing techniques represent a new physical approach for structures health monitoring, which seems extremely promising and is receiving most attentions. This paper comprehensively presents some methods of signal interrogation for fibre optic Brillouin-based distributed scattering sensing technology, especially establishes an accurate Pseudo-Voigt model of Brillouin gain spectrum and gives some results on spectrum analysis and data processing.

  3. Modeling chemical and aerosol processes in the transition from closed to open cells during VOCALS-REx

    Directory of Open Access Journals (Sweden)

    J. Kazil

    2011-08-01

    Full Text Available Chemical and aerosol processes in the transition from closed- to open-cell circulation in the remote, cloudy marine boundary layer are explored. It has previously been shown that precipitation can initiate a transition from the closed- to the open-cellular state, but that the boundary layer cannot maintain this open-cell state without a resupply of cloud condensation nuclei (CCN. Potential sources of CCN include wind-driven production of sea salt from the ocean, nucleation from the gas phase, and entrainment from the free troposphere. In order to investigate CCN sources in the marine boundary layer and their role in supplying new particles, we have coupled in detail chemical, aerosol, and cloud processes in the WRF/Chem model, and added state-of-the-art representations of sea salt emissions and aerosol nucleation. We conduct numerical simulations of the marine boundary layer in the transition from a closed- to an open-cell state. Results are compared with observations in the Southeast Pacific boundary layer during the VAMOS Ocean-Cloud-Atmosphere-Land Study Regional Experiment (VOCALS-REx. The transition from the closed- to the open-cell state generates conditions that are conducive to nucleation by forming a cloud-scavenged, ultra-clean layer below the inversion base. Open cell updrafts loft dimethyl sulfide from the ocean surface into the ultra-clean layer, where it is oxidized during daytime to SO2 and subsequently to H2SO4. Low H2SO4 condensation sink values in the ultra-clean layer allow H2SO4 to rise to concentrations at which aerosol nucleation produces new aerosol in significant numbers. The existence of the ultra-clean layer is confirmed by observations. We find that the observed DMS flux from the ocean in the VOCALS-REx region can support a nucleation source of aerosol in open cells that exceeds sea salt emissions in terms of the number of particles produced

  4. Open-Source Multi-Language Audio Database for Spoken Language Processing Applications

    Science.gov (United States)

    2012-12-01

    Chinese language has a large number of dialects with a resulting big influence on how people pronounce Mandarin [9]. Accents among Mandarin speakers...large open source data base of speech passages from web sites such as You Tube. 300 passages were collected in each of three languages— English ...additional native language listeners. The English and Mandarin were then forced aligned and labeled at the phonetic level using a combination of

  5. Brunn: An open source laboratory information system for microplates with a graphical plate layout design process

    OpenAIRE

    Larsson Rolf; Spjuth Ola; Andersson Claes; Alvarsson Jonathan; Wikberg Jarl ES

    2011-01-01

    Abstract Background Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. Results A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate ...

  6. Open-source intelligence in the Czech military knowledge syst em and process design

    OpenAIRE

    Krejci, Roman

    2002-01-01

    Owing to the recent transitions in the Czech Republic, the Czech military must satisfy a large set of new requirements. One way the military intelligence can become more effective and can conserve resources is by increasing the efficiency of open-source intelligence (OSINT), which plays an important part in intelligence gathering in the age of information. When using OSINT effectively, the military intelligence can elevate its responsiveness to different types of crises and can also properly ...

  7. Deriving robust distributed business processes with automated transformations of fallible component processes

    NARCIS (Netherlands)

    Wang, Lei; Ferreira Pires, Luis; van Sinderen, Marten J.; Wombacher, Andreas; Chi, Chihung

    2015-01-01

    Due to the possibility of system crashes and network failures, the design of robust interactions for collaborative business processes is a challenge. If a process changes state, it sends messages to other relevant processes to inform them about this change. However, server crashes and network failur

  8. Open and Closed R&D Processes: Internal Versus External Knowledge

    Directory of Open Access Journals (Sweden)

    Mohammed Saleh Al.Ansari

    2013-02-01

    Full Text Available In an attempt to help keep up with the ever changing business environments, firms arecontinuously attempting to find ways to open up their organizations boundaries, enablingexternal sources to be used. Through a means of restructuring their R&D systems, firmswill face challenges when it comes to balancing their external and internal R&D activities,allowing them to profit from the amount of external knowledge that they retrieve.Throughout this paper, discussions on the influences that R&D configuration has on theperformance of firms and moderating R&D capacity will be discussed further.According to researched findings, firms that continue to rely on external R&D actionsexperience an increase in innovative performance, up to a certain point. Larger shares ofexternal R&D services will reduce a firm’s performance. Finding the perfect mediumground is something that all chemical engineering firms will need to work on, in order toincrease innovative performance, but not ruin their performance in the same respects. Thispaper will provide an increased amount of understanding in regards to the open innovationparadigm, which suggests that opportunity costs open for R&D borders is a lot higher forfirms that possess high quality technologically based knowledge stock (Elsevier B.V., 2012.

  9. Ratio limits and limiting conditional distributions for discrete-time birth-death processes

    NARCIS (Netherlands)

    Doorn, van Erik A.; Schrijner, Pauline

    1995-01-01

    We consider discrete-time birth-death processes with an absorbing state and study the conditional state distribution at time n given that absorption has not occurred by that time but will occur eventually. In particular, we establish conditions for the convergence of these distributions to a proper

  10. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation

    OpenAIRE

    Barham, H. P.; Collister, P.; V. D. Eusterman; Terella, A. M.

    2015-01-01

    Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF) demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based pa...

  11. Stationary distributions for a class of generalized Fleming-Viot processes

    CERN Document Server

    Handa, Kenji

    2012-01-01

    We identify stationary distributions of generalized Fleming-Viot processes with jump mechanisms specified by certain beta laws together with a parameter measure. Each of these distributions is obtained from normalized stable random measures after a suitable biased transformation followed by mixing by the law of a Dirichlet random measure with the same parameter measure. The calculations are based primarily on the well-known relationship to measure-valued branching processes with immigration.

  12. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  13. LWT Based Sensor Node Signal Processing in Vehicle Surveillance Distributed Sensor Network

    Science.gov (United States)

    Cha, Daehyun; Hwang, Chansik

    Previous vehicle surveillance researches on distributed sensor network focused on overcoming power limitation and communication bandwidth constraints in sensor node. In spite of this constraints, vehicle surveillance sensor node must have signal compression, feature extraction, target localization, noise cancellation and collaborative signal processing with low computation and communication energy dissipation. In this paper, we introduce an algorithm for light-weight wireless sensor node signal processing based on lifting scheme wavelet analysis feature extraction in distributed sensor network.

  14. A distributed multiprocessor system designed for real-time image processing

    Science.gov (United States)

    Yin, Zhiyi; Heng, Wei

    2008-11-01

    In real-time image processing, a large amount of data is needed to be processed at a very high speed. Considering the problems faced in real-time image processing, a distributed multiprocessor system is proposed in this paper. In the design of the distributed multiprocessor system, processing tasks are allocated to various processes, which are bound to different CPUs. Several designs are discussed, and making full use of every process is very important to system's excellent performance. Furthermore, the problems of realization fasten on the inter-process communication, the synchronization, and the stability. System analysis and performance tests both show that the distributed multiprocessor system is able to improve system's performance variously, including the delay, the throughput rate, the stability, the scalability. And the system can be expanded easy at aspects of software and hardware. In a word, the distributed multiprocessor system designed for real-time image processing, based on distributed algorithms, not only improves system's performance variously, but also costs low and expands easy.

  15. Ultralow field emission from thinned, open-ended, and defected carbon nanotubes by using microwave hydrogen plasma processing

    Science.gov (United States)

    Deng, Jian-Hua; Cheng, Lin; Wang, Fan-Jie; Yu, Bin; Li, Guo-Zheng; Li, De-Jun; Cheng, Guo-An

    2015-01-01

    Ultralow field emission is achieved from carbon nanotubes (CNTs) by using microwave hydrogen plasma processing. After the processing, typical capped CNT tips are removed, with thinned, open-ended, and defected CNTs left. Structural analyses indicate that the processed CNTs have more SP3-hybridized defects as compared to the pristine ones. The morphology of CNTs can be readily controlled by adjusting microwave powers, which change the shape of CNTs by means of hydrogen plasma etching. Processed CNTs with optimal morphology are found to have an ultralow turn-on field of 0.566 V/μm and threshold field of 0.896 V/μm, much better than 0.948 and 1.559 V/μm of the as-grown CNTs, respectively. This improved FE performance is ascribed to the structural changes of CNTs after the processing. The thinned and open-ended shape of CNTs can facilitate electron tunneling through barriers and additionally, the increased defects at tube walls can serve as new active emission sites. Furthermore, our plasma processed CNTs exhibit excellent field emission stability at a large emission current density of 10.36 mA/cm2 after being perfectly aged, showing promising prospects in applications as high-performance vacuum electron sources.

  16. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    Science.gov (United States)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  17. Reaction Mechanism and Distribution Behavior of Arsenic in the Bottom Blown Copper Smelting Process

    Directory of Open Access Journals (Sweden)

    Qinmeng Wang

    2017-08-01

    Full Text Available The control of arsenic, a toxic and carcinogenic element, is an important issue for all copper smelters. In this work, the reaction mechanism and distribution behavior of arsenic in the bottom blown copper smelting process (SKS process were investigated and compared to the flash smelting process. There are obvious differences of arsenic distribution in the SKS process and flash process, resulting from the differences of oxygen potentials, volatilizations, smelting temperatures, reaction intensities, and mass transfer processes. Under stable production conditions, the distributions of arsenic among matte, slag, and gas phases are 6%, 12%, and 82%, respectively. Less arsenic is reported in the gas phase with the flash process than with the SKS process. The main arsenic species in gas phase are AsS (g, AsO (g, and As2 (g. Arsenic exists in the slag predominantly as As2O3 (l, and in matte as As (l. High matte grade is harmful to the elimination of arsenic to gas. The changing of Fe/SiO2 has slight effects on the distributions of arsenic. In order to enhance the removal of arsenic from the SKS smelting system to the gas phase, low oxygen concentration, low ratios of oxygen/ore, and low matte grade should be chosen. In the SKS smelting process, no dust is recycled, and almost all dust is collected and further treated to eliminate arsenic and recover valuable metals by other process streams.

  18. Numerical prediction of temperature distribution in thermoset composites during laser curing process

    Institute of Scientific and Technical Information of China (English)

    吴存真; 孙志坚; 徐剑锋; 秦悦慧

    2002-01-01

    The temperature distribution in the advanced thermoset composite during the laser curing process was predicted with the use of the two-dimensional thermo-chemical model presented in this paper which also gives the governing equations based on the thermal history of the curing process. The finite-difference method was used to get the temperature distribution. This paper also deals with the effect of some factors (such as the winding velocity, the tape thickness and the laser heat source) on the temperature distribution.

  19. Digi-Clima Grid: image processing and distributed computing for recovering historical climate data

    Directory of Open Access Journals (Sweden)

    Sergio Nesmachnow

    2015-12-01

    Full Text Available This article describes the Digi-Clima Grid project, whose main goals are to design and implement semi-automatic techniques for digitalizing and recovering historical climate records applying parallel computing techniques over distributed computing infrastructures. The specific tool developed for image processing is described, and the implementation over grid and cloud infrastructures is reported. A experimental analysis over institutional and volunteer-based grid/cloud distributed systems demonstrate that the proposed approach is an efficient tool for recovering historical climate data. The parallel implementations allow to distribute the processing load, achieving accurate speedup values.

  20. The Open Method of Co-ordination and the Analysis of Mutual Learning Processes of the European Employment Strategy

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    2005-01-01

    The purpose of this paper is to address two normative and interlinked methodological and theoretical questions concerning the Open Method of Coordination (OMC): First, what is the most appropriate approach to learning in the analyses of the processes of the European Employment Strategy (EES......)? Second, how should mutual learning processes be diffused among the Member States in order to be efficient? In answering these two questions the paper draws on a social constructivist approach to learning thereby contributing to the debate about learning in the political science literature. At the same...

  1. All-digital signal-processing open-loop fiber-optic gyroscope with enlarged dynamic range.

    Science.gov (United States)

    Wang, Qin; Yang, Chuanchuan; Wang, Xinyue; Wang, Ziyu

    2013-12-15

    We propose and realize a new open-loop fiber-optic gyroscope (FOG) with an all-digital signal-processing (DSP) system where an all-digital phase-locked loop is employed for digital demodulation to eliminate the variation of the source intensity and suppress the bias drift. A Sagnac phase-shift tracking method is proposed to enlarge the dynamic range, and, with its aid, a new open-loop FOG, which can achieve a large dynamic range and high sensitivity at the same time, is realized. The experimental results show that compared with the conventional open-loop FOG with the same fiber coil and optical devices, the proposed FOG reduces the bias instability from 0.259 to 0.018 deg/h, and the angle random walk from 0.031 to 0.006 deg/h(1/2), moreover, enlarges the dynamic range to ±360 deg/s, exceeding the maximum dynamic range ±63 deg/s of the conventional open-loop FOG.

  2. Process Design and Economics for the Production of Algal Biomass: Algal Biomass Production in Open Pond Systems and Processing Through Dewatering for Downstream Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Markham, Jennifer [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kinchin, Christopher [National Renewable Energy Lab. (NREL), Golden, CO (United States); Grundl, Nicholas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tan, Eric C.D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Humbird, David [DWH Process Consulting, Denver, CO (United States)

    2016-02-17

    This report describes in detail a set of aspirational design and process targets to better understand the realistic economic potential for the production of algal biomass for subsequent conversion to biofuels and/or coproducts, based on the use of open pond cultivation systems and a series of dewatering operations to concentrate the biomass up to 20 wt% solids (ash-free dry weight basis).

  3. On the Control of Automatic Processes: A Parallel Distributed Processing Account of the Stroop Effect.

    Science.gov (United States)

    Cohen, Jonathan D.; And Others

    1990-01-01

    It is proposed that attributes of automatization depend on the strength of a processing pathway, and that strength increases with training. With the Stroop effect as an example, automatic processes are shown through simulation to be continuous and to emerge gradually with practice. (SLD)

  4. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  5. Performance enhancement of open loop gas recovery process by centrifugal separation of gases

    Science.gov (United States)

    Kalmani, S. D.; Joshi, A. V.; Bhattacharya, S.; Hunagund, P. V.

    2016-11-01

    The proposed INO-ICAL detector [1] is going to be instrumented with 28800 RPCs (Resistive Plate Chamber). These RPCs (2 × 2 m2 size) will consist of two glass electrodes separated by 2 mm and will use a gas mixture of Freon R134a, isobutane and sulphur hexafluoride (in the ratio of 95.3:4.5:0.2). An Open Ended System (OES), in which the gas mixture is vented to the atmosphere after a single passage through the detector, is most commonly used for small detector setups. However, OES cannot be used with the INO-ICAL detector due to reasons of cost and pollution. It is necessary, therefore, to recirculate the gas mixture in a closed loop. In a Closed Loop gas System (CLS) [2] the gas mixture is purified and recirculated after flowing through the RPC. The impurities which get accumulated in the gas mixture due to leaks or formation of radicals are removed by suitable filters. The Open Loop System (OLS) [3] is based on the separation and recovery of major gas components after passage of the gas mixture through the RPCs. and has the advantage that it does not need filters for removal of impurities. However a CLS is found to be more efficient than OLS in the recovery of gases in the mixture. In this paper we discuss centrifugal separation [4] as a technique to extract major gas constituents and use this technique to improve the efficiency of OLS. Results from preliminary trial runs are reported.

  6. A Two-layer Model for the Simulation of the VARTM Process with Resin Distribution Layer

    Science.gov (United States)

    Young, Wen-Bin

    2013-12-01

    Vacuum assisted resin transfer molding (VARTM) is one of the important processes to fabricate high performance composites. In this process, resin is drawn into the mold to impregnate the fiber reinforcement to a form composite. A resin distribution layer with high permeability was often introduced on top of the fiber reinforcement to accelerate the filling speed. Due to the difference of the flow resistance in the resin distribution layer and the reinforcement as well as the resulting through thickness transverse flow, the filling flow field is intrinsically three-dimensional. This study developed a two-layer model with two-dimensional formulation to simulate the filling flow of the VARTM process with a resin distribution layer. Two-dimensional flow was considered in each layer and a transverse flow in the thickness direction was estimated between the two layers. Thermal analysis including the transverse convection was also performed to better simulate the temperature distribution.

  7. An OpenMP Parallelisation of Real-time Processing of CERN LHC Beam Position Monitor Data

    CERN Document Server

    Renshall, H

    2012-01-01

    SUSSIX is a FORTRAN program for the post processing of turn-by-turn Beam Position Monitor (BPM) data, which computes the frequency, amplitude, and phase of tunes and resonant lines to a high degree of precision. For analysis of LHC BPM data a specific version run through a C steering code has been implemented in the CERN Control Centre to run on a server under the Linux operating system but became a real time computational bottleneck preventing truly online study of the BPM data. Timing studies showed that the independent processing of each BPMs data was a candidate for parallelization and the Open Multiprocessing (OpenMP) package with its simple insertion of compiler directives was tried. It proved to be easy to learn and use, problem free and efficient in this case reaching a factor of ten reductions in real-time over twelve cores on a dedicated server. This paper reviews the problem, shows the critical code fragments with their OpenMP directives and the results obtained.

  8. Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application

    National Research Council Canada - National Science Library

    Lia Duarte; Ana Claudia Teodoro; Jose Alberto Gonçalves; Joana Ribeiro; Deolinda Flores; Alexia Lopez-Gil; Alejandro Dominguez-Lopez; Xabier Angulo-Vinuesa; Sonia Martin-Lopez; Miguel Gonzalez-Herraez

    2017-01-01

    .... The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment-GIS-ECOAL (freely available...

  9. Mobile Open-Source Solar-Powered 3-D Printers for Distributed Manufacturing in Off-Grid Communities

    National Research Council Canada - National Science Library

    Debbie L. King; Adegboyega Babasola; Joseph Rozario; Joshua M. Pearce

    2014-01-01

    .... This study designs and demonstrates the technical viability of two open-source mobile digital manufacturing facilities powered with solar photovoltaics, and capable of printing customizable OSAT in any...

  10. Effect of Polymerization Condition on Particle Size Distribution in St/BA/MAA Emulsion Polymerization Process

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A series of St/BA/MAA emulsion polymerizations was carried out. By using PCS (photon correlation spectroscopy), the particle size distribution(PSD) of the whole St/BA/MAA emulsion polymerization process was gotten easily and quickly. The effect of polymerization condition on PSD in St/BA/MAA emulsion process was discussed.

  11. Temporal bone radiology report classification using open source machine learning and natural langue processing libraries.

    Science.gov (United States)

    Masino, Aaron J; Grundmeier, Robert W; Pennington, Jeffrey W; Germiller, John A; Crenshaw, E Bryan

    2016-06-06

    Radiology reports are a rich resource for biomedical research. Prior to utilization, trained experts must manually review reports to identify discrete outcomes. The Audiological and Genetic Database (AudGenDB) is a public, de-identified research database that contains over 16,000 radiology reports. Because the reports are unlabeled, it is difficult to select those with specific abnormalities. We implemented a classification pipeline using a human-in-the-loop machine learning approach and open source libraries to label the reports with one or more of four abnormality region labels: inner, middle, outer, and mastoid, indicating the presence of an abnormality in the specified ear region. Trained abstractors labeled radiology reports taken from AudGenDB to form a gold standard. These were split into training (80 %) and test (20 %) sets. We applied open source libraries to normalize and convert every report to an n-gram feature vector. We trained logistic regression, support vector machine (linear and Gaussian), decision tree, random forest, and naïve Bayes models for each ear region. The models were evaluated on the hold-out test set. Our gold-standard data set contained 726 reports. The best classifiers were linear support vector machine for inner and outer ear, logistic regression for middle ear, and decision tree for mastoid. Classifier test set accuracy was 90 %, 90 %, 93 %, and 82 % for the inner, middle, outer and mastoid regions, respectively. The logistic regression method was very consistent, achieving accuracy scores within 2.75 % of the best classifier across regions and a receiver operator characteristic area under the curve of 0.92 or greater across all regions. Our results indicate that the applied methods achieve accuracy scores sufficient to support our objective of extracting discrete features from radiology reports to enhance cohort identification in AudGenDB. The models described here are available in several free, open source libraries that

  12. Baseliner: an open source, interactive tool for processing sap flux data from thermal dissipation probes.

    Science.gov (United States)

    Andrew C. Oishi; David Hawthorne; Ram Oren

    2016-01-01

    Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...

  13. Numerical modeling of cold room's hinged door opening and closing processes

    Science.gov (United States)

    Carneiro, R.; Gaspar, P. D.; Silva, P. D.; Domingues, L. C.

    2016-06-01

    The need of rationalize energy consumption in agrifood industry has fasten the development of methodologies to improve the thermal and energy performances of cold rooms. This paper presents a three-dimensional (3D) transient Computational Fluid Dynamics (CFD) modelling of a cold room to evaluate the air infiltration rate through hinged doors. A species transport model is used for modelling the tracer gas concentration decay technique. Numerical predictions indicate that air temperature difference between spaces affects the air infiltration. For this case study, the infiltration rate increases 0.016 m3 s-1 per K of air temperature difference. The knowledge about the evolution of air infiltration during door opening/closing times allows to draw some conclusions about its influence on the air conditions inside the cold room, as well as to suggest best practices and simple technical improvements that can minimize air infiltration, and consequently improve thermal performance and energy consumption rationalization.

  14. Electron ionization of open/closed chain isocarbonic molecules relevant in plasma processing: Theoretical cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Umang R., E-mail: umangpatel193@yahoo.ca [Gandhinagar Institute of Technology, Moti Bhoyan, Gandhinagar-382721, Gujarat (India); Sardar Patel University, Vallabh Vidyanagar-388120, Gujarat (India); Joshipura, K. N.; Pandya, Siddharth H. [Sardar Patel University, Vallabh Vidyanagar-388120, Gujarat (India); Kothari, Harshit N. [Universal College of Engineering and Technology, Moti Bhoyan, Gandhinagar-382721, Gujarat (India)

    2014-01-28

    In this paper, we report theoretical electron impact ionization cross sections from threshold to 2000 eV for isocarbonic open chain molecules C{sub 4}H{sub 6}, C{sub 4}H{sub 8}, C{sub 4}F{sub 6} including their isomers, and closed chain molecules c-C{sub 4}H{sub 8} and c-C{sub 4}F{sub 8}. Theoretical formalism employed presently, viz., Complex Scattering Potential-ionization contribution method has been used successfully for a variety of polyatomic molecules. The present ionization calculations are very important since results available for the studied targets are either scarce or none. Our work affords comparison of C{sub 4} containing hydrocarbon versus fluorocarbon molecules. Comparisons of the present ionization cross sections are made wherever possible, and new ionization data are also presented.

  15. Open Pension Funds in Poland: The Efects of the Pension Privatization Process

    Directory of Open Access Journals (Sweden)

    Oręziak Leokadia

    2014-10-01

    Full Text Available Since their establishment in 1999, the Open Pension Funds (OPFs have comprised a mandatory capital pillar in the pension system of Poland. The paper`s objective is to analyze the principles under which the OPFs function and assess their past and anticipated future impact on the state of the country's public fnances, particularly on the public debt. The analysis also considers the past and potential effects of the OPFs existence from the point of view of future levels of old-age pension. The studies are targeted at determining the threats connected with further maintenance of the OPFs from the point of view of both public fnance stability and pension system security.

  16. Modeling and Analysis of Transient Processes in Open Resonant Structures New Methods and Techniques

    CERN Document Server

    Sirenko, Yuriy K; Ström, Staffan

    2007-01-01

    The principal goal of the book is to describe new accurate and robust algorithms for open resonant structures with substantially increased efficiency. These algorithms allow the extraction of complete information with estimated accuracy concerning the scattering of transient electromagnetic waves by complex objects. The determination and visualization of the electromagnetic fields, developed for realistic models, simplify and significantly speed up the solution to a wide class of fundamental and applied problems of electromagnetic field theory. The book presents a systematic approach to the study of electromagnetic waves scattering which can be introduced in undergraduate/postgraduate education in theoretical and applied radiophysics and different advanced engineering courses on antenna and wave-guide technology. On a broader level, the book should be of interest to scientists in optics, computational physics and applied mathematics.

  17. A Scalable, Open Source Platform for Data Processing, Archiving and Dissemination

    Science.gov (United States)

    2016-01-01

    summer workshop. This was necessary because of sensitivity of some of the data. We worked in a compressed timeframe of just a couple weeks in July 2013...PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / S / / S / CRAIG ANKEN MICHAEL J. WESSING Work Unit...Chris Mattmann, Yolanda Gill 5d. PROJECT NUMBER EF00 5e. TASK NUMBER 81 5f. WORK UNIT NUMBER 69 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES

  18. C++Builder下基于OpenCV的数字图像处理%Digital image processing based on OpenCV in C++Builder

    Institute of Scientific and Technical Information of China (English)

    方玫; 喻擎苍; 李华强

    2008-01-01

    详细地介绍了用于数字图像处理的开放源代码的计算机视觉类库--OpenCV.利用OpenCV中的数字图像处理函数使复杂的问题变得简单化.介绍了OpenCV的特点及功能,对最新版本的OpenCV新增的功能作了详细的论述,并且讨论了在C++Builder环境下如何配置OpenCV,最后在此基础上给出了两个应用实例.对数字图像处理的研究具有一定的实用价值.

  19. Background Noise Distribution before and afterHigh-Resolution Processing in Ship-borne Radar

    Institute of Scientific and Technical Information of China (English)

    ZHANGZhong

    2005-01-01

    When high-resolution algorithm is applied in ship-borne radar~ high-resolution algorithm's nonlinearity and distributional characteristics before highresolution processing determine background clutter's distributional characteristics after high-resolution and detector design afterwards. Because background noise before high-resolution has physical significance, the statistical model of first-order Bragg lines and second order components of sea clutter is put forward. Then by using higher-order cumulative quantity's statistical verification of actually measured data, it is concluded that background noise before high-resolution conforms to normal distribution in ship-borne radar. The non-linearity of high-resolution algorithm determines that background noise after high-resolution processing conforms to non-normal distribution. Non-normal distributed clutter mainly include Weibull, Lognormal and K clutter. Rayleigh clutter can be seen as special case of Weibull clutter. These clutter have differently statistical characteristics and can be discriminated by clutter characteristics recognition. The numerical domain's distribution after high-resolution processing is determined by improved minimum entropy clutter characteristics recognition method based on rule AIC, namely two-parameter domain scanning method. This identification method has higher recognition rate. It is verified that background noise after high-resolution by pre-whitenedMUSIC conforms to lognormal distribution.

  20. Distributed processing; distributed functions?

    OpenAIRE

    Fox, Peter T.; FRISTON, KARL J

    2012-01-01

    After more than twenty years busily mapping the human brain, what have we learned from neuroimaging? This review (coda) considers this question from the point of view of structure–function relationships and the two cornerstones of functional neuroimaging; functional segregation and integration. Despite remarkable advances and insights into the brain’s functional architecture, the earliest and simplest challenge in human brain mapping remains unresolved: We do not have a principled way to map ...

  1. The OpenGeoSys coupling concept for THMC processes in subsurface and the neighboring hydro-compartments

    Science.gov (United States)

    Kalbacher, T.; Delfs, J. O.; Shao, H.; Boettcher, N.; Walther, M.; Kolditz, O.

    2012-12-01

    State-of-the-art computational models used for integrated water resources management are rapidly developing instruments. Advances in computational mathematics have revolutionized the variety and the nature of the problems that can be addressed by environmental scientists and engineers. For each hydro-compartment, from precipitation and surface run-off to catchment water balance and groundwater interactions, there exist many excellent simulation codes. However, their development has been isolated within different disciplines. The OpenGeoSys (OGS) project is a scientific open source initiative for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The basic concept is to provide a flexible numerical framework (using primarily the Finite Element Method (FEM)) for solving multi-field problems in porous and fractured media for applications in geoscience and hydrology. To this purpose, OGS is based on an object-oriented FEM concept including a broad spectrum of interfaces for pre- and post-processing. The idea includes a web-based platform for community access, outfitted with professional software engineering tools such as platform-independent compiling and fully automated benchmarking. The second strategy is to utilize an additional coupling concept that enables OGS simulations to interact sequentially with other individual modeling software in order to address coupled processes in neighboring hydrologic compartments, which includes methods of coupling different physical processes and different geometric model complexities under consideration of the spatial and temporal scale change and the required computational resources. The IWAS ToolBox concept.

  2. Exact run length distribution of the double sampling x-bar chart with estimated process parameters

    Directory of Open Access Journals (Sweden)

    Teoh, W. L.

    2016-05-01

    Full Text Available Since the run length distribution is generally highly skewed, a significant concern about focusing too much on the average run length (ARL criterion is that we may miss some crucial information about a control chart’s performance. Thus it is important to investigate the entire run length distribution of a control chart for an in-depth understanding before implementing the chart in process monitoring. In this paper, the percentiles of the run length distribution for the double sampling (DS X chart with estimated process parameters are computed. Knowledge of the percentiles of the run length distribution provides a more comprehensive understanding of the expected behaviour of the run length. This additional information includes the early false alarm, the skewness of the run length distribution, and the median run length (MRL. A comparison of the run length distribution between the optimal ARL-based and MRL-based DS X chart with estimated process parameters is presented in this paper. Examples of applications are given to aid practitioners to select the best design scheme of the DS X chart with estimated process parameters, based on their specific purpose.

  3. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    Science.gov (United States)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  4. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available Electro-optical system design, data analysis and modelling involve a significant amount of calculation and processing. Many of these calculations are of a repetitive and general nature, suitable for including in a generic toolkit. The availability...

  5. RIoTBench: A Real-time IoT Benchmark for Distributed Stream Processing Platforms

    OpenAIRE

    Shukla, Anshu; Chaturvedi, Shilpa; Simmhan, Yogesh

    2017-01-01

    The Internet of Things (IoT) is an emerging technology paradigm where millions of sensors and actuators help monitor and manage, physical, environmental and human systems in real-time. The inherent closedloop responsiveness and decision making of IoT applications make them ideal candidates for using low latency and scalable stream processing platforms. Distributed Stream Processing Systems (DSPS) hosted on Cloud data-centers are becoming the vital engine for real-time data processing and anal...

  6. Opening the black box: a study of the process of NICE guidelines implementation.

    Science.gov (United States)

    Spyridonidis, Dimitrios; Calnan, Michael

    2011-10-01

    This study informs 'evidence-based' implementation by using an innovative methodology to provide further understanding of the implementation process in the English NHS using two distinctly different NICE clinical guidelines as exemplars. The implementation process was tracked retrospectively and prospectively using a comparative case-study and longitudinal design. 74 unstructured interviews were carried out with 48 key informants (managers and clinicians) between 2007 and 2009. This study has shown that the NICE guidelines implementation process has both planned and emergent components, which was well illustrated by the use of the prospective longitudinal design in this study. The implementation process might be characterised as strategic and planned to begin with but became uncontrolled and subject to negotiation as it moved from the planning phase to adoption in everyday practice. The variations in the implementation process could be best accounted for in terms of differences in the structure and nature of the local organisational context. The latter pointed to the importance of managers as well as clinicians in decision-making about implementation. While national priorities determine the context for implementation the shape of the process is influenced by the interactions between doctors and managers, which influence the way they respond to external policy initiatives such as NICE guidelines. NICE and other national health policy-makers need to recognise that the introduction of planned change 'initiatives' in clinical practice are subject to social and political influences at the micro level as well as the macro level. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Spatial distribution of juvenile and adult female Tanner crabs (Chionoecetes bairdi) in a glacial fjord ecosystem: Implications for recruitment processes

    Science.gov (United States)

    Nielsen, J.K.; Taggart, S.J.; Shirley, T.C.; Mondragon, J.

    2007-01-01

    A systematic pot survey in Glacier Bay, Alaska, was conducted to characterize the spatial distribution of juvenile and adult female Tanner crabs, and their association with depth and temperature. The information was used to infer important recruitment processes for Tanner crabs in glaciated ecosystems. High-catch areas for juvenile and adult female Tanner crabs were identified using local autocorrelation statistics. Spatial segregation by size class corresponded to features in the glacial landscape: high-catch areas for juveniles were located at the distal ends of two narrow glacial fjords, and high-catch areas for adults were located in the open waters of the central Bay. Juvenile female Tanner crabs were found at nearly all sampled depths (15-439 m) and temperatures (4-8??C), but the biggest catches were at depths crabs. ?? 2007 International Council for the Exploration of the Sea. Published by Oxford Journals. All rights reserved.

  8. Pentaho and Jaspersoft: A Comparative Study of Business Intelligence Open Source Tools Processing Big Data to Evaluate Performances

    Directory of Open Access Journals (Sweden)

    Victor M. Parra

    2016-10-01

    Full Text Available Regardless of the recent growth in the use of “Big Data” and “Business Intelligence” (BI tools, little research has been undertaken about the implications involved. Analytical tools affect the development and sustainability of a company, as evaluating clientele needs to advance in the competitive market is critical. With the advancement of the population, processing large amounts of data has become too cumbersome for companies. At some stage in a company’s lifecycle, all companies need to create new and better data processing systems that improve their decision-making processes. Companies use BI Results to collect data that is drawn from interpretations grouped from cues in the data set BI information system that helps organisations with activities that give them the advantage in a competitive market. However, many organizations establish such systems, without conducting a preliminary analysis of the needs and wants of a company, or without determining the benefits and targets that they aim to achieve with the implementation. They rarely measure the large costs associated with the implementation blowout of such applications, which results in these impulsive solutions that are unfinished or too complex and unfeasible, in other words unsustainable even if implemented. BI open source tools are specific tools that solve this issue for organizations in need, with data storage and management. This paper compares two of the best positioned BI open source tools in the market: Pentaho and Jaspersoft, processing big data through six different sized databases, especially focussing on their Extract Transform and Load (ETL and Reporting processes by measuring their performances using Computer Algebra Systems (CAS. The ETL experimental analysis results clearly show that Jaspersoft BI has an increment of CPU time in the process of data over Pentaho BI, which is represented by an average of 42.28% in performance metrics over the six databases

  9. 基于OpenCV下的Visual C++数字图像处理方法%Visual C++ Digital Image Processing Method Based on OpenCV

    Institute of Scientific and Technical Information of China (English)

    滕俊; 王弟林; 文汉云

    2012-01-01

    OpenCV is a machine vision library with open source code,in the application of the Visual C++ program development process,you can directly calls to specific functions in the OpenCV vision library to develop your own image processing program.Shows the combination of Visual C++ and OpenCV in digital image processing by taking face recognition as an sample.%OpenCV是一个开放源代码的机器视觉库,在应用Visual C++进行程序开发的过程中,可以直接调用OpenCV视觉库中的特定函数,开发出自己的图像处理程序。通过示例程序"人脸识别",阐述将Visual C++与OpenCV相结合进行数字图像处理程序开发的过程。

  10. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals.

    Science.gov (United States)

    Wicherts, Jelte M

    2016-01-01

    Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals' websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors' ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal's impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

  11. THE FEATURES OF LASER EMISSION ENERGY DISTRIBUTION AT MATHEMATIC MODELING OF WORKING PROCESS

    Directory of Open Access Journals (Sweden)

    A. Avsiyevich

    2013-01-01

    Full Text Available The space laser emission energy distribution of different continuous operation settings depends from many factors, first on the settings design. For more accurate describing of multimode laser emission energy distribution intensity the experimental and theoretic model, which based on experimental laser emission distribution shift presentation with given accuracy rating in superposition basic function form, is proposed. This model provides the approximation error only 2,2 percent as compared with 24,6 % and 61 % for uniform and Gauss approximation accordingly. The proposed model usage lets more accurate take into consideration the laser emission and working surface interaction peculiarity, increases temperature fields calculation accuracy for mathematic modeling of laser treatment processes. The method of experimental laser emission energy distribution studying for given source and mathematic apparatus for calculation of laser emission energy distribution intensity parameters depended from the distance in radial direction on surface heating zone are shown.

  12. The distribution of particles in the plane dispersed by a simple 3-dimensional diffusion process

    DEFF Research Database (Denmark)

    Stockmarr, Anders

    2002-01-01

    Populations of particles dispersed in the 2-dimensional plane from a single pointsource may be grouped as focus expansion patterns, with an exponentially decreasing density, and more diffuse patterns with thicker tails. Exponentially decreasing distributions are often modelled as the result of 2......-dimensional diffusion processes acting to disperse the particles, while thick-tailed distributions tend to be modelled by purely descriptive distributions. Models based on the Cauchy distribution have been suggested, but these have not been related to diffusion modelling. However, the distribution...... of particles dispersed from a point source by a 3-dimensional Brownian motion that incorporates a constant drift, under the condition that the particle starts at a given height and is stopped when it reaches the xy plane (zero height) may be shown to result in both slim-tailed exponentially decreasing...

  13. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2016-09-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  14. Igpet software for modeling igneous processes: examples of application using the open educational version

    Science.gov (United States)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  15. The Space Apps Challenge: Using Open Innovation Competitions to Engage The Public in the Scientific Process

    Science.gov (United States)

    Gupta, S. S.

    2016-12-01

    NASA's Space Apps Challenge encourages innovation, creativity and collaborative problem solving by gathering coders, builders, artists, designers, and storytellers in a 48-hour hackathon. Open Innovation competitions such as the Space Apps Challenge bring the scientific world to members of the public, regardless of age, experience, credentials, or expertise. In the past five years, this model of public engagement has been widely employed by government, nonprofit and academic institutions, allowing the building of partnerships between the scientific community and the individuals and communities they serve. Furthermore, advances in technology and challenge models have lowered the barriers and costs to scientific collaboration with and for the public. NASA's Space Apps Challenge, structured as a competition seeking solutions from the public to posed problems, brings together teams and forges collaborations between individuals and groups who would otherwise have never worked together for a short but high intensity problem solving session, Space Apps has has created a pathway to public engagement and innovation that is often faster, cheaper, and more impactful than traditional approaches.

  16. Free and open source simulation tools for the design of power processing units for photovoltaic systems

    Directory of Open Access Journals (Sweden)

    Sergio Morales-Hernández

    2015-06-01

    Full Text Available Renewable energy sources, including solar photovoltaic, require electronic circuits that serve as interface between the transducer device and the device or system that uses energy. Moreover, the energy efficiency and the cost of the system can be compromised if such electronic circuit is not designed properly. Given that the electrical characteristics of the photovoltaic devices are nonlinear and that the most efficient electronic circuits for power processing are naturally discontinuous, a detailed dynamic analysis to optimize the design is required. This analysis should be supported by computer simulation tools. In this paper a comparison between two software tools for dynamic system simulation is performed to determinate its usefulness in the design process of photovoltaic systems, mainly in what corresponds to the power processing units. Using as a case of study a photovoltaic system for battery charging it was determined that Scicoslab tool was the most suitable.

  17. Welcome to Processes—A New Open Access Journal on Chemical and Biological Process Technology

    Directory of Open Access Journals (Sweden)

    Michael A. Henson

    2012-11-01

    Full Text Available As the result of remarkable technological progress, this past decade has witnessed considerable advances in our ability to manipulate natural and engineered systems, particularly at the molecular level. These advancements offer the potential to revolutionize our world through the development of novel soft and hard materials and the construction of new cellular platforms for chemical and pharmaceutical synthesis. For these technologies to truly impact society, the development of process technology that will enable effective large-scale production is essential. Improved processes are also needed for more established technologies in chemical and biochemical manufacturing, as these industries face ever increasing competitive pressure that mandates continuous improvement. [...

  18. Sunlight inactivation of viruses in open-water unit process treatment wetlands: modeling endogenous and exogenous inactivation rates.

    Science.gov (United States)

    Silverman, Andrea I; Nguyen, Mi T; Schilling, Iris E; Wenk, Jannis; Nelson, Kara L

    2015-03-03

    Sunlight inactivation is an important mode of disinfection for viruses in surface waters. In constructed wetlands, for example, open-water cells can be used to promote sunlight disinfection and remove pathogenic viruses from wastewater. To aid in the design of these systems, we developed predictive models of virus attenuation that account for endogenous and exogenous sunlight-mediated inactivation mechanisms. Inactivation rate models were developed for two viruses, MS2 and poliovirus type 3; laboratory- and field-scale experiments were conducted to evaluate the models' ability to estimate inactivation rates in a pilot-scale, open-water, unit-process wetland cell. Endogenous inactivation rates were modeled using either photoaction spectra or total, incident UVB irradiance. Exogenous inactivation rates were modeled on the basis of virus susceptibilities to singlet oxygen. Results from both laboratory- and field-scale experiments showed good agreement between measured and modeled inactivation rates. The modeling approach presented here can be applied to any sunlit surface water and utilizes easily measured inputs such as depth, solar irradiance, water matrix absorbance, singlet oxygen concentration, and the virus-specific apparent second-order rate constant with singlet oxygen (k2). Interestingly, the MS2 k2 in the open-water wetland was found to be significantly larger than k2 observed in other waters in previous studies. Examples of how the model can be used to design and optimize natural treatment systems for virus inactivation are provided.

  19. A note on the invariant distribution of a quasi-birth-and-death process

    Energy Technology Data Exchange (ETDEWEB)

    Iglesia, Manuel D de la, E-mail: mdi29@cims.nyu.edu [Courant Institute of Mathematical Sciences, New York University, 251 Mercer Street, New York, NY 10012 (United States)

    2011-04-01

    The aim of this paper is to give an explicit formula of the invariant distribution of a quasi-birth-and-death process in terms of the block entries of the transition probability matrix using a matrix-valued orthogonal polynomials approach. We will show that the invariant distribution can be computed using the squared norms of the corresponding matrix-valued orthogonal polynomials, no matter if they are or not diagonal matrices. We will give an example where the squared norms are not diagonal matrices, but nevertheless we can compute its invariant distribution.

  20. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    Stress can affect the brain functionality in many ways. As the synaptic vesicles have a major role in nervous signal transportation in synapses, their distribution in relationship to the active zone is very important in studying the neuron responses. We study the effect of stress on brain functio...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  1. 基于开源计算机视觉库OpenCV的图像处理%THE IMAGE PROCESSING BASED ON OPEN SOURCE COMPUTER VISION LIBRARY

    Institute of Scientific and Technical Information of China (English)

    贾小军; 喻擎苍

    2008-01-01

    讨论了OpenCV(Open Source Computer Vision Library)相对于现有的计算机视觉软件包所具有的优势,描述了OpenCV的环境配置、数据定义、图像元素访问方式.OpenCV成为一种源码开放、包含丰富的高级数学计算函数、图像处理函数和计算机视觉函数、不断更新和平台无关性的计算机视觉软件包.给出了两个实例,表明了其部分特性.

  2. On the horizontal distribution of algal-bloom in Chaohu Lake and its formation process

    Science.gov (United States)

    Chen, Yuan-Ying; Liu, Qing-Quan

    2014-10-01

    Based on the remote sensing images of algae, the present work analyzes the horizontal distribution characteristics of algal blooms in Chaohu Lake, China, which also reveals the frequency of algal blooms under different wind directions. Further, an unstructured-grid, three-dimensional finite-volume coastal ocean model (FVCOM) is applied to investigate the wind-induced currents and the transport process to explain the reason why algal blooms occur at the detected places. We first deduce the primary distribution of biomass from overlaid satellite images, and explain the formation mechanism by analyzing the pollution sources, and simulating the flow field and transportation process under prevailing wind over Chaohu Lake. And then, we consider the adjustment action of the wind on the corresponding day and develop a two-time scale approach to describe the whole formation process of algae horizontal distribution in Chaohu Lake. That is, on the longer time scale, i.e., during bloom season, prevailing wind determines the primary distribution of biomass by inducing the characteristic flow field; on the shorter time scale, i.e., on the day when bloom occurs, the wind force adjusts the primary distribution of biomass to form the final distribution of algal bloom.

  3. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi

    2010-11-15

    Bubble-nucleation processes of a Lennard-Jones liquid are studied by molecular dynamics simulations. Waiting time, which is the lifetime of a superheated liquid, is determined for several system sizes, and the apparent finite-size effect of the nucleation rate is observed. From the cumulative distribution function of the nucleation events, the bubble-nucleation process is found to be not a simple Poisson process but a Poisson process with an additional relaxation time. The parameters of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows to the critical size leads to an incorrect estimation of the nucleation rate. © 2010 The American Physical Society.

  4. A Low Overhead Minimum Process Global Snapshop Collection Algorithm for Mobile Distributed System

    CERN Document Server

    Kumar, Surender; Kumar, Parveen; 10.5121/ijma.2010.2202

    2010-01-01

    Coordinated checkpointing is an effective fault tolerant technique in distributed system as it avoids the domino effect and require minimum storage requirement. Most of the earlier coordinated checkpoint algorithms block their computation during checkpointing and forces minimum-process or non-blocking but forces all nodes to takes checkpoint even though many of them may not be necessary or non-blocking minimum-process but takes useless checkpoints or reduced useless checkpoint but has higher synchronization message overhead or has high checkpoint request propagation time. Hence in mobile distributed systems there is a great need of minimizing the number of communication message and checkpointing overhead as it raise new issues such as mobility, low bandwidth of wireless channels, frequently disconnections, limited battery power and lack of reliable stable storage on mobile nodes. In this paper, we propose a minimum-process coordinated checkpointing algorithm for mobile distributed system where no useless chec...

  5. Effect of almond processing on levels and distribution of aflatoxins in finished products and byproducts.

    Science.gov (United States)

    Zivoli, Rosanna; Gambacorta, Lucia; Perrone, Giancarlo; Solfrizzo, Michele

    2014-06-18

    The fate of aflatoxins during processing of contaminated almonds into nougat, pastries, and almond syrup was evaluated by testing the effect of each processing step (blanching, peeling, roasting, caramelization, cooking, and water infusion) on the distribution and levels of aflatoxins. Blanching and peeling did not reduce total aflatoxins that were distributed between peeled almonds (90-93%) and skins (7-10%). Roasting of peeled almonds reduced up to 50% of aflatoxins. Up to 70% reduction of aflatoxins was observed during preparation and cooking of almond nougat in caramelized sugar. Aflatoxins were substantially stable during preparation and cooking of almond pastries. The whole process of almond syrup preparation produced a marked increase of total aflatoxins (up to 270%) that were distributed between syrup (18-25%) and spent almonds (75-82%). The increase of total aflatoxins was probably due to the activation of almond enzymes during the infusion step that released free aflatoxins from masked aflatoxins.

  6. Estimation of spatially distributed processes using mobile sensor networks with missing measurements

    Institute of Scientific and Technical Information of China (English)

    江正仙; 崔宝同

    2015-01-01

    This paper investigates the estimation problem for a spatially distributed process described by a partial differential equation with missing measurements. The randomly missing measurements are introduced in order to better reflect the reality in the sensor network. To improve the estimation performance for the spatially distributed process, the network of sensors which are allowed to move within the spatial domain is used. We aim to design the estimator which is used to approximate the distributed process and the mobile trajectories for sensors such that, for all possible missing measure-ments, the estimation error system is globally asymptotically stable in the mean square sense. By constructing Lyapunov functionals and using inequality analysis, the guidance scheme of every sensor and the convergence of the estimation error system are obtained. Finally, a numerical example is given to verify the effectiveness of the proposed estimator utilizing both the proposed guidance scheme for sensors.

  7. High speed vision processor with reconfigurable processing element array based on full-custom distributed memory

    Science.gov (United States)

    Chen, Zhe; Yang, Jie; Shi, Cong; Qin, Qi; Liu, Liyuan; Wu, Nanjian

    2016-04-01

    In this paper, a hybrid vision processor based on a compact full-custom distributed memory for near-sensor high-speed image processing is proposed. The proposed processor consists of a reconfigurable processing element (PE) array, a row processor (RP) array, and a dual-core microprocessor. The PE array includes two-dimensional processing elements with a compact full-custom distributed memory. It supports real-time reconfiguration between the PE array and the self-organized map (SOM) neural network. The vision processor is fabricated using a 0.18 µm CMOS technology. The circuit area of the distributed memory is reduced markedly into 1/3 of that of the conventional memory so that the circuit area of the vision processor is reduced by 44.2%. Experimental results demonstrate that the proposed design achieves correct functions.

  8. Hierarchical charge distribution controls self-assembly process of silk in vitro

    Science.gov (United States)

    Zhang, Yi; Zhang, Cencen; Liu, Lijie; Kaplan, David L.; Zhu, Hesun; Lu, Qiang

    2015-12-01

    Silk materials with different nanostructures have been developed without the understanding of the inherent transformation mechanism. Here we attempt to reveal the conversion road of the various nanostructures and determine the critical regulating factors. The regulating conversion processes influenced by a hierarchical charge distribution were investigated, showing different transformations between molecules, nanoparticles and nanofibers. Various repulsion and compressive forces existed among silk fibroin molecules and aggregates due to the exterior and interior distribution of charge, which further controlled their aggregating and deaggregating behaviors and finally formed nanofibers with different sizes. Synergistic action derived from molecular mobility and concentrations could also tune the assembly process and final nanostructures. It is suggested that the complicated silk fibroin assembly processes comply a same rule based on charge distribution, offering a promising way to develop silk-based materials with designed nanostructures.

  9. On the Limit Distributions of Continuous-State Branching Processes with Immigration

    CERN Document Server

    Keller-Ressel, Martin

    2011-01-01

    We consider the class of continuous-state branching processes with immigration (CBI-processes), introduced by Kawazu and Watanabe [1971], and give a deterministic characterisation for the convergence of a CBI-process to a limit distribution L, which also turns out to be the stationary distribution of the CBI-process, as time tends to infinity. We give an explicit description of the Levy-Khintchine triplet of L in terms of the characteristic triplets of the Levy subordinator and the spectrally positive Levy process, which arise in the definition of the CBI-process and determine it uniquely. We show that the Levy density of L is given by the generator of the Levy subordinator applied to the scale function of the spectrally positive Levy process. This formula allows us to describe the support of L and characterise the absolute continuity and the asymptotic behavior of the density of L at the boundary of the support. Finally we show that the class of limit distributions of CBI-processes is strictly larger (resp. ...

  10. Opening the Black Box and Searching for Smoking Guns: Process Causality in Qualitative Research

    Science.gov (United States)

    Bennett, Elisabeth E.; McWhorter, Rochell R.

    2016-01-01

    Purpose: The purpose of this paper is to explore the role of qualitative research in causality, with particular emphasis on process causality. In one paper, it is not possible to discuss all the issues of causality, but the aim is to provide useful ways of thinking about causality and qualitative research. Specifically, a brief overview of the…

  11. Using Open-Source Components to Process Interferometric TerraSAR-X Spotlight Data

    Directory of Open Access Journals (Sweden)

    Michael Jendryke

    2013-01-01

    Full Text Available We address the processing of interferometric TerraSAR-X and TanDEM-X spotlight data. Processing steps necessary to derive interferograms at high spatial resolution from bi- and monostatic satellite images will be explained. The spotlight image mode is a beam steering technique focusing the antenna on a specific ground area. This results in a linear Doppler shift frequency in azimuth direction, which has to be matched to the master image. While shifting the interpolation kernel in azimuth during resampling, the frequency spectrum of the slave image is aligned to the master image. We show how to process bistatic TanDEM-X images and propose an integrated processing option for monostatic TerraSAR-X data in the Delft Object-oriented Radar Interferometric Software (DORIS. The paper focuses on the implementation of this algorithm for high-resolution spotlight InSAR in a public domain tool; hence, it becomes available to a larger research community. The results are presented for three test areas: Uluru in Australia, Las Vegas in the USA, and Lüneburg in Germany.

  12. Opening the Black Box and Searching for Smoking Guns: Process Causality in Qualitative Research

    Science.gov (United States)

    Bennett, Elisabeth E.; McWhorter, Rochell R.

    2016-01-01

    Purpose: The purpose of this paper is to explore the role of qualitative research in causality, with particular emphasis on process causality. In one paper, it is not possible to discuss all the issues of causality, but the aim is to provide useful ways of thinking about causality and qualitative research. Specifically, a brief overview of the…

  13. Koornwinder polynomials and the stationary multi-species asymmetric exclusion process with open boundaries

    Science.gov (United States)

    Cantini, Luigi; Garbali, Alexandr; de Gier, Jan; Wheeler, Michael

    2016-11-01

    We prove that the normalisation of the stationary state of the multi-species asymmetric simple exclusion process (mASEP) is a specialisation of a Koornwinder polynomial. As a corollary we obtain that the normalisation of mASEP factorises as a product over multiple copies of the two-species ASEP.

  14. The SCOAP3 initiative and the Open Access Article-Processing-Charge market: global partnership and competition improve value in the dissemination of science

    CERN Document Server

    Romeu, Clément; Kohls, Alexander; Mansuy, Anne; Mele, Salvatore; Vesper, Martin

    2014-01-01

    The SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) initiative is an international partnership to convert to Open Access the published literature in the field of High-Energy Physics (HEP). It has been in operation since January 2014, and covers more than 4’000 articles/year. Originally initiated by CERN, the European Organization for Nuclear Research, and now counting partners representing 41 countries and 3 intergovernmental organizations, SCOAP3 has successfully converted to Open Access all, or part of, 6 HEP journals previously restricted to subscribers. It is also supporting publication of articles in 4 existing Open Access journals. As a “Gold” Open Access initiative, SCOAP3 pays Article Processing Charges (APCs), as publishers’ source of revenue for the publication service. Differentiating itself from other Open Access initiatives, SCOAP3 set APCs through a tendering process, correlating quality and price, at consistent conditions across participating publishers. Th...

  15. The brain as a distributed intelligent processing system: an EEG study.

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    Full Text Available BACKGROUND: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS, first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. METHODOLOGY AND PRINCIPAL FINDINGS: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale and WISC (Wechsler Intelligence Scale for Children, and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. CONCLUSION: The present results support these claims and the neural efficiency hypothesis.

  16. Open Knowledge Maps: Creating a Visual Interface to the World’s Scientific Knowledge Based on Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Peter Kraker

    2016-11-01

    Full Text Available The goal of Open Knowledge Maps is to create a visual interface to the world’s scientific knowledge. The base for this visual interface consists of so-called knowledge maps, which enable the exploration of existing knowledge and the discovery of new knowledge. Our open source knowledge mapping software applies a mixture of summarization techniques and similarity measures on article metadata, which are iteratively chained together. After processing, the representation is saved in a database for use in a web visualization. In the future, we want to create a space for collective knowledge mapping that brings together individuals and communities involved in exploration and discovery. We want to enable people to guide each other in their discovery by collaboratively annotating and modifying the automatically created maps. Das Ziel von Open Knowledge Map ist es, ein visuelles Interface zum wissenschaftlichen Wissen der Welt bereitzustellen. Die Basis für die dieses Interface sind sogenannte “knowledge maps”, zu deutsch Wissenslandkarten. Wissenslandkarten ermöglichen die Exploration bestehenden Wissens und die Entdeckung neuen Wissens. Unsere Open Source Software wendet für die Erstellung der Wissenslandkarten eine Reihe von Text Mining Verfahren iterativ auf die Metadaten wissenschaftlicher Artikel an. Die daraus resultierende Repräsentation wird in einer Datenbank für die Anzeige in einer Web-Visualisierung abgespeichert. In Zukunft wollen wir einen Raum für das kollektive Erstellen von Wissenslandkarten schaffen, der die Personen und Communities, welche sich mit der Exploration und Entdeckung wissenschaftlichen Wissens beschäftigen, zusammenbringt. Wir wollen es den NutzerInnen ermöglichen, einander in der Literatursuche durch kollaboratives Annotieren und Modifizieren von automatisch erstellten Wissenslandkarten zu unterstützen.

  17. Stability analysis of the sliding process of the west slope in Buzhaoba Open-Pit Mine

    Institute of Scientific and Technical Information of China (English)

    Ning Fang; Ji Changsheng⇑; Garmondyu E. Crusoe Jr

    2016-01-01

    To study the stability of the west slope in Buzhaoba Open-Pit Mine and determine the aging stability coefficient during slide mass development, the deformation band of the west slope and the slide mass structure of the 34,600 profile are obtained on the basis of hydrology, geology, and monitoring data. The residual thrust method is utilized to calculate the stability coefficients, which are 1.225 and 1.00 under sound and transfixion conditions, respectively. According to the rock damage and fragmentation and the principle of mechanical parameter degradation, the mechanical models of the slide mass devel-opment of the hard and soft rock slopes are established. An integrated model for calculating the slope stability coefficient is built considering water, vibration, and other external factors that pertain to the structural plane damage mechanism and the generating mechanism of the sliding mass. The change curve of the stability coefficient in the slide mass development is obtained from the relevant analyses, and afterwards, the stability control measures are proposed. The analysis results indicate that in the cracking stage of the hard rock, the slope stability coefficient decreases linearly with the increase in the length Lb of the hard rock crack zone. The linear slope is positively correlated to rock cohesion c. In the transfixion stage of the soft rock, the decrease speed of the stability coefficient is positively correlated to the residual strength of the soft rock. When the slope is stable, the stability coefficient is in a quadratic-linear relationship with the decreased height Dh of the side slope and in a linear relationship with anchoring force P.

  18. Simulation of business processes of processing and distribution of orders in transportation

    Directory of Open Access Journals (Sweden)

    Ольга Ігорівна Проніна

    2017-06-01

    Full Text Available Analyzing modern passenger transportation in Ukraine, we can conclude that with the increasing number of urban population the necessity to develop passenger traffic, as well as to improve the quality of transport services is increasing too. The paper examines the three existing models of private passenger transportation (taxi: a model with the use of dispatching service, without dispatching service model and a mixed model. An algorithm of getting an order, processing it, and its implementation according to the given model has been considered. Several arrangements schemes that characterize the operation of the system have been shown in the work as well. The interrelation of the client making an order and the driver who receives the order and executes it has been represented, the server being a connecting link between the customer and the driver and regulating the system as a whole. Business process of private passenger transportation without dispatching service was simulated. Basing on the simulation results it was proposed to supplement the model of private transportation by the making advice system, as well as improving the car selection algorithm. Advice system provides the optimum choice of the car, taking into account a lot of factors. And it will also make it possible to use more efficiently the specific additional services provided by the drivers. Due to the optimization of the order handling process it becomes possible to increase the capacity of the drivers thus increasing their profits. Passenger transportation without the use of dispatching service has some weak points and they were identified. Application of the system will improve transport structure in modern conditions, and improve the transportation basing on modern operating system

  19. Optimal boundary control of a tracking problem for a parabolic distributed system with open-loop control using evolutionary algorithms

    Directory of Open Access Journals (Sweden)

    Russel J Stonier

    2003-08-01

    Full Text Available In this paper we examine the application of evolutionary algorithms to find open-loop control solutions of the optimal control problem arising from the semidiscretisation of a linear parabolic tracking problem with boundary control. The solution is compared with the solutions obtained by methods based upon the variational equations of the Minimum Principle and the finite element method.

  20. {open_quotes}Industry views--timing/structure of consultative process{close_quotes}

    Energy Technology Data Exchange (ETDEWEB)

    Volgelsberg, T.

    1995-12-31

    This paper examines industry`s perspective on the issues concerning the politics and economics of climatic change. The climate change issue complexity goes beyond science and involves: technology, economics, lifestyle, population, intergenerational equity, etc. Industry resources should be actively involved in technology, timeframes, economic assessments, and the political process. Climate mitigation options should be viewed on a holistic or total impact basis. Technology and economic assessment should not create winners and losers. The climate change process is like peeling an onion - long timeframes are required for cultural and infrastructure changes, there are both short term small improvement and long term structural changes, and implementation may take generations. Interdisciplinary communications are critical, cutting across the fields of social science, physical science, economics, technology, demographics, etc. Finally, industry must play or be played - industry can either help shape or be left to live with policy.