WorldWideScience

Sample records for open distributed processing

  1. Building enterprise systems with ODP an introduction to open distributed processing

    CERN Document Server

    Linington, Peter F; Tanaka, Akira; Vallecillo, Antonio

    2011-01-01

    The Reference Model of Open Distributed Processing (RM-ODP) is an international standard that provides a solid basis for describing and building widely distributed systems and applications in a systematic way. It stresses the need to build these systems with evolution in mind by identifying the concerns of major stakeholders and then expressing the design as a series of linked viewpoints. Although RM-ODP has been a standard for more than ten years, many practitioners are still unaware of it. Building Enterprise Systems with ODP: An Introduction to Open Distributed Processing offers a gentle pa

  2. Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation

    NARCIS (Netherlands)

    Sloep, Peter

    2009-01-01

    Sloep, P. B. (2009). Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation. In V. Hornung-Prähauser & M. Luckmann (Eds.), Kreativität und Innovationskompetenz im digitalen Netz - Creativity and Innovation Competencies in the Web, Sammlung von

  3. Energy Consumption in the Process of Excavator-Automobile Complexes Distribution at Kuzbass Open Pit Mines

    Directory of Open Access Journals (Sweden)

    Panachev Ivan

    2017-01-01

    Full Text Available Every year worldwide coal mining companies seek to maintain the tendency of the mining machine fleet renewal. Various activities to maintain the service life of already operated mining equipment are implemented. In this regard, the urgent issue is the problem of efficient distribution of available machines in different geological conditions. The problem of “excavator-automobile” complex effective distribution occurs when heavy dump trucks are used in mining. For this reason, excavation and transportation of blasted rock mass are the most labor intensive and costly processes, considering the volume of transported overburden and coal, as well as diesel fuel, electricity, fuel and lubricants costs, consumables for repair works and downtime, etc. Currently, it is recommended to take the number of loading buckets in the range of 3 to 5, according to which the dump trucks are distributed to faces.

  4. Further improvement in ABWR (part-4) open distributed plant process computer system

    International Nuclear Information System (INIS)

    Makino, Shigenori; Hatori, Yoshinori

    1999-01-01

    In the nuclear industry of Japan, the electric power companies have promoted the plant process computer (PPC) technology of nuclear power plant (NPP). When PPC was introduced to NPP for the first time, because of very tight requirement such as high reliability, high speed processing, the large-scale customized computer was applied. As for recent computer field, the large market of computer contributes to the remarkable progress of engineering work station (EWS) and personal computer (PC) technology. Moreover because the data transmission technology has been progressing at the same time, world wide computer network has been established. Thanks to progress of both technologies, the distributed computer system has been established at reasonable price. So Tokyo Electric Power Company (TEPCO) is trying to apply it for PPC of NPP. (author)

  5. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    Science.gov (United States)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  6. Inter-particle gap distribution and spectral rigidity of the totally asymmetric simple exclusion process with open boundaries

    International Nuclear Information System (INIS)

    Krbalek, Milan; Hrabak, Pavel

    2011-01-01

    We consider the one-dimensional totally asymmetric simple exclusion process (TASEP model) with open boundary conditions and present the analytical computations leading to the exact formula for distance clearance distribution, i.e. probability density for a clear distance between subsequent particles of the model. The general relation is rapidly simplified for the middle part of the one-dimensional lattice. Both the analytical formulas and their approximations are compared with the numerical representation of the TASEP model. Such a comparison is presented for particles occurring in the internal part as well as in the boundary part of the lattice. Furthermore, we introduce the pertinent estimation for the so-called spectral rigidity of the model. The results obtained are sequentially discussed within the scope of vehicular traffic theory.

  7. Momentum distributions: opening remarks

    International Nuclear Information System (INIS)

    Weigold, E.

    1982-01-01

    The problem of the hydrogen atom has played a central role in the development of quantum mechanics, beginning with Bohr's daring speculations. It was also the first problem tackled by Schroedinger with his new wave mechanics and similarly it was used by Heisenberg in his first papers as a prime example of the success of quantum mechanics. It has always played a central role in the teaching of quantum physics and has served as a most important heuristic tool, shaping our intuition and inspiring many expositions. The Schroedinger equation for the hydrogen atom is usually solved in the position representation, the solution to the equation being the wave functions psi/sub nlm/(r). If Schroedinger's equation is solved in the momentum representation instead of the coordinate representation, the absolute square of the corresponding momentum state wave function phi/sub nlm/(p) would give the momentum probability distribution of the electron in the state defined by the quantum numbers n, l and m. Three different types of collisions which can take place in the (e,2e) reaction on atomic hydrogen, which is a three body problem, are discussed

  8. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  9. 75 FR 14076 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Science.gov (United States)

    2010-03-24

    ... tray boxes to address Open and Distribute customers' concerns on the security of their mail in a letter tray during processing. The current tray box sizes were a result of customer feedback. The decision to... processing of Open and Distribute containers. In response to customer needs, the Postal Service is...

  10. Leadership in Open and Distributed Innovation

    DEFF Research Database (Denmark)

    Haefliger, Stefan; Poetz, Marion

    demands and shorter product life cycles have triggered new forms of creation and innovative practices (von Hippel and von Krogh, 2003; Baden-Fuller and Haefliger, 2013). These new forms can be characterized by being more open, distributed, collaborative, and democratized than traditional models...... in networks of innovators, such as platform businesses (Alexy et al., 2009; Gawer and Cusumano, 2008; Füller et al., 2016). However, one aspect that has so far received little attention, both in research and in business practice is the potentially conflicting role of traditional forms of leadership in open...... innovation systems, processes and projects. Traditional approaches to leadership in innovation processes highlight the role of individual managers who lead and evaluate firm-internal team members, champion innovation projects within the organization and act as translators between various firm...

  11. Open-ocean convection process: A driver of the winter nutrient supply and the spring phytoplankton distribution in the Northwestern Mediterranean Sea

    Science.gov (United States)

    Severin, Tatiana; Kessouri, Faycal; Rembauville, Mathieu; Sánchez-Pérez, Elvia Denisse; Oriol, Louise; Caparros, Jocelyne; Pujo-Pay, Mireille; Ghiglione, Jean-François; D'Ortenzio, Fabrizio; Taillandier, Vincent; Mayot, Nicolas; Durrieu De Madron, Xavier; Ulses, Caroline; Estournel, Claude; Conan, Pascal

    2017-06-01

    This study was a part of the DeWEX project (Deep Water formation Experiment), designed to better understand the impact of dense water formation on the marine biogeochemical cycles. Here, nutrient and phytoplankton vertical and horizontal distributions were investigated during a deep open-ocean convection event and during the following spring bloom in the Northwestern Mediterranean Sea (NWM). In February 2013, the deep convection event established a surface nutrient gradient from the center of the deep convection patch to the surrounding mixed and stratified areas. In the center of the convection area, a slight but significant difference of nitrate, phosphate and silicate concentrations was observed possibly due to the different volume of deep waters included in the mixing or to the sediment resuspension occurring where the mixing reached the bottom. One of this process, or a combination of both, enriched the water column in silicate and phosphate, and altered significantly the stoichiometry in the center of the deep convection area. This alteration favored the local development of microphytoplankton in spring, while nanophytoplankton dominated neighboring locations where the convection reached the deep layer but not the bottom. This study shows that the convection process influences both winter nutrients distribution and spring phytoplankton distribution and community structure. Modifications of the convection's spatial scale and intensity (i.e., convective mixing depth) are likely to have strong consequences on phytoplankton community structure and distribution in the NWM, and thus on the marine food web.Plain Language SummaryThe deep open-ocean convection in the Northwestern Mediterranean Sea is an important process for the formation and the circulation of the deep waters of the entire Mediterranean Sea, but also for the local spring phytoplankton bloom. In this study, we showed that variations of the convective mixing depth induced different supply in nitrate

  12. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  13. Process evaluation distributed system

    Science.gov (United States)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  14. Opening up the Innovation Process

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2008-01-01

    An organization's ability to create, retrieve, and use knowledge to innovate is a critical strategic asset. Until recently, most textbooks on business and product development argued that managers should keep their new ideas to themselves and protect knowledge from getting into competitors' hands....... Seeking, developing, and protecting knowledge is a costly endeavour. Moreover, apart from being expensive, the process of turning new knowledge into useful and well-protected innovations often slows the speed of development and increases costs. In this chapter, alternative strategies for innovation......, in which sharing and co-operation play a critical part, are discussed. In particular, we address the involvement of users in opening up the innovation process which, in turn, offers participating actors some useful strategies for product development. Four archetypal strategies are identified and classified...

  15. Distributed genetic process mining

    NARCIS (Netherlands)

    Bratosin, C.C.; Sidorova, N.; Aalst, van der W.M.P.

    2010-01-01

    Process mining aims at discovering process models from data logs in order to offer insight into the real use of information systems. Most of the existing process mining algorithms fail to discover complex constructs or have problems dealing with noise and infrequent behavior. The genetic process

  16. Opening up the innovation process: archetypal strategies

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2005-01-01

    sharing and co-operation play a critical part. The paper addresses the involvement of users in opening up the innovation process, which in turn gives the participating actors an interesting alternative for product development. We identify and classify four archetypal strategies for opening up...

  17. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  18. Safe Distribution of Declarative Processes

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2011-01-01

    of projections that covers a DCR Graph that the network of synchronously communicating DCR Graphs given by the projections is bisimilar to the original global process graph. We exemplify the distribution technique on a process identified in a case study of an cross-organizational case management system carried...... process model generalizing labelled prime event structures to a systems model able to finitely represent ω-regular languages. An operational semantics given as a transition semantics between markings of the graph allows DCR Graphs to be conveniently used as both specification and execution model....... The technique for distribution is based on a new general notion of projection of DCR Graphs relative to a subset of labels and events identifying the set of external events that must be communicated from the other processes in the network in order for the distribution to be safe.We prove that for any vector...

  19. Current distribution in a plasma erosion opening switch

    International Nuclear Information System (INIS)

    Weber, B.V.; Commisso, R.J.; Meger, R.A.; Neri, J.M.; Oliphant, W.F.; Ottinger, P.F.

    1984-01-01

    The current distribution in a plasma erosion opening switch is determined from magnetic field probe data. During the closed state of the switch the current channel broadens rapidly. The width of the current channel is consistent with a bipolar current density limit imposed by the ion flux to the cathode. The effective resistivity of the current channel is anomalously large. Current is diverted to the load when a gap opens near the cathode side of the switch. The observed gap opening can be explained by erosion of the plasma. Magnetic pressure is insufficient to open the gap

  20. Current distribution in a plasma erosion opening switch

    International Nuclear Information System (INIS)

    Weber, B.V.; Commisso, R.J.; Meger, R.A.; Neri, J.M.; Oliphant, W.F.; Ottinger, P.F.

    1985-01-01

    The current distribution in a plasma erosion opening switch is determined from magnetic field probe data. During the closed state of the switch the current channel broadens rapidly. The width of the current channel is consistent with a bipolar current density limit imposed by the ion flux to the cathode. The effective resistivity of the current channel is anomalously large. Current is diverted to the load when a gap opens near the cathode side of the switch. The observed gap opening can be explained by erosion of the plasma. Magnetic pressure is insufficient to open the gap

  1. Opening up the Agile Innovation Process

    Science.gov (United States)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  2. A Transparent Runtime Data Distribution Engine for OpenMP

    Directory of Open Access Journals (Sweden)

    Dimitrios S. Nikolopoulos

    2000-01-01

    Full Text Available This paper makes two important contributions. First, the paper investigates the performance implications of data placement in OpenMP programs running on modern NUMA multiprocessors. Data locality and minimization of the rate of remote memory accesses are critical for sustaining high performance on these systems. We show that due to the low remote-to-local memory access latency ratio of contemporary NUMA architectures, reasonably balanced page placement schemes, such as round-robin or random distribution, incur modest performance losses. Second, the paper presents a transparent, user-level page migration engine with an ability to gain back any performance loss that stems from suboptimal placement of pages in iterative OpenMP programs. The main body of the paper describes how our OpenMP runtime environment uses page migration for implementing implicit data distribution and redistribution schemes without programmer intervention. Our experimental results verify the effectiveness of the proposed framework and provide a proof of concept that it is not necessary to introduce data distribution directives in OpenMP and warrant the simplicity or the portability of the programming model.

  3. INNOVATION PROCESS IN OPEN CAPITAL BRAZILIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2013-12-01

    Full Text Available This study aims to identify the innovation process used by the open capital Brazilian companies and establish a ranking of the potentially innovative ones. For this, a questionnaire was sent to 484 companies with shares traded in Bovespa, receiving a response from 22. The innovation process is based on the model of Barrett and Sexton (2006. A summary of the results is presented below. (i Organizational Capabilities – 95.5% answered that they have incentives for innovation activities and 68.2% reported having procedures for all services. The leadership has a facilitator role encouraging the initiative (86.4% and promotes the maintenance of the group relationship (72.7%. Value risk taking, even through failures and prioritize the learning and experimenting new ideas. (ii Background of the innovation – reveals aspects of the capacity (internal or (external. Of the respondents, 59.1% developed internal activities of continuing P & D. Training to innovate is present in a continuous or occasional basis in 81.8% of the companies. The respondents characterize the economic environment as dynamic and the majority purchased software and equipments. In only 12 opportunities was a reference to obtaining patents as innovation protection measure. (iii Focus of innovation – the majority of the companies mentioned process or product innovation. Rewards are offered when the objectives are met and it is brought to attention when this does not occur. (iv Highlighted performance – the innovations achieved the expectations and created effects. The relevant benefits noticed were: improvement in quality of goods and services, increase of market share, increase of goods and services, and increase of productive capacity.

  4. Distributed Processing of SETI Data

    Science.gov (United States)

    Korpela, Eric

    As you have read in prior chapters, researchers have been performing progressively more sensitive SETI searches since 1960. Each search has been limited by the technologies available at the time. As radio frequency technologies have become more efficient and computers have become faster, the searches have increased in capacity and become more sensitive. Often the limits of the hardware that performs the calculations required to process the telescope data in order to expose any embedded signals is what limits the sensitivity of the search. Shortly before the start of the 21st century, projects began to appear that exploited the processing capabilities of computers connected to the Internet in order to solve problems that required a large amount of computing power. The SETI@home project, managed by myself and a group of researchers at the Space Sciences Laboratory of the University of California, Berkeley, was the first attempt to use large-scale distributed computing to solve the problems of performing a sensitive search for narrow band radio signals from extraterrestrial civilizations. (Korpela et al., 2001) A follow-on project, Astropulse, searches for extraterrestrial signals with wider bandwidths and shorter time durations. Both projects are ongoing at the present time (mid-2010).

  5. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  6. Terror in the Board Room: The Bid-Opening Process

    Science.gov (United States)

    Shoop, James

    2009-01-01

    Competitive bids and the bid-opening process are the cornerstones of public school purchasing. The bid-opening process does not begin on the day of the bid opening. It begins with good planning by the purchasing agent to ensure that the advertised bid complies with the public school contracts law. In New Jersey, that raises the following…

  7. OPEN INNOVATION PROCESSES IN SOCIAL MEDIA PLATFORMS

    OpenAIRE

    Yang, Yang

    2013-01-01

    Innovation power has becomes to the priority concern for many enterprises. Open innovation, which acts as a new innovation method, is now applied in many companies due to its unique advantages. On the other hand, social media platforms have been widely accepted by public and it shares an immeasurable business resources. Based on those facts, there must be space to link social media and open innovation together to achieve win-win. The objective was to research the important factors for op...

  8. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This paper establishes a remarkable result regarding Palm distributions for a log Gaussian Cox process: the reduced Palm distribution for a log Gaussian Cox process is itself a log Gaussian Cox process that only differs from the original log Gaussian Cox process in the intensity function. This new...... result is used to study functional summaries for log Gaussian Cox processes....

  9. Article Processing Charges and OpenAPC

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The publication landscape is about to change. While being largely operated by subscription based journals in the past, recent political decisions force the publishing industry towards OpenAccess. Especially, the publication of the Finch report in 2012 put APC based Gold OpenAccess models almost everywhere on the agenda. These models also require quite some adoptions for library work flows to handle payments, bills and centralized funds for publication fees. Sometimes handled in specialized systems (e.g. first setups in Jülich) pretty early on discussions started to handle APCs in local repositories which would also hold the OpenAccess content resulting from these fees, e.g. the University of Regenburg uses ePrints for this purpose. Backed up by the OpenData movmement, libraries also saw opportunity to exchange data about fees payed. Thus, OpenAPC.de was born in 2014 on github to facilitate this exchange and aggregate large amounts of data for evaluation and comparison. Using the repository to hold payment d...

  10. General distributions in process algebra

    NARCIS (Netherlands)

    Katoen, Joost P.; d' Argenio, P.R.; Brinksma, Hendrik; Hermanns, H.; Katoen, Joost P.

    2001-01-01

    This paper is an informal tutorial on stochastic process algebras, i.e., process calculi where action occurrences may be subject to a delay that is governed by a (mostly continuous) random variable. Whereas most stochastic process algebras consider delays determined by negative exponential

  11. Distribution and behavior of transuranics in the open ocean

    International Nuclear Information System (INIS)

    Nakamura, Kiyoshi

    1996-01-01

    The major source of 239,240 Pu in the open ocean is the global fallout from the atmospheric weapon testing. In 1987-1989, the latitudinal distribution of 239,240 Pu in the surface water showed the pattern reflecting a global deposition in both the Pacific and the Atlantic Ocean. The feature of the 239,240 Pu vertical distribution is that a subsurface maximum exists at depth from 200m to 1000m over a large area of the ocean. Inventories of 239,240 Pu in the water column are often higher than those expected from global fallout and close-in fallout is suggested to be the additional inventory in the Pacific Ocean. The 239,240 Pu inventory in sediment has the values in the range of 2 to 27% of that in sea water. The use of a multiple corer is suggested for sediment sampling. (author)

  12. An Open Distributed Architecture for Sensor Networks for Risk Management

    Directory of Open Access Journals (Sweden)

    Ralf Denzer

    2008-03-01

    Full Text Available Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word ‘sensors’ covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth’s surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: ‘Open Architecture and Spatial Data

  13. Description of Supply Openings in Numerical Models for Room Air Distribution

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    This paper discusses various possibilities for describing supply openings in numerical models of room air distribution.......This paper discusses various possibilities for describing supply openings in numerical models of room air distribution....

  14. Web-Based Distributed XML Query Processing

    NARCIS (Netherlands)

    Smiljanic, M.; Feng, L.; Jonker, Willem; Blanken, Henk; Grabs, T.; Schek, H-J.; Schenkel, R.; Weikum, G.

    2003-01-01

    Web-based distributed XML query processing has gained in importance in recent years due to the widespread popularity of XML on the Web. Unlike centralized and tightly coupled distributed systems, Web-based distributed database systems are highly unpredictable and uncontrollable, with a rather

  15. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus

    This paper reviews useful results related to Palm distributions of spatial point processes and provides a new result regarding the characterization of Palm distributions for the class of log Gaussian Cox processes. This result is used to study functional summary statistics for a log Gaussian Cox...

  16. On Distributed Port-Hamiltonian Process Systems

    NARCIS (Netherlands)

    Lopezlena, Ricardo; Scherpen, Jacquelien M.A.

    2004-01-01

    In this paper we use the term distributed port-Hamiltonian Process Systems (DPHPS) to refer to the result of merging the theory of distributed Port-Hamiltonian systems (DPHS) with the theory of process systems (PS). Such concept is useful for combining the systematic interconnection of PHS with the

  17. Parallel and Distributed Data Processing Using Autonomous ...

    African Journals Online (AJOL)

    Looking at the distributed nature of these networks, data is processed by remote login or Remote Procedure Calls (RPC), this causes congestion in the network bandwidth. This paper proposes a framework where software agents are assigned duties to be processing the distributed data concurrently and assembling the ...

  18. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Segre, Daniel [Boston Univ., MA (United States)

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  19. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  20. Design Principles for Improving the Process of Publishing Open data

    NARCIS (Netherlands)

    Zuiderwijk, A.M.G.; Janssen, M.F.W.H.A.; Choenni, R.; Meijer, R.F.

    2014-01-01

    · Purpose: Governments create large amounts of data. However, the publication of open data is often cumbersome and there are no standard procedures and processes for opening data. This blocks the easy publication of government data. The purpose of this paper is to derive design principles for

  1. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    Science.gov (United States)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  2. Post-Processing Resolution Enhancement of Open Skies Photographic Imagery

    National Research Council Canada - National Science Library

    Sperl, Daniel

    2000-01-01

    ...), which manages implementation of the Open Skies Treaty for the US Air Force, wants to determine if post-processing of the photographic images can improve spatial resolution beyond 30 cm, and if so...

  3. OpenLMD, multimodal monitoring and control of LMD processing

    Science.gov (United States)

    Rodríguez-Araújo, Jorge; García-Díaz, Antón

    2017-02-01

    This paper presents OpenLMD, a novel open-source solution for on-line multimodal monitoring of Laser Metal Deposition (LMD). The solution is also applicable to a wider range of laser-based applications that require on-line control (e.g. laser welding). OpenLMD is a middleware that enables the orchestration and virtualization of a LMD robot cell, using several open-source frameworks (e.g. ROS, OpenCV, PCL). The solution also allows reconfiguration by easy integration of multiple sensors and processing equipment. As a result, OpenLMD delivers significant advantages over existing monitoring and control approaches, such as improved scalability, and multimodal monitoring and data sharing capabilities.

  4. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  5. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  6. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  7. Process of adoption communication openness in adoptive families: adopters’ perspective

    Directory of Open Access Journals (Sweden)

    Maria Acciaiuoli Barbosa-Ducharne

    2016-01-01

    Full Text Available Abstract Communication about adoption is a family interaction process which is more than the simple exchange of information. Adoption communication can be characterized in terms of the level of openness of family conversations regarding the child’s past and the degree of the family’s adoption social disclosure. The objective of this study is to explore the process of adoption communication openness in Portuguese adoptive families by identifying the impact of variables related to the adoption process, the adoptive parenting and the adoptee. One hundred twenty five parents of children aged 3 to 15, who were adopted on average 4 years ago, participated in this study. Data was collected during home visits using the Parents Adoption Process Interview. A cluster analysis identified three different groups of families according to the level of adoption communication openness within the family and outside. The findings also showed that the process of the adoption communication openness started when parents decided to adopt, developed in parent-child interaction and was susceptible to change under professional intervention. The relevance of training given to prospective adopters and of professional practice based on scientific evidence is highlighted.

  8. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  9. Open Distribution of Virtual Containers as a Key Framework for Open Educational Resources and STEAM Subjects

    Science.gov (United States)

    Corbi, Alberto; Burgos, Daniel

    2017-01-01

    This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…

  10. Analysis of multi-stage open shop processing systems

    NARCIS (Netherlands)

    Eggermont, C.E.J.; Schrijver, A.; Woeginger, G.J.; Schwentick, T.; Dürr, C.

    2011-01-01

    We study algorithmic problems in multi-stage open shop processing systems that are centered around reachability and deadlock detection questions. We characterize safe and unsafe system states. We show that it is easy to recognize system states that can be reached from the initial state (where the

  11. 78 FR 79298 - Securities Exempted; Distribution of Shares by Registered Open-End Management Investment Company...

    Science.gov (United States)

    2013-12-30

    ...] Securities Exempted; Distribution of Shares by Registered Open- End Management Investment Company...) 551-6792, Investment Company Rulemaking Office, Division of Investment Management, U.S. Securities and... Distribution of shares by registered open-end management investment company. * * * * * (g) If a plan covers...

  12. Distributed Services with OpenAFS For Enterprise and Education

    CERN Document Server

    Milicchio, Franco

    2007-01-01

    Shows in detail how to build enterprise-level secure, redundant, and highly scalable services from scratch on top of the open source Linux operating system, suitable for small companies as well as big universities. This book presents the core architecture, based on Kerberos, LDAP, AFS, and Samba.

  13. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  14. Fouling distribution in forward osmosis membrane process.

    Science.gov (United States)

    Lee, Junseok; Kim, Bongchul; Hong, Seungkwan

    2014-06-01

    Fouling behavior along the length of membrane module was systematically investigated by performing simple modeling and lab-scale experiments of forward osmosis (FO) membrane process. The flux distribution model developed in this study showed a good agreement with experimental results, validating the robustness of the model. This model demonstrated, as expected, that the permeate flux decreased along the membrane channel due to decreasing osmotic pressure differential across the FO membrane. A series of fouling experiments were conducted under the draw and feed solutions at various recoveries simulated by the model. The simulated fouling experiments revealed that higher organic (alginate) fouling and thus more flux decline were observed at the last section of a membrane channel, as foulants in feed solution became more concentrated. Furthermore, the water flux in FO process declined more severely as the recovery increased due to more foulants transported to membrane surface with elevated solute concentrations at higher recovery, which created favorable solution environments for organic adsorption. The fouling reversibility also decreased at the last section of the membrane channel, suggesting that fouling distribution on FO membrane along the module should be carefully examined to improve overall cleaning efficiency. Lastly, it was found that such fouling distribution observed with co-current flow operation became less pronounced in counter-current flow operation of FO membrane process. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  15. An Open-Rotor Distributed Propulsion Aircraft Study

    OpenAIRE

    Gibbs, Jonathan; Bachmann, Arne; Seyfang, George; Peebles, Patrick; May, Chris; Saracoğlu, Bayındır; Paniagua, Guillermo

    2016-01-01

    The EU-funded SOAR project analyzed the high-lift efficiency of an open-fan wing design by systematic variation of fan blade count and angle. The research project built a cross-flow fan propelled wing section and investigated it by means of fluid dynamic simulation and wind tunnel testing. The experimental data resulting from the wind tunnel model were used to generate non-dimensional parameters which were used to scale data for the full-scale SOAR wing section. Preliminary aircraft ...

  16. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  17. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  18. Distributed Aerodynamic Sensing and Processing Toolbox

    Science.gov (United States)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  19. Distributed data processing for public health surveillance

    Directory of Open Access Journals (Sweden)

    Yih Katherine

    2006-09-01

    Full Text Available Abstract Background Many systems for routine public health surveillance rely on centralized collection of potentially identifiable, individual, identifiable personal health information (PHI records. Although individual, identifiable patient records are essential for conditions for which there is mandated reporting, such as tuberculosis or sexually transmitted diseases, they are not routinely required for effective syndromic surveillance. Public concern about the routine collection of large quantities of PHI to support non-traditional public health functions may make alternative surveillance methods that do not rely on centralized identifiable PHI databases increasingly desirable. Methods The National Bioterrorism Syndromic Surveillance Demonstration Program (NDP is an example of one alternative model. All PHI in this system is initially processed within the secured infrastructure of the health care provider that collects and holds the data, using uniform software distributed and supported by the NDP. Only highly aggregated count data is transferred to the datacenter for statistical processing and display. Results Detailed, patient level information is readily available to the health care provider to elucidate signals observed in the aggregated data, or for ad hoc queries. We briefly describe the benefits and disadvantages associated with this distributed processing model for routine automated syndromic surveillance. Conclusion For well-defined surveillance requirements, the model can be successfully deployed with very low risk of inadvertent disclosure of PHI – a feature that may make participation in surveillance systems more feasible for organizations and more appealing to the individuals whose PHI they hold. It is possible to design and implement distributed systems to support non-routine public health needs if required.

  20. Learning from the History of Distributed Query Processing

    DEFF Research Database (Denmark)

    Betz, Heiko; Gropengießer, Francis; Hose, Katja

    2012-01-01

    The vision of the Semantic Web has triggered the development of various new applications and opened up new directions in research. Recently, much effort has been put into the development of techniques for query processing over Linked Data. Being based upon techniques originally developed...... for distributed and federated databases, some of them inherit the same or similar problems. Thus, the goal of this paper is to point out pitfalls that the previous generation of researchers has already encountered and to introduce the Linked Data as a Service as an idea that has the potential to solve the problem...... in some scenarios. Hence, this paper discusses nine theses about Linked Data processing and sketches a research agenda for future endeavors in the area of Linked Data processing....

  1. Distribution of Selected Trace Elements in the Bayer Process

    Directory of Open Access Journals (Sweden)

    Johannes Vind

    2018-05-01

    Full Text Available The aim of this work was to achieve an understanding of the distribution of selected bauxite trace elements (gallium (Ga, vanadium (V, arsenic (As, chromium (Cr, rare earth elements (REEs, scandium (Sc in the Bayer process. The assessment was designed as a case study in an alumina plant in operation to provide an overview of the trace elements behaviour in an actual industrial setup. A combination of analytical techniques was used, mainly inductively coupled plasma mass spectrometry and optical emission spectroscopy as well as instrumental neutron activation analysis. It was found that Ga, V and As as well as, to a minor extent, Cr are principally accumulated in Bayer process liquors. In addition, Ga is also fractionated to alumina at the end of the Bayer processing cycle. The rest of these elements pass to bauxite residue. REEs and Sc have the tendency to remain practically unaffected in the solid phases of the Bayer process and, therefore, at least 98% of their mass is transferred to bauxite residue. The interest in such a study originates from the fact that many of these trace constituents of bauxite ore could potentially become valuable by-products of the Bayer process; therefore, the understanding of their behaviour needs to be expanded. In fact, Ga and V are already by-products of the Bayer process, but their distribution patterns have not been provided in the existing open literature.

  2. A radial distribution function-based open boundary force model for multi-centered molecules

    KAUST Repository

    Neumann, Philipp; Eckhardt, Wolfgang; Bungartz, Hans-Joachim

    2014-01-01

    We derive an expression for radial distribution function (RDF)-based open boundary forcing for molecules with multiple interaction sites. Due to the high-dimensionality of the molecule configuration space and missing rotational invariance, a

  3. An open software system based on X Windows for process control and equipment monitoring

    International Nuclear Information System (INIS)

    Aimar, A.; Carlier, E.; Mertens, V.

    1992-01-01

    The construction and application of a configurable open software system for process control and equipment monitoring can speed up and simplify the development and maintenance of equipment specific software as compared to individual solutions. The present paper reports the status of such an approach for the distributed control systems of SPS and LEP beam transfer components, based on X Windows and the OSF/Motif tool kit and applying data modeling and software engineering methods. (author)

  4. Inelastically scattering particles and wealth distribution in an open economy

    Czech Academy of Sciences Publication Activity Database

    Slanina, František

    2004-01-01

    Roč. 69, č. 4 (2004), 046102/1-046102/7 ISSN 1539-3755 R&D Projects: GA ČR GA202/01/1091 Institutional research plan: CEZ:AV0Z1010914 Keywords : stochastic processes * econophysics Subject RIV: BE - Theoretical Physics Impact factor: 2.352, year: 2004

  5. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  6. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    Science.gov (United States)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  7. Evaluation of the stress distribution on the pressure vessel head with multi-openings

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.S.; Kim, T.W.; Jeong, K.H.; Lee, G.M. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    This report discusses and analyzes the stress distribution on the pressure vessel head with multi-openings(3 PSV nozzles, 2 SDS nozzles and 1 Man Way) according to patterns of the opening distance. The pressurizer of Korea Standardized Nuclear Power Plant(Ulchin 3 and 4), which meets requirements of the cyclic operation and opening design defined by ASME code, was used as the basic model for that. Stress changes according to the distance between openings were investigated and the factors which should be considered for the opening design were analyzed. Also, the nozzle loads at Level A, B conditions and internal pressure were applied in order to evaluate changes of head stress distributions due to nozzle loads. (author). 6 refs., 29 figs., 4 tabs.

  8. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.

    2016-05-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  9. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.; Tamarkin, Molly

    2016-01-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  10. QUALITY AND PROCESSES OF BANGLADESH OPEN UNIVERSITY COURSE MATERIALS DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    K. M. Rezanur RAHMAN

    2006-04-01

    Full Text Available A new member of the mega-Universities, Bangladesh Open University (BOU introduced a course team approach for developing effective course materials for distance students. BOU teaching media includes printed course books, study guides, radio and television broadcasts, audiocassettes and occasional face-to-face tutorials. Each course team comprises specialist course writer(s, editor, trained style editor, graphic designer,illustrator, audio-visual producer and anonymous referees. An editorial board or preview committee is responsible for the final approval for publishing or broadcasting materials for learners. This approach has been proved to be effective, but appeared to be complicated and time-consuming. This report focuses on the quality and processes of BOU course materials development taking into account the strengths and weaknesses of the current approach.

  11. AVIRIS and TIMS data processing and distribution at the land processes distributed active archive center

    Science.gov (United States)

    Mah, G. R.; Myers, J.

    1993-01-01

    The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial

  12. Operating principle of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • Two control modes were developed for a B2B VSCs based SOP. • The SOP operating principle was investigated under various network conditions. • The performance of the SOP using two control modes was analyzed. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. Two control modes were developed for the operation of an SOP, using back-to-back voltage-source converters (VSCs). A power flow control mode with current control provides independent control of real and reactive power. A supply restoration mode with a voltage controller enables power supply to isolated loads due to network faults. The operating principle of the back-to-back VSCs based SOP was investigated under both normal and abnormal network operating conditions. Studies on a two-feeder medium-voltage distribution network showed the performance of the SOP under different network-operating conditions: normal, during a fault and post-fault supply restoration. During the change of network operating conditions, a mode switch method based on the phase locked loop controller was used to achieve the transitions between the two control modes. Hard transitions by a direct mode switching were noticed unfavourable, but seamless transitions were obtained by deploying a soft cold load pickup and voltage synchronization process.

  13. Flexible distributed architecture for semiconductor process control and experimentation

    Science.gov (United States)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  14. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith

    2017-09-27

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  15. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith; Bodik, Peter; Menache, Ishai; Canini, Marco; Ciucu, Florin

    2017-01-01

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  16. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    Science.gov (United States)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  17. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    Science.gov (United States)

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  18. Parallel and distributed processing: applications to power systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Felix; Murphy, Liam [California Univ., Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1994-12-31

    Applications of parallel and distributed processing to power systems problems are still in the early stages. Rapid progress in computing and communications promises a revolutionary increase in the capacity of distributed processing systems. In this paper, the state-of-the art in distributed processing technology and applications is reviewed and future trends are discussed. (author) 14 refs.,1 tab.

  19. ACToR Chemical Structure processing using Open Source ...

    Science.gov (United States)

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d

  20. Matrix product representation of the stationary state of the open zero range process

    Science.gov (United States)

    Bertin, Eric; Vanicat, Matthieu

    2018-06-01

    Many one-dimensional lattice particle models with open boundaries, like the paradigmatic asymmetric simple exclusion process (ASEP), have their stationary states represented in the form of a matrix product, with matrices that do not explicitly depend on the lattice site. In contrast, the stationary state of the open 1D zero-range process (ZRP) takes an inhomogeneous factorized form, with site-dependent probability weights. We show that in spite of the absence of correlations, the stationary state of the open ZRP can also be represented in a matrix product form, where the matrices are site-independent, non-commuting and determined from algebraic relations resulting from the master equation. We recover the known distribution of the open ZRP in two different ways: first, using an explicit representation of the matrices and boundary vectors; second, from the sole knowledge of the algebraic relations satisfied by these matrices and vectors. Finally, an interpretation of the relation between the matrix product form and the inhomogeneous factorized form is proposed within the framework of hidden Markov chains.

  1. In vitro simulation of distribution processes following intramuscular injection

    Directory of Open Access Journals (Sweden)

    Probst Mareike

    2016-09-01

    Full Text Available There is an urgent need for in vitro dissolution test setups for intramuscularly applied dosage forms. Especially biorelevant methods are needed to predict the in vivo behavior of newly developed dosage forms in a realistic way. There is a lack of knowledge regarding critical in vivo parameters influencing the release and absorption behavior of an intramuscularly applied drug. In the presented work the focus was set on the simulation of blood perfusion and muscle tissue. A solid agarose gel, being incorporated in an open-pored foam, was used to mimic the gel phase of muscle tissue and implemented in a flow through cell. An aqueous solution of fluorescein sodium was injected. Compared to recently obtained in vivo results the distribution of the model substance was very slow. Furthermore an agarose gel of lower viscosity an open-pored foam and phosphate buffer saline pH 7.4 were implemented in a multi-channel-ceramic membrane serving as a holder for the muscle imitating material. Blood simulating release medium was perfused through the ceramic membrane including filling materials. Transport of the dissolved fluorescein sodium was, in case of the gel, not only determined by diffusion but also by convective transport processes. The more realistic the muscle simulating materials were constituted the less reproducible results were obtained with the designed test setups.

  2. Distributed Real-Time Embedded Video Processing

    National Research Council Canada - National Science Library

    Lv, Tiehan

    2004-01-01

    .... A deployable multi-camera video system must perform distributed computation, including computation near the camera as well as remote computations, in order to meet performance and power requirements...

  3. A tutorial on Palm distributions for spatial point processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This tutorial provides an introduction to Palm distributions for spatial point processes. Initially, in the context of finite point processes, we give an explicit definition of Palm distributions in terms of their density functions. Then we review Palm distributions in the general case. Finally, we...

  4. Multivariate semi-logistic distribution and processes | Umar | Journal ...

    African Journals Online (AJOL)

    Multivariate semi-logistic distribution is introduced and studied. Some characterizations properties of multivariate semi-logistic distribution are presented. First order autoregressive minification processes and its generalization to kth order autoregressive minification processes with multivariate semi-logistic distribution as ...

  5. IJ-OpenCV: Combining ImageJ and OpenCV for processing images in biomedicine.

    Science.gov (United States)

    Domínguez, César; Heras, Jónathan; Pascual, Vico

    2017-05-01

    The effective processing of biomedical images usually requires the interoperability of diverse software tools that have different aims but are complementary. The goal of this work is to develop a bridge to connect two of those tools: ImageJ, a program for image analysis in life sciences, and OpenCV, a computer vision and machine learning library. Based on a thorough analysis of ImageJ and OpenCV, we detected the features of these systems that could be enhanced, and developed a library to combine both tools, taking advantage of the strengths of each system. The library was implemented on top of the SciJava converter framework. We also provide a methodology to use this library. We have developed the publicly available library IJ-OpenCV that can be employed to create applications combining features from both ImageJ and OpenCV. From the perspective of ImageJ developers, they can use IJ-OpenCV to easily create plugins that use any functionality provided by the OpenCV library and explore different alternatives. From the perspective of OpenCV developers, this library provides a link to the ImageJ graphical user interface and all its features to handle regions of interest. The IJ-OpenCV library bridges the gap between ImageJ and OpenCV, allowing the connection and the cooperation of these two systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Cumulative processes and quark distribution in nuclei

    International Nuclear Information System (INIS)

    Kondratyuk, L.; Shmatikov, M.

    1984-01-01

    Assuming existence of multiquark (mainly 12q) bags in nuclei the spectra of cumulative nucleons and mesons produced in high-energy particle-nucleus collisions are discussed. The exponential form of quark momentum distribution in 12q-bag (agreeing well with the experimental data on lepton-nucleus interactions at large q 2 ) is shown to result in quasi-exponential distribution of cumulative particles over the light-cone variable αsub(B). The dependence of f(αsub(B); psub(perpendicular)) (where psub(perpendicular) is the transverse momentum of the bag) upon psub(perpendicular) is considered. The yields of cumulative resonances as well as effects related to the u- and d-quark distributions in N > Z nuclei being different are dicscussed

  7. Mapping innovation processes: Visual techniques for opening and presenting the black box of service innovation processes

    DEFF Research Database (Denmark)

    Olesen, Anne Rørbæk

    2017-01-01

    This chapter argues for the usefulness of visual mapping techniques for performing qualitative analysis of complex service innovation processes. Different mapping formats are presented, namely, matrices, networks, process maps, situational analysis maps and temporal situational analysis maps....... For the purpose of researching service innovation processes, the three latter formats are argued to be particularly interesting. Process maps can give an overview of different periods and milestones in a process in one carefully organized location. Situational analysis maps and temporal situational analysis maps...... can open up complexities of service innovation processes, as well as close them down for presentational purposes. The mapping formats presented are illustrated by displaying maps from an exemplary research project, and the chapter is concluded with a brief discussion of the limitations and pitfalls...

  8. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available There are several sources that indicate a remarkable increase in the adoption of open source software (OSS) into the technology infrastructure of organizations. In fact, the number of medium to large organizations without some OSS installations...

  9. Signal processing for distributed readout using TESs

    International Nuclear Information System (INIS)

    Smith, Stephen J.; Whitford, Chris H.; Fraser, George W.

    2006-01-01

    We describe optimal filtering algorithms for determining energy and position resolution in position-sensitive Transition Edge Sensor (TES) Distributed Read-Out Imaging Devices (DROIDs). Improved algorithms, developed using a small-signal finite-element model, are based on least-squares minimisation of the total noise power in the correlated dual TES DROID. Through numerical simulations we show that significant improvements in energy and position resolution are theoretically possible over existing methods

  10. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    Science.gov (United States)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  11. A radial distribution function-based open boundary force model for multi-centered molecules

    KAUST Repository

    Neumann, Philipp

    2014-06-01

    We derive an expression for radial distribution function (RDF)-based open boundary forcing for molecules with multiple interaction sites. Due to the high-dimensionality of the molecule configuration space and missing rotational invariance, a computationally cheap, 1D approximation of the arising integral expressions as in the single-centered case is not possible anymore. We propose a simple, yet accurate model invoking standard molecule- and site-based RDFs to approximate the respective integral equation. The new open boundary force model is validated for ethane in different scenarios and shows very good agreement with data from periodic simulations. © World Scientific Publishing Company.

  12. Numerical simulation of distributed parameter processes

    CERN Document Server

    Colosi, Tiberiu; Unguresan, Mihaela-Ligia; Muresan, Vlad

    2013-01-01

    The present monograph defines, interprets and uses the matrix of partial derivatives of the state vector with applications for the study of some common categories of engineering. The book covers broad categories of processes that are formed by systems of partial derivative equations (PDEs), including systems of ordinary differential equations (ODEs). The work includes numerous applications specific to Systems Theory based on Mpdx, such as parallel, serial as well as feed-back connections for the processes defined by PDEs. For similar, more complex processes based on Mpdx with PDEs and ODEs as components, we have developed control schemes with PID effects for the propagation phenomena, in continuous media (spaces) or discontinuous ones (chemistry, power system, thermo-energetic) or in electro-mechanics (railway – traction) and so on. The monograph has a purely engineering focus and is intended for a target audience working in extremely diverse fields of application (propagation phenomena, diffusion, hydrodyn...

  13. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  14. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    Science.gov (United States)

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  15. Student Support in Open Learning: Sustaining the Process.

    Science.gov (United States)

    Dearnley, Christine

    2003-01-01

    A 2-year study included interviews with 18 and survey of 160 nurses studying through open learning in the United Kingdom. They were challenged by returning to study, requiring time management and technological skills. Professional, academic, and social networks provided important support as life responsibilities and events impinged on learning.…

  16. 17 CFR 270.12b-1 - Distribution of shares by registered open-end management investment company.

    Science.gov (United States)

    2010-04-01

    ... registered open-end management investment company. 270.12b-1 Section 270.12b-1 Commodity and Securities... 1940 § 270.12b-1 Distribution of shares by registered open-end management investment company. (a)(1... the printing and mailing of sales literature; (b) A registered, open-end management investment company...

  17. open-quotes Shift-Betelclose quotes: A (very) distributed mainframe

    International Nuclear Information System (INIS)

    Segal, B.; Martin, O.; Hassine, F.; Hemmer, F.; Jouanigot, J.M.

    1994-01-01

    Over the last four years, CERN has progressively converted its central batch production facilities from classic mainframe platforms (Cray XMP, IBM, ESA, Vax 9000) to distributed RISC based facilities, which have now attained a very large size. Both a CPU-intensive system (open-quotes CSFclose quotes, the Central Simulation Facility) and an I/O-intensive system (open-quotes SHIFTclose quotes, the Scaleable Heterogeneous Integrated Facility) have been developed, plus a distributed data management subsystem allowing seamless access to CERN'S central tape store and to large amounts of economical disk space. The full system is known as open-quotes COREclose quotes, the Centrally Operated Risc Environment; at the time of writing CORE comprises around 2000 CERN Units of Computing (about 8000 MIPs) and over a TeraByte of online disk space. This distributed system is connected using standard networking technologies (IP protocols over Ethernet, FDDI and UltraNet), but which until quite recently were only implemented at sufficiently high speed in the Local Area

  18. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  19. wradlib - an Open Source Library for Weather Radar Data Processing

    Science.gov (United States)

    Pfaff, Thomas; Heistermann, Maik; Jacobi, Stephan

    2014-05-01

    for interactive data exploration and analysis. Based on the powerful scientific python stack (numpy, scipy, matplotlib) and in parts augmented by functions compiled in C or Fortran, most routines are fast enough to also allow data intensive re-analyses or even real-time applications. From the organizational point of view, wradlib is intended to be community driven. To this end, the source code is made available using a distributed version control system (DVCS) with a publicly hosted repository. Code may be contributed using the fork/pull-request mechanism available to most modern DVCS. Mailing lists were set up to allow dedicated exchange among users and developers in order to fix problems and discuss new developments. Extensive documentation is a key feature of the library, and is available online at http://wradlib.bitbucket.org. It includes an individual function reference as well as examples, tutorials and recipes, showing how those routines can be combined to create complete processing workflows. This should allow new users to achieve results quickly, even without much prior experience with weather radar data.

  20. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  1. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  2. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  3. Design of distributed systems of hydrolithosphere processes management. A synthesis of distributed management systems

    Science.gov (United States)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The paper considers an important problem of designing distributed systems of hydrolithosphere processes management. The control actions on the hydrolithosphere processes under consideration are implemented by a set of extractive wells. The article shows the method of defining the approximation links for description of the dynamic characteristics of hydrolithosphere processes. The structure of distributed regulators, used in the management systems by the considered processes, is presented. The paper analyses the results of the synthesis of the distributed management system and the results of modelling the closed-loop control system by the parameters of the hydrolithosphere process.

  4. Point processes and the position distribution of infinite boson systems

    International Nuclear Information System (INIS)

    Fichtner, K.H.; Freudenberg, W.

    1987-01-01

    It is shown that to each locally normal state of a boson system one can associate a point process that can be interpreted as the position distribution of the state. The point process contains all information one can get by position measurements and is determined by the latter. On the other hand, to each so-called Σ/sup c/-point process Q they relate a locally normal state with position distribution Q

  5. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  6. Processing and Distribution of STO2 Data

    Science.gov (United States)

    Goldsmith, Paul

    We propose in this ADAP to reduce the data obtained in the December 2016 flight of the STO2 Antarctic Balloon observatory. In just over 20 days of taking data, STO2 observed over 2.5 square degrees of the inner Milky Way in the 1900 GHz (158 m) fine structure line of ionized carbon ([CII]). This includes over 320,000 spectra with velocity resolution of 0.16 km/s and angular resolution 1 . In common with the higher bands of the Herschel HIFI instrument that also employed hot electron bolometer (HEB) mixers, there are significant baseline issues with the data that make reduction a significant challenge. Due to the year’s postponement of STO2 launch due to weather in 2015/16 season, funds for data analysis were largely redirected to support the team who enabled the successful launch and flight. A supplementary focused effort is thus needed to make STO2 data readily usable by the astronomical community, which is what we propose here. This ADAP will be a two-year program, including the following steps:: (1) Refine and optimize algorithms for excision of bad channels, correction for receiver gain changes, removal of variable bad baselines, final baseline adjustment, and verification of calibration. (2) Develop and integrated pipeline incorporating the optimized algorithms; process entire STO2 data set using the pipeline, and make an initial release of the data (DR1) to the public. (3) Refine data calibration including ancillary data sets coincident with the STO2 fields, make the data VO-compliant. (4) Write documentation for the pipeline and publish in appropriate journal; release final second data release (DR2) to the public, and hand off to permanent data repositories the NASA/IPAC IRSA database and the Harvard University Dataverse, and Cyverse, led by the University of Arizona. Members of the STO2 data reduction team have extensive experience with HIFI data, and particularly with the HEB fine structure spectra. We are thus confident that we can build on this

  7. Alkali corrosion resistant coatings and ceramic foams having superfine open cell structure and method of processing

    Science.gov (United States)

    Brown, Jr., Jesse J.; Hirschfeld, Deidre A.; Li, Tingkai

    1993-12-07

    Alkali corrosion resistant coatings and ceramic foams having superfine open cell structure are created using sol-gel processes. The processes have particular application in creating calcium magnesium zirconium phosphate, CMZP, coatings and foams.

  8. Co-occurrence of Photochemical and Microbiological Transformation Processes in Open-Water Unit Process Wetlands.

    Science.gov (United States)

    Prasse, Carsten; Wenk, Jannis; Jasper, Justin T; Ternes, Thomas A; Sedlak, David L

    2015-12-15

    The fate of anthropogenic trace organic contaminants in surface waters can be complex due to the occurrence of multiple parallel and consecutive transformation processes. In this study, the removal of five antiviral drugs (abacavir, acyclovir, emtricitabine, lamivudine and zidovudine) via both bio- and phototransformation processes, was investigated in laboratory microcosm experiments simulating an open-water unit process wetland receiving municipal wastewater effluent. Phototransformation was the main removal mechanism for abacavir, zidovudine, and emtricitabine, with half-lives (t1/2,photo) in wetland water of 1.6, 7.6, and 25 h, respectively. In contrast, removal of acyclovir and lamivudine was mainly attributable to slower microbial processes (t1/2,bio = 74 and 120 h, respectively). Identification of transformation products revealed that bio- and phototransformation reactions took place at different moieties. For abacavir and zidovudine, rapid transformation was attributable to high reactivity of the cyclopropylamine and azido moieties, respectively. Despite substantial differences in kinetics of different antiviral drugs, biotransformation reactions mainly involved oxidation of hydroxyl groups to the corresponding carboxylic acids. Phototransformation rates of parent antiviral drugs and their biotransformation products were similar, indicating that prior exposure to microorganisms (e.g., in a wastewater treatment plant or a vegetated wetland) would not affect the rate of transformation of the part of the molecule susceptible to phototransformation. However, phototransformation strongly affected the rates of biotransformation of the hydroxyl groups, which in some cases resulted in greater persistence of phototransformation products.

  9. Transparent checkpointing and process migration in a distributed system

    OpenAIRE

    2004-01-01

    A distributed system for creating a checkpoint for a plurality of processes running on the distributed system. The distributed system includes a plurality of compute nodes with an operating system executing on each compute node. A checkpoint library resides at the user level on each of the compute nodes, and the checkpoint library is transparent to the operating system residing on the same compute node and to the other compute nodes. Each checkpoint library uses a windowed messaging logging p...

  10. Tempered stable distributions stochastic models for multiscale processes

    CERN Document Server

    Grabchak, Michael

    2015-01-01

    This brief is concerned with tempered stable distributions and their associated Levy processes. It is a good text for researchers interested in learning about tempered stable distributions.  A tempered stable distribution is one which takes a stable distribution and modifies its tails to make them lighter. The motivation for this class comes from the fact that infinite variance stable distributions appear to provide a good fit to data in a variety of situations, but the extremely heavy tails of these models are not realistic for most real world applications. The idea of using distributions that modify the tails of stable models to make them lighter seems to have originated in the influential paper of Mantegna and Stanley (1994). Since then, these distributions have been extended and generalized in a variety of ways. They have been applied to a wide variety of areas including mathematical finance, biostatistics,computer science, and physics.

  11. Limiting conditional distributions for birth-death processes

    NARCIS (Netherlands)

    Kijima, M.; Nair, M.G.; Pollett, P.K.; van Doorn, Erik A.

    1997-01-01

    In a recent paper one of us identified all of the quasi-stationary distributions for a non-explosive, evanescent birth-death process for which absorption is certain, and established conditions for the existence of the corresponding limiting conditional distributions. Our purpose is to extend these

  12. Aspects Concerning the Optimization of Authentication Process for Distributed Applications

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2008-06-01

    Full Text Available There will be presented the types of distributed applications. The quality characteristics for distributed applications will be analyzed. There will be established the ways to assign access rights. The authentication category will be analyzed. We will propose an algorithm for the optimization of authentication process.For the application “Evaluation of TIC projects” the algorithm proposed will be applied.

  13. Experimental investigation of the ion current distribution in microsecond plasma opening switch

    Energy Technology Data Exchange (ETDEWEB)

    Bystritskij, V; Grigor` ev, S; Kharlov, A; Sinebryukhov, A [Russian Academy of Sciences, Tomsk (Russian Federation). Institute of Electrophysics

    1997-12-31

    This paper is devoted to the investigations of properties of the microsecond plasma opening switch (MPOS) as an ion beam source for surface modification. Two plasma sources were investigated: flash-board and cable guns. The detailed measurements of axial and azimuthal distributions of ion current density in the switch were performed. It was found that the azimuthal inhomogeneity of the ion beam increases from the beginning to the end of MPOS. The advantages and problems of this approach are discussed. (author). 5 figs., 2 refs.

  14. On relation between distribution functions in hard and soft processes

    International Nuclear Information System (INIS)

    Kisselev, A.V.; Petrov, V.A.

    1992-10-01

    It is shown that in the particle-exchange model the hadron-hadron scattering amplitude admits parton-like representation with the distribution functions coinciding with those extracted from deep inelastic processes. (author). 13 refs

  15. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  16. Employer Brand Opens up for a Gender Process Model

    Directory of Open Access Journals (Sweden)

    Hans Lundkvist

    2011-11-01

    Full Text Available Regardless of a long tradition of legislation, policymaking and practical achievements, the issues ofgender equality and of the segregated labor market still remain a matter of concern in Sweden. Thispaper describes a collaborative process between a research project and an engineering enterprise.It describes the point of departure, based on the concept of employer brand, of a long-term changeprocess and the different phases and activities during an intensive period 2009. The collaborationaimed to develop innovative methods, and to apply them in order to achieve increased genderawareness, and thereby to be able to retain and attract the best labor for tomorrow. Differentapproaches and methods as analogies, anecdotes, and pictures were used to nourish the process.Findings showed that the interactive process contributed to increased awareness. During the processthe enterprise became more conscious of the potential of being a gender equal employer

  17. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  18. SUPERPOSITION OF STOCHASTIC PROCESSES AND THE RESULTING PARTICLE DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Schwadron, N. A.; Dayeh, M. A.; Desai, M.; Fahr, H.; Jokipii, J. R.; Lee, M. A.

    2010-01-01

    Many observations of suprathermal and energetic particles in the solar wind and the inner heliosheath show that distribution functions scale approximately with the inverse of particle speed (v) to the fifth power. Although there are exceptions to this behavior, there is a growing need to understand why this type of distribution function appears so frequently. This paper develops the concept that a superposition of exponential and Gaussian distributions with different characteristic speeds and temperatures show power-law tails. The particular type of distribution function, f ∝ v -5 , appears in a number of different ways: (1) a series of Poisson-like processes where entropy is maximized with the rates of individual processes inversely proportional to the characteristic exponential speed, (2) a series of Gaussian distributions where the entropy is maximized with the rates of individual processes inversely proportional to temperature and the density of individual Gaussian distributions proportional to temperature, and (3) a series of different diffusively accelerated energetic particle spectra with individual spectra derived from observations (1997-2002) of a multiplicity of different shocks. Thus, we develop a proof-of-concept for the superposition of stochastic processes that give rise to power-law distribution functions.

  19. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    Science.gov (United States)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  20. Mistaking geography for biology: inferring processes from species distributions.

    Science.gov (United States)

    Warren, Dan L; Cardillo, Marcel; Rosauer, Dan F; Bolnick, Daniel I

    2014-10-01

    Over the past few decades, there has been a rapid proliferation of statistical methods that infer evolutionary and ecological processes from data on species distributions. These methods have led to considerable new insights, but they often fail to account for the effects of historical biogeography on present-day species distributions. Because the geography of speciation can lead to patterns of spatial and temporal autocorrelation in the distributions of species within a clade, this can result in misleading inferences about the importance of deterministic processes in generating spatial patterns of biodiversity. In this opinion article, we discuss ways in which patterns of species distributions driven by historical biogeography are often interpreted as evidence of particular evolutionary or ecological processes. We focus on three areas that are especially prone to such misinterpretations: community phylogenetics, environmental niche modelling, and analyses of beta diversity (compositional turnover of biodiversity). Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  1. Characteristics of the Audit Processes for Distributed Informatics Systems

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2009-01-01

    Full Text Available The paper contains issues regarding: main characteristics and examples of the distributed informatics systems and main difference categories among them, concepts, principles, techniques and fields for auditing the distributed informatics systems, concepts and classes of the standard term, characteristics of this one, examples of standards, guidelines, procedures and controls for auditing the distributed informatics systems. The distributed informatics systems are characterized by the following issues: development process, resources, implemented functionalities, architectures, system classes, particularities. The audit framework has two sides: the audit process and auditors. The audit process must be led in accordance with the standard specifications in the IT&C field. The auditors must meet the ethical principles and they must have a high-level of professional skills and competence in IT&C field.

  2. Image exploitation and dissemination prototype of distributed image processing

    International Nuclear Information System (INIS)

    Batool, N.; Huqqani, A.A.; Mahmood, A.

    2003-05-01

    Image processing applications requirements can be best met by using the distributed environment. This report presents to draw inferences by utilizing the existed LAN resources under the distributed computing environment using Java and web technology for extensive processing to make it truly system independent. Although the environment has been tested using image processing applications, its design and architecture is truly general and modular so that it can be used for other applications as well, which require distributed processing. Images originating from server are fed to the workers along with the desired operations to be performed on them. The Server distributes the task among the Workers who carry out the required operations and send back the results. This application has been implemented using the Remote Method Invocation (RMl) feature of Java. Java RMI allows an object running in one Java Virtual Machine (JVM) to invoke methods on another JVM thus providing remote communication between programs written in the Java programming language. RMI can therefore be used to develop distributed applications [1]. We undertook this project to gain a better understanding of distributed systems concepts and its uses for resource hungry jobs. The image processing application is developed under this environment

  3. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  4. SignalPlant: an open signal processing software platform

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Jurčo, Juraj; Halámek, Josef; Jurák, Pavel

    2016-01-01

    Roč. 37, č. 7 (2016), N38-N48 ISSN 0967-3334 R&D Projects: GA ČR GAP103/11/0933; GA MŠk(CZ) LO1212; GA ČR GAP102/12/2034 Institutional support: RVO:68081731 Keywords : data visualization * software * signal processing * ECG * EEG Subject RIV: FS - Medical Facilities ; Equipment Impact factor: 2.058, year: 2016

  5. MzJava: An open source library for mass spectrometry data processing.

    Science.gov (United States)

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. SignalPlant: an open signal processing software platform.

    Science.gov (United States)

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  7. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Science.gov (United States)

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  8. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  9. Parallel and distributed processing in power system simulation and control

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, Djalma M [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia

    1994-12-31

    Recent advances in computer technology will certainly have a great impact in the methodologies used in power system expansion and operational planning as well as in real-time control. Parallel and distributed processing are among the new technologies that present great potential for application in these areas. Parallel computers use multiple functional or processing units to speed up computation while distributed processing computer systems are collection of computers joined together by high speed communication networks having many objectives and advantages. The paper presents some ideas for the use of parallel and distributed processing in power system simulation and control. It also comments on some of the current research work in these topics and presents a summary of the work presently being developed at COPPE. (author) 53 refs., 2 figs.

  10. Production of Polystyrene Open-celled Microcellular Foam in Batch Process by Super Critical CO2

    Directory of Open Access Journals (Sweden)

    M.S. Enayati

    2010-12-01

    Full Text Available Open-celled foams are capable to allow the passage of fluids through their structure, because of interconnections between the open cells or bubbles and therefore these structures can be used as a membrane and filter. In thiswork, we have studied the production of polystyrene open-celled microcellular foam by using CO2 as blowing agent. To achieve such structures, it is necessary to control the stages of growth in such a way that the cells would connect to each other through the pores without any coalescence. The required processing condition to achieve open-celled structures is predictable by a model theory of opened-cell. This model suggests that at least a 130 bar saturation pressure and foaming time between 9 and 58 s are required for this system. The temperature range has been selected for to be both higher than polymer glass transition temperature and facilitating the foaming process. Experimental results in the batch foaming process has verified the model quite well. The SEM and mercury porousimetry tests show the presence of pores between the cells with open-celled structure. Experimental results show that by increasing the saturation pressure and the foaming temperature, there is a drop in the time required for open-celled structure formation. A 130 bar saturation pressure, 150o C foaming temperature and 60 s foaming time, suggest the attainment of open-celled microcellular foam based on polystyrene/CO2 system in the batch process.

  11. Distributed open environment for data retrieval based on pattern recognition techniques

    International Nuclear Information System (INIS)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A.

    2010-01-01

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  12. Distributed Open Environment for Data Retrieval based on Pattern Recognition Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Full text of publication follows: Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, inter-operability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE, which provides a mature standard framework and a modular architecture. It can handle transactions and competition of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows concealment of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests. (authors)

  13. Distributed open environment for data retrieval based on pattern recognition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A., E-mail: augusto.pereira@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain); Vega, J.; Castro, R.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain)

    2010-07-15

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  14. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    Science.gov (United States)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated

  15. Parallel Distributed Processing theory in the age of deep networks

    OpenAIRE

    Bowers, Jeffrey

    2017-01-01

    Parallel Distributed Processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely, that all knowledge is coded in a distributed format, and cognition is mediated by non-symbolic computations. These claims have long been debated within cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks le...

  16. Hydrocarbon distributions in sediments of the open area of the Arabian Gulf following the 1991 Gulf War oil spill

    International Nuclear Information System (INIS)

    Al-Lihaibi, S.S.; Ghazi, S.J.

    1997-01-01

    Surface sediments collected from the open area of the Arabian Gulf were analysed for total petroleum hydrocarbons and specific aliphatic hydrocarbon components in order to provide information on the extent of oil contamination and the degree of weathering of the spilled oil following the Gulf War. The surface distribution of the petroleum hydrocarbons showed an increasing trend towards the north-east, and among the individual transects there was a pronounced increasing trend towards the north-west direction. Despite off-shore oil-related activities as well as a potential impact from the 1991 oil spill, the concentrations of petroleum hydrocarbons in the study area were relatively low. This finding may be attributed to the effectiveness of weathering processes. (author)

  17. Resource depletion promotes automatic processing: implications for distribution of practice.

    Science.gov (United States)

    Scheel, Matthew H

    2010-12-01

    Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.

  18. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  19. A Technical Survey on Optimization of Processing Geo Distributed Data

    Science.gov (United States)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  20. Distribution and emission of oxamyl in a rockwool cultivation system with open drainage of the nutrient solution

    NARCIS (Netherlands)

    Runia, W.T.; Dekker, A.; Houx, N.W.H.

    1995-01-01

    On a 1.8 ha eggplant nursery with open drainage of the excess of nutrient solution the distribution of oxamyl was measured after it had been added to the nutrient solution. When it was applied via injection at a tap of a section, the distribution was more uniform than when applied via the central

  1. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    International Nuclear Information System (INIS)

    Limosani, Antonio; Boland, Lucien; Crosby, Sean; Huang, Joanna; Sevior, Martin; Coddington, Paul; Zhang, Shunde; Wilson, Ross

    2014-01-01

    The Australian Government is making a $AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  2. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    Science.gov (United States)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  3. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  4. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  5. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  6. Measurement of tracer gas distributions using an open-path FTIR system coupled with computed tomography

    Science.gov (United States)

    Drescher, Anushka C.; Yost, Michael G.; Park, Doo Y.; Levine, Steven P.; Gadgil, Ashok J.; Fischer, Marc L.; Nazaroff, William W.

    1995-05-01

    Optical remote sensing and iterative computed tomography (CT) can be combined to measure the spatial distribution of gaseous pollutant concentrations in a plane. We have conducted chamber experiments to test this combination of techniques using an Open Path Fourier Transform Infrared Spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). ART was found to converge to solutions that showed excellent agreement with the ray integral concentrations measured by the FTIR but were inconsistent with simultaneously gathered point sample concentration measurements. A new CT method was developed based on (a) the superposition of bivariate Gaussians to model the concentration distribution and (b) a simulated annealing minimization routine to find the parameters of the Gaussians that resulted in the best fit to the ray integral concentration data. This new method, named smooth basis function minimization (SBFM) generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present one set of illustrative experimental data to compare the performance of ART and SBFM.

  7. Formal specification of open distributed systems - overview and evaluation of existing methods

    International Nuclear Information System (INIS)

    Stoelen, Ketil

    1998-02-01

    This report classifies, compares and evaluates eleven specification languages for distributed systems. The eleven specification languages have been picked from a wide spectrum of areas embracing both industry and research. We have selected languages that we see as important; either because they have proved useful within the commercial software industry, or because they play or we expect them to play an important role within research. Based on literature studies, we investigate the suitability of these specification languages to describe open distributed systems. The languages are also evaluated with respect to support for refinement and the characterization of proof-obligations. The report consists of five main parts: Part 1 gives the background and motivation for the evaluation; it also introduces the basic terminology; Part 2 motivates, identifies and formulates the concrete evaluation criterions; Part 3 evaluates the specification languages with respect to the evaluation criterions formulated in Part 2; Part 4 sums up the results from the evaluation in the form of tables; it also draws some conclusions and identifies some directions for further studies; Part 5 consists of two appendices, namely a bibliography and a list of abbreviations. (author)

  8. Benchmarking Distributed Stream Processing Platforms for IoT Applications

    OpenAIRE

    Shukla, Anshu; Simmhan, Yogesh

    2016-01-01

    Internet of Things (IoT) is a technology paradigm where millions of sensors monitor, and help inform or manage, physical, envi- ronmental and human systems in real-time. The inherent closed-loop re- sponsiveness and decision making of IoT applications makes them ideal candidates for using low latency and scalable stream processing plat- forms. Distributed Stream Processing Systems (DSPS) are becoming es- sential components of any IoT stack, but the efficacy and performance of contemporary DSP...

  9. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  10. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...

  11. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...

  12. An Analysis of OpenACC Programming Model: Image Processing Algorithms as a Case Study

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2014-06-01

    Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

  13. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    Science.gov (United States)

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  14. Just-in-time Data Distribution for Analytical Query Processing

    NARCIS (Netherlands)

    M.G. Ivanova (Milena); M.L. Kersten (Martin); F.E. Groffen (Fabian)

    2012-01-01

    textabstract Distributed processing commonly requires data spread across machines using a priori static or hash-based data allocation. In this paper, we explore an alternative approach that starts from a master node in control of the complete database, and a variable number of worker nodes

  15. Distributed Iterative Processing for Interference Channels with Receiver Cooperation

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Manchón, Carles Navarro; Bota, Vasile

    2012-01-01

    We propose a method for the design and evaluation of distributed iterative algorithms for receiver cooperation in interference-limited wireless systems. Our approach views the processing within and collaboration between receivers as the solution to an inference problem in the probabilistic model...

  16. First-Passage-Time Distribution for Variable-Diffusion Processes

    Science.gov (United States)

    Barney, Liberty; Gunaratne, Gemunu H.

    2017-05-01

    First-passage-time distribution, which presents the likelihood of a stock reaching a pre-specified price at a given time, is useful in establishing the value of financial instruments and in designing trading strategies. First-passage-time distribution for Wiener processes has a single peak, while that for stocks exhibits a notable second peak within a trading day. This feature has only been discussed sporadically—often dismissed as due to insufficient/incorrect data or circumvented by conversion to tick time—and to the best of our knowledge has not been explained in terms of the underlying stochastic process. It was shown previously that intra-day variations in the market can be modeled by a stochastic process containing two variable-diffusion processes (Hua et al. in, Physica A 419:221-233, 2015). We show here that the first-passage-time distribution of this two-stage variable-diffusion model does exhibit a behavior similar to the empirical observation. In addition, we find that an extended model incorporating overnight price fluctuations exhibits intra- and inter-day behavior similar to those of empirical first-passage-time distributions.

  17. Proceedings: Distributed digital systems, plant process computers, and networks

    International Nuclear Information System (INIS)

    1995-03-01

    These are the proceedings of a workshop on Distributed Digital Systems, Plant Process Computers, and Networks held in Charlotte, North Carolina on August 16--18, 1994. The purpose of the workshop was to provide a forum for technology transfer, technical information exchange, and education. The workshop was attended by more than 100 representatives of electric utilities, equipment manufacturers, engineering service organizations, and government agencies. The workshop consisted of three days of presentations, exhibitions, a panel discussion and attendee interactions. Original plant process computers at the nuclear power plants are becoming obsolete resulting in increasing difficulties in their effectiveness to support plant operations and maintenance. Some utilities have already replaced their plant process computers by more powerful modern computers while many other utilities intend to replace their aging plant process computers in the future. Information on recent and planned implementations are presented. Choosing an appropriate communications and computing network architecture facilitates integrating new systems and provides functional modularity for both hardware and software. Control room improvements such as CRT-based distributed monitoring and control, as well as digital decision and diagnostic aids, can improve plant operations. Commercially available digital products connected to the plant communications system are now readily available to provide distributed processing where needed. Plant operations, maintenance activities, and engineering analyses can be supported in a cost-effective manner. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  18. Distributed quantum information processing via quantum dot spins

    International Nuclear Information System (INIS)

    Jun, Liu; Qiong, Wang; Le-Man, Kuang; Hao-Sheng, Zeng

    2010-01-01

    We propose a scheme to engineer a non-local two-qubit phase gate between two remote quantum-dot spins. Along with one-qubit local operations, one can in principal perform various types of distributed quantum information processing. The scheme employs a photon with linearly polarisation interacting one after the other with two remote quantum-dot spins in cavities. Due to the optical spin selection rule, the photon obtains a Faraday rotation after the interaction process. By measuring the polarisation of the final output photon, a non-local two-qubit phase gate between the two remote quantum-dot spins is constituted. Our scheme may has very important applications in the distributed quantum information processing

  19. The Open Method of Coordination and the Implementation of the Bologna Process

    Science.gov (United States)

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  20. The collaboration between and contribution of a digital open innovation platform to a local design process

    DEFF Research Database (Denmark)

    del Castillo, Jacqueline; Bhatti, Yasser; Hossain, Mokter

    2017-01-01

    We examine the potential of an open innovation digital platform to expose a local innovation process to a greater number of ideas and a more inclusive set of stakeholders. To do so we studied an online innovation challenge on the OpenIDEO to reimagine the end-of-life experience sponsored by Sutte...... it leads to a greater number of ideas from a wider and more inclusive set of stakeholders. We offer insights for the literatures on open innovation and design thinking in a healthcare context....

  1. Radar data processing using a distributed computational system

    Science.gov (United States)

    Mota, Gilberto F.

    1992-06-01

    This research specifies and validates a new concurrent decomposition scheme, called Confined Space Search Decomposition (CSSD), to exploit parallelism of Radar Data Processing algorithms using a Distributed Computational System. To formalize the specification, we propose and apply an object-oriented methodology called Decomposition Cost Evaluation Model (DCEM). To reduce the penalties of load imbalance, we propose a distributed dynamic load balance heuristic called Object Reincarnation (OR). To validate the research, we first compare our decomposition with an identified alternative using the proposed DCEM model and then develop a theoretical prediction of selected parameters. We also develop a simulation to check the Object Reincarnation Concept.

  2. Marketing promotion in the consumer goods’ retail distribution process

    Directory of Open Access Journals (Sweden)

    S.Bălăşescu

    2013-06-01

    Full Text Available The fundamental characteristic of contemporary marketing is the total opening towards three major directions: consumer needs, organization needs and society’s needs. The continuous expansion of marketing has been accompanied by a process of differentiation and specialization. Differentiation has led to the so called “specific marketing”. In this paper, we aim to explain that in the retail companies, the concept of sales marketing can be distinguished as an independent marketing specialization. The main objectives for this paper are: the definition and delimitation of consumer goods’ sales marketing in the retail business and the sectoral approach of the marketing concept and its specific techniques for the retail activities.

  3. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  4. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    Science.gov (United States)

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  5. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  6. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    Science.gov (United States)

    2014-07-01

    RISC 525 Brooks Road Rome NY 13441-4505 10. SPONSOR/MONITOR’S ACRONYM(S) AFRL/RI 11. SPONSOR/MONITOR’S REPORT NUMBER AFRL-RI-RS-TR-2014-195 12...cloud” technologies are not appropriate for situation understanding in areas of denial, where computation resources are limited, data not easily...graph matching process. D-SPACE distributes graph exploitation among a network of autonomous computational resources, designs the collaboration policy

  7. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  8. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  9. Syntactic processing is distributed across the language system.

    Science.gov (United States)

    Blank, Idan; Balewski, Zuzanna; Mahowald, Kyle; Fedorenko, Evelina

    2016-02-15

    Language comprehension recruits an extended set of regions in the human brain. Is syntactic processing localized to a particular region or regions within this system, or is it distributed across the entire ensemble of brain regions that support high-level linguistic processing? Evidence from aphasic patients is more consistent with the latter possibility: damage to many different language regions and to white-matter tracts connecting them has been shown to lead to similar syntactic comprehension deficits. However, brain imaging investigations of syntactic processing continue to focus on particular regions within the language system, often parts of Broca's area and regions in the posterior temporal cortex. We hypothesized that, whereas the entire language system is in fact sensitive to syntactic complexity, the effects in some regions may be difficult to detect because of the overall lower response to language stimuli. Using an individual-subjects approach to localizing the language system, shown in prior work to be more sensitive than traditional group analyses, we indeed find responses to syntactic complexity throughout this system, consistent with the findings from the neuropsychological patient literature. We speculate that such distributed nature of syntactic processing could perhaps imply that syntax is inseparable from other aspects of language comprehension (e.g., lexico-semantic processing), in line with current linguistic and psycholinguistic theories and evidence. Neuroimaging investigations of syntactic processing thus need to expand their scope to include the entire system of high-level language processing regions in order to fully understand how syntax is instantiated in the human brain. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Evolution of the Campanian Ignimbrite Magmatic System II: Trace Element and Th Isotopic Evidence for Open-System Processes

    Science.gov (United States)

    Bohrson, W. A.; Spera, F. J.; Fowler, S.; Belkin, H.; de Vivo, B.

    2005-12-01

    The Campanian Ignimbrite, a large volume (~200 km3 DRE) trachytic to phonolitic ignimbrite was deposited at ~39.3 ka and represents the largest of a number of highly explosive volcanic events in the region near Naples, Italy. Thermodynamic modeling of the major element evolution using the MELTS algorithm (see companion contribution by Fowler et al.) provides detailed information about the identity of and changes in proportions of solids along the liquid line of descent during isobaric fractional crystallization. We have derived trace element mass balance equations that explicitly accommodate changing mineral-melt bulk distribution coefficients during crystallization and also simultaneously satisfy energy and major element mass conservation. Although major element patterns are reasonably modeled assuming closed system fractional crystallization, modeling of trace elements that represent a range of behaviors (e.g. Zr, Nb, Th, U, Rb, Sm, Sr) yields trends for closed system fractionation that are distinct from those observed. These results suggest open-system processes were also important in the evolution of the Campanian magmatic system. Th isotope data yield an apparent isochron that is ~20 kyr younger than the age of the deposit, and age-corrected Th isotope data indicate that the magma body was an open-system at the time of eruption. Because open-system processes can profoundly change isotopic characteristics of a magma body, these results illustrate that it is critical to understand the contribution that open-system processes make to silicic magma bodies prior to assigning relevance to age or timescale information derived from isotope systematics. Fluid-magma interaction has been proposed as a mechanism to change isotopic and elemental characteristics of magma bodies, but an evaluation of the mass and thermal constraints on such a process suggest large-scale fluid-melt interaction at liquidus temperatures is unlikely. In the case of the magma body associated with

  11. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation of the abs......In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, user and multiuser interaction, which can hardly be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples show how to implement interactive...

  12. OpenGeoSys: An open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media

    DEFF Research Database (Denmark)

    Kolditz, O.; Bauer, S.; Bilke, L.

    In this paper we describe the OpenGeoSys (OGS) project, which is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical processes in porous media. The basic concept is to provide a flexible numerical framework (using primarily the Finite Element Method (FEM...

  13. Heat and work distributions for mixed Gauss–Cauchy process

    International Nuclear Information System (INIS)

    Kuśmierz, Łukasz; Gudowska-Nowak, Ewa; Rubi, J Miguel

    2014-01-01

    We analyze energetics of a non-Gaussian process described by a stochastic differential equation of the Langevin type. The process represents a paradigmatic model of a nonequilibrium system subject to thermal fluctuations and additional external noise, with both sources of perturbations considered as additive and statistically independent forcings. We define thermodynamic quantities for trajectories of the process and analyze contributions to mechanical work and heat. As a working example we consider a particle subjected to a drag force and two statistically independent Lévy white noises with stability indices α = 2 and α = 1. The fluctuations of dissipated energy (heat) and distribution of work performed by the force acting on the system are addressed by examining contributions of Cauchy fluctuations (α = 1) to either bath or external force acting on the system. (paper)

  14. Study on Manufacturing Process of Hollow Main Shaft by Open Die Forging

    International Nuclear Information System (INIS)

    Kwon, Yong Chul; Kang, Jong Hun; Kim, Sang Sik

    2016-01-01

    The main shaft is one of the key components connecting the rotor hub and gear box of a wind power generator. Typically, main shafts are manufactured by open die forging method. However, the main shaft for large MW class wind generators is designed to be hollow in order to reduce the weight. Additionally, the main shafts are manufactured by a casting process. This study aims to develop a manufacturing process for hollow main shafts by the open die forging method. The design of a forging process for a solid main shaft and hollow shaft was prepared by an open die forging process design scheme. Finite element analyses were performed to obtain the flow stress by a hot compression test at different temperature and strain rates. The control parameters of each forging process, such as temperature and effective strain, were obtained and compared to predict the suitability of the hollow main shaft forging process. Finally, high productivity reflecting material utilization ratio, internal quality, shape, and dimension was verified by the prototypes manufactured by the proposed forging process for hollow main shafts

  15. Study on Manufacturing Process of Hollow Main Shaft by Open Die Forging

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yong Chul [Gyeongnam Technopark, Changwon (Korea, Republic of); Kang, Jong Hun [Jungwon Univ., Goisan (Korea, Republic of); Kim, Sang Sik [Gyeongsang Natiional Univ., Jinju (Korea, Republic of)

    2016-02-15

    The main shaft is one of the key components connecting the rotor hub and gear box of a wind power generator. Typically, main shafts are manufactured by open die forging method. However, the main shaft for large MW class wind generators is designed to be hollow in order to reduce the weight. Additionally, the main shafts are manufactured by a casting process. This study aims to develop a manufacturing process for hollow main shafts by the open die forging method. The design of a forging process for a solid main shaft and hollow shaft was prepared by an open die forging process design scheme. Finite element analyses were performed to obtain the flow stress by a hot compression test at different temperature and strain rates. The control parameters of each forging process, such as temperature and effective strain, were obtained and compared to predict the suitability of the hollow main shaft forging process. Finally, high productivity reflecting material utilization ratio, internal quality, shape, and dimension was verified by the prototypes manufactured by the proposed forging process for hollow main shafts.

  16. An educational tool for interactive parallel and distributed processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2012-01-01

    In this article we try to describe how the modular interactive tiles system (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing a handson educational tool that allows a change in the representation...... of abstract problems related to designing interactive parallel and distributed systems. Indeed, the MITS seems to bring a series of goals into education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, and user and multi-user interaction which can rarely be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples we show how to implement...

  17. Parallel Distributed Processing Theory in the Age of Deep Networks.

    Science.gov (United States)

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  18. Numerical investigation of the recruitment process in open marine population models

    International Nuclear Information System (INIS)

    Angulo, O; López-Marcos, J C; López-Marcos, M A; Martínez-Rodríguez, J

    2011-01-01

    The changes in the dynamics, produced by the recruitment process in an open marine population model, are investigated from a numerical point of view. The numerical method considered, based on the representation of the solution along the characteristic lines, approximates properly the steady states of the model, and is used to analyze the asymptotic behavior of the solutions of the model

  19. Transactional Distance among Open University Students: How Does it Affect the Learning Process?

    Science.gov (United States)

    Kassandrinou, Amanda; Angelaki, Christina; Mavroidis, Ilias

    2014-01-01

    This study examines the presence of transactional distance among students, the factors affecting it, as well as the way it influences the learning process of students in a blended distance learning setting in Greece. The present study involved 12 postgraduate students of the Hellenic Open University (HOU). A qualitative research was conducted,…

  20. Opening the Learning Process: The Potential Role of Feature Film in Teaching Employment Relations

    Science.gov (United States)

    Lafferty, George

    2016-01-01

    This paper explores the potential of feature film to encourage more inclusive, participatory and open learning in the area of employment relations. Evaluations of student responses in a single postgraduate course over a five-year period revealed how feature film could encourage participatory learning processes in which students reexamined their…

  1. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    Energy Technology Data Exchange (ETDEWEB)

    Dobay, M. P. D., E-mail: maria.pamela.david@physik.uni-muenchen.de; Alberola, A. Piera; Mendoza, E. R.; Raedler, J. O., E-mail: joachim.raedler@physik.uni-muenchen.de [Ludwig-Maximilians University, Faculty of Physics, Center for NanoScience (Germany)

    2012-03-15

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  2. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    International Nuclear Information System (INIS)

    Dobay, M. P. D.; Alberola, A. Piera; Mendoza, E. R.; Rädler, J. O.

    2012-01-01

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  3. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    Science.gov (United States)

    Dobay, M. P. D.; Alberola, A. Piera; Mendoza, E. R.; Rädler, J. O.

    2012-03-01

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  4. Distributed Open and Distance Learning: How Does E-Learning Fit? LSDA Reports.

    Science.gov (United States)

    Fletcher, Mick

    The distinctions between types of open and distance learning broadly equate to the concept of learning at a time, place, and pace that best suits the learner. Distance learning refers to geography, whereas open learning refers to time. Flexible learning is a generic term referring either to geography or time. Combining these distinctions allows…

  5. IPNS distributed-processing data-acquisition system

    International Nuclear Information System (INIS)

    Haumann, J.R.; Daly, R.T.; Worlton, T.G.; Crawford, R.K.

    1981-01-01

    The Intense Pulsed Neutron Source (IPNS) at Argonne National Laboratory is a major new user-oriented facility which has come on line for basic research in neutron scattering and neutron radiation damage. This paper describes the distributed-processing data-acquisition system which handles data collection and instrument control for the time-of-flight neutron-scattering instruments. The topics covered include the overall system configuration, each of the computer subsystems, communication protocols linking each computer subsystem, and an overview of the software which has been developed

  6. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    Science.gov (United States)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  7. Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application

    Directory of Open Access Journals (Sweden)

    Lia Duarte

    2017-03-01

    Full Text Available Geographical Information Systems (GIS are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment—GIS-ECOAL (freely available. The performance of the application was evaluated with distributed temperature measurements gathered in the S. Pedro da Cova (Portugal coal waste pile. In order to obtain the temperature data, an optical fiber cable was disposed over the affected area of the pile, with 42 location stakes acting as precisely-located control points for the temperature measurement. A monthly data set from July (15 min of interval was fed into the application and a video composed by several layouts with temperature measurements was created allowing for recognizing two main areas with higher temperatures. The field observations also allow the identification of these zones; however, the identification of an area with higher temperatures in the top of the studied area was only possible through the visualization of the images created by this application. The generated videos make possible the dynamic and continuous visualization of the combustion process in the monitored area.

  8. The CANDU 9 distributed control system design process

    International Nuclear Information System (INIS)

    Harber, J.E.; Kattan, M.K.; Macbeth, M.J.

    1997-01-01

    Canadian designed CANDU pressurized heavy water nuclear reactors have been world leaders in electrical power generation. The CANDU 9 project is AECL's next reactor design. Plant control for the CANDU 9 station design is performed by a distributed control system (DCS) as compared to centralized control computers, analog control devices and relay logic used in previous CANDU designs. The selection of a DCS as the platform to perform the process control functions and most of the data acquisition of the plant, is consistent with the evolutionary nature of the CANDU technology. The control strategies for the DCS control programs are based on previous CANDU designs but are implemented on a new hardware platform taking advantage of advances in computer technology. This paper describes the design process for developing the CANDU 9 DCS. Various design activities, prototyping and analyses have been undertaken in order to ensure a safe, functional, and cost-effective design. (author)

  9. Design and simulation for real-time distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.; Gellrich, A.; Gensah, U.; Leich, H.; Wegner, P.

    1996-01-01

    The aim of this work is to provide a proper framework for the simulation and the optimization of the event building, the on-line third level trigger, and complete event reconstruction processor farm for the future HERA-B experiment. A discrete event, process oriented, simulation developed in concurrent μC++ is used for modelling the farm nodes running with multi-tasking constraints and different types of switching elements and digital signal processors interconnected for distributing the data through the system. An adequate graphic interface to the simulation part which allows to monitor features on-line and to analyze trace files, provides a powerful development tool for evaluating and designing parallel processing architectures. Control software and data flow protocols for event building and dynamic processor allocation are presented for two architectural models. (author)

  10. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  11. From mineral processing to waste treatment: an open-mind process simulator

    International Nuclear Information System (INIS)

    Guillaneau, J.C.; Brochot, S.; Durance, M.V.; Villeneuve, J.; Fourniguet, G.; Vedrine, H.; Sandvik, K.; Reuter, M.

    1999-01-01

    More than two hundred companies are using the USIM PAC process simulator within the mineral industry world-wide. Either for design or plant adaptation, simulation is increasingly supporting the process Engineer in his activities. From the mineral field, new domains have been concerned by this model-based approach as new models are developed and new applications involving solid waste appears. Examples are presented in bio-processing, steel-making flue dust treatment for zinc valorisation, soil decontamination or urban waste valorisation (sorting, composting and incineration). (author)

  12. BioSig: the free and open source software library for biomedical signal processing.

    Science.gov (United States)

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  13. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  14. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  15. Risk assessment of occupational groups working in open pit mining: Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yaşar Kasap

    2017-01-01

    Full Text Available In open pit mining it is possible to prevent industrial accidents and the results of industrial accidents such as deaths, physical disabilities and financial loss by implementing risk analyses in advance. If the probabilities of different occupational groups encountering various hazards are determined, workers’ risk of having industrial accidents and catching occupational illnesses can be controlled. In this sense, the aim of this study was to assess the industrial accidents which occurred during open pit coal production in the Turkish Coal Enterprises (TCE Garp Lignite unit between 2005 and 2010 and to analyze the risks using the Analytic Hierarchy Process (AHP. The analyses conducted with AHP revealed that the greatest risk in open pit mining is landslides, the most risky occupational group is unskilled labourers and the most common hazards are caused by landslides and transportation/hand tools/falling.

  16. Distributed and cooperative task processing: Cournot oligopolies on a graph.

    Science.gov (United States)

    Pavlic, Theodore P; Passino, Kevin M

    2014-06-01

    This paper introduces a novel framework for the design of distributed agents that must complete externally generated tasks but also can volunteer to process tasks encountered by other agents. To reduce the computational and communication burden of coordination between agents to perfectly balance load around the network, the agents adjust their volunteering propensity asynchronously within a fictitious trading economy. This economy provides incentives for nontrivial levels of volunteering for remote tasks, and thus load is shared. Moreover, the combined effects of diminishing marginal returns and network topology lead to competitive equilibria that have task reallocations that are qualitatively similar to what is expected in a load-balancing system with explicit coordination between nodes. In the paper, topological and algorithmic conditions are given that ensure the existence and uniqueness of a competitive equilibrium. Additionally, a decentralized distributed gradient-ascent algorithm is given that is guaranteed to converge to this equilibrium while not causing any node to over-volunteer beyond its maximum task-processing rate. The framework is applied to an autonomous-air-vehicle example, and connections are drawn to classic studies of the evolution of cooperation in nature.

  17. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  18. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  19. Stepwise Distributed Open Innovation Contests for Software Development: Acceleration of Genome-Wide Association Analysis.

    Science.gov (United States)

    Hill, Andrew; Loh, Po-Ru; Bharadwaj, Ragu B; Pons, Pascal; Shang, Jingbo; Guinan, Eva; Lakhani, Karim; Kilty, Iain; Jelinsky, Scott A

    2017-05-01

    The association of differing genotypes with disease-related phenotypic traits offers great potential to both help identify new therapeutic targets and support stratification of patients who would gain the greatest benefit from specific drug classes. Development of low-cost genotyping and sequencing has made collecting large-scale genotyping data routine in population and therapeutic intervention studies. In addition, a range of new technologies is being used to capture numerous new and complex phenotypic descriptors. As a result, genotype and phenotype datasets have grown exponentially. Genome-wide association studies associate genotypes and phenotypes using methods such as logistic regression. As existing tools for association analysis limit the efficiency by which value can be extracted from increasing volumes of data, there is a pressing need for new software tools that can accelerate association analyses on large genotype-phenotype datasets. Using open innovation (OI) and contest-based crowdsourcing, the logistic regression analysis in a leading, community-standard genetics software package (PLINK 1.07) was substantially accelerated. OI allowed us to do this in innovation, we achieved an end-to-end speedup of 591-fold for a data set size of 6678 subjects by 645 863 variants, compared to PLINK 1.07's logistic regression. This represents a reduction in run time from 4.8 hours to 29 seconds. Accelerated logistic regression code developed in this project has been incorporated into the PLINK2 project. Using iterative competition-based OI, we have developed a new, faster implementation of logistic regression for genome-wide association studies analysis. We present lessons learned and recommendations on running a successful OI process for bioinformatics. © The Author 2017. Published by Oxford University Press.

  20. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  1. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    Science.gov (United States)

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography

  2. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  3. A Distributed, Open Source based Data Infrastructure for the Megacities Carbon Project

    Science.gov (United States)

    Verma, R.; Crichton, D. J.; Duren, R. M.; Salameh, P.; Sparks, A.; Sloop, C.

    2014-12-01

    With the goal of assessing the anthropogenic carbon-emission impact of urban centers on local and global climates, the Megacities Carbon Project has been building carbon-monitoring capabilities for the past two years around the Los Angeles metropolitan area. Hundreds of megabytes (MB) of data are generated daily, and distributed among data centers local to the sensor networks involved. We automatically pull this remotely generated data into a centralized data infrastructure local to the Jet Propulsion Laboratory (JPL), seeking to (1) provide collaboration opportunities on the data, and (2) generate refined data products through community-requested centralized data processing pipelines. The goal of this informatics effort is to ensure near real-time access to generated data products across the Los Angeles carbon monitoring sensor network and meet the data analysis needs of carbon researchers through the production of customized products. We discuss the goals of the informatics effort, its uniqueness, and assess its effectiveness in providing an insight into the carbon sphere of Los Angeles.

  4. Distributed Processing of Sentinel-2 Products using the BIGEARTH Platform

    Science.gov (United States)

    Bacu, Victor; Stefanut, Teodor; Nandra, Constantin; Mihon, Danut; Gorgan, Dorian

    2017-04-01

    The constellation of observational satellites orbiting around Earth is constantly increasing, providing more data that need to be processed in order to extract meaningful information and knowledge from it. Sentinel-2 satellites, part of the Copernicus Earth Observation program, aim to be used in agriculture, forestry and many other land management applications. ESA's SNAP toolbox can be used to process data gathered by Sentinel-2 satellites but is limited to the resources provided by a stand-alone computer. In this paper we present a cloud based software platform that makes use of this toolbox together with other remote sensing software applications to process Sentinel-2 products. The BIGEARTH software platform [1] offers an integrated solution for processing Earth Observation data coming from different sources (such as satellites or on-site sensors). The flow of processing is defined as a chain of tasks based on the WorDeL description language [2]. Each task could rely on a different software technology (such as Grass GIS and ESA's SNAP) in order to process the input data. One important feature of the BIGEARTH platform comes from this possibility of interconnection and integration, throughout the same flow of processing, of the various well known software technologies. All this integration is transparent from the user perspective. The proposed platform extends the SNAP capabilities by enabling specialists to easily scale the processing over distributed architectures, according to their specific needs and resources. The software platform [3] can be used in multiple configurations. In the basic one the software platform runs as a standalone application inside a virtual machine. Obviously in this case the computational resources are limited but it will give an overview of the functionalities of the software platform, and also the possibility to define the flow of processing and later on to execute it on a more complex infrastructure. The most complex and robust

  5. Locality-Aware Task Scheduling and Data Distribution for OpenMP Programs on NUMA Systems and Manycore Processors

    Directory of Open Access Journals (Sweden)

    Ananya Muddukrishna

    2015-01-01

    Full Text Available Performance degradation due to nonuniform data access latencies has worsened on NUMA systems and can now be felt on-chip in manycore processors. Distributing data across NUMA nodes and manycore processor caches is necessary to reduce the impact of nonuniform latencies. However, techniques for distributing data are error-prone and fragile and require low-level architectural knowledge. Existing task scheduling policies favor quick load-balancing at the expense of locality and ignore NUMA node/manycore cache access latencies while scheduling. Locality-aware scheduling, in conjunction with or as a replacement for existing scheduling, is necessary to minimize NUMA effects and sustain performance. We present a data distribution and locality-aware scheduling technique for task-based OpenMP programs executing on NUMA systems and manycore processors. Our technique relieves the programmer from thinking of NUMA system/manycore processor architecture details by delegating data distribution to the runtime system and uses task data dependence information to guide the scheduling of OpenMP tasks to reduce data stall times. We demonstrate our technique on a four-socket AMD Opteron machine with eight NUMA nodes and on the TILEPro64 processor and identify that data distribution and locality-aware task scheduling improve performance up to 69% for scientific benchmarks compared to default policies and yet provide an architecture-oblivious approach for programmers.

  6. Assessing the spatial distribution of Tuta absoluta (Lepidoptera: Gelechiidae) eggs in open-field tomato cultivation through geostatistical analysis.

    Science.gov (United States)

    Martins, Júlio C; Picanço, Marcelo C; Silva, Ricardo S; Gonring, Alfredo Hr; Galdino, Tarcísio Vs; Guedes, Raul Nc

    2018-01-01

    The spatial distribution of insects is due to the interaction between individuals and the environment. Knowledge about the within-field pattern of spatial distribution of a pest is critical to planning control tactics, developing efficient sampling plans, and predicting pest damage. The leaf miner Tuta absoluta (Meyrick) (Lepidoptera: Gelechiidae) is the main pest of tomato crops in several regions of the world. Despite the importance of this pest, the pattern of spatial distribution of T. absoluta on open-field tomato cultivation remains unknown. Therefore, this study aimed to characterize the spatial distribution of T. absoluta in 22 commercial open-field tomato cultivations with plants at the three phenological development stages by using geostatistical analysis. Geostatistical analysis revealed that there was strong evidence for spatially dependent (aggregated) T. absoluta eggs in 19 of the 22 sample tomato cultivations. The maps that were obtained demonstrated the aggregated structure of egg densities at the edges of the crops. Further, T. absoluta was found to accomplish egg dispersal along the rows more frequently than it does between rows. Our results indicate that the greatest egg densities of T. absoluta occur at the edges of tomato crops. These results are discussed in relation to the behavior of T. absoluta distribution within fields and in terms of their implications for improved sampling guidelines and precision targeting control methods that are essential for effective pest monitoring and management. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  7. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  8. Seedling establishment and distribution of direct radiation in slit-shaped openings of Norway spruce forests in the intermediate Alps

    International Nuclear Information System (INIS)

    Brang, P.

    1996-01-01

    Direct radiation is crucial for Norway spruce (Picea abies (L.) Karst.) seedling establishment in high-montane and subalpine spruce forests. Fisheye photography was used to estimate the daily distribution of direct radiation in small forest openings on a north-northwest and a south facing slope near Sedrun (Grisons, Switzerland). In slit-shaped openings on the north-northwest facing slope long sunflecks mostly occurred in the afternoon, when the sun shines parallel to the slit axis. This is in accordance to the silvicultural intention. However, since the stands are clumpy and therefore pervious to sunlight, the daily sunfleck distribution is fairly even notwithstanding the slit orientation, and direct radiation at noon is the dominant form of incident energy. In small circular to rectangular openings on the south facing slope direct radiation peaks at noontide. A seeding trial imitating natural seedling establishment was set in place in openings on both slopes. Based on this trial, the relations among seedling establishment, aspect, slit shape, size, and orientation are discussed for Norway spruce forests in the intermediate Alps. The directional weather factors such as radiation and precipitation can be highly influenced by slits, which is why suitable microclimate for seedling establishment can be promoted provided the slits are oriented appropriately. Slits in which the most insolated edges are oriented windward are especially favourable

  9. Distribution of green open space in Malang City based on multispectral data

    Science.gov (United States)

    Hasyim, A. W.; Hernawan, F. P.

    2017-06-01

    Green open space is one of the land that its existence is quite important in urban areas where the minimum area is set to reach 30% of the total area of the city. Malang which has an area of 110,6 square kilometers, is one of the major cities in East Java Province that is prone to over-land conversion due to development needs. In support of the green space program, calculation of green space is needed precisely so that remote sensing which has high accuracy is now used for measurement of green space. This study aims to analyze the area of green open space in Malang by using Landsat 8 image in 2015. The method used was the vegetation index that is Normalized Difference Vegetation Index (NDVI). From the study obtained the calculation of green open space was better to use the vegetation index method to avoid the occurrence of misclassification of other types of land use. The results of the calculation of green open space using NDVI found that the area of green open space in Malang City in 2015 reached 39% of the total area.

  10. Solid Waste Processing Center Primary Opening Cells Systems, Equipment and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Sharon A.; Baker, Carl P.; Mullen, O Dennis; Valdez, Patrick LJ

    2006-04-17

    This document addresses the remote systems and design integration aspects of the development of the Solid Waste Processing Center (SWPC), a facility to remotely open, sort, size reduce, and repackage mixed low-level waste (MLLW) and transuranic (TRU)/TRU mixed waste that is either contact-handled (CH) waste in large containers or remote-handled (RH) waste in various-sized packages.

  11. Modelling of Wheat-Flour Dough Mixing as an Open-Loop Hysteretic Process

    Czech Academy of Sciences Publication Activity Database

    Anderssen, R.; Kružík, Martin

    2013-01-01

    Roč. 18, č. 2 (2013), s. 283-293 ISSN 1531-3492 R&D Projects: GA AV ČR IAA100750802 Keywords : Dissipation * Dough mixing * Rate-independent systems Subject RIV: BA - General Mathematics Impact factor: 0.628, year: 2013 http://library.utia.cas.cz/separaty/2013/MTR/kruzik-modelling of wheat-flour dough mixing as an open-loop hysteretic process.pdf

  12. Crowd innovation : The role of uncertainty for opening up the innovation process in the public sector

    OpenAIRE

    Collm, Alexandra; Schedler, Kuno

    2011-01-01

    Innovations are complex processes that can be created internally, caused externally or generated collectively with stakeholders. Integrating crowdsourcing and open innovation and supported by Web 2.0 technologies, a new innovation practice, crowd innovation, has emerged. In this paper, we illustrate empirically the practice of crowd innovation and discuss institutional obstacles, which exist for implementing crowd innovation in the public sector. Referring to the normative mode of publicness ...

  13. Calculation of the spallation product distribution in the evaporation process

    International Nuclear Information System (INIS)

    Nishida, T.; Kanno, I.; Nakahara, Y.; Takada, H.

    1989-01-01

    Some investigations are performed for the calculational model of nuclear spallation reaction in the evaporation process. A new version of a spallation reaction simulation code NUCLEUS has been developed by incorporating the newly revised Uno ampersand Yamada's mass formula and extending the counting region of produced nuclei. The differences between the new and original mass formulas are shown in the comparisons of mass excess values. The distributions of spallation products of a uranium target nucleus bombarded by energy (0.38 - 2.9 GeV) protons have been calculated with the new and original versions of NUCLEUS. In the fission component Uno ampersand Yamada's mass formula reproduces the measured data obtained from thin foil experiments significantly better, especially in the neutron excess side, than the combination of the Cameron's mass formula and the mass table compiled by Wapstra, et al., in the original version of NUCLEUS. Discussions are also made on how the mass-yield distribution of products varies dependent on the level density parameter a characterizing the particle evaporation. 16 refs., 7 figs., 1 tab

  14. Calculation of the spallation product distribution in the evaporation process

    International Nuclear Information System (INIS)

    Nishida, T.; Kanno, I.; Nakahara, Y.; Takada, H.

    1989-01-01

    Some investigations are performed for the calculational model of nuclear spallation reaction in the evaporation process. A new version of a spallation reaction simulation code NUCLEUS has been developed by incorporating the newly revised Uno and Yamada's mass formula and extending the counting region of produced nuclei. The differences between the new and original mass formulas are shown in the comparisons of mass excess values. The distributions of spallation products of a uranium target nucleus bombarded by energy (0.38 - 2.9 GeV) protons have been calculated with the new and original versions of NUCLEUS. In the fission component Uno and Yamada's mass formula reproduces the measured data obtained from thin foil experiments significantly better, especially in the neutron excess side, than the combination of the Cameron's mass formula and the mass table compiled by Wapstra, et al., in the original version of NUCLEUS. Discussions are also made on how the mass-yield distribution of products varies dependent on the level density parameter α characterizing the particle evaporation. (author)

  15. Fuel distribution process risk analysis in East Borneo

    Directory of Open Access Journals (Sweden)

    Laksmita Raizsa

    2018-01-01

    Full Text Available Fuel distribution is an important aspect of fulfilling the customer’s need. It is risky because it can cause tardiness that can cause fuel scarcity. In the process of distribution, many risks are occurring. House of Risk is a method used for mitigating the risk. It identifies seven risk events and nine risk agents. Matrix occurrence and severity are used for eliminating the minor impact risk. House of Risk 1 is used for determining the Aggregate Risk Potential (ARP. Pareto diagram is applied to prioritize risk that must be mitigated by preventive actions based on ARP. It identifies 4 priority risks, namely A8 (Car trouble, A4 (Human Error, A3 (Error deposit via bank and underpayment, and A6 (traffic accident which should be mitigated. House of Risk 2 makes for mapping between the preventive action and risk agent. It gets the Effectiveness to Difficulty Ratio (ETD for mitigating action. Conducting safety talk routine once every three days with ETD 2088 is the primary preventive actions.

  16. Quantifying evenly distributed states in exclusion and nonexclusion processes

    Science.gov (United States)

    Binder, Benjamin J.; Landman, Kerry A.

    2011-04-01

    Spatial-point data sets, generated from a wide range of physical systems and mathematical models, can be analyzed by counting the number of objects in equally sized bins. We find that the bin counts are related to the Pólya distribution. New measures are developed which indicate whether or not a spatial data set, generated from an exclusion process, is at its most evenly distributed state, the complete spatial randomness (CSR) state. To this end, we define an index in terms of the variance between the bin counts. Limiting values of the index are determined when objects have access to the entire domain and when there are subregions of the domain that are inaccessible to objects. Using three case studies (Lagrangian fluid particles in chaotic laminar flows, cellular automata agents in discrete models, and biological cells within colonies), we calculate the indexes and verify that our theoretical CSR limit accurately predicts the state of the system. These measures should prove useful in many biological applications.

  17. Radon: Chemical and physical processes associated with its distribution

    International Nuclear Information System (INIS)

    Castleman, A.W. Jr.

    1992-01-01

    Assessing the mechanisms which govern the distribution, fate, and pathways of entry into biological systems, as well as the ultimate hazards associated with the radon progeny and their secondary reaction products, depends on knowledge of their chemistry. Our studies are directed toward developing fundamental information which will provide a basis for modeling studies that are requisite in obtaining a complete picture of growth, attachment to aerosols, and transport to the bioreceptor and ultimate incorporation within. Our program is divided into three major areas of research. These include measurement of the determination of their mobilities, study of the role of radon progeny ions in affecting reactions, including study of the influence of the degree of solvation (clustering), and examination of the important secondary reaction products, with particular attention to processes leading to chemical conversion of either the core ions or the ligands as a function of the degree of clustering

  18. Lifetime-Based Memory Management for Distributed Data Processing Systems

    DEFF Research Database (Denmark)

    Lu, Lu; Shi, Xuanhua; Zhou, Yongluan

    2016-01-01

    create a large amount of long-living data objects in the heap, which may quickly saturate the garbage collector, especially when handling a large dataset, and hence would limit the scalability of the system. To eliminate this problem, we propose a lifetime-based memory management framework, which...... the garbage collection time by up to 99.9%, 2) to achieve up to 22.7x speed up in terms of execution time in cases without data spilling and 41.6x speedup in cases with data spilling, and 3) to consume up to 46.6% less memory.......In-memory caching of intermediate data and eager combining of data in shuffle buffers have been shown to be very effective in minimizing the re-computation and I/O cost in distributed data processing systems like Spark and Flink. However, it has also been widely reported that these techniques would...

  19. An Open Source-Based Real-Time Data Processing Architecture Framework for Manufacturing Sustainability

    Directory of Open Access Journals (Sweden)

    Muhammad Syafrudin

    2017-11-01

    Full Text Available Currently, the manufacturing industry is experiencing a data-driven revolution. There are multiple processes in the manufacturing industry and will eventually generate a large amount of data. Collecting, analyzing and storing a large amount of data are one of key elements of the smart manufacturing industry. To ensure that all processes within the manufacturing industry are functioning smoothly, the big data processing is needed. Thus, in this study an open source-based real-time data processing (OSRDP architecture framework was proposed. OSRDP architecture framework consists of several open sources technologies, including Apache Kafka, Apache Storm and NoSQL MongoDB that are effective and cost efficient for real-time data processing. Several experiments and impact analysis for manufacturing sustainability are provided. The results showed that the proposed system is capable of processing a massive sensor data efficiently when the number of sensors data and devices increases. In addition, the data mining based on Random Forest is presented to predict the quality of products given the sensor data as the input. The Random Forest successfully classifies the defect and non-defect products, and generates high accuracy compared to other data mining algorithms. This study is expected to support the management in their decision-making for product quality inspection and support manufacturing sustainability.

  20. Spatial Data Exploring by Satellite Image Distributed Processing

    Science.gov (United States)

    Mihon, V. D.; Colceriu, V.; Bektas, F.; Allenbach, K.; Gvilava, M.; Gorgan, D.

    2012-04-01

    Our society needs and environmental predictions encourage the applications development, oriented on supervising and analyzing different Earth Science related phenomena. Satellite images could be explored for discovering information concerning land cover, hydrology, air quality, and water and soil pollution. Spatial and environment related data could be acquired by imagery classification consisting of data mining throughout the multispectral bands. The process takes in account a large set of variables such as satellite image types (e.g. MODIS, Landsat), particular geographic area, soil composition, vegetation cover, and generally the context (e.g. clouds, snow, and season). All these specific and variable conditions require flexible tools and applications to support an optimal search for the appropriate solutions, and high power computation resources. The research concerns with experiments on solutions of using the flexible and visual descriptions of the satellite image processing over distributed infrastructures (e.g. Grid, Cloud, and GPU clusters). This presentation highlights the Grid based implementation of the GreenLand application. The GreenLand application development is based on simple, but powerful, notions of mathematical operators and workflows that are used in distributed and parallel executions over the Grid infrastructure. Currently it is used in three major case studies concerning with Istanbul geographical area, Rioni River in Georgia, and Black Sea catchment region. The GreenLand application offers a friendly user interface for viewing and editing workflows and operators. The description involves the basic operators provided by GRASS [1] library as well as many other image related operators supported by the ESIP platform [2]. The processing workflows are represented as directed graphs giving the user a fast and easy way to describe complex parallel algorithms, without having any prior knowledge of any programming language or application commands

  1. Coalescent Processes with Skewed Offspring Distributions and Nonequilibrium Demography.

    Science.gov (United States)

    Matuszewski, Sebastian; Hildebrandt, Marcel E; Achaz, Guillaume; Jensen, Jeffrey D

    2018-01-01

    Nonequilibrium demography impacts coalescent genealogies leaving detectable, well-studied signatures of variation. However, similar genomic footprints are also expected under models of large reproductive skew, posing a serious problem when trying to make inference. Furthermore, current approaches consider only one of the two processes at a time, neglecting any genomic signal that could arise from their simultaneous effects, preventing the possibility of jointly inferring parameters relating to both offspring distribution and population history. Here, we develop an extended Moran model with exponential population growth, and demonstrate that the underlying ancestral process converges to a time-inhomogeneous psi-coalescent. However, by applying a nonlinear change of time scale-analogous to the Kingman coalescent-we find that the ancestral process can be rescaled to its time-homogeneous analog, allowing the process to be simulated quickly and efficiently. Furthermore, we derive analytical expressions for the expected site-frequency spectrum under the time-inhomogeneous psi-coalescent, and develop an approximate-likelihood framework for the joint estimation of the coalescent and growth parameters. By means of extensive simulation, we demonstrate that both can be estimated accurately from whole-genome data. In addition, not accounting for demography can lead to serious biases in the inferred coalescent model, with broad implications for genomic studies ranging from ecology to conservation biology. Finally, we use our method to analyze sequence data from Japanese sardine populations, and find evidence of high variation in individual reproductive success, but few signs of a recent demographic expansion. Copyright © 2018 by the Genetics Society of America.

  2. Applying SDN/OpenFlow in Virtualized LTE to support Distributed Mobility Management (DMM)

    NARCIS (Netherlands)

    Karimzadeh Motallebi Azar, Morteza; Valtulina, Luca; Karagiannis, Georgios

    2014-01-01

    Distributed Mobility Management (DMM) is a mobility management solution, where the mobility anchors are distributed instead of being centralized. The use of DMM can be applied in cloud-based (virtualized) Long Term Evolution (LTE) mobile network environments to (1) provide session continuity to

  3. The Marginal Distributions of a Crossing Time and Renewal Numbers Related with Two Poisson Processes are as Ph-Distributions

    Directory of Open Access Journals (Sweden)

    Mir G. H. Talpur

    2006-01-01

    Full Text Available In this paper we consider, how to find the marginal distributions of crossing time and renewal numbers related with two poisson processes by using probability arguments. The obtained results show that the one-dimension marginal distributions are N+1 order PH-distributions.

  4. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    Science.gov (United States)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis

  5. Being or Becoming: Toward an Open-System, Process-Centric Model of Personality.

    Science.gov (United States)

    Giordano, Peter J

    2015-12-01

    Mainstream personality psychology in the West neglects the investigation of intra-individual process and variation, because it favors a Being over a Becoming ontology. A Being ontology privileges a structural (e.g., traits or selves) conception of personality. Structure-centric models in turn suggest nomothetic research strategies and the investigation of individual and group differences. This article argues for an open-system, process-centric understanding of personality anchored in an ontology of Becoming. A classical Confucian model of personality is offered as an example of a process-centric approach for investigating and appreciating within-person personality process and variation. Both quantitative and qualitative idiographic strategies can be used as methods of scientific inquiry, particularly the exploration of the Confucian exemplar of psychological health and well-being.

  6. Study of cache performance in distributed environment for data processing

    International Nuclear Information System (INIS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-01-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set

  7. Process-orientated psychoanalytic work in initial interviews and the importance of the opening scene.

    Science.gov (United States)

    Wegner, Peter

    2014-06-01

    From the very first moment of the initial interview to the end of a long course of psychoanalysis, the unconscious exchange between analysand and analyst, and the analysis of the relationship between transference and countertransference, are at the heart of psychoanalytic work. Drawing on initial interviews with a psychosomatically and depressively ill student, a psychoanalytic understanding of initial encounters is worked out. The opening scene of the first interview already condenses the central psychopathology - a clinging to the primary object because it was never securely experienced as present by the patient. The author outlines the development of some psychoanalytic theories concerning the initial interview and demonstrates their specific importance as background knowledge for the clinical situation in the following domains: the 'diagnostic position', the 'therapeutic position', the 'opening scene', the 'countertransference' and the 'analyst's free-floating introspectiveness'. More recent investigations refer to 'process qualities' of the analytic relationship, such as 'synchronization' and 'self-efficacy'. The latter seeks to describe after how much time between the interview sessions constructive or destructive inner processes gain ground in the patient and what significance this may have for the decision about the treatment that follows. All these factors combined can lead to establishing a differential process-orientated indication that also takes account of the fact that being confronted with the fear of unconscious processes of exchange is specific to the psychoanalytic profession. Copyright © 2014 Institute of Psychoanalysis.

  8. Dynamical Processes in Open Quantum Systems from a TDDFT Perspective: Resonances and Electron Photoemission.

    Science.gov (United States)

    Larsen, Ask Hjorth; De Giovannini, Umberto; Rubio, Angel

    2016-01-01

    We present a review of different computational methods to describe time-dependent phenomena in open quantum systems and their extension to a density-functional framework. We focus the discussion on electron emission processes in atoms and molecules addressing excited-state lifetimes and dissipative processes. Initially we analyze the concept of an electronic resonance, a central concept in spectroscopy associated with a metastable state from which an electron eventually escapes (electronic lifetime). Resonances play a fundamental role in many time-dependent molecular phenomena but can be rationalized from a time-independent context in terms of scattering states. We introduce the method of complex scaling, which is used to capture resonant states as localized states in the spirit of usual bound-state methods, and work on its extension to static and time-dependent density-functional theory. In a time-dependent setting, complex scaling can be used to describe excitations in the continuum as well as wave packet dynamics leading to electron emission. This process can also be treated by using open boundary conditions which allow time-dependent simulations of emission processes without artificial reflections at the boundaries (i.e., borders of the simulation box). We compare in detail different schemes to implement open boundaries, namely transparent boundaries using Green functions, and absorbing boundaries in the form of complex absorbing potentials and mask functions. The last two are regularly used together with time-dependent density-functional theory to describe the electron emission dynamics of atoms and molecules. Finally, we discuss approaches to the calculation of energy and angle-resolved time-dependent pump-probe photoelectron spectroscopy of molecular systems.

  9. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  10. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  11. Stationary distributions of stochastic processes described by a linear neutral delay differential equation

    International Nuclear Information System (INIS)

    Frank, T D

    2005-01-01

    Stationary distributions of processes are derived that involve a time delay and are defined by a linear stochastic neutral delay differential equation. The distributions are Gaussian distributions. The variances of the Gaussian distributions are either monotonically increasing or decreasing functions of the time delays. The variances become infinite when fixed points of corresponding deterministic processes become unstable. (letter to the editor)

  12. Calibration process of highly parameterized semi-distributed hydrological model

    Science.gov (United States)

    Vidmar, Andrej; Brilly, Mitja

    2017-04-01

    Hydrological phenomena take place in the hydrological system, which is governed by nature, and are essentially stochastic. These phenomena are unique, non-recurring, and changeable across space and time. Since any river basin with its own natural characteristics and any hydrological event therein, are unique, this is a complex process that is not researched enough. Calibration is a procedure of determining the parameters of a model that are not known well enough. Input and output variables and mathematical model expressions are known, while only some parameters are unknown, which are determined by calibrating the model. The software used for hydrological modelling nowadays is equipped with sophisticated algorithms for calibration purposes without possibility to manage process by modeler. The results are not the best. We develop procedure for expert driven process of calibration. We use HBV-light-CLI hydrological model which has command line interface and coupling it with PEST. PEST is parameter estimation tool which is used widely in ground water modeling and can be used also on surface waters. Process of calibration managed by expert directly, and proportionally to the expert knowledge, affects the outcome of the inversion procedure and achieves better results than if the procedure had been left to the selected optimization algorithm. First step is to properly define spatial characteristic and structural design of semi-distributed model including all morphological and hydrological phenomena, like karstic area, alluvial area and forest area. This step includes and requires geological, meteorological, hydraulic and hydrological knowledge of modeler. Second step is to set initial parameter values at their preferred values based on expert knowledge. In this step we also define all parameter and observation groups. Peak data are essential in process of calibration if we are mainly interested in flood events. Each Sub Catchment in the model has own observations group

  13. Classification of bacterial contamination using image processing and distributed computing.

    Science.gov (United States)

    Ahmed, W M; Bayraktar, B; Bhunia, A; Hirleman, E D; Robinson, J P; Rajwa, B

    2013-01-01

    Disease outbreaks due to contaminated food are a major concern not only for the food-processing industry but also for the public at large. Techniques for automated detection and classification of microorganisms can be a great help in preventing outbreaks and maintaining the safety of the nations food supply. Identification and classification of foodborne pathogens using colony scatter patterns is a promising new label-free technique that utilizes image-analysis and machine-learning tools. However, the feature-extraction tools employed for this approach are computationally complex, and choosing the right combination of scatter-related features requires extensive testing with different feature combinations. In the presented work we used computer clusters to speed up the feature-extraction process, which enables us to analyze the contribution of different scatter-based features to the overall classification accuracy. A set of 1000 scatter patterns representing ten different bacterial strains was used. Zernike and Chebyshev moments as well as Haralick texture features were computed from the available light-scatter patterns. The most promising features were first selected using Fishers discriminant analysis, and subsequently a support-vector-machine (SVM) classifier with a linear kernel was used. With extensive testing we were able to identify a small subset of features that produced the desired results in terms of classification accuracy and execution speed. The use of distributed computing for scatter-pattern analysis, feature extraction, and selection provides a feasible mechanism for large-scale deployment of a light scatter-based approach to bacterial classification.

  14. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  15. Does ego development increase during midlife? The effects of openness and accommodative processing of difficult events.

    Science.gov (United States)

    Lilgendahl, Jennifer Pals; Helson, Ravenna; John, Oliver P

    2013-08-01

    Although Loevinger's model of ego development is a theory of personality growth, there are few studies that have examined age-related change in ego level over developmentally significant periods of adulthood. To address this gap in the literature, we examined mean-level change and individual differences in change in ego level over 18 years of midlife. In this longitudinal study, participants were 79 predominantly White, college-educated women who completed the Washington University Sentence Completion Test in early (age 43) and late (age 61) midlife as well as measures of the trait of Openness (ages 21, 43, 52, and 61) and accommodative processing (assessed from narratives of difficult life events at age 52). As hypothesized, the sample overall showed a mean-level increase in ego level from age 43 to age 61. Additionally, a regression analysis showed that both the trait of Openness at age 21 and accommodative processing of difficult events that occurred during (as opposed to prior to) midlife were each predictive of increasing ego level from age 43 to age 61. These findings counter prior claims that ego level remains stable during adulthood and contribute to our understanding of the underlying processes involved in personality growth in midlife. © 2012 Wiley Periodicals, Inc.

  16. The WIPP decision plan: Charting the course for openness in the decision making process

    International Nuclear Information System (INIS)

    Hagers, J.

    1992-01-01

    In June of 1989, the Secretary of Energy requested that a plan be developed that would clearly outline the prerequisites to opening the Waste Isolation Pilot Plant (WIPP). It was to provide the basis for a decision making process that was not only visible to the public, but one which included public participation. It must also be dynamic enough to effectively deal with the changing legislative, regulatory, and technical environments. Based on a recognized need for openness, the Secretary's Draft Decision Plan was developed. The plan charted the course for ultimately making the decision to declare WIPP ready to receive waste for the start of test phase operations. It outlined to critics and supporters alike the rigorous and thorough process by which the internal decisions were made. The plan identified all internal prerequisites to the decision; charted the review cycles, and targeted the completion dates. It also outlined the processes outside the control of the Department, institutional issues, such as legislative land withdrawal, issuance of permits, and designation of transportation routes

  17. Parallel processing implementation for the coupled transport of photons and electrons using OpenMP

    Science.gov (United States)

    Doerner, Edgardo

    2016-05-01

    In this work the use of OpenMP to implement the parallel processing of the Monte Carlo (MC) simulation of the coupled transport for photons and electrons is presented. This implementation was carried out using a modified EGSnrc platform which enables the use of the Microsoft Visual Studio 2013 (VS2013) environment, together with the developing tools available in the Intel Parallel Studio XE 2015 (XE2015). The performance study of this new implementation was carried out in a desktop PC with a multi-core CPU, taking as a reference the performance of the original platform. The results were satisfactory, both in terms of scalability as parallelization efficiency.

  18. Land surface temperature distribution and development for green open space in Medan city using imagery-based satellite Landsat 8

    Science.gov (United States)

    Sulistiyono, N.; Basyuni, M.; Slamet, B.

    2018-03-01

    Green open space (GOS) is one of the requirements where a city is comfortable to stay. GOS might reduce land surface temperature (LST) and air pollution. Medan is one of the biggest towns in Indonesia that experienced rapid development. However, the early development tends to neglect the GOS existence for the city. The objective of the study is to determine the distribution of land surface temperature and the relationship between the normalized difference vegetation index (NDVI) and the priority of GOS development in Medan City using imagery-based satellite Landsat 8. The method approached to correlate the distribution of land surface temperature derived from the value of digital number band 10 with the NDVI which was from the ratio of groups five and four on satellite images of Landsat 8. The results showed that the distribution of land surface temperature in the Medan City in 2016 ranged 20.57 - 33.83 °C. The relationship between the distribution of LST distribution with NDVI was reversed with a negative correlation of -0.543 (sig 0,000). The direction of GOS in Medan City is therefore developed on the allocation of LST and divided into three priority classes namely first priority class had 5,119.71 ha, the second priority consisted of 16,935.76 ha, and third priority of 6,118.50 ha.

  19. A novel bio-safe phase separation process for preparing open-pore biodegradable polycaprolactone microparticles.

    Science.gov (United States)

    Salerno, Aurelio; Domingo, Concepción

    2014-09-01

    Open-pore biodegradable microparticles are object of considerable interest for biomedical applications, particularly as cell and drug delivery carriers in tissue engineering and health care treatments. Furthermore, the engineering of microparticles with well definite size distribution and pore architecture by bio-safe fabrication routes is crucial to avoid the use of toxic compounds potentially harmful to cells and biological tissues. To achieve this important issue, in the present study a straightforward and bio-safe approach for fabricating porous biodegradable microparticles with controlled morphological and structural features down to the nanometer scale is developed. In particular, ethyl lactate is used as a non-toxic solvent for polycaprolactone particles fabrication via a thermal induced phase separation technique. The used approach allows achieving open-pore particles with mean particle size in the 150-250 μm range and a 3.5-7.9 m(2)/g specific surface area. Finally, the combination of thermal induced phase separation and porogen leaching techniques is employed for the first time to obtain multi-scaled porous microparticles with large external and internal pore sizes and potential improved characteristics for cell culture and tissue engineering. Samples were characterized to assess their thermal properties, morphology and crystalline structure features and textural properties. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. ACCESS TO PUBLIC OPEN SPACE: IS DISTRIBUTION EQUITABLE ACROSS DIFFERENT SOCIO-ECONOMIC AREAS

    Directory of Open Access Journals (Sweden)

    Mohammad Javad Koohsari

    2011-12-01

    Full Text Available During the past decade, the role of the built environment on physical activity has been well investigated by public health, transportation and urban design scholars and it has been shown that different aspects of the built environment can influence physical activity Public open spaces (POS like parks have many health benefits and they can be important settings and destinations for having physical activity. Inequality in access to POS which may influence the amount of physical activity can be a reason for lower physical activity among deprived neighbourhoods. This paper aims to examine whether objective access to public open spaces (POS like parks is equally across the different socio-economic status (SES areas in the City of Melbourne. Objective access to POS was measured in network distance using geographic information systems (GIS and area SES was obtained using the SEIFA (Socio-Economic Indexes for Areas index. The results showed there was a significant difference in access to POS according to the SES areas. There was a significant negative correlation between the access to POS and the SES areas in which lower SES areas had poorer access to POS in comparison with the higher ones.

  1. Fractal-Markovian scaling of turbulent bursting process in open channel flow

    International Nuclear Information System (INIS)

    Keshavarzi, Ali Reza; Ziaei, Ali Naghi; Homayoun, Emdad; Shirvani, Amin

    2005-01-01

    The turbulent coherent structure of flow in open channel is a chaotic and stochastic process in nature. The coherence structure of the flow or bursting process consists of a series of eddies with a variety of different length scales and it is very important for the entrainment of sediment particles from the bed. In this study, a fractal-Markovian process is applied to the measured turbulent data in open channel. The turbulent data was measured in an experimental flume using three-dimensional acoustic Doppler velocity meter (ADV). A fractal interpolation function (FIF) algorithm was used to simulate more than 500,000 time series data of measured instantaneous velocity fluctuations and Reynolds shear stress. The fractal interpolation functions (FIF) enables to simulate and construct time series of u', v', and u'v' for any particular movement and state in the Markov process. The fractal dimension of the bursting events is calculated for 16 particular movements with the transition probability of the events based on 1st order Markov process. It was found that the average fractal dimensions of the streamwise flow velocity (u') are; 1.73, 1.74, 1.71 and 1.74 with the transition probability of 60.82%, 63.77%, 59.23% and 62.09% for the 1-1, 2-2, 3-3 and 4-4 movements, respectively. It was also found that the fractal dimensions of Reynold stress u'v' for quadrants 1, 2, 3 and 4 are 1.623, 1.623, 1.625 and 1.618, respectively

  2. THE UNIQUE Na:O ABUNDANCE DISTRIBUTION IN NGC 6791: THE FIRST OPEN(?) CLUSTER WITH MULTIPLE POPULATIONS

    International Nuclear Information System (INIS)

    Geisler, D.; Villanova, S.; Cummings, J.; Carraro, G.; Pilachowski, C.; Johnson, C. I.; Bresolin, F.

    2012-01-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  3. The Unique Na:O Abundance Distribution in NGC 6791: The First Open(?) Cluster with Multiple Populations

    Science.gov (United States)

    Geisler, D.; Villanova, S.; Carraro, G.; Pilachowski, C.; Cummings, J.; Johnson, C. I.; Bresolin, F.

    2012-09-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  4. Wigner distribution function and entropy of the damped harmonic oscillator within the theory of the open quantum systems

    Science.gov (United States)

    Isar, Aurelian

    1995-01-01

    The harmonic oscillator with dissipation is studied within the framework of the Lindblad theory for open quantum systems. By using the Wang-Uhlenbeck method, the Fokker-Planck equation, obtained from the master equation for the density operator, is solved for the Wigner distribution function, subject to either the Gaussian type or the delta-function type of initial conditions. The obtained Wigner functions are two-dimensional Gaussians with different widths. Then a closed expression for the density operator is extracted. The entropy of the system is subsequently calculated and its temporal behavior shows that this quantity relaxes to its equilibrium value.

  5. An Integrated Open Approach to Capturing Systematic Knowledge for Manufacturing Process Innovation Based on Collective Intelligence

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2018-02-01

    Full Text Available Process innovation plays a vital role in the manufacture realization of increasingly complex new products, especially in the context of sustainable development and cleaner production. Knowledge-based innovation design can inspire designers’ creative thinking; however, the existing scattered knowledge has not yet been properly captured and organized according to Computer-Aided Process Innovation (CAPI. Therefore, this paper proposes an integrated approach to tackle this non-trivial issue. By analyzing the design process of CAPI and technical features of open innovation, a novel holistic paradigm of process innovation knowledge capture based on collective intelligence (PIKC-CI is constructed from the perspective of the knowledge life cycle. Then, a multi-source innovation knowledge fusion algorithm based on semantic elements reconfiguration is applied to form new public knowledge. To ensure the credibility and orderliness of innovation knowledge refinement, a collaborative editing strategy based on knowledge lock and knowledge–social trust degree is explored. Finally, a knowledge management system MPI-OKCS integrating the proposed techniques is implemented into the pre-built CAPI general platform, and a welding process innovation example is provided to illustrate the feasibility of the proposed approach. It is expected that our work would lay the foundation for the future knowledge-inspired CAPI and smart process planning.

  6. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  7. Using ISO/IEC 12207 to analyze open source software development processes: an E-learning case study

    OpenAIRE

    Krishnamurthy, Aarthy; O'Connor, Rory

    2013-01-01

    peer-reviewed To date, there is no comprehensive study of open source software development process (OSSDP) carried out for open source (OS) e-learning systems. This paper presents the work which objectively analyzes the open source software development (OSSD) practices carried out by e-learning systems development communities and their results are represented using DEMO models. These results are compared using ISO/IEC 12207:2008. The comparison of DEMO models with ISO/IEC...

  8. A model for the distribution channels planning process

    NARCIS (Netherlands)

    Neves, M.F.; Zuurbier, P.; Campomar, M.C.

    2001-01-01

    Research of existing literature reveals some models (sequence of steps) for companies that want to plan distribution channels. None of these models uses strong contributions from transaction cost economics, bringing a possibility to elaborate on a "distribution channels planning model", with these

  9. Enhancement of the efficiency of the Open Cycle Phillips Optimized Cascade LNG process

    International Nuclear Information System (INIS)

    Fahmy, M.F.M.; Nabih, H.I.; El-Nigeily, M.

    2016-01-01

    Highlights: • Expanders replaced JT valves in the Phillips Optimized Cascade liquefaction process. • Improvement in plant liquefaction efficiency was evaluated in presence of expanders. • Comparison of the different optimum cases for the liquefaction process was presented. - Abstract: This study aims to improve the performance of the Open Cycle Phillips Optimized Cascade Process for the production of liquefied natural gas (LNG) through the replacement of Joule–Thomson (JT) valves by expanders. The expander has a higher thermodynamic efficiency than the JT valve. Moreover, the produced shaft power from the expander is integrated into the process. The study is conducted using the Aspen HYSYS-V7 simulation software for simulation of the Open Cycle Phillips Optimized Cascade Process having the JT valves. Simulation of several proposed cases in which expanders are used instead of JT valves at different locations in the process as at the propane cycle, ethylene cycle, methane cycle and the upstream of the heavies removal column is conducted. The optimum cases clearly indicate that expanders not only produce power, but also offer significant improvements in the process performance as shown by the total plant power consumption, LNG production, thermal efficiency, plant specific power and CO_2 emissions reduction. Results also reveal that replacing JT valves by expanders in the methane cycle has a dominating influence on all performance criteria and hence, can be considered as the main key contributor affecting the Phillips Optimized Cascade Process leading to a notable enhancement in its efficiency. This replacement of JT valves by liquid expanders at different locations of the methane cycle encounters power savings in the range of 4.92–5.72%, plant thermal efficiency of 92.64–92.97% and an increase in LNG production of 5.77–7.04%. Moreover, applying liquid expanders at the determined optimum cases for the different cycles, improves process performance and

  10. Distributed Prognostic Health Management with Gaussian Process Regression

    Science.gov (United States)

    Saha, Sankalita; Saha, Bhaskar; Saxena, Abhinav; Goebel, Kai Frank

    2010-01-01

    Distributed prognostics architecture design is an enabling step for efficient implementation of health management systems. A major challenge encountered in such design is formulation of optimal distributed prognostics algorithms. In this paper. we present a distributed GPR based prognostics algorithm whose target platform is a wireless sensor network. In addition to challenges encountered in a distributed implementation, a wireless network poses constraints on communication patterns, thereby making the problem more challenging. The prognostics application that was used to demonstrate our new algorithms is battery prognostics. In order to present trade-offs within different prognostic approaches, we present comparison with the distributed implementation of a particle filter based prognostics for the same battery data.

  11. Distributivity of the algebra of regular open subsets of .beta. R / R

    Czech Academy of Sciences Publication Activity Database

    Balcar, Bohuslav; Hrušák, M.

    2005-01-01

    Roč. 149, č. 1 (2005), s. 1-7 ISSN 0166-8641 R&D Projects: GA ČR(CZ) GA201/03/0933; GA ČR(CZ) GA201/02/0857 Institutional research plan: CEZ:AV0Z10190503 Keywords : distributivity of Boolean algebras * cardinal invariants of the continuum * Čech-Stone compactification Subject RIV: BA - General Mathematics Impact factor: 0.297, year: 2005

  12. Development of a revolving drum reactor for open-sorption heat storage processes

    International Nuclear Information System (INIS)

    Zettl, Bernhard; Englmair, Gerald; Steinmaurer, Gerald

    2014-01-01

    To evaluate the potential of an open sorption storage process using molecular sieves to provide thermal energy for space heating and hot water, an experimental study of adsorption heat generation in a rotating reactor is presented. Dehydrated zeolite of the type 4A and MSX were used in form of spherical grains and humidified room air was blown through the rotating bed. Zeolite batches of about 50 kg were able to generate an adsorption heat up to 12 kWh and temperature shifts of the process air up to 36 K depending on the inlet air water content and the state of dehydration of the storage materials. A detailed study of the heat transfer effects, the generated adsorption heat, and the evolving temperatures show the applicability of the reactor and storage concept. - Highlights: • Use of an open adsorption concept for domestic heat supply was proved. • A rotating heat drum reactor concept was successfully applied. • Zeolite batches of 50 kg generated up to 12 kWh adsorption heat (580 kJ/kg). • Temperature shift in the rotating material bed was up to 60 K during adsorption

  13. Optimization of valve opening process for the suppression of impulse exhaust noise

    Science.gov (United States)

    Li, Jingxiang; Zhao, Shengdun

    2017-02-01

    Impulse exhaust noise generated by the sudden impact of discharging flow of pneumatic systems has significant temporal characteristics including high sound pressure and rapid sound transient. The impulse noise exposures are more hazardous to hearing than the energy equivalent uniform noise exposures. This paper presents a novel approach to suppress the peak sound pressure as a major indicator of impulsiveness of the impulse exhaust noise by an optimization of the opening process of valve. Relationships between exhaust flow and impulse noise are described by thermodynamics and noise generating mechanism. Then an optimized approach by controlling the valve opening process is derived under a constraint of pre-setting exhaust time. A modified servo-direct-driven valve was designed and assembled in a typical pneumatic system for the verification experiments comparing with an original solenoid valve. Experimental results with groups of initial cylinder pressures and pre-setting exhaust times are shown to verify the effects of the proposed optimization. Some indicators of energy-equivalent and impulsiveness are introduced to discuss the effects of the noise suppressions. Relationship between noise reduction and exhaust time delay is also discussed.

  14. Gaseous material capacity of open plasma jet in plasma spray-physical vapor deposition process

    Science.gov (United States)

    Liu, Mei-Jun; Zhang, Meng; Zhang, Qiang; Yang, Guan-Jun; Li, Cheng-Xin; Li, Chang-Jiu

    2018-01-01

    Plasma spray-physical vapor deposition (PS-PVD) process, emerging as a highly efficient hybrid approach, is based on two powerful technologies of both plasma spray and physical vapor deposition. The maximum production rate is affected by the material feed rate apparently, but it is determined by the material vapor capacity of transporting plasma actually and essentially. In order to realize high production rate, the gaseous material capacity of plasma jet must be fundamentally understood. In this study, the thermal characteristics of plasma were measured by optical emission spectrometry. The results show that the open plasma jet is in the local thermal equilibrium due to a typical electron number density from 2.1 × 1015 to 3.1 × 1015 cm-3. In this condition, the temperature of gaseous zirconia can be equal to the plasma temperature. A model was developed to obtain the vapor pressure of gaseous ZrO2 molecules as a two dimensional map of jet axis and radial position corresponding to different average plasma temperatures. The overall gaseous material capacity of open plasma jet, take zirconia for example, was further established. This approach on evaluating material capacity in plasma jet would shed light on the process optimization towards both depositing columnar coating and a high production rate of PS-PVD.

  15. An open-loop system design for deep space signal processing applications

    Science.gov (United States)

    Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi

    2018-06-01

    A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.

  16. A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.

    Science.gov (United States)

    Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos

    2015-01-01

    ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.

  17. Determination of Optimal Opening Scheme for Electromagnetic Loop Networks Based on Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yang Li

    2016-01-01

    Full Text Available Studying optimization and decision for opening electromagnetic loop networks plays an important role in planning and operation of power grids. First, the basic principle of fuzzy analytic hierarchy process (FAHP is introduced, and then an improved FAHP-based scheme evaluation method is proposed for decoupling electromagnetic loop networks based on a set of indicators reflecting the performance of the candidate schemes. The proposed method combines the advantages of analytic hierarchy process (AHP and fuzzy comprehensive evaluation. On the one hand, AHP effectively combines qualitative and quantitative analysis to ensure the rationality of the evaluation model; on the other hand, the judgment matrix and qualitative indicators are expressed with trapezoidal fuzzy numbers to make decision-making more realistic. The effectiveness of the proposed method is validated by the application results on the real power system of Liaoning province of China.

  18. Openness and Continuous Collaboration as the Foundation for Entrepreneurial Discovery Process in Finnish Regions

    Directory of Open Access Journals (Sweden)

    Mona ROMAN

    2017-12-01

    Full Text Available Entrepreneurial discovery process (EDP is a bottom-up process engaging regional actors from academic, business, government and civil society together to identify new market opportunities and overcome the potential barriers to innovation. While EDP forms the key principle behind the smart specialization policy of European Commission, its operationalization has remained a challenge. We adopted a grounded theory approach to explore the dynamics of EDP through a case study in Finnish regions. Our aim is to identify the key underlying factors of EDP. Based on the semi-structured interviews with 10 Finnish regions during September 2016, we identified openness, engaging, focused networking and continuous collaboration as the key factors underlying EDP. Our findings contribute to the theoretical debate on what constitutes EDP in the context of smart specialization. We also provide examples for policymakers how to implement these factors based on our case study.

  19. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  20. Laser scanner data processing and 3D modeling using a free and open source software

    International Nuclear Information System (INIS)

    Gabriele, Fatuzzo; Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito

    2015-01-01

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue

  1. Laser scanner data processing and 3D modeling using a free and open source software

    Energy Technology Data Exchange (ETDEWEB)

    Gabriele, Fatuzzo [Dept. of Industrial and Mechanical Engineering, University of Catania (Italy); Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci; Salvatore, Zito [Dept. of Civil Engineering and Architecture, University of Catania (Italy)

    2015-03-10

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.

  2. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  3. The Limitations of Access Alone: Moving Towards Open Processes in Education Technology

    Science.gov (United States)

    Knox, Jeremy

    2013-01-01

    "Openness" has emerged as one of the foremost themes in education, within which an open education movement has enthusiastically embraced digital technologies as the central means of participation and inclusion. Open Educational Resources (OERs) and Massive Open Online Courses (MOOCs) have surfaced at the forefront of this development,…

  4. A Scalable Infrastructure for Lidar Topography Data Distribution, Processing, and Discovery

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Phan, M.; Cowart, C. A.; Arrowsmith, R.; Baru, C.

    2010-12-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology have emerged as a fundamental tool in the Earth sciences, and are also being widely utilized for ecological, planning, engineering, and environmental applications. Collected from airborne, terrestrial, and space-based platforms, these data are revolutionary because they permit analysis of geologic and biologic processes at resolutions essential for their appropriate representation. Public domain lidar data collection by federal, state, and local agencies are a valuable resource to the scientific community, however the data pose significant distribution challenges because of the volume and complexity of data that must be stored, managed, and processed. Lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative products. This massive volume of data is often challenging to host for resource-limited agencies. Furthermore, these data can be technically challenging for users who lack appropriate software, computing resources, and expertise. The National Science Foundation-funded OpenTopography Facility (www.opentopography.org) has developed a cyberinfrastructure-based solution to enable online access to Earth science-oriented high-resolution lidar topography data, online processing tools, and derivative products. OpenTopography provides access to terabytes of point cloud data, standard DEMs, and Google Earth image data, all co-located with computational resources for on-demand data processing. The OpenTopography portal is built upon a cyberinfrastructure platform that utilizes a Services Oriented Architecture (SOA) to provide a modular system that is highly scalable and flexible enough to support the growing needs of the Earth science lidar community. OpenTopography strives to host and provide access to datasets as soon as they become available, and also to expose greater application level functionalities to

  5. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    Science.gov (United States)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  6. Unity and disunity in evolutionary sciences: process-based analogies open common research avenues for biology and linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric

    2016-08-20

    For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to

  7. Standard services for the capture, processing, and distribution of packetized telemetry data

    Science.gov (United States)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  8. Floral and nesting resources, habitat structure, and fire influence bee distribution across an open-forest gradient

    Science.gov (United States)

    Grundel, R.; Jean, R.P.; Frohnapple, K.J.; Glowacki, G.A.; Scott, P.E.; Pavlovic, N.B.

    2010-01-01

    Given bees' central effect on vegetation communities, it is important to understand how and why bee distributions vary across ecological gradients. We examined how plant community composition, plant diversity, nesting suitability, canopy cover, land use, and fire history affected bee distribution across an open-forest gradient in northwest Indiana, USA, a gradient similar to the historic Midwest United States landscape mosaic. When considered with the other predictors, plant community composition was not a significant predictor of bee community composition. Bee abundance was negatively related to canopy cover and positively to recent fire frequency, bee richness was positively related to plant richness and abundance of potential nesting resources, and bee community composition was significantly related to plant richness, soil characteristics potentially related to nesting suitability, and canopy cover. Thus, bee abundance was predicted by a different set of environmental characteristics than was bee species richness, and bee community composition was predicted, in large part, by a combination of the significant predictors of bee abundance and richness. Differences in bee community composition along the woody vegetation gradient were correlated with relative abundance of oligolectic, or diet specialist, bees. Because oligoleges were rarer than diet generalists and were associated with open habitats, their populations may be especially affected by degradation of open habitats. More habitat-specialist bees were documented for open and forest/scrub habitats than for savanna/woodland habitats, consistent with bees responding to habitats of intermediate woody vegetation density, such as savannas, as ecotones rather than as distinct habitat types. Similarity of bee community composition, similarity of bee abundance, and similarity of bee richness between sites were not significantly related to proximity of sites to each other. Nestedness analysis indicated that species

  9. Distributed collaborative team effectiveness: measurement and process improvement

    Science.gov (United States)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  10. Clinical Laboratory Data Management: A Distributed Data Processing Solution

    OpenAIRE

    Levin, Martin; Morgner, Raymond; Packer, Bernice

    1980-01-01

    Two turn-key systems, one for patient registration and the other for the clinical laboratory have been installed and linked together at the Hospital of the University of Pennsylvania, forming the nucleus of an evolving distributed Hospital Information System.

  11. Radiation chemistry of polymer degradation processes: molecular weight distribution effects

    International Nuclear Information System (INIS)

    Viswanathan, N.S.

    1976-01-01

    The molecular weight distributions of poly(methyl methacrylate) irradiated at 15 and 25 MeV with electron beams were investigated. The experimental values for the effect of chain scissions on the dispersivity agreed well with theoretical predictions

  12. Distributed Multiscale Data Analysis and Processing for Sensor Networks

    National Research Council Canada - National Science Library

    Wagner, Raymond; Sarvotham, Shriram; Choi, Hyeokho; Baraniuk, Richard

    2005-01-01

    .... Second, the communication overhead of multiscale algorithms can become prohibitive. In this paper, we take a first step in addressing both shortcomings by introducing two new distributed multiresolution transforms...

  13. 40 CFR 761.80 - Manufacturing, processing and distribution in commerce exemptions.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Manufacturing, processing and..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Exemptions § 761.80 Manufacturing, processing and... any change in the manner of processing and distributing, importing (manufacturing), or exporting of...

  14. Parallel and distributed processing in two SGBDS: A case study

    OpenAIRE

    Francisco Javier Moreno; Nataly Castrillón Charari; Camilo Taborda Zuluaga

    2017-01-01

    Context: One of the strategies for managing large volumes of data is distributed and parallel computing. Among the tools that allow applying these characteristics are some Data Base Management Systems (DBMS), such as Oracle, DB2, and SQL Server. Method: In this paper we present a case study where we evaluate the performance of an SQL query in two of these DBMS. The evaluation is done through various forms of data distribution in a computer network with different degrees of parallelism. ...

  15. Prediction of residence time distributions in food processing machinery

    DEFF Research Database (Denmark)

    Karlson, Torben; Friis, Alan; Szabo, Peter

    1996-01-01

    The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet.......The velocity field in a co-rotating disc scraped surface heat exchanger (CDHE) is calculated using a finite element method. The residence time distribution for the CDHE is then obtained by tracing particles introduced in the inlet....

  16. Frequency distributions from birth, death, and creation processes.

    Science.gov (United States)

    Bartley, David L; Ogden, Trevor; Song, Ruiguang

    2002-01-01

    The time-dependent frequency distribution of groups of individuals versus group size was investigated within a continuum approximation, assuming a simplified individual growth, death and creation model. The analogy of the system to a physical fluid exhibiting both convection and diffusion was exploited in obtaining various solutions to the distribution equation. A general solution was approximated through the application of a Green's function. More specific exact solutions were also found to be useful. The solutions were continually checked against the continuum approximation through extensive simulation of the discrete system. Over limited ranges of group size, the frequency distributions were shown to closely exhibit a power-law dependence on group size, as found in many realizations of this type of system, ranging from colonies of mutated bacteria to the distribution of surnames in a given population. As an example, the modeled distributions were successfully fit to the distribution of surnames in several countries by adjusting the parameters specifying growth, death and creation rates.

  17. Free Open Access Medical Education (FOAM in Emergency Medicine: The Global Distribution of Users in 2016

    Directory of Open Access Journals (Sweden)

    Jennifer W. Bellows

    2018-04-01

    Full Text Available Introduction: Free open-access medical education (FOAM is a collection of interactive online medical education resources—free and accessible to students, physicians and other learners. This novel approach to medical education has the potential to reach learners across the globe; however, the extent of its global uptake is unknown. Methods: This descriptive report evaluates the 2016 web analytics data from a convenience sample of FOAM blogs and websites with a focus on emergency medicine (EM and critical care. The number of times a site was accessed, or “sessions”, was categorized by country of access, cross-referenced with World Bank data for population and income level, and then analyzed using simple descriptive statistics and geographic mapping. Results: We analyzed 12 FOAM blogs published from six countries, with a total reported volume of approximately 18.7 million sessions worldwide in 2016. High-income countries accounted for 73.7% of population-weighted FOAM blog and website sessions in 2016, while upper-middle income countries, lower-middle income countries and low-income countries accounted for 17.5%, 8.5% and 0.3%, respectively. Conclusion: FOAM, while largely used in high-income countries, is used in low- and middle-income countries as well. The potential to provide free, online training resources for EM in places where formal training is limited is significant and thus is prime for further investigation.

  18. Open Computer Forensic Architecture a Way to Process Terabytes of Forensic Disk Images

    Science.gov (United States)

    Vermaas, Oscar; Simons, Joep; Meijer, Rob

    This chapter describes the Open Computer Forensics Architecture (OCFA), an automated system that dissects complex file types, extracts metadata from files and ultimately creates indexes on forensic images of seized computers. It consists of a set of collaborating processes, called modules. Each module is specialized in processing a certain file type. When it receives a so called 'evidence', the information that has been extracted so far about the file together with the actual data, it either adds new information about the file or uses the file to derive a new 'evidence'. All evidence, original and derived, is sent to a router after being processed by a particular module. The router decides which module should process the evidence next, based upon the metadata associated with the evidence. Thus the OCFA system can recursively process images until from every compound file the embedded files, if any, are extracted, all information that the system can derive, has been derived and all extracted text is indexed. Compound files include, but are not limited to, archive- and zip-files, disk images, text documents of various formats and, for example, mailboxes. The output of an OCFA run is a repository full of derived files, a database containing all extracted information about the files and an index which can be used when searching. This is presented in a web interface. Moreover, processed data is easily fed to third party software for further analysis or to be used in data mining or text mining-tools. The main advantages of the OCFA system are Scalability, it is able to process large amounts of data.

  19. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    Science.gov (United States)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  20. Ionization processes in a transient hollow cathode discharge before electric breakdown: statistical distribution

    International Nuclear Information System (INIS)

    Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.

    1998-01-01

    The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown

  1. 234Th distributions in coastal and open ocean waters by non-destructive β-counting

    International Nuclear Information System (INIS)

    Miller, L.A.; Svaeren, I.

    2003-01-01

    Non-destructive β-counting analyses of particulate and dissolved 234 Th activities in seawater are simpler but no less precise than traditional radioanalytical methods. The inherent accuracy limitations of the non-destructive β-counting method, particularly in samples likely to be contaminated with anthropogenic nuclides, are alleviated by recounting the samples over several half-lives and fitting the counting data to the 234 Th decay curve. Precision (including accuracy, estimated at an average of 3%) is better than 10% for particulate or 5% for dissolved samples. Thorium-234 distributions in the Skagerrak indicated a vigorous, presumably biological, particle export from the surface waters, and while bottom sediment resuspension was not an effective export mechanism, it did strip thorium from the dissolved phase. In the Greenland and Norwegian Seas, we saw clear evidence of particulate export from the surface waters, but at 75 m, total 234 Th activities were generally in equilibrium with 238 U. (author)

  2. Investigation of Velocity Distribution in Open Channel Flows Based on Conditional Average of Turbulent Structures

    Directory of Open Access Journals (Sweden)

    Yu Han

    2017-01-01

    Full Text Available We report the development of a new analytical model similar to the Reynolds-averaged Navier-Stokes equations to determine the distribution of streamwise velocity by considering the bursting phenomenon. It is found that, in two-dimensional (2D flows, the underlying mechanism of the wake law in 2D uniform flow is actually a result of up/down events. A special experiment was conducted to examine the newly derived analytical model, and good agreement is achieved between the experimental data in the inner region and the model’s prediction. The obtained experimental data were also used to examine the DML-Law (dip-modified-log-law, MLW-Law (modified-log-wake law, and CML-Law (Cole’s wake law, and the agreement is not very satisfactory in the outer region.

  3. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    Science.gov (United States)

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  5. Geosfear? - Overcoming System Boundaries by Open-source Based Monitoring of Spatio-temporal Processes

    Science.gov (United States)

    Brandt, T.; Schima, R.; Goblirsch, T.; Paschen, M.; Francyk, B.; Bumberger, J.; Zacharias, S.; Dietrich, P.; Rubin, Y.; Rinke, K.; Fleckenstein, J. H.; Schmidt, C.; Vieweg, M.

    2016-12-01

    The impact of global change, intensive agriculture and complex interactions between humans and the environment show different effects on different scales. However, the desire to obtain a better understanding of ecosystems and process dynamics in nature accentuates the need for observing these processes at higher temporal and spatial resolutions. Especially with regard to the process dynamics and heterogeneity of rivers and catchment areas, a comprehensive monitoring of the ongoing processes and effects remains to be a challenging issue. What we need are monitoring systems which can collect most diverse data across different environmental compartments and scales. Today, open-source based electronics and innovative sensors and sensor components are offering a promising approach to investigate new possibilities of mobile data acquisition to improve our understanding of the geosphere. To start with, we have designed and implemented a multi-operable, embedded Linux platform for fast integration of different sensors within a single infrastructure. In addition, a GPS module in combination with a GSM transmitter ensures the synchronization and geo-referencing of all data, no matter how old-fashioned the sensors are. To this end, initial field experiments were conducted at a 3rd order stream in the Central German Lowland. Here, we linked in-stream DOC inputs with subsurface metabolism by coupling miniaturized DOC sensor probes with a modified vertical oxygen profiler in situ. Starting from metrological observations to water quality and subsurface conditions, the overarching goal is the detection of interlinked process dynamics across highly reactive biogeochemical interfaces. Overall, the field experiments demonstrated the feasibility of this emerging technology and its potential towards a cutting-edge strategy based on a holistic and integrated process. Now, we are only a few steps away from realizing adaptive and event-triggered observations close to real

  6. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    International Nuclear Information System (INIS)

    He, Qingyun; Chen, Hongli; Feng, Jingchao

    2015-01-01

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  7. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    He, Qingyun; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; Feng, Jingchao

    2015-12-15

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  8. IOC-UNEP review meeting on oceanographic processes of transport and distribution of pollutants in the sea

    International Nuclear Information System (INIS)

    1991-01-01

    The IOC-UNEP Review Meeting on Oceanographic Processes of Transfer and Distribution of Pollutants in the Sea was opened at the Ruder Boskovic Institute, Zagreb, Yugoslavia on Monday, 15 May 1989. Papers presented at the meeting dealt with physical and geochemical processes in sea-water and sediment in transport mixing and dispersal of pollutants. The importance of mesoscale eddies and gyres in the open sea, wind-driven currents and upwelling events in the coastal zone, and thermohaline processes in semi-enclosed bays and estuaries was recognized. There is strong evidence that non-local forcing can drive circulation in the coastal area. Concentrations, horizontal and vertical distributions and transport of pollutants were investigated and presented for a number of coastal areas. Riverine and atmospheric inputs of different pollutants to the western Mediterranean were discussed. Reports on two on-going nationally/internationally co-ordinated projects (MEDMODEL, EROS 2000) were presented. Discussions during the meeting enabled an exchange of ideas between specialists in different disciplines to be made. It is expected that this will promote the future interdisciplinary approach in this field. The meeting recognized the importance of physical oceanographic studies in investigating the transfer and distribution of pollutants in the sea and in view of the importance of the interdisciplinary approach and bilateral and/or multilateral co-operation a number of recommendations were adopted

  9. An open and transparent process to select ELIXIR Node Services as implemented by ELIXIR-UK.

    Science.gov (United States)

    Hancock, John M; Game, Alf; Ponting, Chris P; Goble, Carole A

    2016-01-01

    ELIXIR is the European infrastructure established specifically for the sharing and sustainability of life science data. To provide up-to-date resources and services, ELIXIR needs to undergo a continuous process of refreshing the services provided by its national Nodes. Here we present the approach taken by ELIXIR-UK to address the advice by the ELIXIR Scientific Advisory Board that Nodes need to develop " mechanisms to ensure that each Node continues to be representative of the Bioinformatics efforts within the country". ELIXIR-UK put in place an open and transparent process to identify potential ELIXIR resources within the UK during late 2015 and early to mid-2016. Areas of strategic strength were identified and Expressions of Interest in these priority areas were requested from the UK community. Criteria were established, in discussion with the ELIXIR Hub, and prospective ELIXIR-UK resources were assessed by an independent committee set up by the Node for this purpose. Of 19 resources considered, 14 were judged to be immediately ready to be included in the UK ELIXIR Node's portfolio. A further five were placed on the Node's roadmap for future consideration for inclusion. ELIXIR-UK expects to repeat this process regularly to ensure its portfolio continues to reflect its community's strengths.

  10. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    Science.gov (United States)

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  11. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    Science.gov (United States)

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  12. Exploring the Role of Distributed Learning in Distance Education at Allama Iqbal Open University: Academic Challenges at Postgraduate Level

    Directory of Open Access Journals (Sweden)

    Qadir BUKHSH

    2015-01-01

    Full Text Available Distributed learning is derived from the concept of distributed resources. Different institutions around the globe connected through network and the learners are diverse, located in the different cultures and communities. Distributed learning provides global standards of quality to all learners through synchronous and asynchronous communications and provides the opportunity of flexible and independent learning with equity, low cost educational services and has become the first choice of the dispersed learners around the globe. The present study was undertaken to investigate the challenges faced by the Faculty Members of Department of Business Administration and Computer Science at Allama Iqbal Open University Islamabad Pakistan. 25 Faculty Members were taken as sample of the study from both Departments (100% Sampling. The study was qualitative in nature and interview was the data collection tool. Data was analyzed by thematic analysis technique. The major challenges faced by the Faculty Members were as: bandwidth, synchronous learning activities, irregularity of the learners, feedback on individual work, designing and managing the learning activities, quality issues and training to use the network for teaching learning activities

  13. The redesign of a warranty distribution network with recovery processes

    NARCIS (Netherlands)

    Ashayeri, J.; Ma, N.; Sotirov, R.

    A warranty distribution network provides aftersales warranty services to customers and resembles a closed-loop supply chain network with specific challenges for reverse flows management like recovery, repair, and reflow of refurbished products. We present here a nonlinear and nonconvex mixed integer

  14. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    International Nuclear Information System (INIS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-01-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  15. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  16. Joint Probability Distributions for a Class of Non-Markovian Processes

    OpenAIRE

    Baule, A.; Friedrich, R.

    2004-01-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H.C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single time probability distributions to the case of N-time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fr...

  17. Parallel and distributed processing in two SGBDS: A case study

    Directory of Open Access Journals (Sweden)

    Francisco Javier Moreno

    2017-04-01

    Full Text Available Context: One of the strategies for managing large volumes of data is distributed and parallel computing. Among the tools that allow applying these characteristics are some Data Base Management Systems (DBMS, such as Oracle, DB2, and SQL Server. Method: In this paper we present a case study where we evaluate the performance of an SQL query in two of these DBMS. The evaluation is done through various forms of data distribution in a computer network with different degrees of parallelism. Results: The tests of the SQL query evidenced the performance differences between the two DBMS analyzed. However, more thorough testing and a wider variety of queries are needed. Conclusions: The differences in performance between the two DBMSs analyzed show that when evaluating this aspect, it is necessary to consider the particularities of each DBMS and the degree of parallelism of the queries.

  18. Process planning and accuracy distribution of marine power plant modularization

    Directory of Open Access Journals (Sweden)

    ZHANG Jinguo

    2018-02-01

    Full Text Available [Objectives] Modular shipbuilding can shorten the cycle of design and construction, lower production costs and improve the quality of products, but higher shipbuilding capabilities are required, especially for the installation of power plants. Because of such characteristics of modular shipbuilding as the high precision of docking links, long size equipment installation chain and quantitative docking interfaces, docking installation is very difficult due to high docking deviation and low accuracy of docking installation, leading to the abnormal vibration of equipment. In order to solve this problem, [Methods] on the basis of domestic shipbuilding capability, numerical calculation methods are used to analyze the accuracy distribution of modular installation. [Results] The results show that the accuracy distribution of different docking links is reasonable and feasible, and the setting of adjusting allowance matches the requirements of shipbuilding. [Conclusions] This method provides a reference for the modular construction of marine power plants.

  19. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  20. Processes determining the marine alkalinity and carbonate saturation distributions

    OpenAIRE

    B. R. Carter; J. R. Toggweiler; R. M. Key; J. L. Sarmiento

    2014-01-01

    We introduce a composite tracer, Alk*, that has a global distribution primarily determined by CaCO3 precipitation and dissolution. Alk* also highlights riverine alkalinity plumes that are due to dissolved calcium carbonate from land. We estimate the Arctic receives approximately twice the riverine alkalinity per unit area as the Atlantic, and 8 times that of the other oceans. Riverine inputs broadly elevate Alk* in the Arctic surface and particularly near ri...

  1. Asymmetric photoelectron angular distributions from interfering photoionization processes

    International Nuclear Information System (INIS)

    Yin, Y.; Chen, C.; Elliott, D.S.; Smith, A.V.

    1992-01-01

    We have measured asymmetric photoelectron angular distributions for atomic rubidium. Ionization is induced by a one-photon interaction with 280 nm light and by a two-photon interaction with 560 nm light. Interference between the even- and odd-parity free-electron wave functions allows us to control the direction of maximum electron flux by varying the relative phase of the two laser fields

  2. Stochastic production phase design for an open pit mining complex with multiple processing streams

    Science.gov (United States)

    Asad, Mohammad Waqar Ali; Dimitrakopoulos, Roussos; van Eldert, Jeroen

    2014-08-01

    In a mining complex, the mine is a source of supply of valuable material (ore) to a number of processes that convert the raw ore to a saleable product or a metal concentrate for production of the refined metal. In this context, expected variation in metal content throughout the extent of the orebody defines the inherent uncertainty in the supply of ore, which impacts the subsequent ore and metal production targets. Traditional optimization methods for designing production phases and ultimate pit limit of an open pit mine not only ignore the uncertainty in metal content, but, in addition, commonly assume that the mine delivers ore to a single processing facility. A stochastic network flow approach is proposed that jointly integrates uncertainty in supply of ore and multiple ore destinations into the development of production phase design and ultimate pit limit. An application at a copper mine demonstrates the intricacies of the new approach. The case study shows a 14% higher discounted cash flow when compared to the traditional approach.

  3. Open source software in a practical approach for post processing of radiologic images.

    Science.gov (United States)

    Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea

    2015-03-01

    The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.

  4. Why an open common-knowledge process about decommissioning funds? How transparency supports democracy

    International Nuclear Information System (INIS)

    BOVY, Michel

    2006-01-01

    Future generations will receive funds and have to manage the financial burdens linked to the technical heritage of the past nuclear activities. This shows the challenges of ethical requirements in this particular field, its cultural background as well as what it stands for. Another question is how the operators or the governmental bodies will interpret their decisions and justify these based on a hierarchy of principles where utilitarianism and egalitarianism have a central meaning. We aim at showing how a comparison of common criteria for decommissioning funds could help democracy and how a common knowledge could be developed by an open expertise process. The function of the control of the systems [1], that favours a democratic regulatory process in each country, calls for sufficient answers with regard to decommissioning funds, compared to other essential social needs. He has to adequately respond to the population with a higher degree of transparency in the priority of choices between different ways of using and controlling these funds. This asks for more social accountability and makes experts more responsible to Society for which they should work. (author)

  5. A distributive peptide cyclase processes multiple microviridin core peptides within a single polypeptide substrate.

    Science.gov (United States)

    Zhang, Yi; Li, Kunhua; Yang, Guang; McBride, Joshua L; Bruner, Steven D; Ding, Yousong

    2018-05-03

    Ribosomally synthesized and post-translationally modified peptides (RiPPs) are an important family of natural products. Their biosynthesis follows a common scheme in which the leader peptide of a precursor peptide guides the modifications of a single core peptide. Here we describe biochemical studies of the processing of multiple core peptides within a precursor peptide, rare in RiPP biosynthesis. In a cyanobacterial microviridin pathway, an ATP-grasp ligase, AMdnC, installs up to two macrolactones on each of the three core peptides within AMdnA. The enzyme catalysis occurs in a distributive fashion and follows an unstrict N-to-C overall directionality, but a strict order in macrolactonizing each core peptide. Furthermore, AMdnC is catalytically versatile to process unnatural substrates carrying one to four core peptides, and kinetic studies provide insights into its catalytic properties. Collectively, our results reveal a distinct biosynthetic logic of RiPPs, opening up the possibility of modular production via synthetic biology approaches.

  6. Gap formation processes in a high-density plasma opening switch

    International Nuclear Information System (INIS)

    Grossmann, J.M.; Swanekamp, S.B.; Ottinger, P.F.; Commisso, R.J.; Hinshelwood, D.D.; Weber, B.V.

    1995-01-01

    A gap opening process in plasma opening switches (POS) is examined with the aid of numerical simulations. In these simulations, a high density (n e =10 14 --5x10 15 cm -3 ) uniform plasma initially bridges a small section of the coaxial transmission line of an inductive energy storage generator. A short section of vacuum transmission line connects the POS to a short circuit load. The results presented here extend previous simulations in the n e =10 12 --10 13 cm -3 density regime. The simulations show that a two-dimensional (2-D) sheath forms in the plasma near a cathode. This sheath is positively charged, and electrostatic sheath potentials that are large compared to the anode--cathode voltage develop. Initially, the 2-D sheath is located at the generator edge of the plasma. As ions are accelerated out of the sheath, it retains its original 2-D structure, but migrates axially toward the load creating a magnetically insulated gap in its wake. When the sheath reaches the load edge of the POS, the POS stops conducting current and the load current increases rapidly. At the end of the conduction phase a gap exists in the POS whose size is determined by the radial dimensions of the 2-D sheath. Simulations at various plasma densities and current levels show that the radial size of the gap scales roughly as B/n e , where B is the magnetic field. The results of this work are discussed in the context of long-conduction-time POS physics, but exhibit the same physical gap formation mechanisms as earlier lower density simulations more relevant to short-conduction-time POS. copyright 1995 American Institute of Physics

  7. Parallel Hyperspectral Image Processing on Distributed Multi-Cluster Systems

    NARCIS (Netherlands)

    Liu, F.; Seinstra, F.J.; Plaza, A.J.

    2011-01-01

    Computationally efficient processing of hyperspectral image cubes can be greatly beneficial in many application domains, including environmental modeling, risk/hazard prevention and response, and defense/security. As individual cluster computers often cannot satisfy the computational demands of

  8. Integration of distributed computing into the drug discovery process.

    Science.gov (United States)

    von Korff, Modest; Rufener, Christian; Stritt, Manuel; Freyss, Joel; Bär, Roman; Sander, Thomas

    2011-02-01

    Grid computing offers an opportunity to gain massive computing power at low costs. We give a short introduction into the drug discovery process and exemplify the use of grid computing for image processing, docking and 3D pharmacophore descriptor calculations. The principle of a grid and its architecture are briefly explained. More emphasis is laid on the issues related to a company-wide grid installation and embedding the grid into the research process. The future of grid computing in drug discovery is discussed in the expert opinion section. Most needed, besides reliable algorithms to predict compound properties, is embedding the grid seamlessly into the discovery process. User friendly access to powerful algorithms without any restrictions, that is, by a limited number of licenses, has to be the goal of grid computing in drug discovery.

  9. Analysis of the logistics processes in the wine distribution

    OpenAIRE

    Slavkovský, Matúš

    2011-01-01

    Master's thesis is referring the importance of logistics in the retail business and the importance of reducing logistics costs. It includes so theoretical knowledge as well as the analysis of the relevant markets, which are producing and consuming wine in the largest quantities. Thesis is focused on analysis of the logistical processes and costs of an e-shop. Based on this analysis measures to improve the logistics of the process of the company are proposed. The goal of the Master's thesis is...

  10. Sources and processes affecting the distribution of dissolved Nd isotopes and concentrations in the West Pacific

    Science.gov (United States)

    Behrens, Melanie K.; Pahnke, Katharina; Schnetger, Bernhard; Brumsack, Hans-Jürgen

    2018-02-01

    In the Atlantic, where deep circulation is vigorous, the dissolved neodymium (Nd) isotopic composition (expressed as ɛNd) is largely controlled by water mass mixing. In contrast, the factors influencing the ɛNd distribution in the Pacific, marked by sluggish circulation, is not clear yet. Indication for regional overprints in the Pacific is given based on its bordering volcanic islands. Our study aims to clarify the impact and relative importance of different Nd sources (rivers, volcanic islands), vertical (bio)geochemical processes and lateral water mass transport in controlling dissolved ɛNd and Nd concentration ([Nd]) distributions in the West Pacific between South Korea and Fiji. We find indication for unradiogenic continental input from South Korean and Chinese rivers to the East China Sea. In the tropical West Pacific, volcanic islands supply Nd to surface and subsurface waters and modify their ɛNd to radiogenic values of up to +0.7. These radiogenic signatures allow detailed tracing of currents flowing to the east and differentiation from westward currents with open ocean Pacific ɛNd composition in the complex tropical Pacific zonal current system. Modified radiogenic ɛNd of West Pacific intermediate to bottom waters upstream or within our section also indicates non-conservative behavior of ɛNd due to boundary exchange at volcanic island margins, submarine ridges, and with hydrothermal particles. Only subsurface to deep waters (3000 m) in the open Northwest Pacific show conservative behavior of ɛNd. In contrast, we find a striking correlation of extremely low (down to 2.77 pmol/kg Nd) and laterally constant [Nd] with the high-salinity North and South Pacific Tropical Water, indicating lateral transport of preformed [Nd] from the North and South Pacific subtropical gyres into the study area. This observation also explains the previously observed low subsurface [Nd] in the tropical West Pacific. Similarly, Western South Pacific Central Water, Antarctic

  11. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    Science.gov (United States)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  12. Prescription-induced jump distributions in multiplicative Poisson processes.

    Science.gov (United States)

    Suweis, Samir; Porporato, Amilcare; Rinaldo, Andrea; Maritan, Amos

    2011-06-01

    Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative Gaussian white noise, the Stratonovich prescription does not correspond to the well-known midpoint (or any other intermediate) prescription. By introducing an inertial term in the GLE, we show that the Itô and Stratonovich prescriptions naturally arise depending on two time scales, one induced by the inertial term and the other determined by the jump event. We also show that, when the multiplicative noise is linear in the random variable, one prescription can be made equivalent to the other by a suitable transformation in the jump probability distribution. We apply these results to a recently proposed stochastic model describing the dynamics of primary soil salinization, in which the salt mass balance within the soil root zone requires the analysis of different prescriptions arising from the resulting stochastic differential equation forced by multiplicative white Poisson noise, the features of which are tailored to the characters of the daily precipitation. A method is finally suggested to infer the most appropriate prescription from the data.

  13. Prescription-induced jump distributions in multiplicative Poisson processes

    Science.gov (United States)

    Suweis, Samir; Porporato, Amilcare; Rinaldo, Andrea; Maritan, Amos

    2011-06-01

    Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative Gaussian white noise, the Stratonovich prescription does not correspond to the well-known midpoint (or any other intermediate) prescription. By introducing an inertial term in the GLE, we show that the Itô and Stratonovich prescriptions naturally arise depending on two time scales, one induced by the inertial term and the other determined by the jump event. We also show that, when the multiplicative noise is linear in the random variable, one prescription can be made equivalent to the other by a suitable transformation in the jump probability distribution. We apply these results to a recently proposed stochastic model describing the dynamics of primary soil salinization, in which the salt mass balance within the soil root zone requires the analysis of different prescriptions arising from the resulting stochastic differential equation forced by multiplicative white Poisson noise, the features of which are tailored to the characters of the daily precipitation. A method is finally suggested to infer the most appropriate prescription from the data.

  14. Process for opening up carboniferous seams for underground gasification by drilling production holes downwards

    Energy Technology Data Exchange (ETDEWEB)

    Lokschin, J L; Volk, A F; Starinskii, A A

    1977-12-01

    This process will reduce drilling costs and times by 20 to 25% and will improve gasification under the influence of a thin liquid medium connecting adjacent holes. After determining the approximate depth and thickness of the seam to be opened up, e.g. by geological means, production holes of 100 to 400 mm (diameter) are made down to a depth of 400 m or more, by well-known boring bars and chisels. After passing the top of the seam (the roof of the seam), which can be recognised by discoloration of the drilling liquid, one goes 1/2 to 1 metre deeper and one determines the depth of the roof the seam exactly by the reduced natural radioactivity at the boundary layer, by introducing a gamma sensor on to the boring bar. The production holes are taken down in a second borehold to a free space 0.6 to 2 metres above the floor of the seam (bottom of the seam), according to the thickness of the seam. After replacing the boring bar by a feedpipe one continues to drill using a boring bar of smaller cutting diameter inside this tube. This hole reaches from the foot of the pipe of the feedpipe to the floor of the seam. It is preferably flushed with gas but may be flushed with liquid. A thin liquid introduced into this hole penetrates the surrounding mass of the seam horizontally (unhindered by any armouring) and represents the required connection to neighbouring bores for gasification. The process is suitable for mining coal, combustible shale oil, bituminous rock, heavy natural oil where this process is based on gasification, melting or dissolving of those deposits.

  15. Are anonymous evaluations a better assessment of faculty teaching performance? A comparative analysis of open and anonymous evaluation processes.

    Science.gov (United States)

    Afonso, Nelia M; Cardozo, Lavoisier J; Mascarenhas, Oswald A J; Aranha, Anil N F; Shah, Chirag

    2005-01-01

    We compared teaching performance of medical school faculty using anonymous evaluations and open evaluations (in which the evaluator was not anonymous) and examined barriers to open evaluation. Residents and medical students evaluated faculty using an open evaluation instrument in which their identity was indicated in the evaluation. Following this, they completed anonymous evaluation on the same faculty members. Aggregate outcomes using the two evaluation systems were compared. Outcomes by group of evaluators (residents and students) were analyzed. Trainees were also asked to rate the barriers to the open evaluation process. A statistically significant difference between the open and anonymous evaluations was noted across all items, with faculty receiving lower scores on the anonymous evaluations. The mean score for all the items on the open evaluations was 4.45 +/- 0.65, compared to mean score of 4.07 +/- 0.80 on the anonymous evaluations. There was also a statistically significant difference between open and anonymous evaluations in five clinical teaching domains that were evaluated individually. Residents perceived that the three most common barriers to optimal evaluation were an apprehension of possible encounters with the same attending physician in the future, destruction of working relationships with the attending, and a feeling of frustration with the evaluation system. The evaluation of faculty teaching performance is complex. Most academic medical centers use the open evaluation format. This study supports the case for the use of the anonymous evaluation method as a more accurate reflection of teaching performance.

  16. Open Water Processes of the San Francisco Estuary: From Physical Forcing to Biological Responses

    Directory of Open Access Journals (Sweden)

    Wim Kimmerer

    2004-02-01

    Full Text Available This paper reviews the current state of knowledge of the open waters of the San Francisco Estuary. This estuary is well known for the extent to which it has been altered through loss of wetlands, changes in hydrography, and the introduction of chemical and biological contaminants. It is also one of the most studied estuaries in the world, with much of the recent research effort aimed at supporting restoration efforts. In this review I emphasize the conceptual foundations for our current understanding of estuarine dynamics, particularly those aspects relevant to restoration. Several themes run throughout this paper. First is the critical role physical dynamics play in setting the stage for chemical and biological responses. Physical forcing by the tides and by variation in freshwater input combine to control the movement of the salinity field, and to establish stratification, mixing, and dilution patterns throughout the estuary. Many aspects of estuarine dynamics respond to interannual variation in freshwater flow; in particular, abundance of several estuarine-dependent species of fish and shrimp varies positively with flow, although the mechanisms behind these relationships are largely unknown. The second theme is the importance of time scales in determining the degree of interaction between dynamic processes. Physical effects tend to dominate when they operate at shorter time scales than biological processes; when the two time scales are similar, important interactions can arise between physical and biological variability. These interactions can be seen, for example, in the response of phytoplankton blooms, with characteristic time scales of days, to stratification events occurring during neap tides. The third theme is the key role of introduced species in all estuarine habitats; particularly noteworthy are introduced waterweeds and fishes in the tidal freshwater reaches of the estuary, and introduced clams there and in brackish water. The

  17. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  18. The certification process of the LHCb distributed computing software

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    DIRAC contains around 200 thousand lines of python code, and LHCbDIRAC around 120 thousand. The testing process for each release consists of a number of steps, that includes static code analysis, unit tests, integration tests, regression tests, system tests. We dubbed the full p...

  19. Novel scaling of the multiplicity distributions in the sequential fragmentation process and in the percolation

    International Nuclear Information System (INIS)

    Botet, R.

    1996-01-01

    A novel scaling of the multiplicity distributions is found in the shattering phase of the sequential fragmentation process with inhibition. The same scaling law is shown to hold in the percolation process. (author)

  20. Joint probability distributions for a class of non-Markovian processes.

    Science.gov (United States)

    Baule, A; Friedrich, R

    2005-02-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.

  1. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  2. Supporting open collaboration in science through explicit and linked semantic description of processes

    Science.gov (United States)

    Gil, Yolanda; Michel, Felix; Ratnakar, Varun; Read, Jordan S.; Hauder, Matheus; Duffy, Christopher; Hanson, Paul C.; Dugan, Hilary

    2015-01-01

    The Web was originally developed to support collaboration in science. Although scientists benefit from many forms of collaboration on the Web (e.g., blogs, wikis, forums, code sharing, etc.), most collaborative projects are coordinated over email, phone calls, and in-person meetings. Our goal is to develop a collaborative infrastructure for scientists to work on complex science questions that require multi-disciplinary contributions to gather and analyze data, that cannot occur without significant coordination to synthesize findings, and that grow organically to accommodate new contributors as needed as the work evolves over time. Our approach is to develop an organic data science framework based on a task-centered organization of the collaboration, includes principles from social sciences for successful on-line communities, and exposes an open science process. Our approach is implemented as an extension of a semantic wiki platform, and captures formal representations of task decomposition structures, relations between tasks and users, and other properties of tasks, data, and other relevant science objects. All these entities are captured through the semantic wiki user interface, represented as semantic web objects, and exported as linked data.

  3. Numerical simulation for arc-plasma dynamics during contact opening process in electrical circuit-breakers

    International Nuclear Information System (INIS)

    Gupta, D N; Srinivas, D; Patil, G N; Kale, S S; Potnis, S B

    2010-01-01

    The high-energy, high-current thermal plasma that develops between electric contacts in a gas circuit-breaker during circuit interruption is an important phenomenon in the power transmission industry. The high temperature and pressure arc dissipates the tremendous amount of energy generated by the fault current. Simultaneously, this energy has to be transferred away from the contacts to build the dielectric strength level of the circuit-breaker. In order to interrupt the current, the arc must be weakened and finally extinguished. We model these phenomena by using a computer software code based on the solution of the unsteady Euler equations of gas dynamics. We consider the equations of fluid flows. These equations are solved numerically in complex circuit breaker geometries using a finite-volume method. The domain is initially filled with SF 6 gas. We begin our simulations from cold mode, where the fault current is not present (hence no arc). An axis-symmetric geometry of a 145 kV gas circuit-breaker is considered to study the pressure, density, and temperature profile during contact opening process.

  4. Investigation of Thermal Stress Distribution in Laser Spot Welding Process

    OpenAIRE

    Osamah F. Abdulateef

    2009-01-01

    The objective of this paper was to study the laser spot welding process of low carbon steel sheet. The investigations were based on analytical and finite element analyses. The analytical analysis was focused on a consistent set of equations representing interaction of the laser beam with materials. The numerical analysis based on 3-D finite element analysis of heat flow during laser spot welding taken into account the temperature dependence of the physical properties and latent heat of transf...

  5. A trial of distributed portable data acquisition and processing system implementation: the qdpb - data processing with branchpoints

    International Nuclear Information System (INIS)

    Gritsaj, K.I.; Isupov, A.Yu.

    2001-01-01

    A trial of distributed portable data acquisition and processing system qdpb is issued. An experimental setup data and hardware dependent code is separated from the generic part of the qdpb system. The generic part implementation is described

  6. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    Science.gov (United States)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  7. SCT: Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data.

    Science.gov (United States)

    De Leener, Benjamin; Lévy, Simon; Dupont, Sara M; Fonov, Vladimir S; Stikov, Nikola; Louis Collins, D; Callot, Virginie; Cohen-Adad, Julien

    2017-01-15

    For the past 25 years, the field of neuroimaging has witnessed the development of several software packages for processing multi-parametric magnetic resonance imaging (mpMRI) to study the brain. These software packages are now routinely used by researchers and clinicians, and have contributed to important breakthroughs for the understanding of brain anatomy and function. However, no software package exists to process mpMRI data of the spinal cord. Despite the numerous clinical needs for such advanced mpMRI protocols (multiple sclerosis, spinal cord injury, cervical spondylotic myelopathy, etc.), researchers have been developing specific tools that, while necessary, do not provide an integrative framework that is compatible with most usages and that is capable of reaching the community at large. This hinders cross-validation and the possibility to perform multi-center studies. In this study we introduce the Spinal Cord Toolbox (SCT), a comprehensive software dedicated to the processing of spinal cord MRI data. SCT builds on previously-validated methods and includes state-of-the-art MRI templates and atlases of the spinal cord, algorithms to segment and register new data to the templates, and motion correction methods for diffusion and functional time series. SCT is tailored towards standardization and automation of the processing pipeline, versatility, modularity, and it follows guidelines of software development and distribution. Preliminary applications of SCT cover a variety of studies, from cross-sectional area measures in large databases of patients, to the precise quantification of mpMRI metrics in specific spinal pathways. We anticipate that SCT will bring together the spinal cord neuroimaging community by establishing standard templates and analysis procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. The Distribution of the Interval between Events of a Cox Process with Shot Noise Intensity

    Directory of Open Access Journals (Sweden)

    Angelos Dassios

    2008-01-01

    Full Text Available Applying piecewise deterministic Markov processes theory, the probability generating function of a Cox process, incorporating with shot noise process as the claim intensity, is obtained. We also derive the Laplace transform of the distribution of the shot noise process at claim jump times, using stationary assumption of the shot noise process at any times. Based on this Laplace transform and from the probability generating function of a Cox process with shot noise intensity, we obtain the distribution of the interval of a Cox process with shot noise intensity for insurance claims and its moments, that is, mean and variance.

  9. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  10. Equivalence of functional limit theorems for stationary point processes and their Palm distributions

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1989-01-01

    Let P be the distribution of a stationary point process on the real line and let P0 be its Palm distribution. In this paper we consider two types of functional limit theorems, those in terms of the number of points of the point process in (0, t] and those in terms of the location of the nth point

  11. Design of distributed systems of hydrolithospere processes management. Selection of optimal number of extracting wells

    Science.gov (United States)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The article considers the important issue of designing the distributed systems of hydrolithospere processes management. Control effects on the hydrolithospere processes are implemented by a set of extractive wells. The article shows how to determine the optimal number of extractive wells that provide a distributed control impact on the management object.

  12. A reconfigurable strategy for distributed digital process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1990-01-01

    A reconfigurable control scheme is proposed which, unlike a preprogrammed one, uses stochastic automata to learn the current operating status of the environment (i.e., the plant, controller, and communication network) by dynamically monitoring the system performance and then switching to the appropriate controller on the basis of these observations. The potential applicability of this reconfigurable control scheme to electric power plants is being investigated. The plant under consideration is the Experimental Breeder Reactor (EBR-II) at the Argonne National Laboratory site in Idaho. The distributed control system is emulated on a ring network where the individual subsystems are hosted as follows: (1) the reconfigurable control modules are located in one of the network modules called Multifunction Controller; (2) the learning modules are resident in a VAX 11/785 mainframe computer; and (3) a detailed model of the plant under control is executed in the same mainframe. This configuration is a true representation of the network-based control system in the sense that it operates in real time and is capable of interacting with the actual plant

  13. Open Data Within governmental Organisations: Effects, Benefits and Challenges of the Implementation Process

    Directory of Open Access Journals (Sweden)

    Martijn Hartog

    2014-10-01

    Full Text Available This article describes the growth of open government, open data and the means for transparency and accountability but aims to reflect on the bottlenecks and actual practicallity of opening data to the public domain by two governmental bodies. The Municiaplity of The Hague and The Province of South-Holland of The Netherlands are part of 2 research programmes called ‘Government of the Future’, which main goals are to explore and establish knowledge on societal innovation by new applications and possibilities of long term effects of ICT’s in the public sector. Part of these programmes are themes as transparecny and open data, which are  viewed form the somewhat pragmatic and operational side of its applicability. The paper shows the development within the governmental bodies and captivates the ‘readiness’ for open data.

  14. When to make proprietary software open source

    NARCIS (Netherlands)

    Caulkins, J.P.; Feichtinger, G.; Grass, D.; Hartl, R.F.; Kort, P.M.; Seidl, A.

    Software can be distributed closed source (proprietary) or open source (developed collaboratively). While a firm cannot sell open source software, and so loses potential sales revenue, the open source software development process can have a substantial positive impact on the quality of a software,

  15. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    Science.gov (United States)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  16. Ultralow field emission from thinned, open-ended, and defected carbon nanotubes by using microwave hydrogen plasma processing

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Jian-Hua, E-mail: jhdeng1983@163.com [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Lin; Wang, Fan-Jie; Yu, Bin; Li, Guo-Zheng; Li, De-Jun [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Guo-An [Key Laboratory of Beam Technology and Material Modification of Ministry of Education, Beijing Normal University, Beijing 100875 (China)

    2015-01-01

    Graphical abstract: Thinned, open-ended, and defected carbon nanotubes were prepared by using hydrogen plasma processing. The processed carbon nanotubes have far better field emission performance than that of the pristine ones. - Highlights: • CVD prepared CNT arrays were processed by microwave hydrogen plasma. • Thinned, open-ended, and defected CNTs were obtained. • Processed CNTs have far better field emission performance than the pristine ones. • Processed CNTs have applicable emission stability after being perfectly aged. - Abstract: Ultralow field emission is achieved from carbon nanotubes (CNTs) by using microwave hydrogen plasma processing. After the processing, typical capped CNT tips are removed, with thinned, open-ended, and defected CNTs left. Structural analyses indicate that the processed CNTs have more SP{sup 3}-hybridized defects as compared to the pristine ones. The morphology of CNTs can be readily controlled by adjusting microwave powers, which change the shape of CNTs by means of hydrogen plasma etching. Processed CNTs with optimal morphology are found to have an ultralow turn-on field of 0.566 V/μm and threshold field of 0.896 V/μm, much better than 0.948 and 1.559 V/μm of the as-grown CNTs, respectively. This improved FE performance is ascribed to the structural changes of CNTs after the processing. The thinned and open-ended shape of CNTs can facilitate electron tunneling through barriers and additionally, the increased defects at tube walls can serve as new active emission sites. Furthermore, our plasma processed CNTs exhibit excellent field emission stability at a large emission current density of 10.36 mA/cm{sup 2} after being perfectly aged, showing promising prospects in applications as high-performance vacuum electron sources.

  17. Distributed Sensing and Processing for Multi-Camera Networks

    Science.gov (United States)

    Sankaranarayanan, Aswin C.; Chellappa, Rama; Baraniuk, Richard G.

    Sensor networks with large numbers of cameras are becoming increasingly prevalent in a wide range of applications, including video conferencing, motion capture, surveillance, and clinical diagnostics. In this chapter, we identify some of the fundamental challenges in designing such systems: robust statistical inference, computationally efficiency, and opportunistic and parsimonious sensing. We show that the geometric constraints induced by the imaging process are extremely useful for identifying and designing optimal estimators for object detection and tracking tasks. We also derive pipelined and parallelized implementations of popular tools used for statistical inference in non-linear systems, of which multi-camera systems are examples. Finally, we highlight the use of the emerging theory of compressive sensing in reducing the amount of data sensed and communicated by a camera network.

  18. Open Government and (Linked (Open (Government (Data

    Directory of Open Access Journals (Sweden)

    Christian Philipp Geiger

    2012-12-01

    Full Text Available This article explores the opening and the free usage of stored public sector data, supplied by state. In the age of Open Government and Open Data it’s not enough just to put data online. It should be rather weighed out whether, how and which supplied public sector data can be published. Open Data are defined as stored data which could be made accessible in a public interest without any restrictions for usage and distribution. These Open Data can possibly be statistics, geo data, maps, plans, environmental data and weather data in addition to materials of the parliaments, ministries and authorities. The preparation and the free access to existing data permit varied approaches to the reuse of data, discussed in the article. In addition, impulses can be given for Open Government – the opening of state and administration, to more transparency, participation and collaboration as well as to innovation and business development. The Open Data movement tries to get to the bottom of current publication processes in the public sector which could be formed even more friendly to citizens and enterprises.

  19. {open_quotes}O{close_quotes} ring sealed process tube, Phase II, test project

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.E.

    1951-04-09

    The {open_quotes}O{close_quotes} ring seal has been proposed to replace the van stone flange and the bellows thermal expansion assembly currently used on the existing Hanford piles to achieve water and gas seals, respectively. Possible advantages of the {open_quotes}O{close_quotes} ring seal are: (1) simplification of component parts and elimination of van stone corrosion; (2) simplification of maintenance; (3) lower costs of initial erection; (4) increased strength. This test supplements Test Project No. 27 (a preliminary thermal cycling test) in applying the {open_quotes}O{close_quotes} ring seal assembly to actual pile operating conditions.

  20. Data processing and distribution in the PAMELA experiment

    International Nuclear Information System (INIS)

    Casolino, M.; Nagni, M.

    2007-01-01

    YODA is a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and engineering information. Housekeeping information are analyzed in a short time from download (∼hours) in order to monitor the status of the experiment and for the mission planning. A prototype for the data visualization runs on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. A quicklook system with GUI interface is used for operator monitoring and fast macrocommand issuing. On a longer timescale scientific data are analyzed, calibrations performed and the database adjourned. The data storage core is composed of CERN's ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing of ground PAMELA data

  1. Experimental investigation of open circuit voltage during start-up process of HT-PEMFC

    International Nuclear Information System (INIS)

    Abdul Rasheed, Raj Kamal; Chan, Siew Hwa

    2015-01-01

    Highlights: • OCV reduces non-linearly with temperature under constant power input. • The reduction gradient of OCV is observed to be non-linear with time. • Nernst equation is less accurate for HT-PEMFC start-up models. - Abstract: This paper investigates the open circuit voltage (OCV) during the warm-up process of a high temperature proton exchange membrane fuel cell (HT-PEMFC) from 140 °C to the desired temperature of 180 °C, where the temperature increases with time. The heating strategy involves the external heating of the fuel cell with constant heat input rate. The commonly used Nernst equation, to predict the OCV of the fuel cell, is usually used in transient start-up models. Thus, this papers highlights the limitations of using the Nernst equation where the temperature increases transiently with time. A polybenzimidazole-based HTPEM single cell was set up and the OCV was measured under constant heating power supplied by an external source. A parametric study was done by varying the external heating power and the effect on the OCV was observed. The results showed that the OCV reduces non-linearly with respect to temperature, when the fuel cell is subjected to a constant heating power. This behaviour is clearly in contrast with the Nernst equation, which considers the temperature as steady state. For effective comparison, the OCV was also measured under steady state temperatures, showing an almost constant reduction gradient of ∼ −2.3×10 −4 V/°C. However, the behaviour under a constant heating power show curvilinear reduction of the OCV as the temperature increases. In addition, as the external heating power is increased, the degree of curvature of the OCV profile is greater. Thus, the results clearly indicate that the accuracy of using the Nernst equation in transient thermal start-up models can be improved, by considering a non linear behaviour, as shown in this paper.

  2. Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder: An Open Study Examining Methylphenidate Effects.

    Science.gov (United States)

    Lanzetta-Valdo, Bianca Pinheiro; Oliveira, Giselle Alves de; Ferreira, Jane Tagarro Correa; Palacios, Ester Miyuki Nakamura

    2017-01-01

    Introduction  Children with Attention Deficit Hyperactivity Disorder can present Auditory Processing (AP) Disorder. Objective  The study examined the AP in ADHD children compared with non-ADHD children, and before and after 3 and 6 months of methylphenidate (MPH) treatment in ADHD children. Methods  Drug-naive children diagnosed with ADHD combined subtype aging between 7 and 11 years, coming from public and private outpatient service or public and private school, and age-gender-matched non-ADHD children, participated in an open, non-randomized study from February 2013 to December 2013. They were submitted to a behavioral battery of AP tests comprising Speech with white Noise, Dichotic Digits (DD), and Pitch Pattern Sequence (PPS) and were compared with non-ADHD children. They were followed for 3 and 6 months of MPH treatment (0.5 mg/kg/day). Results  ADHD children presented larger number of errors in DD ( p  < 0.01), and less correct responses in the PPS ( p  < 0.0001) and in the SN ( p  < 0.05) tests when compared with non-ADHD children. The treatment with MPH, especially along 6 months, significantly decreased the mean errors in the DD ( p  < 0.01) and increased the correct response in the PPS ( p  < 0.001) and SN ( p  < 0.01) tests when compared with the performance before MPH treatment. Conclusions  ADHD children show inefficient AP in selected behavioral auditory battery suggesting impaired in auditory closure, binaural integration, and temporal ordering. Treatment with MPH gradually improved these deficiencies and completely reversed them by reaching a performance similar to non-ADHD children at 6 months of treatment.

  3. An open source Bayesian Monte Carlo isotope mixing model with applications in Earth surface processes

    Science.gov (United States)

    Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.

    2015-05-01

    The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.

  4. Effect of solid distribution on elastic properties of open-cell cellular solids using numerical and experimental methods.

    Science.gov (United States)

    Zargarian, A; Esfahanian, M; Kadkhodapour, J; Ziaei-Rad, S

    2014-09-01

    Effect of solid distribution between edges and vertices of three-dimensional cellular solid with an open-cell structure was investigated both numerically and experimentally. Finite element analysis (FEA) with continuum elements and appropriate periodic boundary condition was employed to calculate the elastic properties of cellular solids using tetrakaidecahedral (Kelvin) unit cell. Relative densities between 0.01 and 0.1 and various values of solid fractions were considered. In order to validate the numerical model, three scaffolds with the relative density of 0.08, but different amounts of solid in vertices, were fabricated via 3-D printing technique. Good agreement was observed between numerical simulation and experimental results. Results of numerical simulation showed that, at low relative densities (solid fraction in vertices. By fitting a curve to the data obtained from the numerical simulation and considering the relative density and solid fraction in vertices, empirical relations were derived for Young׳s modulus and Poisson׳s ratio. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    Science.gov (United States)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  6. Marine mammal distribution in the open ocean: a comparison of ocean color data products and levant time scales

    Science.gov (United States)

    Ohern, J.

    2016-02-01

    Marine mammals are generally located in areas of enhanced surface primary productivity, though they may forage much deeper within the water column and higher on the food chain. Numerous studies over the past several decades have utilized ocean color data from remote sensing instruments (CZCS, MODIS, and others) to asses both the quantity and time scales over which surface primary productivity relates to marine mammal distribution. In areas of sustained upwelling, primary productivity may essentially grow in the secondary levels of productivity (the zooplankton and nektonic species on which marine mammals forage). However, in many open ocean habitats a simple trophic cascade does not explain relatively short time lags between enhanced surface productivity and marine mammal presence. Other dynamic features that entrain prey or attract marine mammals may be responsible for the correlations between marine mammals and ocean color. In order to investigate these features, two MODIS (moderate imaging spectroradiometer) data products, the concentration as well as the standard deviation of surface chlorophyll were used in conjunction with marine mammal sightings collected within Ecuadorian waters. Time lags between enhanced surface chlorophyll and marine mammal presence were on the order of 2-4 weeks, however correlations were much stronger when the standard deviation of spatially binned images was used, rather than the chlorophyll concentrations. Time lags also varied between Balaenopterid and Odontocete cetaceans. Overall, the standard deviation of surface chlorophyll proved a useful tool for assessing potential relationships between marine mammal sightings and surface chlorophyll.

  7. Development and evaluation of an open-source, low-cost distributed sensor network for environmental monitoring applications

    Science.gov (United States)

    Gunawardena, N.; Pardyjak, E. R.; Stoll, R.; Khadka, A.

    2018-02-01

    Over the last decade there has been a proliferation of low-cost sensor networks that enable highly distributed sensor deployments in environmental applications. The technology is easily accessible and rapidly advancing due to the use of open-source microcontrollers. While this trend is extremely exciting, and the technology provides unprecedented spatial coverage, these sensors and associated microcontroller systems have not been well evaluated in the literature. Given the large number of new deployments and proposed research efforts using these technologies, it is necessary to quantify the overall instrument and microcontroller performance for specific applications. In this paper, an Arduino-based weather station system is presented in detail. These low-cost energy-budget measurement stations, or LEMS, have now been deployed for continuous measurements as part of several different field campaigns, which are described herein. The LEMS are low-cost, flexible, and simple to maintain. In addition to presenting the technical details of the LEMS, its errors are quantified in laboratory and field settings. A simple artificial neural network-based radiation-error correction scheme is also presented. Finally, challenges and possible improvements to microcontroller-based atmospheric sensing systems are discussed.

  8. The Positions of Virtual Knowledge Brokers in the Core Process of Open Innovation

    NARCIS (Netherlands)

    Hacievliyagil, N.K.; Maisonneuve, Y.E.; Auger, J.F.; Hartmann, L.

    2007-01-01

    Several companies are implementing the strategy of open innovation in their research and development operations. They become more dependent, therefore, on their capabilities to exchange knowledge and technology with external parties. To facilitate these exchanges, virtual knowledge brokers use

  9. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Directory of Open Access Journals (Sweden)

    Konrad J Karczewski

    Full Text Available The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping, a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  10. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Science.gov (United States)

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  11. A role for distributed processing in advanced nuclear materials control and accountability systems

    International Nuclear Information System (INIS)

    Tisinger, R.M.; Whitty, W.J.; Ford, W.; Strittmatter, R.B.

    1986-01-01

    Networking and distributed processing hardware and software have the potential of greatly enhancing nuclear materials control and account-ability (MCandA) systems, both from safeguards and process operations perspectives while allowing timely integrated safeguards activities and enhanced computer security at reasonable cost. A hierarchical distributed system is proposed consisting of groups of terminals and instruments in plant production and support areas connected to microprocessors that are connected to either larger microprocessors or minicomputers. The structuring and development of a limited distributed MCandA prototype system, including human engineering concepts, are described. Implications of integrated safeguards and computer security concepts to the distributed system design are discussed

  12. Implementation of the U.S. Environmental Protection Agency's Waste Reduction (WAR) Algorithm in Cape-Open Based Process Simulators

    Science.gov (United States)

    The Sustainable Technology Division has recently completed an implementation of the U.S. EPA's Waste Reduction (WAR) Algorithm that can be directly accessed from a Cape-Open compliant process modeling environment. The WAR Algorithm add-in can be used in AmsterChem's COFE (Cape-Op...

  13. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  14. EVALUATION OF POLLUTION PREVENTION OPTIONS TO REDUCE STYRENE EMISSIONS FROM FIBER-REINFORCED PLASTIC OPEN MOLDING PROCESSES

    Science.gov (United States)

    Pollution prevention (P2) options to reduce styrene emissions, such as new materials, and application equipment, are commercially available to the operators of open molding processes. However, information is lacking on the emissions reduction that these options can achieve. To me...

  15. Gamma processes and peaks-over-threshold distributions for time-dependent reliability

    International Nuclear Information System (INIS)

    Noortwijk, J.M. van; Weide, J.A.M. van der; Kallen, M.J.; Pandey, M.D.

    2007-01-01

    In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability

  16. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    Science.gov (United States)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding

  17. Does open access improve the process and outcome of podiatric care?

    Science.gov (United States)

    Wrobel, James S; Davies, Michael L; Robbins, Jeffrey M

    2011-05-19

    Open access to clinics is a management strategy to improve healthcare delivery. Providers are sometimes hesitant to adopt open access because of fear of increased visits for potentially trivial complaints. We hypothesized open access clinics would result in decreased wait times, increased number of podiatry visits, fewer no shows, higher rates of acute care visits, and lower minor amputation rates over control clinics without open access. This study was a national retrospective case-control study of VHA (Veterans Hospital Administration) podiatry clinics in 2008. Eight case facilities reported to have open podiatry clinic access for at least one year were identified from an email survey. Sixteen control facilities with similar structural features (e.g., full time podiatrists, health tech, residency program, reconstructive foot surgery, vascular, and orthopedic surgery) were identified in the same geographic region as the case facilities. Twenty-two percent of facilities responded to the survey. Fifty-four percent reported open access and 46% did not. There were no differences in facility or podiatry panel size, podiatry visits, or visit frequency between the cases and controls. Podiatry visits trended higher for control facilities but didn't reach statistical significance. Case facilities had more new consults seen within 30 days (96%, 89%; P = 0.050) and lower minor amputation rates (0.62/1,000, 1.0/1,000; P = 0.041). The VHA is the worlds largest managed care organization and it relies on clinical efficiencies as one mechanism to improve the quality of care. Open access clinics had more timely access for new patients and lower rates of minor amputations.

  18. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    Science.gov (United States)

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  19. BioFed: federated query processing over life sciences linked open data.

    Science.gov (United States)

    Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich

    2017-03-15

    Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the

  20. Distributed processing in receivers based on tensor for cooperative communications systems

    OpenAIRE

    Igor FlÃvio SimÃes de Sousa

    2014-01-01

    In this dissertation, we present a distributed data estimation and detection approach for the uplink of a network that uses CDMA at transmitters (users). The analyzed network can be represented by an undirected and connected graph, where the nodes use a distributed estimation algorithm based on consensus averaging to perform joint channel and symbol estimation using a receiver based on tensor signal processing. The centralized receiver, developed for a central base station, and the distribute...

  1. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  2. Accelerating the output of technology for auxillary processes in open cast mines

    Energy Technology Data Exchange (ETDEWEB)

    Matantsev, A.I.

    1984-01-01

    An analysis of the current state of track-laying operations in open-cut mines is given and their labor intensive nature is noted. The lag in the technological development of mechanization equipment for track laying and repair operations is noted. Results are given from developments by the Scientific Research Institute of Open-Pit Mining Operations in the field of mobile transportation technology and promising modular route design. Also examined are the problems of improving the routing bases in the industry. The absence of factories for manufacturing transportation technology and equipment is reflected most evidently in the technical and cost characteristics of coal production by an open-cut method and requires immediate solution.

  3. New technical design of food packaging makes the opening process easier for patients with hand disorders.

    Science.gov (United States)

    Hensler, Stefanie; Herren, Daniel B; Marks, Miriam

    2015-09-01

    Opening packaged food is a complex daily activity carried out worldwide. Peelable packaging, as used for cheese or meat, causes real problems for many consumers, especially elderly people and those with hand disorders. Our aim was to investigate the possibility of producing meat packaging that is easier for patients with hand disorders to open. One hundred patients with hand osteoarthritis were asked to open a meat package currently available in supermarkets (Type A) and a modified, newly designed version (Type B), and rate their experiences with a consumer satisfaction index (CSI). The mean CSI of the Type B packs was 68.9%, compared with 41.9% for Type A (p food packages that afford greater consumer satisfaction. Such future packaging would benefit not only people with hand disorders but also the population as a whole. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    Science.gov (United States)

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  5. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    Science.gov (United States)

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  6. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  7. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  8. Bounds for the probability distribution function of the linear ACD process

    OpenAIRE

    Fernandes, Marcelo

    2003-01-01

    Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

  9. Bridging the gap between a stationary point process and its Palm distribution

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1994-01-01

    In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to

  10. Time Optimal Run-time Evaluation of Distributed Timing Constraints in Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.; Kristensen, C.H.

    1993-01-01

    This paper considers run-time evaluation of an important class of constraints; Timing constraints. These appear extensively in process control systems. Timing constraints are considered in distributed systems, i.e. systems consisting of multiple autonomous nodes......

  11. Absorption, Distribution, Metabolism, and Excretion of [14C]-Volixibat in Healthy Men: Phase 1 Open-Label Study.

    Science.gov (United States)

    Siebers, Nicholas; Palmer, Melissa; Silberg, Debra G; Jennings, Lee; Bliss, Caleb; Martin, Patrick T

    2018-02-01

    Volixibat is a potent inhibitor of the apical sodium-dependent bile acid transporter in development for the treatment of nonalcoholic steatohepatitis. This phase 1, open-label study investigated the absorption, distribution, metabolism, and excretion of [ 14 C]-volixibat in heathy men. Eligible men (n = 8) aged 18-50 years (body mass index 18.0-30.0 kg/m 2 ; weight >50 kg) received a single oral dose of [ 14 C]-volixibat 50 mg containing ~5.95 µCi radioactivity. The primary objectives were to assess the pharmacokinetics of [ 14 C]-volixibat and to determine the total radioactivity in whole blood, plasma, urine, and feces at pre-selected time points over 6 days. The secondary objectives were to characterize metabolites and to assess the safety and tolerability. Low concentrations of volixibat (range 0-0.179 ng/mL) were detected in plasma up to 8 h following administration; the pharmacokinetic parameters could not be calculated. No radioactivity was observed in plasma or whole blood. The percentage (mean ± standard deviation) of total radioactivity in urine was 0.01 ± 0.007%. The vast majority (92.3 ± 5.25%) of volixibat was recovered in feces (69.2 ± 33.1% within 24 h). Unchanged volixibat was the only radioactive component detected in feces. Adverse events were mild in severity and mostly gastrointestinal. Changes in laboratory values were not clinically meaningful. Following oral administration, [ 14 C]-volixibat was excreted unchanged from the parent compound almost exclusively via fecal excretion, indicating that the drug is minimally absorbed. Consistent with other studies, adverse events were primarily gastrointestinal in nature. ClinicalTrials.gov identifier NCT02571192.

  12. Open Technology Development: Roadmap Plan

    National Research Council Canada - National Science Library

    Herz, J. C; Lucas, Mark; Scott, John

    2006-01-01

    .... Collaborative and distributed online tools; and 4. Technological Agility. Open standards and interfaces were initially established through ARPA and distributed via open source software reference implementations...

  13. Analysing the Outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center

    OpenAIRE

    Marjeta, Katri

    2011-01-01

    Marjeta, Katri. 2011. Analysing the outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center. Master´s thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 57. Due to confidentiality issues, this work has been modified from its original form. The aim of this Master Thesis work is to describe and analyze the outbound logistics process enhancement projects executed in Nokia-Siemens Networks Global Distribution Center after the N...

  14. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  15. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available of such a toolkit facilitates and increases productivity during subsequent tool development: “develop once and use many times”. The concept of an extendible toolkit lends itself naturally to the open-source philosophy, where the toolkit user-base develops...

  16. Process and apparatus for forming a circular lip around an opening

    International Nuclear Information System (INIS)

    Fournier, Y.

    1991-01-01

    Circular lip around an opening is made by machining a spot facing, then pushing back the edge using rollers. Rollers are held by a support that displaces them along its axis. The angle that the roller's axis makes with the support's axis is progressively reduced as the support advances

  17. Investigating the Influence of Technology Inflows on Technology Outflows in Open Innovation Processes : A Longitudinal Analysis

    NARCIS (Netherlands)

    Sikimic, U.; Chiesa, V.; Frattini, F.; Scalera, V.G.

    2016-01-01

    The open innovation (OI) paradigm emphasizes the importance of integrating inbound and outbound flows of technology to increase a firm's innovation performance. While the synergies between technology inflows and outflows have been discussed in conceptual OI articles, the majority of empirical

  18. Cross-coherent vector sensor processing for spatially distributed glider networks.

    Science.gov (United States)

    Nichols, Brendan; Sabra, Karim G

    2015-09-01

    Autonomous underwater gliders fitted with vector sensors can be used as a spatially distributed sensor array to passively locate underwater sources. However, to date, the positional accuracy required for robust array processing (especially coherent processing) is not achievable using dead-reckoning while the gliders remain submerged. To obtain such accuracy, the gliders can be temporarily surfaced to allow for global positioning system contact, but the acoustically active sea surface introduces locally additional sensor noise. This letter demonstrates that cross-coherent array processing, which inherently mitigates the effects of local noise, outperforms traditional incoherent processing source localization methods for this spatially distributed vector sensor network.

  19. Reaction Mechanism and Distribution Behavior of Arsenic in the Bottom Blown Copper Smelting Process

    Directory of Open Access Journals (Sweden)

    Qinmeng Wang

    2017-08-01

    Full Text Available The control of arsenic, a toxic and carcinogenic element, is an important issue for all copper smelters. In this work, the reaction mechanism and distribution behavior of arsenic in the bottom blown copper smelting process (SKS process were investigated and compared to the flash smelting process. There are obvious differences of arsenic distribution in the SKS process and flash process, resulting from the differences of oxygen potentials, volatilizations, smelting temperatures, reaction intensities, and mass transfer processes. Under stable production conditions, the distributions of arsenic among matte, slag, and gas phases are 6%, 12%, and 82%, respectively. Less arsenic is reported in the gas phase with the flash process than with the SKS process. The main arsenic species in gas phase are AsS (g, AsO (g, and As2 (g. Arsenic exists in the slag predominantly as As2O3 (l, and in matte as As (l. High matte grade is harmful to the elimination of arsenic to gas. The changing of Fe/SiO2 has slight effects on the distributions of arsenic. In order to enhance the removal of arsenic from the SKS smelting system to the gas phase, low oxygen concentration, low ratios of oxygen/ore, and low matte grade should be chosen. In the SKS smelting process, no dust is recycled, and almost all dust is collected and further treated to eliminate arsenic and recover valuable metals by other process streams.

  20. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  1. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  2. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    functionality by statistically modelling the distribution of the synaptic vesicles in two groups of rats: a control group subjected to sham stress and a stressed group subjected to a single acute foot-shock (FS)-stress episode. We hypothesize that the synaptic vesicles have different spatial distributions...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  3. Digi-Clima Grid: image processing and distributed computing for recovering historical climate data

    Directory of Open Access Journals (Sweden)

    Sergio Nesmachnow

    2015-12-01

    Full Text Available This article describes the Digi-Clima Grid project, whose main goals are to design and implement semi-automatic techniques for digitalizing and recovering historical climate records applying parallel computing techniques over distributed computing infrastructures. The specific tool developed for image processing is described, and the implementation over grid and cloud infrastructures is reported. A experimental analysis over institutional and volunteer-based grid/cloud distributed systems demonstrate that the proposed approach is an efficient tool for recovering historical climate data. The parallel implementations allow to distribute the processing load, achieving accurate speedup values.

  4. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  5. Managing Distributed Innovation Processes in Virtual Organizations by Applying the Collaborative Network Relationship Analysis

    Science.gov (United States)

    Eschenbächer, Jens; Seifert, Marcus; Thoben, Klaus-Dieter

    Distributed innovation processes are considered as a new option to handle both the complexity and the speed in which new products and services need to be prepared. Indeed most research on innovation processes was focused on multinational companies with an intra-organisational perspective. The phenomena of innovation processes in networks - with an inter-organisational perspective - have been almost neglected. Collaborative networks present a perfect playground for such distributed innovation processes whereas the authors highlight in specific Virtual Organisation because of their dynamic behaviour. Research activities supporting distributed innovation processes in VO are rather new so that little knowledge about the management of such research is available. With the presentation of the collaborative network relationship analysis this gap will be addressed. It will be shown that a qualitative planning of collaboration intensities can support real business cases by proving knowledge and planning data.

  6. The influence of emotion on lexical processing: insights from RT distributional analysis.

    Science.gov (United States)

    Yap, Melvin J; Seow, Cui Shan

    2014-04-01

    In two lexical decision experiments, the present study was designed to examine emotional valence effects on visual lexical decision (standard and go/no-go) performance, using traditional analyses of means and distributional analyses of response times. Consistent with an earlier study by Kousta, Vinson, and Vigliocco (Cognition 112:473-481, 2009), we found that emotional words (both negative and positive) were responded to faster than neutral words. Finer-grained distributional analyses further revealed that the facilitation afforded by valence was reflected by a combination of distributional shifting and an increase in the slow tail of the distribution. This suggests that emotional valence effects in lexical decision are unlikely to be entirely mediated by early, preconscious processes, which are associated with pure distributional shifting. Instead, our results suggest a dissociation between early preconscious processes and a later, more task-specific effect that is driven by feedback from semantically rich representations.

  7. Open-source intelligence in the Czech military knowledge syst em and process design

    OpenAIRE

    Krejci, Roman

    2002-01-01

    Owing to the recent transitions in the Czech Republic, the Czech military must satisfy a large set of new requirements. One way the military intelligence can become more effective and can conserve resources is by increasing the efficiency of open-source intelligence (OSINT), which plays an important part in intelligence gathering in the age of information. When using OSINT effectively, the military intelligence can elevate its responsiveness to different types of crises and can also properly ...

  8. Core power distribution measurement and data processing in Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Hong

    1997-01-01

    For the first time in China, Daya Bay Nuclear Power Station applied the advanced technology of worldwide commercial pressurized reactors to the in-core detectors, the leading excore six-chamber instrumentation for precise axial power distribution, and the related data processing. Described in this article are the neutron flux measurement in Daya Bay Nuclear Power Station, and the detailed data processing

  9. DISCO - A concept of a system for integrated data base management in distributed data processing systems

    International Nuclear Information System (INIS)

    Holler, E.

    1980-01-01

    The development in data processing technology favors the trend towards distributed data processing systems: The large-scale integration of semiconductor devices has lead to very efficient (approx. 10 6 operations per second) and relatively cheap low end computers being offered today, that allow to install distributed data processing systems with a total capacity coming near to that of large-scale data processing plants at a tolerable investment expenditure. The technologies of communication and data banks, each by itself, have reached a state of development justifying their routine application. This is made evident by the present efforts for standardization in both areas. The integration of both technologies in the development of systems for integrated distributed data bank management, however, is new territory for engineering. (orig.) [de

  10. The Analysis of process optimization during the loading distribution test for steam turbine

    International Nuclear Information System (INIS)

    Li Jiangwei; Cao Yuhua; Li Dawei

    2014-01-01

    The loading distribution of steam turbine needs six times to complete in total, the first time is completed when the turbine cylinder buckles, the rest must be completed orderly in the process of installing GVP pipe. To complete 5 tests of loading distribution and installation of GVP pipe, it usually takes around 90 days for most nuclear plants while the unit l of Fuqing Nuclear Power Station compress it into about 45 days by optimizing the installation process. this article describes the successful experience of how the Unit l of Fuqing Nuclear Power Station finished 5 tests of loading distribution and installation of GVP pipe in 45 days by optimizing the process, Meanwhile they analysis the advantages and disadvantages through comparing it with the process provide by suppliers, which brings up some rationalization proposals for installation work to the follow-up units of our plant. (authors)

  11. Distributed system for parallel data processing of ECT signals for electromagnetic flaw detection in materials

    International Nuclear Information System (INIS)

    Guliashki, Vassil; Marinova, Galia

    2002-01-01

    The paper proposes a distributed system for parallel data processing of ECT signals for flaw detection in materials. The measured data are stored in files on a host computer, where a JAVA server is located. The host computer is connected through Internet to a set of client computers, distributed geographically. The data are distributed from the host computer by means of the JAVA server to the client computers according their requests. The software necessary for the data processing is installed on each client computer in advance. The organization of the data processing on many computers, working simultaneously in parallel, leads to great time reducing, especially in cases when huge amount of data should be processed in very short time. (Author)

  12. When the mean is not enough: Calculating fixation time distributions in birth-death processes.

    Science.gov (United States)

    Ashcroft, Peter; Traulsen, Arne; Galla, Tobias

    2015-10-01

    Studies of fixation dynamics in Markov processes predominantly focus on the mean time to absorption. This may be inadequate if the distribution is broad and skewed. We compute the distribution of fixation times in one-step birth-death processes with two absorbing states. These are expressed in terms of the spectrum of the process, and we provide different representations as forward-only processes in eigenspace. These allow efficient sampling of fixation time distributions. As an application we study evolutionary game dynamics, where invading mutants can reach fixation or go extinct. We also highlight the median fixation time as a possible analog of mixing times in systems with small mutation rates and no absorbing states, whereas the mean fixation time has no such interpretation.

  13. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    Science.gov (United States)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  14. Determination of material distribution in heading process of small bimetallic bar

    Science.gov (United States)

    Presz, Wojciech; Cacko, Robert

    2018-05-01

    The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.

  15. Exact run length distribution of the double sampling x-bar chart with estimated process parameters

    Directory of Open Access Journals (Sweden)

    Teoh, W. L.

    2016-05-01

    Full Text Available Since the run length distribution is generally highly skewed, a significant concern about focusing too much on the average run length (ARL criterion is that we may miss some crucial information about a control chart’s performance. Thus it is important to investigate the entire run length distribution of a control chart for an in-depth understanding before implementing the chart in process monitoring. In this paper, the percentiles of the run length distribution for the double sampling (DS X chart with estimated process parameters are computed. Knowledge of the percentiles of the run length distribution provides a more comprehensive understanding of the expected behaviour of the run length. This additional information includes the early false alarm, the skewness of the run length distribution, and the median run length (MRL. A comparison of the run length distribution between the optimal ARL-based and MRL-based DS X chart with estimated process parameters is presented in this paper. Examples of applications are given to aid practitioners to select the best design scheme of the DS X chart with estimated process parameters, based on their specific purpose.

  16. A community dataspace for distribution and processing of "long tail" high resolution topography data

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Arrowsmith, R.

    2016-12-01

    Topography is a fundamental observable for Earth and environmental science and engineering. High resolution topography (HRT) is revolutionary for Earth science. Cyberinfrastructure that enables users to discover, manage, share, and process these data increases the impact of investments in data collection and catalyzes scientific discovery.National Science Foundation funded OpenTopography (OT, www.opentopography.org) employs cyberinfrastructure that includes large-scale data management, high-performance computing, and service-oriented architectures, providing researchers with efficient online access to large, HRT (mostly lidar) datasets, metadata, and processing tools. HRT data are collected from satellite, airborne, and terrestrial platforms at increasingly finer resolutions, greater accuracy, and shorter repeat times. There has been a steady increase in OT data holdings due to partnerships and collaborations with various organizations with the academic NSF domain and beyond.With the decreasing costs of HRT data collection, via methods such as Structure from Motion, the number of researchers collecting these data is increasing. Researchers collecting these "long- tail" topography data (of modest size but great value) face an impediment, especially with costs associated in making them widely discoverable, shared, annotated, cited, managed and archived. Also because there are no existing central repositories or services to support storage and curation of these datasets, much of it is isolated and difficult to locate and preserve. To overcome these barriers and provide efficient centralized access to these high impact datasets, OT is developing a "Community DataSpace", a service built on a low cost storage cloud, (e.g. AWS S3) to make it easy for researchers to upload, curate, annotate and distribute their datasets. The system's ingestion workflow will extract metadata from data uploaded; validate it; assign a digital object identifier (DOI); and create a searchable

  17. The Semi-opened Infrastructure Model (SopIM): A Frame to Set Up an Organizational Learning Process

    Science.gov (United States)

    Grundstein, Michel

    In this paper, we introduce the "Semi-opened Infrastructure Model (SopIM)" implemented to deploy Artificial Intelligence and Knowledge-based Systems within a large industrial company. This model illustrates what could be two of the operating elements of the Model for General Knowledge Management within the Enterprise (MGKME) that are essential to set up the organizational learning process that leads people to appropriate and use concepts, methods and tools of an innovative technology: the "Ad hoc Infrastructures" element, and the "Organizational Learning Processes" element.

  18. Process Design and Economics for the Production of Algal Biomass: Algal Biomass Production in Open Pond Systems and Processing Through Dewatering for Downstream Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Markham, Jennifer [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kinchin, Christopher [National Renewable Energy Lab. (NREL), Golden, CO (United States); Grundl, Nicholas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tan, Eric C.D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Humbird, David [DWH Process Consulting, Denver, CO (United States)

    2016-02-17

    This report describes in detail a set of aspirational design and process targets to better understand the realistic economic potential for the production of algal biomass for subsequent conversion to biofuels and/or coproducts, based on the use of open pond cultivation systems and a series of dewatering operations to concentrate the biomass up to 20 wt% solids (ash-free dry weight basis).

  19. Robust Model-based Control of Open-loop Unstable Processes

    International Nuclear Information System (INIS)

    Emad, Ali

    1999-01-01

    This paper addresses the development of new formulations for estimating modeling errors or unmeasured disturbances to be used in Model Predictive Control (MPC) algorithms during open-loop prediction. Two different formulations were developed in this paper. One is used in MPC that directly utilizes linear models and the other in MPC that utilizes non-linear models. These estimation techniques were utilized to provide robust performance for MPC algorithms when the plant is open-loop unstable and under the influence of modeling error and/or unmeasured disturbances. For MPC that utilizes a non-linear model, the estimation technique is formulated as a fixed small size on-line optimization problem, while for linear MPC, the unmeasured disturbances are estimated via a proposed linear disturbance model. The disturbance model coefficients are identified on-line from historical estimates of plant-model mismatch. The effectiveness of incorporating these proposed estimation techniques into MPC is tested through simulated implementation on non-linear unstable exothermic fluidized bed reactor. Closed-loop simulations proved the capability of the proposed estimation methods to stabilize and, thereby, improve the MPC performance in such cases. (Author)

  20. On the joint distribution of excursion duration and amplitude of a narrow-band Gaussian process

    DEFF Research Database (Denmark)

    Ghane, Mahdi; Gao, Zhen; Blanke, Mogens

    2018-01-01

    of amplitude and period are limited to excursion through a mean-level or to describe the asymptotic behavior of high level excursions. This paper extends the knowledge by presenting a theoretical derivation of probability of wave exceedance amplitude and duration, for a narrow-band Gaussian process......The probability density of crest amplitude and of duration of exceeding a given level are used in many theoretical and practical problems in engineering. The joint density is essential for design of constructions that are subjected to waves and wind. The presently available joint distributions...... distribution, as expected, and that the marginal distribution of excursion duration works both for asymptotic and non-asymptotic cases. The suggested model is found to be a good replacement for the empirical distributions that are widely used. Results from simulations of narrow-band Gaussian processes, real...

  1. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    Science.gov (United States)

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  2. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  3. Exposure to an open-field arena increases c-Fos expression in a distributed anxiety-related system projecting to the basolateral amygdaloid complex

    DEFF Research Database (Denmark)

    Hale, M.W.; Hay-Schmidt, A.; Mikkelsen, J.D.

    2008-01-01

    Anxiety states and anxiety-related behaviors appear to be regulated by a distributed and highly interconnected system of brain structures including the basolateral amygdala. Our previous studies demonstrate that exposure of rats to an open-field in high- and low-light conditions results in a marked...... increase in c-Fos expression in the anterior part of the basolateral amygdaloid nucleus (BLA) compared with controls. The neural mechanisms underlying the anatomically specific effects of open-field exposure on c-Fos expression in the BLA are not clear, however, it is likely that this reflects activation...... to this region in combination with c-Fos immunostaining to identify cells responding to exposure to an open-field arena in low-light (8-13 lux) conditions (an anxiogenic stimulus in rats). Adult male Wistar rats received a unilateral microinjection of 4% CTb in phosphate-buffered saline into the basolateral...

  4. THE FEATURES OF LASER EMISSION ENERGY DISTRIBUTION AT MATHEMATIC MODELING OF WORKING PROCESS

    Directory of Open Access Journals (Sweden)

    A. M. Avsiyevich

    2013-01-01

    Full Text Available The space laser emission energy distribution of different continuous operation settings depends from many factors, first on the settings design. For more accurate describing of multimode laser emission energy distribution intensity the experimental and theoretic model, which based on experimental laser emission distribution shift presentation with given accuracy rating in superposition basic function form, is proposed. This model provides the approximation error only 2,2 percent as compared with 24,6 % and 61 % for uniform and Gauss approximation accordingly. The proposed model usage lets more accurate take into consideration the laser emission and working surface interaction peculiarity, increases temperature fields calculation accuracy for mathematic modeling of laser treatment processes. The method of experimental laser emission energy distribution studying for given source and mathematic apparatus for calculation of laser emission energy distribution intensity parameters depended from the distance in radial direction on surface heating zone are shown.

  5. Distribution of chirality in the quantum walk: Markov process and entanglement

    International Nuclear Information System (INIS)

    Romanelli, Alejandro

    2010-01-01

    The asymptotic behavior of the quantum walk on the line is investigated, focusing on the probability distribution of chirality independently of position. It is shown analytically that this distribution has a longtime limit that is stationary and depends on the initial conditions. This result is unexpected in the context of the unitary evolution of the quantum walk as it is usually linked to a Markovian process. The asymptotic value of the entanglement between the coin and the position is determined by the chirality distribution. For given asymptotic values of both the entanglement and the chirality distribution, it is possible to find the corresponding initial conditions within a particular class of spatially extended Gaussian distributions.

  6. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  7. Spatial distribution of juvenile and adult female Tanner crabs (Chionoecetes bairdi) in a glacial fjord ecosystem: Implications for recruitment processes

    Science.gov (United States)

    Nielsen, J.K.; Taggart, S. James; Shirley, Thomas C.; Mondragon, Jennifer

    2007-01-01

    A systematic pot survey in Glacier Bay, Alaska, was conducted to characterize the spatial distribution of juvenile and adult female Tanner crabs, and their association with depth and temperature. The information was used to infer important recruitment processes for Tanner crabs in glaciated ecosystems. High-catch areas for juvenile and adult female Tanner crabs were identified using local autocorrelation statistics. Spatial segregation by size class corresponded to features in the glacial landscape: high-catch areas for juveniles were located at the distal ends of two narrow glacial fjords, and high-catch areas for adults were located in the open waters of the central Bay. Juvenile female Tanner crabs were found at nearly all sampled depths (15–439 m) and temperatures (4–8°C), but the biggest catches were at depths <150 m where adults were scarce. Because adults may prey on or compete with juveniles, the distribution of juveniles could be influenced by the distribution of adults. Areas where adults or predators are scarce, such as glacially influenced fjords, could serve as refuges for juvenile Tanner crabs.

  8. Problem of uniqueness in the renewal process generated by the uniform distribution

    Directory of Open Access Journals (Sweden)

    D. Ugrin-Šparac

    1992-01-01

    Full Text Available The renewal process generated by the uniform distribution, when interpreted as a transformation of the uniform distribution into a discrete distribution, gives rise to the question of uniqueness of the inverse image. The paper deals with a particular problem from the described domain, that arose in the construction of a complex stochastic test intended to evaluate pseudo-random number generators. The connection of the treated problem with the question of a unique integral representation of Gamma-function is also mentioned.

  9. An OpenMP Parallelisation of Real-time Processing of CERN LHC Beam Position Monitor Data

    CERN Document Server

    Renshall, H

    2012-01-01

    SUSSIX is a FORTRAN program for the post processing of turn-by-turn Beam Position Monitor (BPM) data, which computes the frequency, amplitude, and phase of tunes and resonant lines to a high degree of precision. For analysis of LHC BPM data a specific version run through a C steering code has been implemented in the CERN Control Centre to run on a server under the Linux operating system but became a real time computational bottleneck preventing truly online study of the BPM data. Timing studies showed that the independent processing of each BPMs data was a candidate for parallelization and the Open Multiprocessing (OpenMP) package with its simple insertion of compiler directives was tried. It proved to be easy to learn and use, problem free and efficient in this case reaching a factor of ten reductions in real-time over twelve cores on a dedicated server. This paper reviews the problem, shows the critical code fragments with their OpenMP directives and the results obtained.

  10. High speed vision processor with reconfigurable processing element array based on full-custom distributed memory

    Science.gov (United States)

    Chen, Zhe; Yang, Jie; Shi, Cong; Qin, Qi; Liu, Liyuan; Wu, Nanjian

    2016-04-01

    In this paper, a hybrid vision processor based on a compact full-custom distributed memory for near-sensor high-speed image processing is proposed. The proposed processor consists of a reconfigurable processing element (PE) array, a row processor (RP) array, and a dual-core microprocessor. The PE array includes two-dimensional processing elements with a compact full-custom distributed memory. It supports real-time reconfiguration between the PE array and the self-organized map (SOM) neural network. The vision processor is fabricated using a 0.18 µm CMOS technology. The circuit area of the distributed memory is reduced markedly into 1/3 of that of the conventional memory so that the circuit area of the vision processor is reduced by 44.2%. Experimental results demonstrate that the proposed design achieves correct functions.

  11. Audit trails in OpenSLEX : paving the road for process mining in healthcare

    NARCIS (Netherlands)

    González López De Murillas, E.; Helm, E.; Reijers, H.A.; Küng, J.; Bursa, M.; Holzinger, A.; Elena Renda, M.; Khuri, S.

    2017-01-01

    The analysis of organizational and medical treatment pro-cesses is crucial for the future development of the healthcare domain. Recent approaches to enable process mining on healthcare data make use of the hospital information systems' Audit Trails. In this work, methods are proposed to integrate

  12. Baseliner: an open source, interactive tool for processing sap flux data from thermal dissipation probes.

    Science.gov (United States)

    Andrew C. Oishi; David Hawthorne; Ram Oren

    2016-01-01

    Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...

  13. 36 CFR 1275.42 - Processing period; notice of proposed opening.

    Science.gov (United States)

    2010-07-01

    ... abuses of governmental power information, as defined in § 1275.16(c), will be given priority processing... Agreement (see Appendix A to this Part). After the tape segments which consist of abuses of governmental power information have been released, the archivists will conduct archival processing of those tape...

  14. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi

    2010-11-15

    Bubble-nucleation processes of a Lennard-Jones liquid are studied by molecular dynamics simulations. Waiting time, which is the lifetime of a superheated liquid, is determined for several system sizes, and the apparent finite-size effect of the nucleation rate is observed. From the cumulative distribution function of the nucleation events, the bubble-nucleation process is found to be not a simple Poisson process but a Poisson process with an additional relaxation time. The parameters of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows to the critical size leads to an incorrect estimation of the nucleation rate. © 2010 The American Physical Society.

  15. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically......We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...... adjusted according to past claims experience....

  16. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    Science.gov (United States)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  17. Open Pension Funds in Poland: The Efects of the Pension Privatization Process

    Directory of Open Access Journals (Sweden)

    Oręziak Leokadia

    2014-10-01

    Full Text Available Since their establishment in 1999, the Open Pension Funds (OPFs have comprised a mandatory capital pillar in the pension system of Poland. The paper`s objective is to analyze the principles under which the OPFs function and assess their past and anticipated future impact on the state of the country's public fnances, particularly on the public debt. The analysis also considers the past and potential effects of the OPFs existence from the point of view of future levels of old-age pension. The studies are targeted at determining the threats connected with further maintenance of the OPFs from the point of view of both public fnance stability and pension system security.

  18. Electron ionization of open/closed chain isocarbonic molecules relevant in plasma processing: Theoretical cross sections

    International Nuclear Information System (INIS)

    Patel, Umang R.; Joshipura, K. N.; Pandya, Siddharth H.; Kothari, Harshit N.

    2014-01-01

    In this paper, we report theoretical electron impact ionization cross sections from threshold to 2000 eV for isocarbonic open chain molecules C 4 H 6 , C 4 H 8 , C 4 F 6 including their isomers, and closed chain molecules c-C 4 H 8 and c-C 4 F 8 . Theoretical formalism employed presently, viz., Complex Scattering Potential-ionization contribution method has been used successfully for a variety of polyatomic molecules. The present ionization calculations are very important since results available for the studied targets are either scarce or none. Our work affords comparison of C 4 containing hydrocarbon versus fluorocarbon molecules. Comparisons of the present ionization cross sections are made wherever possible, and new ionization data are also presented

  19. Numerical modeling of cold room's hinged door opening and closing processes

    Science.gov (United States)

    Carneiro, R.; Gaspar, P. D.; Silva, P. D.; Domingues, L. C.

    2016-06-01

    The need of rationalize energy consumption in agrifood industry has fasten the development of methodologies to improve the thermal and energy performances of cold rooms. This paper presents a three-dimensional (3D) transient Computational Fluid Dynamics (CFD) modelling of a cold room to evaluate the air infiltration rate through hinged doors. A species transport model is used for modelling the tracer gas concentration decay technique. Numerical predictions indicate that air temperature difference between spaces affects the air infiltration. For this case study, the infiltration rate increases 0.016 m3 s-1 per K of air temperature difference. The knowledge about the evolution of air infiltration during door opening/closing times allows to draw some conclusions about its influence on the air conditions inside the cold room, as well as to suggest best practices and simple technical improvements that can minimize air infiltration, and consequently improve thermal performance and energy consumption rationalization.

  20. The brain as a distributed intelligent processing system: an EEG study.

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-03-15

    Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. The present results support these claims and the neural efficiency hypothesis.

  1. The brain as a distributed intelligent processing system: an EEG study.

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    Full Text Available BACKGROUND: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS, first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. METHODOLOGY AND PRINCIPAL FINDINGS: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale and WISC (Wechsler Intelligence Scale for Children, and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. CONCLUSION: The present results support these claims and the neural efficiency hypothesis.

  2. The Brain as a Distributed Intelligent Processing System: An EEG Study

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-01-01

    Background Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion The present results support these claims and the neural efficiency hypothesis. PMID:21423657

  3. Simulation of business processes of processing and distribution of orders in transportation

    Directory of Open Access Journals (Sweden)

    Ольга Ігорівна Проніна

    2017-06-01

    Full Text Available Analyzing modern passenger transportation in Ukraine, we can conclude that with the increasing number of urban population the necessity to develop passenger traffic, as well as to improve the quality of transport services is increasing too. The paper examines the three existing models of private passenger transportation (taxi: a model with the use of dispatching service, without dispatching service model and a mixed model. An algorithm of getting an order, processing it, and its implementation according to the given model has been considered. Several arrangements schemes that characterize the operation of the system have been shown in the work as well. The interrelation of the client making an order and the driver who receives the order and executes it has been represented, the server being a connecting link between the customer and the driver and regulating the system as a whole. Business process of private passenger transportation without dispatching service was simulated. Basing on the simulation results it was proposed to supplement the model of private transportation by the making advice system, as well as improving the car selection algorithm. Advice system provides the optimum choice of the car, taking into account a lot of factors. And it will also make it possible to use more efficiently the specific additional services provided by the drivers. Due to the optimization of the order handling process it becomes possible to increase the capacity of the drivers thus increasing their profits. Passenger transportation without the use of dispatching service has some weak points and they were identified. Application of the system will improve transport structure in modern conditions, and improve the transportation basing on modern operating system

  4. Distributed inter process communication framework of BES III DAQ online software

    International Nuclear Information System (INIS)

    Li Fei; Liu Yingjie; Ren Zhenyu; Wang Liang; Chinese Academy of Sciences, Beijing; Chen Mali; Zhu Kejun; Zhao Jingwei

    2006-01-01

    DAQ (Data Acquisition) system is one important part of BES III, which is the large scale high-energy physics detector on the BEPC. The inter process communication (IPC) of online software in distributed environments is very pivotal for design and implement of DAQ system. This article will introduce one distributed inter process communication framework, which is based on CORBA and used in BES III DAQ online software. The article mainly presents the design and implementation of the IPC framework and application based on IPC. (authors)

  5. Distributed password cracking

    OpenAIRE

    Crumpacker, John R.

    2009-01-01

    Approved for public release, distribution unlimited Password cracking requires significant processing power, which in today's world is located at a workstation or home in the form of a desktop computer. Berkeley Open Infrastructure for Network Computing (BOINC) is the conduit to this significant source of processing power and John the Ripper is the key. BOINC is a distributed data processing system that incorporates client-server relationships to generically process data. The BOINC structu...

  6. Comparison of Environment Impact between Conventional and Cold Chain Management System in Paprika Distribution Process

    Directory of Open Access Journals (Sweden)

    Eidelweijs A Putri

    2012-09-01

    Full Text Available Pasir Langu village in Cisarua, West Java, is the largest central production area of paprika in Indonesia. On average, for every 200 kilograms of paprika produced, there is rejection amounting to 3 kilograms. This resulted in money loss for wholesalers and wastes. In one year, this amount can be approximately 11.7 million Indonesian rupiahs. Recently, paprika wholesalers in Pasir Langu village recently are developing cold chain management system to maintain quality of paprika so that number of rejection can be reduced. The objective of this study is to compare environmental impacts between conventional and cold chain management system in paprika distribution process using Life Cycle Assessment (LCA methodology and propose Photovoltaic (PV system in paprika distribution process. The result implies that the cold chain system produces more CO2 emission compared to conventional system. However, due to the promotion of PV system, the emission would be reduced. For future research, it is necessary to reduce CO2 emission from transportation process since this process is biggest contributor of CO2 emission at whole distribution process. Keywords: LCA, environmentally friendly distribution, paprika,cold chain, PV system

  7. Poster - Thur Eve - 06: Comparison of an open source genetic algorithm to the commercially used IPSA for generation of seed distributions in LDR prostate brachytherapy.

    Science.gov (United States)

    McGeachy, P; Khan, R

    2012-07-01

    In early stage prostate cancer, low dose rate (LDR) prostate brachytherapy is a favorable treatment modality, where small radioactive seeds are permanently implanted throughout the prostate. Treatment centres currently rely on a commercial optimization algorithm, IPSA, to generate seed distributions for treatment plans. However, commercial software does not allow the user access to the source code, thus reducing the flexibility for treatment planning and impeding any implementation of new and, perhaps, improved clinical techniques. An open source genetic algorithm (GA) has been encoded in MATLAB to generate seed distributions for a simplified prostate and urethra model. To assess the quality of the seed distributions created by the GA, both the GA and IPSA were used to generate seed distributions for two clinically relevant scenarios and the quality of the GA distributions relative to IPSA distributions and clinically accepted standards for seed distributions was investigated. The first clinically relevant scenario involved generating seed distributions for three different prostate volumes (19.2 cc, 32.4 cc, and 54.7 cc). The second scenario involved generating distributions for three separate seed activities (0.397 mCi, 0.455 mCi, and 0.5 mCi). Both GA and IPSA met the clinically accepted criteria for the two scenarios, where distributions produced by the GA were comparable to IPSA in terms of full coverage of the prostate by the prescribed dose, and minimized dose to the urethra, which passed straight through the prostate. Further, the GA offered improved reduction of high dose regions (i.e hot spots) within the planned target volume. © 2012 American Association of Physicists in Medicine.

  8. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  9. Advances in data processing for open-path Fourier transform infrared spectrometry of greenhouse gases.

    Science.gov (United States)

    Shao, Limin; Griffiths, Peter R; Leytem, April B

    2010-10-01

    The automated quantification of three greenhouse gases, ammonia, methane, and nitrous oxide, in the vicinity of a large dairy farm by open-path Fourier transform infrared (OP/FT-IR) spectrometry at intervals of 5 min is demonstrated. Spectral pretreatment, including the automated detection and correction of the effect of interrupting the infrared beam, is by a moving object, and the automated correction for the nonlinear detector response is applied to the measured interferograms. Two ways of obtaining quantitative data from OP/FT-IR data are described. The first, which is installed in a recently acquired commercial OP/FT-IR spectrometer, is based on classical least-squares (CLS) regression, and the second is based on partial least-squares (PLS) regression. It is shown that CLS regression only gives accurate results if the absorption features of the analytes are located in very short spectral intervals where lines due to atmospheric water vapor are absent or very weak; of the three analytes examined, only ammonia fell into this category. On the other hand, PLS regression works allowed what appeared to be accurate results to be obtained for all three analytes.

  10. Estimating the transmission potential of supercritical processes based on the final size distribution of minor outbreaks.

    Science.gov (United States)

    Nishiura, Hiroshi; Yan, Ping; Sleeman, Candace K; Mode, Charles J

    2012-02-07

    Use of the final size distribution of minor outbreaks for the estimation of the reproduction numbers of supercritical epidemic processes has yet to be considered. We used a branching process model to derive the final size distribution of minor outbreaks, assuming a reproduction number above unity, and applying the method to final size data for pneumonic plague. Pneumonic plague is a rare disease with only one documented major epidemic in a spatially limited setting. Because the final size distribution of a minor outbreak needs to be normalized by the probability of extinction, we assume that the dispersion parameter (k) of the negative-binomial offspring distribution is known, and examine the sensitivity of the reproduction number to variation in dispersion. Assuming a geometric offspring distribution with k=1, the reproduction number was estimated at 1.16 (95% confidence interval: 0.97-1.38). When less dispersed with k=2, the maximum likelihood estimate of the reproduction number was 1.14. These estimates agreed with those published from transmission network analysis, indicating that the human-to-human transmission potential of the pneumonic plague is not very high. Given only minor outbreaks, transmission potential is not sufficiently assessed by directly counting the number of offspring. Since the absence of a major epidemic does not guarantee a subcritical process, the proposed method allows us to conservatively regard epidemic data from minor outbreaks as supercritical, and yield estimates of threshold values above unity. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  11. A Java based environment to control and monitor distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.

    1997-01-01

    Distributed processing systems are considered to solve the challenging requirements of triggering and data acquisition systems for future HEP experiments. The aim of this work is to present a software environment to control and monitor large scale parallel processing systems based on a distributed client-server approach developed in Java. One server task may control several processing nodes, switching elements or controllers for different sub-systems. Servers are designed as multi-thread applications for efficient communications with other objects. Servers communicate between themselves by using Remote Method Invocation (RMI) in a peer-to-peer mechanism. This distributed server layer has to provide a dynamic and transparent access from any client to all the resources in the system. The graphical user interface programs, which are platform independent, may be transferred to any client via the http protocol. In this scheme the control and monitor tasks are distributed among servers and network controls the flow of information among servers and clients providing a flexible mechanism for monitoring and controlling large heterogenous distributed systems. (author)

  12. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals.

    Science.gov (United States)

    Wicherts, Jelte M

    2016-01-01

    Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals' websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors' ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal's impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

  13. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    Science.gov (United States)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  14. Enforcement of entailment constraints in distributed service-based business processes.

    Science.gov (United States)

    Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram

    2013-11-01

    A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web

  15. Welcome to Processes—A New Open Access Journal on Chemical and Biological Process Technology

    Directory of Open Access Journals (Sweden)

    Michael A. Henson

    2012-11-01

    Full Text Available As the result of remarkable technological progress, this past decade has witnessed considerable advances in our ability to manipulate natural and engineered systems, particularly at the molecular level. These advancements offer the potential to revolutionize our world through the development of novel soft and hard materials and the construction of new cellular platforms for chemical and pharmaceutical synthesis. For these technologies to truly impact society, the development of process technology that will enable effective large-scale production is essential. Improved processes are also needed for more established technologies in chemical and biochemical manufacturing, as these industries face ever increasing competitive pressure that mandates continuous improvement. [...

  16. Distributed real time data processing architecture for the TJ-II data acquisition system

    International Nuclear Information System (INIS)

    Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Vega, J.; Sanchez, E.

    2004-01-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions

  17. Dust charging processes with a Cairns-Tsallis distribution function with negative ions

    International Nuclear Information System (INIS)

    Abid, A. A.; Khan, M. Z.; Yap, S. L.; Terças, H.; Mahmood, S.

    2016-01-01

    Dust grain charging processes are presented in a non-Maxwellian dusty plasma following the Cairns-Tsallis (q, α)–distribution, whose constituents are the electrons, as well as the positive/negative ions and negatively charged dust grains. For this purpose, we have solved the current balance equation for a negatively charged dust grain to achieve an equilibrium state value (viz., q d  = constant) in the presence of Cairns-Tsallis (q, α)–distribution. In fact, the current balance equation becomes modified due to the Boltzmannian/streaming distributed negative ions. It is numerically found that the relevant plasma parameters, such as the spectral indexes q and α, the positive ion-to-electron temperature ratio, and the negative ion streaming speed (U 0 ) significantly affect the dust grain surface potential. It is also shown that in the limit q → 1 the Cairns-Tsallis reduces to the Cairns distribution; for α = 0 the Cairns-Tsallis distribution reduces to pure Tsallis distribution and the latter reduces to Maxwellian distribution for q → 1 and α = 0

  18. Dust charging processes with a Cairns-Tsallis distribution function with negative ions

    Energy Technology Data Exchange (ETDEWEB)

    Abid, A. A., E-mail: abidaliabid1@hotmail.com [Applied Physics Department, Federal Urdu University of Arts, Science and Technology, Islamabad Campus, Islamabad 45320 (Pakistan); Khan, M. Z., E-mail: mzk-qau@yahoo.com [Applied Physics Department, Federal Urdu University of Arts, Science and Technology, Islamabad Campus, Islamabad 45320 (Pakistan); Plasma Technology Research Center, Department of Physics, Faculty of Science, University of Malaya, Kuala Lumpur 50603 (Malaysia); Yap, S. L. [Plasma Technology Research Center, Department of Physics, Faculty of Science, University of Malaya, Kuala Lumpur 50603 (Malaysia); Terças, H., E-mail: hugo.tercas@tecnico.ul.pt [Physics of Information Group, Instituto de Telecomunicações, Av. Rovisco Pais, Lisbon 1049-001 (Portugal); Mahmood, S. [Science Place, University of Saskatchewan, Saskatoon, Saskatchewan S7N5A2 (Canada)

    2016-01-15

    Dust grain charging processes are presented in a non-Maxwellian dusty plasma following the Cairns-Tsallis (q, α)–distribution, whose constituents are the electrons, as well as the positive/negative ions and negatively charged dust grains. For this purpose, we have solved the current balance equation for a negatively charged dust grain to achieve an equilibrium state value (viz., q{sub d} = constant) in the presence of Cairns-Tsallis (q, α)–distribution. In fact, the current balance equation becomes modified due to the Boltzmannian/streaming distributed negative ions. It is numerically found that the relevant plasma parameters, such as the spectral indexes q and α, the positive ion-to-electron temperature ratio, and the negative ion streaming speed (U{sub 0}) significantly affect the dust grain surface potential. It is also shown that in the limit q → 1 the Cairns-Tsallis reduces to the Cairns distribution; for α = 0 the Cairns-Tsallis distribution reduces to pure Tsallis distribution and the latter reduces to Maxwellian distribution for q → 1 and α = 0.

  19. Opening the Black Box and Searching for Smoking Guns: Process Causality in Qualitative Research

    Science.gov (United States)

    Bennett, Elisabeth E.; McWhorter, Rochell R.

    2016-01-01

    Purpose: The purpose of this paper is to explore the role of qualitative research in causality, with particular emphasis on process causality. In one paper, it is not possible to discuss all the issues of causality, but the aim is to provide useful ways of thinking about causality and qualitative research. Specifically, a brief overview of the…

  20. Predicting cycle time distributions for integrated processing workstations : an aggregate modeling approach

    NARCIS (Netherlands)

    Veeger, C.P.L.; Etman, L.F.P.; Lefeber, A.A.J.; Adan, I.J.B.F.; Herk, van J.; Rooda, J.E.

    2011-01-01

    To predict cycle time distributions of integrated processing workstations, detailed simulation models are almost exclusively used; these models require considerable development and maintenance effort. As an alternative, we propose an aggregate model that is a lumped-parameter representation of the

  1. Distribution flow: a general process in the top layer of water repellent soils

    NARCIS (Netherlands)

    Ritsema, C.J.; Dekker, L.W.

    1995-01-01

    Distribution flow is the process of water and solute flowing in a lateral direction over and through the very first millimetre or centimetre of the soil profile. A potassium bromide tracer was applied in two water-repellent sandy soils to follow the actual flow paths of water and solutes in the

  2. The constitutive distributed parameter model of multicomponent chemical processes in gas, fluid and solid phase

    International Nuclear Information System (INIS)

    Niemiec, W.

    1985-01-01

    In the literature of distributed parameter modelling of real processes is not considered the class of multicomponent chemical processes in gas, fluid and solid phase. The aim of paper is constitutive distributed parameter physicochemical model, constructed on kinetics and phenomenal analysis of multicomponent chemical processes in gas, fluid and solid phase. The mass, energy and momentum aspects of these multicomponent chemical reactions and adequate phenomena are utilized in balance operations, by conditions of: constitutive invariance for continuous media with space and time memories, reciprocity principle for isotropic and anisotropic nonhomogeneous media with space and time memories, application of definitions of following derivative and equation of continuity, to the construction of systems of partial differential constitutive state equations, in the following derivative forms for gas, fluid and solid phase. Couched in this way all physicochemical conditions of multicomponent chemical processes in gas, fluid and solid phase are new form of constitutive distributed parameter model for automatics and its systems of equations are new form of systems of partial differential constitutive state equations in sense of phenomenal distributed parameter control

  3. AKaplan-Meier estimators of distance distributions for spatial point processes

    NARCIS (Netherlands)

    Baddeley, A.J.; Gill, R.D.

    1997-01-01

    When a spatial point process is observed through a bounded window, edge effects hamper the estimation of characteristics such as the empty space function $F$, the nearest neighbour distance distribution $G$, and the reduced second order moment function $K$. Here we propose and study product-limit

  4. Spatial patterns in the distribution of kimberlites: relationship to tectonic processes and lithosphere structure

    DEFF Research Database (Denmark)

    Chemia, Zurab; Artemieva, Irina; Thybo, Hans

    2015-01-01

    of kimberlite melts through the lithospheric mantle, which forms the major pipe. Stage 2 (second-order process) begins when the major pipe splits into daughter sub-pipes (tree-like pattern) at crustal depths. We apply cluster analysis to the spatial distribution of all known kimberlite fields with the goal...

  5. Spatial Patterns in Distribution of Kimberlites: Relationship to Tectonic Processes and Lithosphere Structure

    DEFF Research Database (Denmark)

    Chemia, Zurab; Artemieva, Irina; Thybo, Hans

    2014-01-01

    of kimberlite melts through the lithospheric mantle, which forms the major pipe. Stage 2 (second-order process) begins when the major pipe splits into daughter sub-pipes (tree-like pattern) at crustal depths. We apply cluster analysis to the spatial distribution of all known kimberlite fields with the goal...

  6. Parallel Distributed Processing at 25: Further Explorations in the Microstructure of Cognition

    Science.gov (United States)

    Rogers, Timothy T.; McClelland, James L.

    2014-01-01

    This paper introduces a special issue of "Cognitive Science" initiated on the 25th anniversary of the publication of "Parallel Distributed Processing" (PDP), a two-volume work that introduced the use of neural network models as vehicles for understanding cognition. The collection surveys the core commitments of the PDP…

  7. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi; Suzuki, Masaru; Ito, Nobuyasu

    2010-01-01

    of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows

  8. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  9. Third party testing : new pilot facility for mining processes opens in Fort McKay

    International Nuclear Information System (INIS)

    Jaremko, D.

    2007-01-01

    Fort McKay lies 65 kilometres north of Fort McMurray, Alberta and is the centre of operational oilsands mining activity. As such, it was chosen for a pilot testing facility created by the Geneva-based SGS Group. The reputable facility provides an opportunity for mining producers to advance their processes, including environmental performance, by allowing them to test different processes on their own oilsands. The Northern Lights partnership, led by Synenco Energy, was the first client at the facility. Due to outsourcing, clients are not obligated to make substantial capital investment into in-house research. The Northern Lights partnership will be using the facility to test extraction processes on bitumen from its leases. Although the Fort McKay facility is SGS's first venture into the oilsands industry, it operates in more than 140 companies globally, including the mineral industry, and specializes in inspection, verification, testing and certification. SGS took the experience from its minerals extraction business to identify what could be done to help the oilsands industry by using best practices developed from global operations. The facility lies on the Fort McKay industrial park owned by the Fort McKay First Nation. An existing testing facility called McMurray Resources Research and Testing was expanded by the SGS Group to include environmental analysis capabilities. The modular units that lie on 6 acres include refrigerated ore storage to maintain ore integrity; modular ore and materials handling systems; extraction equipment; and, zero discharge process water and waste disposal systems. Froth treatment will be added in the near future to cover the entire upstream side of the mining processing business. A micro-upgrader might be added in the future to manufacture synthetic crude. 3 figs

  10. Third party testing : new pilot facility for mining processes opens in Fort McKay

    Energy Technology Data Exchange (ETDEWEB)

    Jaremko, D.

    2007-12-15

    Fort McKay lies 65 kilometres north of Fort McMurray, Alberta and is the centre of operational oilsands mining activity. As such, it was chosen for a pilot testing facility created by the Geneva-based SGS Group. The reputable facility provides an opportunity for mining producers to advance their processes, including environmental performance, by allowing them to test different processes on their own oilsands. The Northern Lights partnership, led by Synenco Energy, was the first client at the facility. Due to outsourcing, clients are not obligated to make substantial capital investment into in-house research. The Northern Lights partnership will be using the facility to test extraction processes on bitumen from its leases. Although the Fort McKay facility is SGS's first venture into the oilsands industry, it operates in more than 140 companies globally, including the mineral industry, and specializes in inspection, verification, testing and certification. SGS took the experience from its minerals extraction business to identify what could be done to help the oilsands industry by using best practices developed from global operations. The facility lies on the Fort McKay industrial park owned by the Fort McKay First Nation. An existing testing facility called McMurray Resources Research and Testing was expanded by the SGS Group to include environmental analysis capabilities. The modular units that lie on 6 acres include refrigerated ore storage to maintain ore integrity; modular ore and materials handling systems; extraction equipment; and, zero discharge process water and waste disposal systems. Froth treatment will be added in the near future to cover the entire upstream side of the mining processing business. A micro-upgrader might be added in the future to manufacture synthetic crude. 3 figs.

  11. Open Knowledge Maps: Creating a Visual Interface to the World’s Scientific Knowledge Based on Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Peter Kraker

    2016-11-01

    Full Text Available The goal of Open Knowledge Maps is to create a visual interface to the world’s scientific knowledge. The base for this visual interface consists of so-called knowledge maps, which enable the exploration of existing knowledge and the discovery of new knowledge. Our open source knowledge mapping software applies a mixture of summarization techniques and similarity measures on article metadata, which are iteratively chained together. After processing, the representation is saved in a database for use in a web visualization. In the future, we want to create a space for collective knowledge mapping that brings together individuals and communities involved in exploration and discovery. We want to enable people to guide each other in their discovery by collaboratively annotating and modifying the automatically created maps. Das Ziel von Open Knowledge Map ist es, ein visuelles Interface zum wissenschaftlichen Wissen der Welt bereitzustellen. Die Basis für die dieses Interface sind sogenannte “knowledge maps”, zu deutsch Wissenslandkarten. Wissenslandkarten ermöglichen die Exploration bestehenden Wissens und die Entdeckung neuen Wissens. Unsere Open Source Software wendet für die Erstellung der Wissenslandkarten eine Reihe von Text Mining Verfahren iterativ auf die Metadaten wissenschaftlicher Artikel an. Die daraus resultierende Repräsentation wird in einer Datenbank für die Anzeige in einer Web-Visualisierung abgespeichert. In Zukunft wollen wir einen Raum für das kollektive Erstellen von Wissenslandkarten schaffen, der die Personen und Communities, welche sich mit der Exploration und Entdeckung wissenschaftlichen Wissens beschäftigen, zusammenbringt. Wir wollen es den NutzerInnen ermöglichen, einander in der Literatursuche durch kollaboratives Annotieren und Modifizieren von automatisch erstellten Wissenslandkarten zu unterstützen.

  12. The SCOAP3 initiative and the Open Access Article-Processing-Charge market: global partnership and competition improve value in the dissemination of science

    CERN Document Server

    Romeu, Clément; Kohls, Alexander; Mansuy, Anne; Mele, Salvatore; Vesper, Martin

    2014-01-01

    The SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) initiative is an international partnership to convert to Open Access the published literature in the field of High-Energy Physics (HEP). It has been in operation since January 2014, and covers more than 4’000 articles/year. Originally initiated by CERN, the European Organization for Nuclear Research, and now counting partners representing 41 countries and 3 intergovernmental organizations, SCOAP3 has successfully converted to Open Access all, or part of, 6 HEP journals previously restricted to subscribers. It is also supporting publication of articles in 4 existing Open Access journals. As a “Gold” Open Access initiative, SCOAP3 pays Article Processing Charges (APCs), as publishers’ source of revenue for the publication service. Differentiating itself from other Open Access initiatives, SCOAP3 set APCs through a tendering process, correlating quality and price, at consistent conditions across participating publishers. Th...

  13. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Science.gov (United States)

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  14. Effect of irreversible processes on the thermodynamic performance of open-cycle desiccant cooling cycles

    International Nuclear Information System (INIS)

    La, Dong; Li, Yong; Dai, Yanjun; Ge, Tianshu; Wang, Ruzhu

    2013-01-01

    Highlights: ► Effects of irreversible processes on the performance of desiccant cooling cycle are identified. ► The exergy destructions involved are classified by the properties of the individual processes. ► Appropriate indexes for thermodynamic evaluation are proposed based on thermodynamic analyses. - Abstract: Thermodynamic analyses of desiccant cooling cycle usually focus on the overall cycle performance in previous study. In this paper, the effects of the individual irreversible processes in each component on thermodynamic performance are analyzed in detail. The objective of this paper is to reveal the elemental features of the individual components, and to show their effects on the thermodynamic performance of the whole cycle in a fundamental way. Appropriate indexes for thermodynamic evaluation are derived based on the first and second law analyses. A generalized model independent of the connection of components is developed. The results indicate that as the effectiveness of the desiccant wheel increases, the cycle performance is increased principally due to the significant reduction in exergy carried out by exhaust air. The corresponding exergy destruction coefficient of the cycle with moderate performance desiccant wheel is decreased greatly to 3.9%, which is more than 50% lower than that of the cycle with low performance desiccant wheel. The effect of the heat source is similar. As the temperature of the heat source increases from 60 °C to 90 °C, the percentage of exergy destruction raised by exhaust air increases sharply from 5.3% to 21.8%. High heat exchanger effectiveness improves the cycle performance mainly by lowering the irreversibility of the heat exchanger, using less regeneration heat and pre-cooling the process air effectively

  15. The influence of drilling process automation on improvement of blasting works quality in open pit mining

    Science.gov (United States)

    Bodlak, Maciej; Dmytryk, Dominik; Mertuszka, Piotr; Szumny, Marcin; Tomkiewicz, Grzegorz

    2018-01-01

    The article describes the monitoring system of blasthole drilling process called HNS (Hole Navigation System), which was used in blasting works performed by Maxam Poland Ltd. Developed by Atlas Copco's, the HNS system - using satellite data - allows for a very accurate mapping of the designed grid of blastholes. The article presents the results of several conducted measurements of ground vibrations triggered by blasting, designed and performed using traditional technology and using the HNS system and shows first observations in this matter.

  16. Atmospheric processing of combustion aerosols as a source of soluble iron to the open ocean

    OpenAIRE

    伊藤, 彰記; ITO, Akinori

    2015-01-01

    The majority of bioavailable iron (Fe) from the atmosphere is delivered from arid and semiarid regions to the oceans because the global deposition of iron from combustion sources is small compared with that from mineral dust. Atmospheric processing of mineral aerosols by inorganic and organic acids from anthropogenic and natural sources has been shown to increase the iron solubility of soils (initially < 0.5%) up to about 10%. On the other hand, atmospheric observations have shown that iron i...

  17. Improving the extraction-and-loading process in the open mining operations

    Directory of Open Access Journals (Sweden)

    Cheban A. Yu.

    2017-09-01

    Full Text Available Using the explosions is the main way to prepare solid rocks for the excavation, and that results in the formation of a rock mass of uneven granulometric composition, which makes it impossible to use a conveyor quarry transport without the preliminary large crushing of the rock mass obtained during the explosion. A way to achieve the greatest technical and economic effect is the full conveyorization of quarry transport, what, in this case, ensures the sequenced-flow of transport operations, automation of management and high labor productivity. The extraction-and-loading machines are the determining factor in the performance of mining and transport machines in the technological flow of the quarry. When extracting a blasted rock mass with single-bucket excavators or loaders working in combination with bottom-hole conveyors, one uses self-propelled crushing and reloading units of various designs to grind large individual parts to fractions of conditioning size. The presence of a crushing and reloading unit in the pit-face along with the excavator requires an additional space for its placement, complicates the maneuvering of the equipment in the pit-face, and increases the number of personnel and the cost of maintaining the extraction-and-reloading operations. The article proposes an improved method for carrying out the extraction-and-loading process, as well as the design of extraction-and-grinding unit based on a quarry hydraulic excavator. The design of the proposed unit makes it possible to convert the cyclic process of scooping the rock mass into the continuous process of its loading on the bottom-hole conveyor. Using the extraction-and-grinding unit allows one to combine the processes of excavation, preliminary crushing and loading of the rock mass, which ensures an increase in the efficiency of mining operations.

  18. Open Integrated Personal Learning Environment: Towards a New Conception of the ICT-Based Learning Processes

    Science.gov (United States)

    Conde, Miguel Ángel; García-Peñalvo, Francisco José; Casany, Marià José; Alier Forment, Marc

    Learning processes are changing related to technological and sociological evolution, taking this in to account, a new learning strategy must be considered. Specifically what is needed is to give an effective step towards the eLearning 2.0 environments consolidation. This must imply the fusion of the advantages of the traditional LMS (Learning Management System) - more formative program control and planning oriented - with the social learning and the flexibility of the web 2.0 educative applications.

  19. Distribution and interplay of geologic processes on Titan from Cassini radar data

    Science.gov (United States)

    Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.

    2010-01-01

    The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ???350 m to ???2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30??), with no dunes being present above 60??. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30?? and 60?? north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial/pluvial/lacustrine processes are the

  20. New production processes for alpha hemihydrate open up new marketing opportunities

    International Nuclear Information System (INIS)

    Engert, W.; Lehmkaemper, O.; Bunte, H.P.

    1991-01-01

    New production processes and markets for alpha hemihydrate are discussed. Utility studies concluded that lignite gypsum is harmless in terms of public and occupational health, and is technically comparable to or superior to natural gypsum by virtue of greater purity. Semi-commercial and pilot-scale studies were carried out on the use of flue gas desulfurization (FGD) gypsum for producing alpha hemihydrate, with successful results. The process enabled pure alpha hemihydrate to be produced without dihydrate or dihydrate impurities, and of a constant, uniform quality. The treatment consists of forming pressed mouldings of FGD gypsum followed by steam autoclaving, drying and milling. Agents are used to stabilize the stackable moldings, and to act as growth inhibitors during transformation of dihydrite to alpha-hemihydrate. Markets for the product are found in mining, tunneling and road building, foundation work, floor systems, as hard plaster for dental and moulding applications, for construction industry use, and as structural and non-structural material. Details are presented of the production process and marketing concepts. 12 figs

  1. Strange quark distribution and parton charge symmetry violation in a semi-inclusive process

    International Nuclear Information System (INIS)

    Kitagawa, Hisashi; Sakemi, Yasuhiro

    2000-01-01

    It is possible to observe a semi-inclusive reaction with tagged charged kaons using the RICH detector at DESY-HERA. Using the semi-inclusive process we study two kinds of parton properties in the nucleon. We study relations between cross sections and strange quark distributions, which are expected to be measured more precisely in such a process than in the process in which pions are tagged. We also investigate charge symmetry violation (CSV) in the nucleon, which appears in the region x ≤ 0.1. (author)

  2. Standardization of a method to study the distribution of Americium in purex process

    International Nuclear Information System (INIS)

    Dapolikar, T.T.; Pant, D.K.; Kapur, H.N.; Kumar, Rajendra; Dubey, K.

    2017-01-01

    In the present work the distribution of Americium in PUREX process is investigated in various process streams. For this purpose a method has been standardized for the determination of Am in process samples. The method involves extraction of Am with associated actinides using 30% TRPO-NPH at 0.3M HNO 3 followed by selective stripping of Am from the organic phase into aqueous phase at 6M HNO 3 . The assay of aqueous phase for Am content is carried out by alpha radiometry. The investigation has revealed that 100% Am follows the HLLW route. (author)

  3. Bayesian inference for multivariate point processes observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper; Aukema, B.H.

    We consider statistical and computational aspects of simulation-based Bayesian inference for a multivariate point process which is only observed at sparsely distributed times. For specicity we consider a particular data set which has earlier been analyzed by a discrete time model involving unknown...... normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared to discrete time processes in the setting of the present paper as well as other spatial-temporal situations. Keywords: Bark beetle, conditional intensity, forest entomology, Markov chain Monte Carlo...

  4. Modelling spatiotemporal distribution patterns of earthworms in order to indicate hydrological soil processes

    Science.gov (United States)

    Palm, Juliane; Klaus, Julian; van Schaik, Loes; Zehe, Erwin; Schröder, Boris

    2010-05-01

    Soils provide central ecosystem functions in recycling nutrients, detoxifying harmful chemicals as well as regulating microclimate and local hydrological processes. The internal regulation of these functions and therefore the development of healthy and fertile soils mainly depend on the functional diversity of plants and animals. Soil organisms drive essential processes such as litter decomposition, nutrient cycling, water dynamics, and soil structure formation. Disturbances by different soil management practices (e.g., soil tillage, fertilization, pesticide application) affect the distribution and abundance of soil organisms and hence influence regulating processes. The strong relationship between environmental conditions and soil organisms gives us the opportunity to link spatiotemporal distribution patterns of indicator species with the potential provision of essential soil processes on different scales. Earthworms are key organisms for soil function and affect, among other things, water dynamics and solute transport in soils. Through their burrowing activity, earthworms increase the number of macropores by building semi-permanent burrow systems. In the unsaturated zone, earthworm burrows act as preferential flow pathways and affect water infiltration, surface-, subsurface- and matrix flow as well as the transport of water and solutes into deeper soil layers. Thereby different ecological earthworm types have different importance. Deep burrowing anecic earthworm species (e.g., Lumbricus terrestris) affect the vertical flow and thus increase the risk of potential contamination of ground water with agrochemicals. In contrast, horizontal burrowing endogeic (e.g., Aporrectodea caliginosa) and epigeic species (e.g., Lumbricus rubellus) increase water conductivity and the diffuse distribution of water and solutes in the upper soil layers. The question which processes are more relevant is pivotal for soil management and risk assessment. Thus, finding relevant

  5. A distributed process monitoring system for nuclear powered electrical generating facilities

    International Nuclear Information System (INIS)

    Sweney, A.D.

    1991-01-01

    Duke Power Company is one of the largest investor owned utilities in the United States, with a service area of 20,000 square miles extending across North and South Carolina. Oconee Nuclear Station, one of Duke Power's three nuclear generating facilities, is a three unit pressurized water reactor site and has, over the course of its 15-year operating lifetime, effectively run out of plant processing capability. From a severely overcrowded cable spread room to an aging overtaxed Operator Aid Computer, the problems with trying to add additional process variables to the present centralized Operator Aid Computer are almost insurmountable obstacles. This paper reports that for this reason, and to realize the inherent benefits of a distributed process monitoring and control system, Oconee has embarked on a project to demonstrate the ability of a distributed system to perform in the nuclear power plant environment

  6. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  7. Radial transport processes as a precursor to particle deposition in drinking water distribution systems.

    Science.gov (United States)

    van Thienen, P; Vreeburg, J H G; Blokker, E J M

    2011-02-01

    Various particle transport mechanisms play a role in the build-up of discoloration potential in drinking water distribution networks. In order to enhance our understanding of and ability to predict this build-up, it is essential to recognize and understand their role. Gravitational settling with drag has primarily been considered in this context. However, since flow in water distribution pipes is nearly always in the turbulent regime, turbulent processes should be considered also. In addition to these, single particle effects and forces may affect radial particle transport. In this work, we present an application of a previously published turbulent particle deposition theory to conditions relevant for drinking water distribution systems. We predict quantitatively under which conditions turbophoresis, including the virtual mass effect, the Saffman lift force, and the Magnus force may contribute significantly to sediment transport in radial direction and compare these results to experimental observations. The contribution of turbophoresis is mostly limited to large particles (>50 μm) in transport mains, and not expected to play a major role in distribution mains. The Saffman lift force may enhance this process to some degree. The Magnus force is not expected to play any significant role in drinking water distribution systems. © 2010 Elsevier Ltd. All rights reserved.

  8. The influence of drilling process automation on improvement of blasting works quality in open pit mining

    Directory of Open Access Journals (Sweden)

    Bodlak Maciej

    2018-01-01

    Full Text Available The article describes the monitoring system of blasthole drilling process called HNS (Hole Navigation System, which was used in blasting works performed by Maxam Poland Ltd. Developed by Atlas Copco's, the HNS system – using satellite data – allows for a very accurate mapping of the designed grid of blastholes. The article presents the results of several conducted measurements of ground vibrations triggered by blasting, designed and performed using traditional technology and using the HNS system and shows first observations in this matter.

  9. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Science.gov (United States)

    Hennig, Patrick; Möller, Ralf; Egelhaaf, Martin

    2008-08-28

    Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells) in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model), an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model) requires smaller changes in input resistance in the inhibited neurons during visual stimulation. Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the retinotopic elements. Hence, distributed inhibition might be an alternative explanation of

  10. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  11. Applying Idea Management System (IMS Approach to Design and Implement a collaborative Environment in Public Service related open Innovation Processes

    Directory of Open Access Journals (Sweden)

    Marco Alessi

    2015-12-01

    Full Text Available Novel ideas are the key ingredients for innovation processes, and Idea Management System (IMS plays a prominent role in managing captured ideas from external stakeholders and internal actors within an Open Innovation process. By considering a specific case study, Lecce-Italy, we have designed and implemented a collaborative environment, which provides an ideal platform for government, citizens, etc. to share ideas and co-create the value of innovative public services in Lecce. In this study the application of IMS with six main steps, including: idea generation, idea improvement, idea selection, refinement, idea implementation, and monitoring, shows that this, remarkably, helps service providers to exploit the intellectual capital and initiatives of the regional stakeholders and citizens and assist service providers to stay in line with the needs of society. Moreover, we have developed two support tools to foster collaboration and transparency: sentiment analysis tool and gamification application.

  12. The Open Method of Co-ordination and the Analysis of Mutual Learning Processes of the European Employment Strategy

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    2005-01-01

    The purpose of this paper is to address two normative and interlinked methodological and theoretical questions concerning the Open Method of Coordination (OMC): First, what is the most appropriate approach to learning in the analyses of the processes of the European Employment Strategy (EES......)? Second, how should mutual learning processes be diffused among the Member States in order to be efficient? In answering these two questions the paper draws on a social constructivist approach to learning thereby contributing to the debate about learning in the political science literature. At the same...... time, based on the literature and participatory observations, it is concluded that the learning effects of the EES are probably somewhat larger than what is normally suggested, but that successful diffusion still depends on a variety of contextual factors. At the end of the paper a path for empirical...

  13. VMStools: Open-source software for the processing, analysis and visualization of fisheries logbook and VMS data

    DEFF Research Database (Denmark)

    Hintzen, Niels T.; Bastardie, Francois; Beare, Doug

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook...... fishing from other activities, provide high-resolution maps of both fishing effort and -landings, interpolate vessel tracks, calculate indicators of fishing impact as listed under the Data Collection Framework at different spatio-temporal scales. Finally data can be transformed into other existing formats......, for example to populate regional databases like FishFrame. This paper describes workflow examples of these features while online material allows a head start to perform these analyses. This software incorporates state-of-the art VMS and logbook analysing methods standardizing the process towards obtaining pan...

  14. Comparison of the depth distribution processes for 137Cs and 210Pbex in cultivated soils

    International Nuclear Information System (INIS)

    Zhang Yunqi; Zhang Xinbao; Long Yi; He Xiubin; Yu Xingxiu

    2012-01-01

    This paper focuses on the different processes of 137 Cs and 210 Pb ex depth distribution in cultivated soils. In view of their different fallout deposition processes, considering radionuclide will diffuse from the plough layer to the plough pan layer duo to the concentration gradient between the two layers, the 137 Cs and 210 Pb ex depth distribution processes were theoretically derived. Additionally, the theoretical derivation was verified by the measured 137 Cs and 210 Pb ex values in the soil core collected from wheat field in Fujianzhuang, Shanxi Province, China, and the 137 Cs and 210 Pb ex concentrations variation with depth in soils of the wheat field was explained rationally. The 137 Cs depth distribution state in cultivated soils will consistently vary with time due to 137 Cs continual decay and diffusion as an artificial radionuclide without sustainable fallout input since 1960s. In contrast, the 210 Pb ex depth distribution in cultivated soils will achieve steady state because of sustainable deposition of the naturally occurring 210 Pb ex fallout, and it can be concluded that the differences between the theoretical and the measured values, especially for 210 Pb ex , might be associated with the history of plough depth variation or LUCC. (authors)

  15. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing

    Directory of Open Access Journals (Sweden)

    Samsinar Riza

    2018-01-01

    Full Text Available The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  16. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    Science.gov (United States)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  17. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  18. Distribution ratios on Dowex 50W resins of metal leached in the caron nickel recovery process

    International Nuclear Information System (INIS)

    Reynolds, B.A.; Metsa, J.C.; Mullins, M.E.

    1980-05-01

    Pressurized ion exchange on Dowex 50W-X8 and 50W-X12 resins was investigated using elution techniques to determine distribution ratios for copper, nickel, and cobalt complexes contained in ammonium carbonate solution, a mixture which approximates the waste liquor from the Caron nickel recovery process. Results were determined for different feed concentrations, as well as for different concentrations and pH values of the ammonium carbonate eluant. Distribution ratios were compared with those previously obtained from a continuous annular chromatographic system. Separation of copper and nickel was not conclusively observed at any of the conditions examined

  19. Distribution ratios on Dowex 50W resins of metal leached in the caron nickel recovery process

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, B.A.; Metsa, J.C.; Mullins, M.E.

    1980-05-01

    Pressurized ion exchange on Dowex 50W-X8 and 50W-X12 resins was investigated using elution techniques to determine distribution ratios for copper, nickel, and cobalt complexes contained in ammonium carbonate solution, a mixture which approximates the waste liquor from the Caron nickel recovery process. Results were determined for different feed concentrations, as well as for different concentrations and pH values of the ammonium carbonate eluant. Distribution ratios were compared with those previously obtained from a continuous annular chromatographic system. Separation of copper and nickel was not conclusively observed at any of the conditions examined.

  20. Distributed control and monitoring of high-level trigger processes on the LHCb online farm

    CERN Document Server

    Vannerem, P; Jost, B; Neufeld, N

    2003-01-01

    The on-line data taking of the LHCb experiment at the future LHC collider will be controlled by a fully integrated and distributed Experiment Control System (ECS). The ECS will supervise both the detector operation (DCS) and the trigger and data acquisition (DAQ) activities of the experiment. These tasks require a large distributed information management system. The aim of this paper is to show how the control and monitoring of software processes such as trigger algorithms are integrated in the ECS of LHCb.

  1. The Particle Distribution in Liquid Metal with Ceramic Particles Mould Filling Process

    Science.gov (United States)

    Dong, Qi; Xing, Shu-ming

    2017-09-01

    Adding ceramic particles in the plate hammer is an effective method to increase the wear resistance of the hammer. The liquid phase method is based on the “with the flow of mixed liquid forging composite preparation of ZTA ceramic particle reinforced high chromium cast iron hammer. Preparation method for this system is using CFD simulation analysis the particles distribution of flow mixing and filling process. Taking the 30% volume fraction of ZTA ceramic composite of high chromium cast iron hammer as example, by changing the speed of liquid metal viscosity to control and make reasonable predictions of particles distribution before solidification.

  2. Charged particle multiplicity distributions in e+e--annihilation processes in the LEP experiments

    International Nuclear Information System (INIS)

    Shlyapnikov, P.V.

    1992-01-01

    Results of studies of the charged particle multiplicity distributions in the process of e + e - -annihilation into hadrons obtained in experiments at LEP accelerator in CERN are reviewed. Universality in energy dependence of the average charged particle multiplicity in e + e - and p ± p collisions, evidence for KNO-scaling in e + e - data, structure in multiplicity distribution and its relation to the jet structure of events, average particle multiplicities or quark and gluon jets, 'clan' picture and other topics are discussed. 73 refs.; 20 figs.; 3 tabs

  3. Effect of process parameters on temperature distribution in twin-electrode TIG coupling arc

    Science.gov (United States)

    Zhang, Guangjun; Xiong, Jun; Gao, Hongming; Wu, Lin

    2012-10-01

    The twin-electrode TIG coupling arc is a new type of welding heat source, which is generated in a single welding torch that has two tungsten electrodes insulated from each other. This paper aims at determining the distribution of temperature for the coupling arc using the Fowler-Milne method under the assumption of local thermodynamic equilibrium. The influences of welding current, arc length, and distance between both electrode tips on temperature distribution of the coupling arc were analyzed. Based on the results, a better understanding of the twin-electrode TIG welding process was obtained.

  4. Biodiversity and the Lotka-Volterra theory of species interactions: open systems and the distribution of logarithmic densities.

    Science.gov (United States)

    Wilson, William G; Lundberg, Per

    2004-09-22

    Theoretical interest in the distributions of species abundances observed in ecological communities has focused recently on the results of models that assume all species are identical in their interactions with one another, and rely upon immigration and speciation to promote coexistence. Here we examine a one-trophic level system with generalized species interactions, including species-specific intraspecific and interspecific interaction strengths, and density-independent immigration from a regional species pool. Comparisons between results from numerical integrations and an approximate analytic calculation for random communities demonstrate good agreement, and both approaches yield abundance distributions of nearly arbitrary shape, including bimodality for intermediate immigration rates.

  5. The Relationship of the Facial Nerve to the Condylar Process: A Cadaveric Study with Implications for Open Reduction Internal Fixation

    Directory of Open Access Journals (Sweden)

    H. P. Barham

    2015-01-01

    Full Text Available Introduction. The mandibular condyle is the most common site of mandibular fracture. Surgical treatment of condylar fractures by open reduction and internal fixation (ORIF demands direct visualization of the fracture. This project aimed to investigate the anatomic relationship of the tragus to the facial nerve and condylar process. Materials and Methods. Twelve fresh hemicadavers heads were used. An extended retromandibular/preauricular approach was utilized, with the incision being based parallel to the posterior edge of the ramus. Measurements were obtained from the tragus to the facial nerve and condylar process. Results. The temporozygomatic division of the facial nerve was encountered during each approach, crossing the mandible at the condylar neck. The mean tissue depth separating the facial nerve from the condylar neck was 5.5 mm (range: 3.5 mm–7 mm, SD 1.2 mm. The upper division of the facial nerve crossed the posterior border of the condylar process on average 2.31 cm (SD 0.10 cm anterior to the tragus. Conclusions. This study suggests that the temporozygomatic division of the facial nerve will be encountered in most approaches to the condylar process. As visualization of the relationship of the facial nerve to condyle is often limited, recognition that, on average, 5.5 mm of tissue separates condylar process from nerve should help reduce the incidence of facial nerve injury during this procedure.

  6. A study on pore-opening behaviors of graphite nanofibers by a chemical activation process.

    Science.gov (United States)

    Kim, Byung-Joo; Lee, Young-Seak; Park, Soo-Jin

    2007-02-15

    In this work, porous graphite nanofibers (GNFs) were prepared by a KOH activation method in order to manufacture porous carbon nanofibers. The process was conducted in the activation temperature range of 900-1100 degrees C, and the KOH:GNFs ratio was fixed at 3.5:1. The textural properties of the porous carbons were analyzed using N2 adsorption isotherms at 77 K. The BET, D-R, and BJH equations were used to observe the specific surface areas and the micro- and mesopore structures, respectively. From the results, it was found that the textural properties, including the specific surface area and the pore volumes, were proportionally enhanced with increasing activation temperatures. However, the activation mechanisms showed quite significant differences between the samples activated at low and high temperatures.

  7. Video Conferencing for Opening Classroom Doors in Initial Teacher Education: Sociocultural Processes of Mimicking and Improvisation

    Directory of Open Access Journals (Sweden)

    Rolf Wiesemes

    2010-11-01

    Full Text Available In this article, we present an alternative framework for conceptualising video-conferencing uses in initial teacher education and in Higher Education (HE more generally. This alternative framework takes into account the existing models in the field, but – based on a set of interviews conducted with teacher trainees and wider analysis of the related literature – we suggest that there is a need to add to existing models the notions of ‘mimicking’ (copying practice and improvisation (unplanned and spontaneous personal learning moments. These two notions are considered to be vital, as they remain valid throughout teachers’ careers and constitute key affordances of video-conferencing uses in HE. In particular, we argue that improvisational processes can be considered as key for developing professional practice and lifelong learning and that video-conferencing uses in initial teacher education can contribute to an understanding of training and learning processes. Current conceptualisations of video conferencing as suggested by Coyle (2004 and Marsh et al. (2009 remain valid, but also are limited in their scope with respect to focusing predominantly on pragmatic and instrumental teacher-training issues. Our article suggests that the theoretical conceptualisations of video conferencing should be expanded to include elements of mimicking and ultimately improvisation. This allows us to consider not just etic aspects of practice, but equally emic practices and related personal professional development. We locate these arguments more widely in a sociocultural-theory framework, as it enables us to describe interactions in dialectical rather than dichotomous terms (Lantolf & Poehner, 2008.

  8. Effect of forced-air heaters on perfusion and temperature distribution during and after open-heart surgery

    NARCIS (Netherlands)

    Severens, Natascha M. W.; van Marken Lichtenbelt, Wouter D.; van Leeuwen, Gerard M. J.; Frijns, Arjan J. H.; van Steenhoven, Anton A.; de Mol, Bas A. J. M.; van Wezel, Harry B.; Veldman, Dirk J.

    2007-01-01

    OBJECTIVES: After cardiopulmonary bypass, patients often show redistribution hypothermia, also called afterdrop. Forced-air blankets help to reduce afterdrop. This study explores the effect of forced-air blankets on temperature distribution and peripheral perfusion. The blood perfusion data is used

  9. Effect of forced-air heaters on perfusion and and temperature distribution during and after open-heart surgery

    NARCIS (Netherlands)

    Severens, N.M.W.; Marken Lichtenbelt, van W.; Leeuwen, van G.M.J.; Frijns, A.J.H.; Steenhoven, van A.A.; Mol, de B.A.J.M.; Wezel, H.B.; Veldman, D.J.

    2007-01-01

    Objectives: After cardiopulmonary bypass, patients often show redistribution hypothermia, also called afterdrop. Forced-air blankets help to reduce afterdrop. This study explores the effect of forced-air blankets on temperature distribution and peripheral perfusion. The blood perfusion data is used

  10. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  11. Generation and distribution of PAHs in the process of medical waste incineration

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Ying, E-mail: echochen327@163.com [School of Environment, Tsinghua University, Beijing 100084 (China); National Center of Solid Waste Management, Ministry of Environmental Protection, Beijing 100029 (China); Zhao, Rongzhi [Civil and Environmental Engineering School, University of Science and Technology Beijing, Beijing 100083 (China); Xue, Jun [National Center of Solid Waste Management, Ministry of Environmental Protection, Beijing 100029 (China); Li, Jinhui, E-mail: jinhui@tsinghua.edu.cn [State Key Joint Laboratory of Environment Simulation and Pollution Control, School of Environment, Tsinghua University, Beijing 100084 (China)

    2013-05-15

    Highlights: ► PAHs generation and distribution features of medical waste incineration are studied. ► More PAHs were found in fly ash than that in bottom ash. ► The highest proportion of PAHs consisted of the seven most carcinogenic ones. ► Increase of free oxygen molecule and burning temperature promote PAHs degradation. ► There is a moderate positive correlation between total PCDD/Fs and total PAHs. - Abstract: After the deadly earthquake on May 12, 2008 in Wenchuan county of China, several different incineration approaches were used for medical waste disposal. This paper investigates the generation properties of polycyclic aromatic hydrocarbons (PAHs) during the incineration. Samples were collected from the bottom ash in an open burning slash site, surface soil at the open burning site, bottom ash from a simple incinerator, bottom ash generated from the municipal solid waste (MSW) incinerator used for medical waste disposal, and bottom ash and fly ash from an incinerator exclusively used for medical waste. The species of PAHs were analyzed, and the toxicity equivalency quantities (TEQs) of samples calculated. Analysis results indicate that the content of total PAHs in fly ash was 1.8 × 10{sup 3} times higher than that in bottom ash, and that the strongly carcinogenic PAHs with four or more rings accumulated sensitively in fly ash. The test results of samples gathered from open burning site demonstrate that Acenaphthylene (ACY), Acenaphthene (ACE), Fluorene (FLU), Phenanthrene (PHE), Anthracene (ANT) and other PAHs were inclined to migrate into surrounding environment along air and surface watershed corridors, while 4- to 6-ring PAHs accumulated more likely in soil. Being consistent with other studies, it has also been confirmed that increases in both free oxygen molecules and combustion temperatures could promote the decomposition of polycyclic PAHs. In addition, without the influence of combustion conditions, there is a positive correlation between

  12. Generation and distribution of PAHs in the process of medical waste incineration

    International Nuclear Information System (INIS)

    Chen, Ying; Zhao, Rongzhi; Xue, Jun; Li, Jinhui

    2013-01-01

    Highlights: ► PAHs generation and distribution features of medical waste incineration are studied. ► More PAHs were found in fly ash than that in bottom ash. ► The highest proportion of PAHs consisted of the seven most carcinogenic ones. ► Increase of free oxygen molecule and burning temperature promote PAHs degradation. ► There is a moderate positive correlation between total PCDD/Fs and total PAHs. - Abstract: After the deadly earthquake on May 12, 2008 in Wenchuan county of China, several different incineration approaches were used for medical waste disposal. This paper investigates the generation properties of polycyclic aromatic hydrocarbons (PAHs) during the incineration. Samples were collected from the bottom ash in an open burning slash site, surface soil at the open burning site, bottom ash from a simple incinerator, bottom ash generated from the municipal solid waste (MSW) incinerator used for medical waste disposal, and bottom ash and fly ash from an incinerator exclusively used for medical waste. The species of PAHs were analyzed, and the toxicity equivalency quantities (TEQs) of samples calculated. Analysis results indicate that the content of total PAHs in fly ash was 1.8 × 10 3 times higher than that in bottom ash, and that the strongly carcinogenic PAHs with four or more rings accumulated sensitively in fly ash. The test results of samples gathered from open burning site demonstrate that Acenaphthylene (ACY), Acenaphthene (ACE), Fluorene (FLU), Phenanthrene (PHE), Anthracene (ANT) and other PAHs were inclined to migrate into surrounding environment along air and surface watershed corridors, while 4- to 6-ring PAHs accumulated more likely in soil. Being consistent with other studies, it has also been confirmed that increases in both free oxygen molecules and combustion temperatures could promote the decomposition of polycyclic PAHs. In addition, without the influence of combustion conditions, there is a positive correlation between total

  13. Research on distributed optical fiber sensing data processing method based on LabVIEW

    Science.gov (United States)

    Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing

    2018-01-01

    The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.

  14. Development of laboratory and process sensors to monitor particle size distribution of industrial slurries

    Energy Technology Data Exchange (ETDEWEB)

    Pendse, H.P.

    1992-10-01

    In this paper we present a novel measurement technique for monitoring particle size distributions of industrial colloidal slurries based on ultrasonic spectroscopy and mathematical deconvolution. An on-line sensor prototype has been developed and tested extensively in laboratory and production settings using mineral pigment slurries. Evaluation to date shows that the sensor is capable of providing particle size distributions, without any assumptions regarding their functional form, over diameters ranging from 0.1 to 100 micrometers in slurries with particle concentrations of 10 to 50 volume percents. The newly developed on-line sensor allows one to obtain particle size distributions of commonly encountered inorganic pigment slurries under industrial processing conditions without dilution.

  15. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    Directory of Open Access Journals (Sweden)

    Tokareva Victoria

    2018-01-01

    Full Text Available New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS. Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  16. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    Science.gov (United States)

    Tokareva, Victoria

    2018-04-01

    New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  17. Distribution of radioactivity in the Esk Estuary and its relationship to sedimentary processes

    International Nuclear Information System (INIS)

    Kelly, M.; Emptage, M.

    1992-01-01

    In the Esk Estuary, Cumbria, the distribution of sediment lithology and facies has been determined and related to radionuclide surface and sub-surface distribution. The total volume of sediment contaminated with artificial radionuclides is estimated at 1.2 Mm 3 and the inventory of 137 Cs at 4.5 TBq. The fine grained sediments of the bank facies are the main reservoir for radionuclides, comprising 73% of the 137 Cs inventory. Time scales for the reworking of these sediments are estimated at tens to hundreds of years. Measurements of sediment and radionuclide deposition demonstrate that direct sediment deposition is the main method for radionuclide recruitment to the deposits but solution labelling can also occur. Bioturbation and other diagenetic processes modify the distribution of radionuclides in the deposits. Gamma dose rates in air can be related to the sediment grain size and sedimentation rate. (Author)

  18. The open access and the natural gas ducts: transport and distribution; O livre acesso e os dutos de gas natural: transporte e distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Siqueira, Mariana de; Xavier, Yanko Marcius de Alencar [Universidade Federal do Rio Grande do Norte (UFRN), Natal, RN (Brazil)

    2008-07-01

    The present research, attempting for the economic relevance of the natural gas sector, for the lack of a law that disciplines it and, still, for the structural question of the natural gas ducts activities; it analyzes, in a comparative way, the monopoly of the natural gas ducts activities and the mechanisms of competition chosen to brighten up it: the open access and the by pas. The transport and the distribution of the natural gas are really similar, but the ways to insert the competition in its areas are not. (author)

  19. Citrate-based open-quotes Talspeakclose quotes actinide-lanthanide separation process

    International Nuclear Information System (INIS)

    Del Cul, G.D.; Toth, L.M.; Bond, W.D.

    1997-01-01

    Lanthanide elements are produced in relatively high yield by fission of 235 U. Almost all the lanthanide isotopes decay to stable nonradioactive lanthanide isotopes in a relatively short time. Consequently, it is highly advantageous to separate the relatively small actinide fraction from the relatively large quantities of lanthanide isotopes. The TALSPEAK process (Trivalent Actinide Lanthanide Separations by Phosphorus-reagent Extraction from Aqueous Complexes) is one of the few means available to separate the trivalent actinides from the lanthanides. Previous work based on the use of lactic or glycolic acid has shown deleterious effects of some impurity ions such as zirconium(IV), even at concentrations on the order of 10 -4 M. Other perceived problems were the need to maintain the pH and reagent concentrations within a narrow range and a significant solubility of the organic phase at high carboxylic acid concentrations. The authors' cold experiments showed that replacing the traditional extractants glycolic or lactic acid with citric acid eliminates or greatly reduces the deleterious effects produced by impurities such as zirconium. An extensive series of batch tests was done using a wide range of reagent concentrations at different pH values, temperatures, and contact times. The results demonstrated that the citrate-based TALSPEAK can tolerate appreciable changes in pH and reagent concentrations while maintaining an adequate lanthanide extraction. Experiments using a three-stage glass mixer-settler showed a good lanthanide extraction, appropriate phase disengagement, no appreciable deleterious effects due to the presence of impurities such as zirconium, excellent pH buffering, and no significant loss of organic phase

  20. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    Science.gov (United States)

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.