WorldWideScience

Sample records for open distributed processing

  1. Building enterprise systems with ODP an introduction to open distributed processing

    CERN Document Server

    Linington, Peter F; Tanaka, Akira; Vallecillo, Antonio

    2011-01-01

    The Reference Model of Open Distributed Processing (RM-ODP) is an international standard that provides a solid basis for describing and building widely distributed systems and applications in a systematic way. It stresses the need to build these systems with evolution in mind by identifying the concerns of major stakeholders and then expressing the design as a series of linked viewpoints. Although RM-ODP has been a standard for more than ten years, many practitioners are still unaware of it. Building Enterprise Systems with ODP: An Introduction to Open Distributed Processing offers a gentle pa

  2. 75 FR 14076 - Express Mail Open and Distribute and Priority Mail Open and Distribute Changes and Updates

    Science.gov (United States)

    2010-03-24

    ... tray boxes to address Open and Distribute customers' concerns on the security of their mail in a letter tray during processing. The current tray box sizes were a result of customer feedback. The decision to... processing of Open and Distribute containers. In response to customer needs, the Postal Service is...

  3. Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation

    NARCIS (Netherlands)

    Sloep, Peter

    2009-01-01

    Sloep, P. B. (2009). Innovation as a distributed, collaborative process of knowledge generation: open, networked innovation. In V. Hornung-Prähauser & M. Luckmann (Eds.), Kreativität und Innovationskompetenz im digitalen Netz - Creativity and Innovation Competencies in the Web, Sammlung von

  4. Leadership in Open and Distributed Innovation

    DEFF Research Database (Denmark)

    Haefliger, Stefan; Poetz, Marion

    demands and shorter product life cycles have triggered new forms of creation and innovative practices (von Hippel and von Krogh, 2003; Baden-Fuller and Haefliger, 2013). These new forms can be characterized by being more open, distributed, collaborative, and democratized than traditional models...... in networks of innovators, such as platform businesses (Alexy et al., 2009; Gawer and Cusumano, 2008; Füller et al., 2016). However, one aspect that has so far received little attention, both in research and in business practice is the potentially conflicting role of traditional forms of leadership in open...... innovation systems, processes and projects. Traditional approaches to leadership in innovation processes highlight the role of individual managers who lead and evaluate firm-internal team members, champion innovation projects within the organization and act as translators between various firm...

  5. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  6. Inter-particle gap distribution and spectral rigidity of the totally asymmetric simple exclusion process with open boundaries

    International Nuclear Information System (INIS)

    Krbalek, Milan; Hrabak, Pavel

    2011-01-01

    We consider the one-dimensional totally asymmetric simple exclusion process (TASEP model) with open boundary conditions and present the analytical computations leading to the exact formula for distance clearance distribution, i.e. probability density for a clear distance between subsequent particles of the model. The general relation is rapidly simplified for the middle part of the one-dimensional lattice. Both the analytical formulas and their approximations are compared with the numerical representation of the TASEP model. Such a comparison is presented for particles occurring in the internal part as well as in the boundary part of the lattice. Furthermore, we introduce the pertinent estimation for the so-called spectral rigidity of the model. The results obtained are sequentially discussed within the scope of vehicular traffic theory.

  7. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    Science.gov (United States)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  8. Description of Supply Openings in Numerical Models for Room Air Distribution

    DEFF Research Database (Denmark)

    Nielsen, Peter V.

    This paper discusses various possibilities for describing supply openings in numerical models of room air distribution.......This paper discusses various possibilities for describing supply openings in numerical models of room air distribution....

  9. Open-ocean convection process: A driver of the winter nutrient supply and the spring phytoplankton distribution in the Northwestern Mediterranean Sea

    Science.gov (United States)

    Severin, Tatiana; Kessouri, Faycal; Rembauville, Mathieu; Sánchez-Pérez, Elvia Denisse; Oriol, Louise; Caparros, Jocelyne; Pujo-Pay, Mireille; Ghiglione, Jean-François; D'Ortenzio, Fabrizio; Taillandier, Vincent; Mayot, Nicolas; Durrieu De Madron, Xavier; Ulses, Caroline; Estournel, Claude; Conan, Pascal

    2017-06-01

    This study was a part of the DeWEX project (Deep Water formation Experiment), designed to better understand the impact of dense water formation on the marine biogeochemical cycles. Here, nutrient and phytoplankton vertical and horizontal distributions were investigated during a deep open-ocean convection event and during the following spring bloom in the Northwestern Mediterranean Sea (NWM). In February 2013, the deep convection event established a surface nutrient gradient from the center of the deep convection patch to the surrounding mixed and stratified areas. In the center of the convection area, a slight but significant difference of nitrate, phosphate and silicate concentrations was observed possibly due to the different volume of deep waters included in the mixing or to the sediment resuspension occurring where the mixing reached the bottom. One of this process, or a combination of both, enriched the water column in silicate and phosphate, and altered significantly the stoichiometry in the center of the deep convection area. This alteration favored the local development of microphytoplankton in spring, while nanophytoplankton dominated neighboring locations where the convection reached the deep layer but not the bottom. This study shows that the convection process influences both winter nutrients distribution and spring phytoplankton distribution and community structure. Modifications of the convection's spatial scale and intensity (i.e., convective mixing depth) are likely to have strong consequences on phytoplankton community structure and distribution in the NWM, and thus on the marine food web.Plain Language SummaryThe deep open-ocean convection in the Northwestern Mediterranean Sea is an important process for the formation and the circulation of the deep waters of the entire Mediterranean Sea, but also for the local spring phytoplankton bloom. In this study, we showed that variations of the convective mixing depth induced different supply in nitrate

  10. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    Science.gov (United States)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  11. A Transparent Runtime Data Distribution Engine for OpenMP

    Directory of Open Access Journals (Sweden)

    Dimitrios S. Nikolopoulos

    2000-01-01

    Full Text Available This paper makes two important contributions. First, the paper investigates the performance implications of data placement in OpenMP programs running on modern NUMA multiprocessors. Data locality and minimization of the rate of remote memory accesses are critical for sustaining high performance on these systems. We show that due to the low remote-to-local memory access latency ratio of contemporary NUMA architectures, reasonably balanced page placement schemes, such as round-robin or random distribution, incur modest performance losses. Second, the paper presents a transparent, user-level page migration engine with an ability to gain back any performance loss that stems from suboptimal placement of pages in iterative OpenMP programs. The main body of the paper describes how our OpenMP runtime environment uses page migration for implementing implicit data distribution and redistribution schemes without programmer intervention. Our experimental results verify the effectiveness of the proposed framework and provide a proof of concept that it is not necessary to introduce data distribution directives in OpenMP and warrant the simplicity or the portability of the programming model.

  12. Current distribution in a plasma erosion opening switch

    International Nuclear Information System (INIS)

    Weber, B.V.; Commisso, R.J.; Meger, R.A.; Neri, J.M.; Oliphant, W.F.; Ottinger, P.F.

    1984-01-01

    The current distribution in a plasma erosion opening switch is determined from magnetic field probe data. During the closed state of the switch the current channel broadens rapidly. The width of the current channel is consistent with a bipolar current density limit imposed by the ion flux to the cathode. The effective resistivity of the current channel is anomalously large. Current is diverted to the load when a gap opens near the cathode side of the switch. The observed gap opening can be explained by erosion of the plasma. Magnetic pressure is insufficient to open the gap

  13. Current distribution in a plasma erosion opening switch

    International Nuclear Information System (INIS)

    Weber, B.V.; Commisso, R.J.; Meger, R.A.; Neri, J.M.; Oliphant, W.F.; Ottinger, P.F.

    1985-01-01

    The current distribution in a plasma erosion opening switch is determined from magnetic field probe data. During the closed state of the switch the current channel broadens rapidly. The width of the current channel is consistent with a bipolar current density limit imposed by the ion flux to the cathode. The effective resistivity of the current channel is anomalously large. Current is diverted to the load when a gap opens near the cathode side of the switch. The observed gap opening can be explained by erosion of the plasma. Magnetic pressure is insufficient to open the gap

  14. open-quotes Shift-Betelclose quotes: A (very) distributed mainframe

    International Nuclear Information System (INIS)

    Segal, B.; Martin, O.; Hassine, F.; Hemmer, F.; Jouanigot, J.M.

    1994-01-01

    Over the last four years, CERN has progressively converted its central batch production facilities from classic mainframe platforms (Cray XMP, IBM, ESA, Vax 9000) to distributed RISC based facilities, which have now attained a very large size. Both a CPU-intensive system (open-quotes CSFclose quotes, the Central Simulation Facility) and an I/O-intensive system (open-quotes SHIFTclose quotes, the Scaleable Heterogeneous Integrated Facility) have been developed, plus a distributed data management subsystem allowing seamless access to CERN'S central tape store and to large amounts of economical disk space. The full system is known as open-quotes COREclose quotes, the Centrally Operated Risc Environment; at the time of writing CORE comprises around 2000 CERN Units of Computing (about 8000 MIPs) and over a TeraByte of online disk space. This distributed system is connected using standard networking technologies (IP protocols over Ethernet, FDDI and UltraNet), but which until quite recently were only implemented at sufficiently high speed in the Local Area

  15. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Segre, Daniel [Boston Univ., MA (United States)

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  16. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  17. The Simple Concurrent Online Processing System (SCOPS) - An open-source interface for remotely sensed data processing

    Science.gov (United States)

    Warren, M. A.; Goult, S.; Clewley, D.

    2018-06-01

    Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

  18. Matrix product representation of the stationary state of the open zero range process

    Science.gov (United States)

    Bertin, Eric; Vanicat, Matthieu

    2018-06-01

    Many one-dimensional lattice particle models with open boundaries, like the paradigmatic asymmetric simple exclusion process (ASEP), have their stationary states represented in the form of a matrix product, with matrices that do not explicitly depend on the lattice site. In contrast, the stationary state of the open 1D zero-range process (ZRP) takes an inhomogeneous factorized form, with site-dependent probability weights. We show that in spite of the absence of correlations, the stationary state of the open ZRP can also be represented in a matrix product form, where the matrices are site-independent, non-commuting and determined from algebraic relations resulting from the master equation. We recover the known distribution of the open ZRP in two different ways: first, using an explicit representation of the matrices and boundary vectors; second, from the sole knowledge of the algebraic relations satisfied by these matrices and vectors. Finally, an interpretation of the relation between the matrix product form and the inhomogeneous factorized form is proposed within the framework of hidden Markov chains.

  19. Further improvement in ABWR (part-4) open distributed plant process computer system

    International Nuclear Information System (INIS)

    Makino, Shigenori; Hatori, Yoshinori

    1999-01-01

    In the nuclear industry of Japan, the electric power companies have promoted the plant process computer (PPC) technology of nuclear power plant (NPP). When PPC was introduced to NPP for the first time, because of very tight requirement such as high reliability, high speed processing, the large-scale customized computer was applied. As for recent computer field, the large market of computer contributes to the remarkable progress of engineering work station (EWS) and personal computer (PC) technology. Moreover because the data transmission technology has been progressing at the same time, world wide computer network has been established. Thanks to progress of both technologies, the distributed computer system has been established at reasonable price. So Tokyo Electric Power Company (TEPCO) is trying to apply it for PPC of NPP. (author)

  20. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  1. Learning from the History of Distributed Query Processing

    DEFF Research Database (Denmark)

    Betz, Heiko; Gropengießer, Francis; Hose, Katja

    2012-01-01

    The vision of the Semantic Web has triggered the development of various new applications and opened up new directions in research. Recently, much effort has been put into the development of techniques for query processing over Linked Data. Being based upon techniques originally developed...... for distributed and federated databases, some of them inherit the same or similar problems. Thus, the goal of this paper is to point out pitfalls that the previous generation of researchers has already encountered and to introduce the Linked Data as a Service as an idea that has the potential to solve the problem...... in some scenarios. Hence, this paper discusses nine theses about Linked Data processing and sketches a research agenda for future endeavors in the area of Linked Data processing....

  2. Evaluation of the stress distribution on the pressure vessel head with multi-openings

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.S.; Kim, T.W.; Jeong, K.H.; Lee, G.M. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    This report discusses and analyzes the stress distribution on the pressure vessel head with multi-openings(3 PSV nozzles, 2 SDS nozzles and 1 Man Way) according to patterns of the opening distance. The pressurizer of Korea Standardized Nuclear Power Plant(Ulchin 3 and 4), which meets requirements of the cyclic operation and opening design defined by ASME code, was used as the basic model for that. Stress changes according to the distance between openings were investigated and the factors which should be considered for the opening design were analyzed. Also, the nozzle loads at Level A, B conditions and internal pressure were applied in order to evaluate changes of head stress distributions due to nozzle loads. (author). 6 refs., 29 figs., 4 tabs.

  3. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  4. Terror in the Board Room: The Bid-Opening Process

    Science.gov (United States)

    Shoop, James

    2009-01-01

    Competitive bids and the bid-opening process are the cornerstones of public school purchasing. The bid-opening process does not begin on the day of the bid opening. It begins with good planning by the purchasing agent to ensure that the advertised bid complies with the public school contracts law. In New Jersey, that raises the following…

  5. Opening up the innovation process: archetypal strategies

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2005-01-01

    sharing and co-operation play a critical part. The paper addresses the involvement of users in opening up the innovation process, which in turn gives the participating actors an interesting alternative for product development. We identify and classify four archetypal strategies for opening up...

  6. 78 FR 79298 - Securities Exempted; Distribution of Shares by Registered Open-End Management Investment Company...

    Science.gov (United States)

    2013-12-30

    ...] Securities Exempted; Distribution of Shares by Registered Open- End Management Investment Company...) 551-6792, Investment Company Rulemaking Office, Division of Investment Management, U.S. Securities and... Distribution of shares by registered open-end management investment company. * * * * * (g) If a plan covers...

  7. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  8. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  9. Energy Consumption in the Process of Excavator-Automobile Complexes Distribution at Kuzbass Open Pit Mines

    Directory of Open Access Journals (Sweden)

    Panachev Ivan

    2017-01-01

    Full Text Available Every year worldwide coal mining companies seek to maintain the tendency of the mining machine fleet renewal. Various activities to maintain the service life of already operated mining equipment are implemented. In this regard, the urgent issue is the problem of efficient distribution of available machines in different geological conditions. The problem of “excavator-automobile” complex effective distribution occurs when heavy dump trucks are used in mining. For this reason, excavation and transportation of blasted rock mass are the most labor intensive and costly processes, considering the volume of transported overburden and coal, as well as diesel fuel, electricity, fuel and lubricants costs, consumables for repair works and downtime, etc. Currently, it is recommended to take the number of loading buckets in the range of 3 to 5, according to which the dump trucks are distributed to faces.

  10. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    Science.gov (United States)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  11. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  12. Operating principle of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • Two control modes were developed for a B2B VSCs based SOP. • The SOP operating principle was investigated under various network conditions. • The performance of the SOP using two control modes was analyzed. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. Two control modes were developed for the operation of an SOP, using back-to-back voltage-source converters (VSCs). A power flow control mode with current control provides independent control of real and reactive power. A supply restoration mode with a voltage controller enables power supply to isolated loads due to network faults. The operating principle of the back-to-back VSCs based SOP was investigated under both normal and abnormal network operating conditions. Studies on a two-feeder medium-voltage distribution network showed the performance of the SOP under different network-operating conditions: normal, during a fault and post-fault supply restoration. During the change of network operating conditions, a mode switch method based on the phase locked loop controller was used to achieve the transitions between the two control modes. Hard transitions by a direct mode switching were noticed unfavourable, but seamless transitions were obtained by deploying a soft cold load pickup and voltage synchronization process.

  13. Flexible distributed architecture for semiconductor process control and experimentation

    Science.gov (United States)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  14. An open software system based on X Windows for process control and equipment monitoring

    International Nuclear Information System (INIS)

    Aimar, A.; Carlier, E.; Mertens, V.

    1992-01-01

    The construction and application of a configurable open software system for process control and equipment monitoring can speed up and simplify the development and maintenance of equipment specific software as compared to individual solutions. The present paper reports the status of such an approach for the distributed control systems of SPS and LEP beam transfer components, based on X Windows and the OSF/Motif tool kit and applying data modeling and software engineering methods. (author)

  15. Distribution of Selected Trace Elements in the Bayer Process

    Directory of Open Access Journals (Sweden)

    Johannes Vind

    2018-05-01

    Full Text Available The aim of this work was to achieve an understanding of the distribution of selected bauxite trace elements (gallium (Ga, vanadium (V, arsenic (As, chromium (Cr, rare earth elements (REEs, scandium (Sc in the Bayer process. The assessment was designed as a case study in an alumina plant in operation to provide an overview of the trace elements behaviour in an actual industrial setup. A combination of analytical techniques was used, mainly inductively coupled plasma mass spectrometry and optical emission spectroscopy as well as instrumental neutron activation analysis. It was found that Ga, V and As as well as, to a minor extent, Cr are principally accumulated in Bayer process liquors. In addition, Ga is also fractionated to alumina at the end of the Bayer processing cycle. The rest of these elements pass to bauxite residue. REEs and Sc have the tendency to remain practically unaffected in the solid phases of the Bayer process and, therefore, at least 98% of their mass is transferred to bauxite residue. The interest in such a study originates from the fact that many of these trace constituents of bauxite ore could potentially become valuable by-products of the Bayer process; therefore, the understanding of their behaviour needs to be expanded. In fact, Ga and V are already by-products of the Bayer process, but their distribution patterns have not been provided in the existing open literature.

  16. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  17. Opening up the Agile Innovation Process

    Science.gov (United States)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  18. OpenLMD, multimodal monitoring and control of LMD processing

    Science.gov (United States)

    Rodríguez-Araújo, Jorge; García-Díaz, Antón

    2017-02-01

    This paper presents OpenLMD, a novel open-source solution for on-line multimodal monitoring of Laser Metal Deposition (LMD). The solution is also applicable to a wider range of laser-based applications that require on-line control (e.g. laser welding). OpenLMD is a middleware that enables the orchestration and virtualization of a LMD robot cell, using several open-source frameworks (e.g. ROS, OpenCV, PCL). The solution also allows reconfiguration by easy integration of multiple sensors and processing equipment. As a result, OpenLMD delivers significant advantages over existing monitoring and control approaches, such as improved scalability, and multimodal monitoring and data sharing capabilities.

  19. When to make proprietary software open source

    NARCIS (Netherlands)

    Caulkins, J.P.; Feichtinger, G.; Grass, D.; Hartl, R.F.; Kort, P.M.; Seidl, A.

    Software can be distributed closed source (proprietary) or open source (developed collaboratively). While a firm cannot sell open source software, and so loses potential sales revenue, the open source software development process can have a substantial positive impact on the quality of a software,

  20. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    Science.gov (United States)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  1. Process of adoption communication openness in adoptive families: adopters’ perspective

    Directory of Open Access Journals (Sweden)

    Maria Acciaiuoli Barbosa-Ducharne

    2016-01-01

    Full Text Available Abstract Communication about adoption is a family interaction process which is more than the simple exchange of information. Adoption communication can be characterized in terms of the level of openness of family conversations regarding the child’s past and the degree of the family’s adoption social disclosure. The objective of this study is to explore the process of adoption communication openness in Portuguese adoptive families by identifying the impact of variables related to the adoption process, the adoptive parenting and the adoptee. One hundred twenty five parents of children aged 3 to 15, who were adopted on average 4 years ago, participated in this study. Data was collected during home visits using the Parents Adoption Process Interview. A cluster analysis identified three different groups of families according to the level of adoption communication openness within the family and outside. The findings also showed that the process of the adoption communication openness started when parents decided to adopt, developed in parent-child interaction and was susceptible to change under professional intervention. The relevance of training given to prospective adopters and of professional practice based on scientific evidence is highlighted.

  2. A Scalable Infrastructure for Lidar Topography Data Distribution, Processing, and Discovery

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Phan, M.; Cowart, C. A.; Arrowsmith, R.; Baru, C.

    2010-12-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology have emerged as a fundamental tool in the Earth sciences, and are also being widely utilized for ecological, planning, engineering, and environmental applications. Collected from airborne, terrestrial, and space-based platforms, these data are revolutionary because they permit analysis of geologic and biologic processes at resolutions essential for their appropriate representation. Public domain lidar data collection by federal, state, and local agencies are a valuable resource to the scientific community, however the data pose significant distribution challenges because of the volume and complexity of data that must be stored, managed, and processed. Lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative products. This massive volume of data is often challenging to host for resource-limited agencies. Furthermore, these data can be technically challenging for users who lack appropriate software, computing resources, and expertise. The National Science Foundation-funded OpenTopography Facility (www.opentopography.org) has developed a cyberinfrastructure-based solution to enable online access to Earth science-oriented high-resolution lidar topography data, online processing tools, and derivative products. OpenTopography provides access to terabytes of point cloud data, standard DEMs, and Google Earth image data, all co-located with computational resources for on-demand data processing. The OpenTopography portal is built upon a cyberinfrastructure platform that utilizes a Services Oriented Architecture (SOA) to provide a modular system that is highly scalable and flexible enough to support the growing needs of the Earth science lidar community. OpenTopography strives to host and provide access to datasets as soon as they become available, and also to expose greater application level functionalities to

  3. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  4. MzJava: An open source library for mass spectrometry data processing.

    Science.gov (United States)

    Horlacher, Oliver; Nikitin, Frederic; Alocci, Davide; Mariethoz, Julien; Müller, Markus; Lisacek, Frederique

    2015-11-03

    Mass spectrometry (MS) is a widely used and evolving technique for the high-throughput identification of molecules in biological samples. The need for sharing and reuse of code among bioinformaticians working with MS data prompted the design and implementation of MzJava, an open-source Java Application Programming Interface (API) for MS related data processing. MzJava provides data structures and algorithms for representing and processing mass spectra and their associated biological molecules, such as metabolites, glycans and peptides. MzJava includes functionality to perform mass calculation, peak processing (e.g. centroiding, filtering, transforming), spectrum alignment and clustering, protein digestion, fragmentation of peptides and glycans as well as scoring functions for spectrum-spectrum and peptide/glycan-spectrum matches. For data import and export MzJava implements readers and writers for commonly used data formats. For many classes support for the Hadoop MapReduce (hadoop.apache.org) and Apache Spark (spark.apache.org) frameworks for cluster computing was implemented. The library has been developed applying best practices of software engineering. To ensure that MzJava contains code that is correct and easy to use the library's API was carefully designed and thoroughly tested. MzJava is an open-source project distributed under the AGPL v3.0 licence. MzJava requires Java 1.7 or higher. Binaries, source code and documentation can be downloaded from http://mzjava.expasy.org and https://bitbucket.org/sib-pig/mzjava. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Open Government and (Linked (Open (Government (Data

    Directory of Open Access Journals (Sweden)

    Christian Philipp Geiger

    2012-12-01

    Full Text Available This article explores the opening and the free usage of stored public sector data, supplied by state. In the age of Open Government and Open Data it’s not enough just to put data online. It should be rather weighed out whether, how and which supplied public sector data can be published. Open Data are defined as stored data which could be made accessible in a public interest without any restrictions for usage and distribution. These Open Data can possibly be statistics, geo data, maps, plans, environmental data and weather data in addition to materials of the parliaments, ministries and authorities. The preparation and the free access to existing data permit varied approaches to the reuse of data, discussed in the article. In addition, impulses can be given for Open Government – the opening of state and administration, to more transparency, participation and collaboration as well as to innovation and business development. The Open Data movement tries to get to the bottom of current publication processes in the public sector which could be formed even more friendly to citizens and enterprises.

  6. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  7. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  8. 17 CFR 270.12b-1 - Distribution of shares by registered open-end management investment company.

    Science.gov (United States)

    2010-04-01

    ... registered open-end management investment company. 270.12b-1 Section 270.12b-1 Commodity and Securities... 1940 § 270.12b-1 Distribution of shares by registered open-end management investment company. (a)(1... the printing and mailing of sales literature; (b) A registered, open-end management investment company...

  9. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  10. Safe Distribution of Declarative Processes

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2011-01-01

    of projections that covers a DCR Graph that the network of synchronously communicating DCR Graphs given by the projections is bisimilar to the original global process graph. We exemplify the distribution technique on a process identified in a case study of an cross-organizational case management system carried...... process model generalizing labelled prime event structures to a systems model able to finitely represent ω-regular languages. An operational semantics given as a transition semantics between markings of the graph allows DCR Graphs to be conveniently used as both specification and execution model....... The technique for distribution is based on a new general notion of projection of DCR Graphs relative to a subset of labels and events identifying the set of external events that must be communicated from the other processes in the network in order for the distribution to be safe.We prove that for any vector...

  11. IOC-UNEP review meeting on oceanographic processes of transport and distribution of pollutants in the sea

    International Nuclear Information System (INIS)

    1991-01-01

    The IOC-UNEP Review Meeting on Oceanographic Processes of Transfer and Distribution of Pollutants in the Sea was opened at the Ruder Boskovic Institute, Zagreb, Yugoslavia on Monday, 15 May 1989. Papers presented at the meeting dealt with physical and geochemical processes in sea-water and sediment in transport mixing and dispersal of pollutants. The importance of mesoscale eddies and gyres in the open sea, wind-driven currents and upwelling events in the coastal zone, and thermohaline processes in semi-enclosed bays and estuaries was recognized. There is strong evidence that non-local forcing can drive circulation in the coastal area. Concentrations, horizontal and vertical distributions and transport of pollutants were investigated and presented for a number of coastal areas. Riverine and atmospheric inputs of different pollutants to the western Mediterranean were discussed. Reports on two on-going nationally/internationally co-ordinated projects (MEDMODEL, EROS 2000) were presented. Discussions during the meeting enabled an exchange of ideas between specialists in different disciplines to be made. It is expected that this will promote the future interdisciplinary approach in this field. The meeting recognized the importance of physical oceanographic studies in investigating the transfer and distribution of pollutants in the sea and in view of the importance of the interdisciplinary approach and bilateral and/or multilateral co-operation a number of recommendations were adopted

  12. IJ-OpenCV: Combining ImageJ and OpenCV for processing images in biomedicine.

    Science.gov (United States)

    Domínguez, César; Heras, Jónathan; Pascual, Vico

    2017-05-01

    The effective processing of biomedical images usually requires the interoperability of diverse software tools that have different aims but are complementary. The goal of this work is to develop a bridge to connect two of those tools: ImageJ, a program for image analysis in life sciences, and OpenCV, a computer vision and machine learning library. Based on a thorough analysis of ImageJ and OpenCV, we detected the features of these systems that could be enhanced, and developed a library to combine both tools, taking advantage of the strengths of each system. The library was implemented on top of the SciJava converter framework. We also provide a methodology to use this library. We have developed the publicly available library IJ-OpenCV that can be employed to create applications combining features from both ImageJ and OpenCV. From the perspective of ImageJ developers, they can use IJ-OpenCV to easily create plugins that use any functionality provided by the OpenCV library and explore different alternatives. From the perspective of OpenCV developers, this library provides a link to the ImageJ graphical user interface and all its features to handle regions of interest. The IJ-OpenCV library bridges the gap between ImageJ and OpenCV, allowing the connection and the cooperation of these two systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. The eGo grid model: An open-source and open-data based synthetic medium-voltage grid model for distribution power supply systems

    Science.gov (United States)

    Amme, J.; Pleßmann, G.; Bühler, J.; Hülk, L.; Kötter, E.; Schwaegerl, P.

    2018-02-01

    The increasing integration of renewable energy into the electricity supply system creates new challenges for distribution grids. The planning and operation of distribution systems requires appropriate grid models that consider the heterogeneity of existing grids. In this paper, we describe a novel method to generate synthetic medium-voltage (MV) grids, which we applied in our DIstribution Network GeneratOr (DINGO). DINGO is open-source software and uses freely available data. Medium-voltage grid topologies are synthesized based on location and electricity demand in defined demand areas. For this purpose, we use GIS data containing demand areas with high-resolution spatial data on physical properties, land use, energy, and demography. The grid topology is treated as a capacitated vehicle routing problem (CVRP) combined with a local search metaheuristics. We also consider the current planning principles for MV distribution networks, paying special attention to line congestion and voltage limit violations. In the modelling process, we included power flow calculations for validation. The resulting grid model datasets contain 3608 synthetic MV grids in high resolution, covering all of Germany and taking local characteristics into account. We compared the modelled networks with real network data. In terms of number of transformers and total cable length, we conclude that the method presented in this paper generates realistic grids that could be used to implement a cost-optimised electrical energy system.

  14. Study of the Release Process of Open Source Software: Case Study

    OpenAIRE

    Eide, Tor Erik

    2007-01-01

    This report presents the results of a case study focusing on the release process of open source projects initiated with commercial motives. The purpose of the study is to gain an increased understanding of the release process, how a community can be attracted to the project, and how the interaction with the community evolves in commercial open source initiatives. Data has been gathered from four distinct sources to form the basis of this thesis. A thorough review of the open source literatu...

  15. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    Science.gov (United States)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  16. A radial distribution function-based open boundary force model for multi-centered molecules

    KAUST Repository

    Neumann, Philipp; Eckhardt, Wolfgang; Bungartz, Hans-Joachim

    2014-01-01

    We derive an expression for radial distribution function (RDF)-based open boundary forcing for molecules with multiple interaction sites. Due to the high-dimensionality of the molecule configuration space and missing rotational invariance, a

  17. Distributed password cracking

    OpenAIRE

    Crumpacker, John R.

    2009-01-01

    Approved for public release, distribution unlimited Password cracking requires significant processing power, which in today's world is located at a workstation or home in the form of a desktop computer. Berkeley Open Infrastructure for Network Computing (BOINC) is the conduit to this significant source of processing power and John the Ripper is the key. BOINC is a distributed data processing system that incorporates client-server relationships to generically process data. The BOINC structu...

  18. Opening up the Innovation Process

    DEFF Research Database (Denmark)

    Vujovic, Sladjana; Ulhøi, John Parm

    2008-01-01

    An organization's ability to create, retrieve, and use knowledge to innovate is a critical strategic asset. Until recently, most textbooks on business and product development argued that managers should keep their new ideas to themselves and protect knowledge from getting into competitors' hands....... Seeking, developing, and protecting knowledge is a costly endeavour. Moreover, apart from being expensive, the process of turning new knowledge into useful and well-protected innovations often slows the speed of development and increases costs. In this chapter, alternative strategies for innovation......, in which sharing and co-operation play a critical part, are discussed. In particular, we address the involvement of users in opening up the innovation process which, in turn, offers participating actors some useful strategies for product development. Four archetypal strategies are identified and classified...

  19. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  20. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    Science.gov (United States)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated

  1. Experiments to Distribute Map Generalization Processes

    Science.gov (United States)

    Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas

    2018-05-01

    Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.

  2. Distribution and behavior of transuranics in the open ocean

    International Nuclear Information System (INIS)

    Nakamura, Kiyoshi

    1996-01-01

    The major source of 239,240 Pu in the open ocean is the global fallout from the atmospheric weapon testing. In 1987-1989, the latitudinal distribution of 239,240 Pu in the surface water showed the pattern reflecting a global deposition in both the Pacific and the Atlantic Ocean. The feature of the 239,240 Pu vertical distribution is that a subsurface maximum exists at depth from 200m to 1000m over a large area of the ocean. Inventories of 239,240 Pu in the water column are often higher than those expected from global fallout and close-in fallout is suggested to be the additional inventory in the Pacific Ocean. The 239,240 Pu inventory in sediment has the values in the range of 2 to 27% of that in sea water. The use of a multiple corer is suggested for sediment sampling. (author)

  3. On Distributed Port-Hamiltonian Process Systems

    NARCIS (Netherlands)

    Lopezlena, Ricardo; Scherpen, Jacquelien M.A.

    2004-01-01

    In this paper we use the term distributed port-Hamiltonian Process Systems (DPHPS) to refer to the result of merging the theory of distributed Port-Hamiltonian systems (DPHS) with the theory of process systems (PS). Such concept is useful for combining the systematic interconnection of PHS with the

  4. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This paper establishes a remarkable result regarding Palm distributions for a log Gaussian Cox process: the reduced Palm distribution for a log Gaussian Cox process is itself a log Gaussian Cox process that only differs from the original log Gaussian Cox process in the intensity function. This new...... result is used to study functional summaries for log Gaussian Cox processes....

  5. Design Principles for Improving the Process of Publishing Open data

    NARCIS (Netherlands)

    Zuiderwijk, A.M.G.; Janssen, M.F.W.H.A.; Choenni, R.; Meijer, R.F.

    2014-01-01

    · Purpose: Governments create large amounts of data. However, the publication of open data is often cumbersome and there are no standard procedures and processes for opening data. This blocks the easy publication of government data. The purpose of this paper is to derive design principles for

  6. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  7. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  8. Web-Based Distributed XML Query Processing

    NARCIS (Netherlands)

    Smiljanic, M.; Feng, L.; Jonker, Willem; Blanken, Henk; Grabs, T.; Schek, H-J.; Schenkel, R.; Weikum, G.

    2003-01-01

    Web-based distributed XML query processing has gained in importance in recent years due to the widespread popularity of XML on the Web. Unlike centralized and tightly coupled distributed systems, Web-based distributed database systems are highly unpredictable and uncontrollable, with a rather

  9. Parallel and distributed processing: applications to power systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Felix; Murphy, Liam [California Univ., Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1994-12-31

    Applications of parallel and distributed processing to power systems problems are still in the early stages. Rapid progress in computing and communications promises a revolutionary increase in the capacity of distributed processing systems. In this paper, the state-of-the art in distributed processing technology and applications is reviewed and future trends are discussed. (author) 14 refs.,1 tab.

  10. Plastic debris in the open ocean

    KAUST Repository

    Cozar, Andres

    2014-06-30

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. However, the global load of plastic on the open ocean surface was estimated to be on the order of tens of thousands of tons, far less than expected. Our observations of the size distribution of floating plastic debris point at important size-selective sinks removing millimeter-sized fragments of floating plastic on a large scale. This sink may involve a combination of fast nano-fragmentation of the microplastic into particles of microns or smaller, their transference to the ocean interior by food webs and ballasting processes, and processes yet to be discovered. Resolving the fate of the missing plastic debris is of fundamental importance to determine the nature and significance of the impacts of plastic pollution in the ocean.

  11. Plastic debris in the open ocean.

    Science.gov (United States)

    Cózar, Andrés; Echevarría, Fidel; González-Gordillo, J Ignacio; Irigoien, Xabier; Ubeda, Bárbara; Hernández-León, Santiago; Palma, Alvaro T; Navarro, Sandra; García-de-Lomas, Juan; Ruiz, Andrea; Fernández-de-Puelles, María L; Duarte, Carlos M

    2014-07-15

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. However, the global load of plastic on the open ocean surface was estimated to be on the order of tens of thousands of tons, far less than expected. Our observations of the size distribution of floating plastic debris point at important size-selective sinks removing millimeter-sized fragments of floating plastic on a large scale. This sink may involve a combination of fast nano-fragmentation of the microplastic into particles of microns or smaller, their transference to the ocean interior by food webs and ballasting processes, and processes yet to be discovered. Resolving the fate of the missing plastic debris is of fundamental importance to determine the nature and significance of the impacts of plastic pollution in the ocean.

  12. Plastic debris in the open ocean

    Science.gov (United States)

    Cózar, Andrés; Echevarría, Fidel; González-Gordillo, J. Ignacio; Irigoien, Xabier; Úbeda, Bárbara; Hernández-León, Santiago; Palma, Álvaro T.; Navarro, Sandra; García-de-Lomas, Juan; Ruiz, Andrea; Fernández-de-Puelles, María L.; Duarte, Carlos M.

    2014-01-01

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. However, the global load of plastic on the open ocean surface was estimated to be on the order of tens of thousands of tons, far less than expected. Our observations of the size distribution of floating plastic debris point at important size-selective sinks removing millimeter-sized fragments of floating plastic on a large scale. This sink may involve a combination of fast nano-fragmentation of the microplastic into particles of microns or smaller, their transference to the ocean interior by food webs and ballasting processes, and processes yet to be discovered. Resolving the fate of the missing plastic debris is of fundamental importance to determine the nature and significance of the impacts of plastic pollution in the ocean. PMID:24982135

  13. Multivariate semi-logistic distribution and processes | Umar | Journal ...

    African Journals Online (AJOL)

    Multivariate semi-logistic distribution is introduced and studied. Some characterizations properties of multivariate semi-logistic distribution are presented. First order autoregressive minification processes and its generalization to kth order autoregressive minification processes with multivariate semi-logistic distribution as ...

  14. Study on Manufacturing Process of Hollow Main Shaft by Open Die Forging

    International Nuclear Information System (INIS)

    Kwon, Yong Chul; Kang, Jong Hun; Kim, Sang Sik

    2016-01-01

    The main shaft is one of the key components connecting the rotor hub and gear box of a wind power generator. Typically, main shafts are manufactured by open die forging method. However, the main shaft for large MW class wind generators is designed to be hollow in order to reduce the weight. Additionally, the main shafts are manufactured by a casting process. This study aims to develop a manufacturing process for hollow main shafts by the open die forging method. The design of a forging process for a solid main shaft and hollow shaft was prepared by an open die forging process design scheme. Finite element analyses were performed to obtain the flow stress by a hot compression test at different temperature and strain rates. The control parameters of each forging process, such as temperature and effective strain, were obtained and compared to predict the suitability of the hollow main shaft forging process. Finally, high productivity reflecting material utilization ratio, internal quality, shape, and dimension was verified by the prototypes manufactured by the proposed forging process for hollow main shafts

  15. Study on Manufacturing Process of Hollow Main Shaft by Open Die Forging

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yong Chul [Gyeongnam Technopark, Changwon (Korea, Republic of); Kang, Jong Hun [Jungwon Univ., Goisan (Korea, Republic of); Kim, Sang Sik [Gyeongsang Natiional Univ., Jinju (Korea, Republic of)

    2016-02-15

    The main shaft is one of the key components connecting the rotor hub and gear box of a wind power generator. Typically, main shafts are manufactured by open die forging method. However, the main shaft for large MW class wind generators is designed to be hollow in order to reduce the weight. Additionally, the main shafts are manufactured by a casting process. This study aims to develop a manufacturing process for hollow main shafts by the open die forging method. The design of a forging process for a solid main shaft and hollow shaft was prepared by an open die forging process design scheme. Finite element analyses were performed to obtain the flow stress by a hot compression test at different temperature and strain rates. The control parameters of each forging process, such as temperature and effective strain, were obtained and compared to predict the suitability of the hollow main shaft forging process. Finally, high productivity reflecting material utilization ratio, internal quality, shape, and dimension was verified by the prototypes manufactured by the proposed forging process for hollow main shafts.

  16. A radial distribution function-based open boundary force model for multi-centered molecules

    KAUST Repository

    Neumann, Philipp

    2014-06-01

    We derive an expression for radial distribution function (RDF)-based open boundary forcing for molecules with multiple interaction sites. Due to the high-dimensionality of the molecule configuration space and missing rotational invariance, a computationally cheap, 1D approximation of the arising integral expressions as in the single-centered case is not possible anymore. We propose a simple, yet accurate model invoking standard molecule- and site-based RDFs to approximate the respective integral equation. The new open boundary force model is validated for ethane in different scenarios and shows very good agreement with data from periodic simulations. © World Scientific Publishing Company.

  17. Analysis of multi-stage open shop processing systems

    NARCIS (Netherlands)

    Eggermont, C.E.J.; Schrijver, A.; Woeginger, G.J.; Schwentick, T.; Dürr, C.

    2011-01-01

    We study algorithmic problems in multi-stage open shop processing systems that are centered around reachability and deadlock detection questions. We characterize safe and unsafe system states. We show that it is easy to recognize system states that can be reached from the initial state (where the

  18. AVIRIS and TIMS data processing and distribution at the land processes distributed active archive center

    Science.gov (United States)

    Mah, G. R.; Myers, J.

    1993-01-01

    The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial

  19. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  20. Palm distributions for log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus

    This paper reviews useful results related to Palm distributions of spatial point processes and provides a new result regarding the characterization of Palm distributions for the class of log Gaussian Cox processes. This result is used to study functional summary statistics for a log Gaussian Cox...

  1. Post-Processing Resolution Enhancement of Open Skies Photographic Imagery

    National Research Council Canada - National Science Library

    Sperl, Daniel

    2000-01-01

    ...), which manages implementation of the Open Skies Treaty for the US Air Force, wants to determine if post-processing of the photographic images can improve spatial resolution beyond 30 cm, and if so...

  2. Evolution of the Campanian Ignimbrite Magmatic System II: Trace Element and Th Isotopic Evidence for Open-System Processes

    Science.gov (United States)

    Bohrson, W. A.; Spera, F. J.; Fowler, S.; Belkin, H.; de Vivo, B.

    2005-12-01

    The Campanian Ignimbrite, a large volume (~200 km3 DRE) trachytic to phonolitic ignimbrite was deposited at ~39.3 ka and represents the largest of a number of highly explosive volcanic events in the region near Naples, Italy. Thermodynamic modeling of the major element evolution using the MELTS algorithm (see companion contribution by Fowler et al.) provides detailed information about the identity of and changes in proportions of solids along the liquid line of descent during isobaric fractional crystallization. We have derived trace element mass balance equations that explicitly accommodate changing mineral-melt bulk distribution coefficients during crystallization and also simultaneously satisfy energy and major element mass conservation. Although major element patterns are reasonably modeled assuming closed system fractional crystallization, modeling of trace elements that represent a range of behaviors (e.g. Zr, Nb, Th, U, Rb, Sm, Sr) yields trends for closed system fractionation that are distinct from those observed. These results suggest open-system processes were also important in the evolution of the Campanian magmatic system. Th isotope data yield an apparent isochron that is ~20 kyr younger than the age of the deposit, and age-corrected Th isotope data indicate that the magma body was an open-system at the time of eruption. Because open-system processes can profoundly change isotopic characteristics of a magma body, these results illustrate that it is critical to understand the contribution that open-system processes make to silicic magma bodies prior to assigning relevance to age or timescale information derived from isotope systematics. Fluid-magma interaction has been proposed as a mechanism to change isotopic and elemental characteristics of magma bodies, but an evaluation of the mass and thermal constraints on such a process suggest large-scale fluid-melt interaction at liquidus temperatures is unlikely. In the case of the magma body associated with

  3. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    Science.gov (United States)

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  4. Parallel and Distributed Data Processing Using Autonomous ...

    African Journals Online (AJOL)

    Looking at the distributed nature of these networks, data is processed by remote login or Remote Procedure Calls (RPC), this causes congestion in the network bandwidth. This paper proposes a framework where software agents are assigned duties to be processing the distributed data concurrently and assembling the ...

  5. Process evaluation distributed system

    Science.gov (United States)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  6. Design of distributed systems of hydrolithosphere processes management. A synthesis of distributed management systems

    Science.gov (United States)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The paper considers an important problem of designing distributed systems of hydrolithosphere processes management. The control actions on the hydrolithosphere processes under consideration are implemented by a set of extractive wells. The article shows the method of defining the approximation links for description of the dynamic characteristics of hydrolithosphere processes. The structure of distributed regulators, used in the management systems by the considered processes, is presented. The paper analyses the results of the synthesis of the distributed management system and the results of modelling the closed-loop control system by the parameters of the hydrolithosphere process.

  7. In vitro simulation of distribution processes following intramuscular injection

    Directory of Open Access Journals (Sweden)

    Probst Mareike

    2016-09-01

    Full Text Available There is an urgent need for in vitro dissolution test setups for intramuscularly applied dosage forms. Especially biorelevant methods are needed to predict the in vivo behavior of newly developed dosage forms in a realistic way. There is a lack of knowledge regarding critical in vivo parameters influencing the release and absorption behavior of an intramuscularly applied drug. In the presented work the focus was set on the simulation of blood perfusion and muscle tissue. A solid agarose gel, being incorporated in an open-pored foam, was used to mimic the gel phase of muscle tissue and implemented in a flow through cell. An aqueous solution of fluorescein sodium was injected. Compared to recently obtained in vivo results the distribution of the model substance was very slow. Furthermore an agarose gel of lower viscosity an open-pored foam and phosphate buffer saline pH 7.4 were implemented in a multi-channel-ceramic membrane serving as a holder for the muscle imitating material. Blood simulating release medium was perfused through the ceramic membrane including filling materials. Transport of the dissolved fluorescein sodium was, in case of the gel, not only determined by diffusion but also by convective transport processes. The more realistic the muscle simulating materials were constituted the less reproducible results were obtained with the designed test setups.

  8. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    Science.gov (United States)

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  9. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  10. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  11. An Analysis of OpenACC Programming Model: Image Processing Algorithms as a Case Study

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2014-06-01

    Full Text Available Graphics processing units and similar accelerators have been intensively used in general purpose computations for several years. In the last decade, GPU architecture and organization changed dramatically to support an ever-increasing demand for computing power. Along with changes in hardware, novel programming models have been proposed, such as NVIDIA’s Compute Unified Device Architecture (CUDA and Open Computing Language (OpenCL by Khronos group. Although numerous commercial and scientific applications have been developed using these two models, they still impose a significant challenge for less experienced users. There are users from various scientific and engineering communities who would like to speed up their applications without the need to deeply understand a low-level programming model and underlying hardware. In 2011, OpenACC programming model was launched. Much like OpenMP for multicore processors, OpenACC is a high-level, directive-based programming model for manycore processors like GPUs. This paper presents an analysis of OpenACC programming model and its applicability in typical domains like image processing. Three, simple image processing algorithms have been implemented for execution on the GPU with OpenACC. The results were compared with their sequential counterparts, and results are briefly discussed.

  12. A tutorial on Palm distributions for spatial point processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This tutorial provides an introduction to Palm distributions for spatial point processes. Initially, in the context of finite point processes, we give an explicit definition of Palm distributions in terms of their density functions. Then we review Palm distributions in the general case. Finally, we...

  13. Distribution and emission of oxamyl in a rockwool cultivation system with open drainage of the nutrient solution

    NARCIS (Netherlands)

    Runia, W.T.; Dekker, A.; Houx, N.W.H.

    1995-01-01

    On a 1.8 ha eggplant nursery with open drainage of the excess of nutrient solution the distribution of oxamyl was measured after it had been added to the nutrient solution. When it was applied via injection at a tap of a section, the distribution was more uniform than when applied via the central

  14. Distributed GIS Systems, Open Specifications and Interoperability: How do They Relate to the Sustainable Management of Natural Resources?

    Science.gov (United States)

    Rafael Moreno-Sanchez

    2006-01-01

    The aim of this is paper is to provide a conceptual framework for the session: “The role of web-based Geographic Information Systems in supporting sustainable management.” The concepts of sustainability, sustainable forest management, Web Services, Distributed Geographic Information Systems, interoperability, Open Specifications, and Open Source Software are defined...

  15. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  16. Image exploitation and dissemination prototype of distributed image processing

    International Nuclear Information System (INIS)

    Batool, N.; Huqqani, A.A.; Mahmood, A.

    2003-05-01

    Image processing applications requirements can be best met by using the distributed environment. This report presents to draw inferences by utilizing the existed LAN resources under the distributed computing environment using Java and web technology for extensive processing to make it truly system independent. Although the environment has been tested using image processing applications, its design and architecture is truly general and modular so that it can be used for other applications as well, which require distributed processing. Images originating from server are fed to the workers along with the desired operations to be performed on them. The Server distributes the task among the Workers who carry out the required operations and send back the results. This application has been implemented using the Remote Method Invocation (RMl) feature of Java. Java RMI allows an object running in one Java Virtual Machine (JVM) to invoke methods on another JVM thus providing remote communication between programs written in the Java programming language. RMI can therefore be used to develop distributed applications [1]. We undertook this project to gain a better understanding of distributed systems concepts and its uses for resource hungry jobs. The image processing application is developed under this environment

  17. Production of Polystyrene Open-celled Microcellular Foam in Batch Process by Super Critical CO2

    Directory of Open Access Journals (Sweden)

    M.S. Enayati

    2010-12-01

    Full Text Available Open-celled foams are capable to allow the passage of fluids through their structure, because of interconnections between the open cells or bubbles and therefore these structures can be used as a membrane and filter. In thiswork, we have studied the production of polystyrene open-celled microcellular foam by using CO2 as blowing agent. To achieve such structures, it is necessary to control the stages of growth in such a way that the cells would connect to each other through the pores without any coalescence. The required processing condition to achieve open-celled structures is predictable by a model theory of opened-cell. This model suggests that at least a 130 bar saturation pressure and foaming time between 9 and 58 s are required for this system. The temperature range has been selected for to be both higher than polymer glass transition temperature and facilitating the foaming process. Experimental results in the batch foaming process has verified the model quite well. The SEM and mercury porousimetry tests show the presence of pores between the cells with open-celled structure. Experimental results show that by increasing the saturation pressure and the foaming temperature, there is a drop in the time required for open-celled structure formation. A 130 bar saturation pressure, 150o C foaming temperature and 60 s foaming time, suggest the attainment of open-celled microcellular foam based on polystyrene/CO2 system in the batch process.

  18. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  19. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  20. SUPERPOSITION OF STOCHASTIC PROCESSES AND THE RESULTING PARTICLE DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Schwadron, N. A.; Dayeh, M. A.; Desai, M.; Fahr, H.; Jokipii, J. R.; Lee, M. A.

    2010-01-01

    Many observations of suprathermal and energetic particles in the solar wind and the inner heliosheath show that distribution functions scale approximately with the inverse of particle speed (v) to the fifth power. Although there are exceptions to this behavior, there is a growing need to understand why this type of distribution function appears so frequently. This paper develops the concept that a superposition of exponential and Gaussian distributions with different characteristic speeds and temperatures show power-law tails. The particular type of distribution function, f ∝ v -5 , appears in a number of different ways: (1) a series of Poisson-like processes where entropy is maximized with the rates of individual processes inversely proportional to the characteristic exponential speed, (2) a series of Gaussian distributions where the entropy is maximized with the rates of individual processes inversely proportional to temperature and the density of individual Gaussian distributions proportional to temperature, and (3) a series of different diffusively accelerated energetic particle spectra with individual spectra derived from observations (1997-2002) of a multiplicity of different shocks. Thus, we develop a proof-of-concept for the superposition of stochastic processes that give rise to power-law distribution functions.

  1. OpenGeoSys: An open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media

    DEFF Research Database (Denmark)

    Kolditz, O.; Bauer, S.; Bilke, L.

    In this paper we describe the OpenGeoSys (OGS) project, which is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical processes in porous media. The basic concept is to provide a flexible numerical framework (using primarily the Finite Element Method (FEM...

  2. First-Passage-Time Distribution for Variable-Diffusion Processes

    Science.gov (United States)

    Barney, Liberty; Gunaratne, Gemunu H.

    2017-05-01

    First-passage-time distribution, which presents the likelihood of a stock reaching a pre-specified price at a given time, is useful in establishing the value of financial instruments and in designing trading strategies. First-passage-time distribution for Wiener processes has a single peak, while that for stocks exhibits a notable second peak within a trading day. This feature has only been discussed sporadically—often dismissed as due to insufficient/incorrect data or circumvented by conversion to tick time—and to the best of our knowledge has not been explained in terms of the underlying stochastic process. It was shown previously that intra-day variations in the market can be modeled by a stochastic process containing two variable-diffusion processes (Hua et al. in, Physica A 419:221-233, 2015). We show here that the first-passage-time distribution of this two-stage variable-diffusion model does exhibit a behavior similar to the empirical observation. In addition, we find that an extended model incorporating overnight price fluctuations exhibits intra- and inter-day behavior similar to those of empirical first-passage-time distributions.

  3. Experimental investigation of the ion current distribution in microsecond plasma opening switch

    Energy Technology Data Exchange (ETDEWEB)

    Bystritskij, V; Grigor` ev, S; Kharlov, A; Sinebryukhov, A [Russian Academy of Sciences, Tomsk (Russian Federation). Institute of Electrophysics

    1997-12-31

    This paper is devoted to the investigations of properties of the microsecond plasma opening switch (MPOS) as an ion beam source for surface modification. Two plasma sources were investigated: flash-board and cable guns. The detailed measurements of axial and azimuthal distributions of ion current density in the switch were performed. It was found that the azimuthal inhomogeneity of the ion beam increases from the beginning to the end of MPOS. The advantages and problems of this approach are discussed. (author). 5 figs., 2 refs.

  4. Limiting conditional distributions for birth-death processes

    NARCIS (Netherlands)

    Kijima, M.; Nair, M.G.; Pollett, P.K.; van Doorn, Erik A.

    1997-01-01

    In a recent paper one of us identified all of the quasi-stationary distributions for a non-explosive, evanescent birth-death process for which absorption is certain, and established conditions for the existence of the corresponding limiting conditional distributions. Our purpose is to extend these

  5. Article Processing Charges and OpenAPC

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The publication landscape is about to change. While being largely operated by subscription based journals in the past, recent political decisions force the publishing industry towards OpenAccess. Especially, the publication of the Finch report in 2012 put APC based Gold OpenAccess models almost everywhere on the agenda. These models also require quite some adoptions for library work flows to handle payments, bills and centralized funds for publication fees. Sometimes handled in specialized systems (e.g. first setups in Jülich) pretty early on discussions started to handle APCs in local repositories which would also hold the OpenAccess content resulting from these fees, e.g. the University of Regenburg uses ePrints for this purpose. Backed up by the OpenData movmement, libraries also saw opportunity to exchange data about fees payed. Thus, OpenAPC.de was born in 2014 on github to facilitate this exchange and aggregate large amounts of data for evaluation and comparison. Using the repository to hold payment d...

  6. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    Science.gov (United States)

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography

  7. Open Technology Development: Roadmap Plan

    National Research Council Canada - National Science Library

    Herz, J. C; Lucas, Mark; Scott, John

    2006-01-01

    .... Collaborative and distributed online tools; and 4. Technological Agility. Open standards and interfaces were initially established through ARPA and distributed via open source software reference implementations...

  8. Extrusion Process by Finite Volume Method Using OpenFoam Software

    International Nuclear Information System (INIS)

    Matos Martins, Marcelo; Tonini Button, Sergio; Divo Bressan, Jose; Ivankovic, Alojz

    2011-01-01

    The computational codes are very important tools to solve engineering problems. In the analysis of metal forming process, such as extrusion, this is not different because the computational codes allow analyzing the process with reduced cost. Traditionally, the Finite Element Method is used to solve solid mechanic problems, however, the Finite Volume Method (FVM) have been gaining force in this field of applications. This paper presents the velocity field and friction coefficient variation results, obtained by numerical simulation using the OpenFoam Software and the FVM to solve an aluminum direct cold extrusion process.

  9. Distributed Aerodynamic Sensing and Processing Toolbox

    Science.gov (United States)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  10. Locality-Aware Task Scheduling and Data Distribution for OpenMP Programs on NUMA Systems and Manycore Processors

    Directory of Open Access Journals (Sweden)

    Ananya Muddukrishna

    2015-01-01

    Full Text Available Performance degradation due to nonuniform data access latencies has worsened on NUMA systems and can now be felt on-chip in manycore processors. Distributing data across NUMA nodes and manycore processor caches is necessary to reduce the impact of nonuniform latencies. However, techniques for distributing data are error-prone and fragile and require low-level architectural knowledge. Existing task scheduling policies favor quick load-balancing at the expense of locality and ignore NUMA node/manycore cache access latencies while scheduling. Locality-aware scheduling, in conjunction with or as a replacement for existing scheduling, is necessary to minimize NUMA effects and sustain performance. We present a data distribution and locality-aware scheduling technique for task-based OpenMP programs executing on NUMA systems and manycore processors. Our technique relieves the programmer from thinking of NUMA system/manycore processor architecture details by delegating data distribution to the runtime system and uses task data dependence information to guide the scheduling of OpenMP tasks to reduce data stall times. We demonstrate our technique on a four-socket AMD Opteron machine with eight NUMA nodes and on the TILEPro64 processor and identify that data distribution and locality-aware task scheduling improve performance up to 69% for scientific benchmarks compared to default policies and yet provide an architecture-oblivious approach for programmers.

  11. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.

    2016-05-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  12. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.; Tamarkin, Molly

    2016-01-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  13. Hydrocarbon distributions in sediments of the open area of the Arabian Gulf following the 1991 Gulf War oil spill

    International Nuclear Information System (INIS)

    Al-Lihaibi, S.S.; Ghazi, S.J.

    1997-01-01

    Surface sediments collected from the open area of the Arabian Gulf were analysed for total petroleum hydrocarbons and specific aliphatic hydrocarbon components in order to provide information on the extent of oil contamination and the degree of weathering of the spilled oil following the Gulf War. The surface distribution of the petroleum hydrocarbons showed an increasing trend towards the north-east, and among the individual transects there was a pronounced increasing trend towards the north-west direction. Despite off-shore oil-related activities as well as a potential impact from the 1991 oil spill, the concentrations of petroleum hydrocarbons in the study area were relatively low. This finding may be attributed to the effectiveness of weathering processes. (author)

  14. Characteristics of the Audit Processes for Distributed Informatics Systems

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2009-01-01

    Full Text Available The paper contains issues regarding: main characteristics and examples of the distributed informatics systems and main difference categories among them, concepts, principles, techniques and fields for auditing the distributed informatics systems, concepts and classes of the standard term, characteristics of this one, examples of standards, guidelines, procedures and controls for auditing the distributed informatics systems. The distributed informatics systems are characterized by the following issues: development process, resources, implemented functionalities, architectures, system classes, particularities. The audit framework has two sides: the audit process and auditors. The audit process must be led in accordance with the standard specifications in the IT&C field. The auditors must meet the ethical principles and they must have a high-level of professional skills and competence in IT&C field.

  15. Distributed open environment for data retrieval based on pattern recognition techniques

    International Nuclear Information System (INIS)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A.

    2010-01-01

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  16. Distributed Open Environment for Data Retrieval based on Pattern Recognition Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Full text of publication follows: Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, inter-operability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE, which provides a mature standard framework and a modular architecture. It can handle transactions and competition of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows concealment of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests. (authors)

  17. Distributed open environment for data retrieval based on pattern recognition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A., E-mail: augusto.pereira@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain); Vega, J.; Castro, R.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain)

    2010-07-15

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  18. Joint probability distributions for a class of non-Markovian processes.

    Science.gov (United States)

    Baule, A; Friedrich, R

    2005-02-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.

  19. Exploring Coordination Structures in Open Source Software Development

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Harmsen, Frank; Hegeman, J.H.; Amrit, Chintan Amrit; Geisberger, Eva; Keil, Patrick; Kuhrmann, Marco

    2007-01-01

    Coordination is difficult to achieve in a large globally distributed project setting. The problem is multiplied in open source software development projects, where most of the traditional means of coordination such as plans, system-level designs, schedules and defined process are not used. In order

  20. Standardization as an Arena for Open Innovation

    Science.gov (United States)

    Grøtnes, Endre

    This paper argues that anticipatory standardization can be viewed as an arena for open innovation and shows this through two cases from mobile telecommunication standardization. One case is the Android initiative by Google and the Open Handset Alliance, while the second case is the general standardization work of the Open Mobile Alliance. The paper shows how anticipatory standardization intentionally uses inbound and outbound streams of research and intellectual property to create new innovations. This is at the heart of the open innovation model. The standardization activities use both pooling of R&D and the distribution of freely available toolkits to create products and architectures that can be utilized by the participants and third parties to leverage their innovation. The paper shows that the technology being standardized needs to have a systemic nature to be part of an open innovation process.

  1. A Technical Survey on Optimization of Processing Geo Distributed Data

    Science.gov (United States)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  2. Deterministic Design Optimization of Structures in OpenMDAO Framework

    Science.gov (United States)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  3. New insight into California’s drought through open data

    Science.gov (United States)

    Read, Emily K.; Bucknell, Mary; Hines, Megan K.; Kreft, James M.; Lucido, Jessica M.; Read, Jordan S.; Schroedl, Carl; Sibley, David M.; Stephan, Shirley; Suftin, Ivan; Thongsavanh, Phethala; Van Den Hoek, Jamon; Walker, Jordan I.; Wernimont, Martin R; Winslow, Luke A.; Yan, Andrew N.

    2015-01-01

    Historically unprecedented drought in California has brought water issues to the forefront of the nation’s attention. Crucial investigations that concern water policy, management, and research, in turn, require extensive information about the quality and quantity of California’s water. Unfortunately, key sources of pertinent data are unevenly distributed and frequently hard to find. Thankfully, the vital importance of integrating water data across federal, state, and tribal, academic, and private entities, has recently been recognized and addressed through federal initiatives such as the Climate Data Initiative of President Obama’s Climate Action Plan and the Advisory Committee on Water Information’sOpen Water Data Initiative. Here, we demonstrate an application of integrated open water data, visualized and made available online using open source software, for the purpose of exploring the impact of the current California drought. Our collaborative approach and technical tools enabled a rapid, distributed development process. Many positive outcomes have resulted: the application received recognition within and outside of the Federal Government, inspired others to visualize open water data, spurred new collaborations for our group, and strengthened the collaborative relationships within the team of developers. In this article, we describe the technical tools and collaborative process that enabled the success of the application. 

  4. 40 CFR 761.80 - Manufacturing, processing and distribution in commerce exemptions.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Manufacturing, processing and..., PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Exemptions § 761.80 Manufacturing, processing and... any change in the manner of processing and distributing, importing (manufacturing), or exporting of...

  5. Mistaking geography for biology: inferring processes from species distributions.

    Science.gov (United States)

    Warren, Dan L; Cardillo, Marcel; Rosauer, Dan F; Bolnick, Daniel I

    2014-10-01

    Over the past few decades, there has been a rapid proliferation of statistical methods that infer evolutionary and ecological processes from data on species distributions. These methods have led to considerable new insights, but they often fail to account for the effects of historical biogeography on present-day species distributions. Because the geography of speciation can lead to patterns of spatial and temporal autocorrelation in the distributions of species within a clade, this can result in misleading inferences about the importance of deterministic processes in generating spatial patterns of biodiversity. In this opinion article, we discuss ways in which patterns of species distributions driven by historical biogeography are often interpreted as evidence of particular evolutionary or ecological processes. We focus on three areas that are especially prone to such misinterpretations: community phylogenetics, environmental niche modelling, and analyses of beta diversity (compositional turnover of biodiversity). Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  6. Mapping innovation processes: Visual techniques for opening and presenting the black box of service innovation processes

    DEFF Research Database (Denmark)

    Olesen, Anne Rørbæk

    2017-01-01

    This chapter argues for the usefulness of visual mapping techniques for performing qualitative analysis of complex service innovation processes. Different mapping formats are presented, namely, matrices, networks, process maps, situational analysis maps and temporal situational analysis maps....... For the purpose of researching service innovation processes, the three latter formats are argued to be particularly interesting. Process maps can give an overview of different periods and milestones in a process in one carefully organized location. Situational analysis maps and temporal situational analysis maps...... can open up complexities of service innovation processes, as well as close them down for presentational purposes. The mapping formats presented are illustrated by displaying maps from an exemplary research project, and the chapter is concluded with a brief discussion of the limitations and pitfalls...

  7. The Open Method of Coordination and the Implementation of the Bologna Process

    Science.gov (United States)

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  8. Distributed quantum information processing via quantum dot spins

    International Nuclear Information System (INIS)

    Jun, Liu; Qiong, Wang; Le-Man, Kuang; Hao-Sheng, Zeng

    2010-01-01

    We propose a scheme to engineer a non-local two-qubit phase gate between two remote quantum-dot spins. Along with one-qubit local operations, one can in principal perform various types of distributed quantum information processing. The scheme employs a photon with linearly polarisation interacting one after the other with two remote quantum-dot spins in cavities. Due to the optical spin selection rule, the photon obtains a Faraday rotation after the interaction process. By measuring the polarisation of the final output photon, a non-local two-qubit phase gate between the two remote quantum-dot spins is constituted. Our scheme may has very important applications in the distributed quantum information processing

  9. Open Data and Open Science for better Research in the Geo and Space Domain

    Science.gov (United States)

    Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.

    2015-12-01

    Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data

  10. OpenADR Open Source Toolkit: Developing Open Source Software for the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2011-02-01

    Demand response (DR) is becoming an increasingly important part of power grid planning and operation. The advent of the Smart Grid, which mandates its use, further motivates selection and development of suitable software protocols to enable DR functionality. The OpenADR protocol has been developed and is being standardized to serve this goal. We believe that the development of a distributable, open source implementation of OpenADR will benefit this effort and motivate critical evaluation of its capabilities, by the wider community, for providing wide-scale DR services

  11. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  12. Opening access to African scholarly content: Stellenbosch University's AOARI platforms

    Directory of Open Access Journals (Sweden)

    Dr Reggie Raju

    2013-03-01

    Full Text Available Africa is viewed as a consumer of the world's knowledge production. A significant factor influencing this status is the low research output, with the main contributor to this status being minimum access to scholarly content to support research. Stellenbosch University, a leading research institution on the African continent, is committed to contributing to changing this status quo through the distribution of its own research output utilizing open sources. Given the challenges that have plagued Africa in developing processes for the distribution of their research, Stellenbosch University has developed the African Open Access Repository Initiative (AOARI which uses open source software for two platforms that support the ‘green’ and ‘gold’ route to sharing scholarly literature: Ubuntu is used as the operating system, DSpace is used for its repository and Open Journal Systems for its publication platform. It is anticipated that AOARI will be the bridge that facilitates the sharing of research output and nurtures a culture of research production in Africa.

  13. A distributive peptide cyclase processes multiple microviridin core peptides within a single polypeptide substrate.

    Science.gov (United States)

    Zhang, Yi; Li, Kunhua; Yang, Guang; McBride, Joshua L; Bruner, Steven D; Ding, Yousong

    2018-05-03

    Ribosomally synthesized and post-translationally modified peptides (RiPPs) are an important family of natural products. Their biosynthesis follows a common scheme in which the leader peptide of a precursor peptide guides the modifications of a single core peptide. Here we describe biochemical studies of the processing of multiple core peptides within a precursor peptide, rare in RiPP biosynthesis. In a cyanobacterial microviridin pathway, an ATP-grasp ligase, AMdnC, installs up to two macrolactones on each of the three core peptides within AMdnA. The enzyme catalysis occurs in a distributive fashion and follows an unstrict N-to-C overall directionality, but a strict order in macrolactonizing each core peptide. Furthermore, AMdnC is catalytically versatile to process unnatural substrates carrying one to four core peptides, and kinetic studies provide insights into its catalytic properties. Collectively, our results reveal a distinct biosynthetic logic of RiPPs, opening up the possibility of modular production via synthetic biology approaches.

  14. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    Science.gov (United States)

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  15. Assessing the spatial distribution of Tuta absoluta (Lepidoptera: Gelechiidae) eggs in open-field tomato cultivation through geostatistical analysis.

    Science.gov (United States)

    Martins, Júlio C; Picanço, Marcelo C; Silva, Ricardo S; Gonring, Alfredo Hr; Galdino, Tarcísio Vs; Guedes, Raul Nc

    2018-01-01

    The spatial distribution of insects is due to the interaction between individuals and the environment. Knowledge about the within-field pattern of spatial distribution of a pest is critical to planning control tactics, developing efficient sampling plans, and predicting pest damage. The leaf miner Tuta absoluta (Meyrick) (Lepidoptera: Gelechiidae) is the main pest of tomato crops in several regions of the world. Despite the importance of this pest, the pattern of spatial distribution of T. absoluta on open-field tomato cultivation remains unknown. Therefore, this study aimed to characterize the spatial distribution of T. absoluta in 22 commercial open-field tomato cultivations with plants at the three phenological development stages by using geostatistical analysis. Geostatistical analysis revealed that there was strong evidence for spatially dependent (aggregated) T. absoluta eggs in 19 of the 22 sample tomato cultivations. The maps that were obtained demonstrated the aggregated structure of egg densities at the edges of the crops. Further, T. absoluta was found to accomplish egg dispersal along the rows more frequently than it does between rows. Our results indicate that the greatest egg densities of T. absoluta occur at the edges of tomato crops. These results are discussed in relation to the behavior of T. absoluta distribution within fields and in terms of their implications for improved sampling guidelines and precision targeting control methods that are essential for effective pest monitoring and management. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  16. Gamma processes and peaks-over-threshold distributions for time-dependent reliability

    International Nuclear Information System (INIS)

    Noortwijk, J.M. van; Weide, J.A.M. van der; Kallen, M.J.; Pandey, M.D.

    2007-01-01

    In the evaluation of structural reliability, a failure is defined as the event in which stress exceeds a resistance that is liable to deterioration. This paper presents a method to combine the two stochastic processes of deteriorating resistance and fluctuating load for computing the time-dependent reliability of a structural component. The deterioration process is modelled as a gamma process, which is a stochastic process with independent non-negative increments having a gamma distribution with identical scale parameter. The stochastic process of loads is generated by a Poisson process. The variability of the random loads is modelled by a peaks-over-threshold distribution (such as the generalised Pareto distribution). These stochastic processes of deterioration and load are combined to evaluate the time-dependent reliability

  17. BioSig: the free and open source software library for biomedical signal processing.

    Science.gov (United States)

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  18. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  19. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  20. Tempered stable distributions stochastic models for multiscale processes

    CERN Document Server

    Grabchak, Michael

    2015-01-01

    This brief is concerned with tempered stable distributions and their associated Levy processes. It is a good text for researchers interested in learning about tempered stable distributions.  A tempered stable distribution is one which takes a stable distribution and modifies its tails to make them lighter. The motivation for this class comes from the fact that infinite variance stable distributions appear to provide a good fit to data in a variety of situations, but the extremely heavy tails of these models are not realistic for most real world applications. The idea of using distributions that modify the tails of stable models to make them lighter seems to have originated in the influential paper of Mantegna and Stanley (1994). Since then, these distributions have been extended and generalized in a variety of ways. They have been applied to a wide variety of areas including mathematical finance, biostatistics,computer science, and physics.

  1. Parallel and distributed processing in power system simulation and control

    Energy Technology Data Exchange (ETDEWEB)

    Falcao, Djalma M [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia

    1994-12-31

    Recent advances in computer technology will certainly have a great impact in the methodologies used in power system expansion and operational planning as well as in real-time control. Parallel and distributed processing are among the new technologies that present great potential for application in these areas. Parallel computers use multiple functional or processing units to speed up computation while distributed processing computer systems are collection of computers joined together by high speed communication networks having many objectives and advantages. The paper presents some ideas for the use of parallel and distributed processing in power system simulation and control. It also comments on some of the current research work in these topics and presents a summary of the work presently being developed at COPPE. (author) 53 refs., 2 figs.

  2. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    Science.gov (United States)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  3. NCI's Distributed Geospatial Data Server

    Science.gov (United States)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under

  4. Point processes and the position distribution of infinite boson systems

    International Nuclear Information System (INIS)

    Fichtner, K.H.; Freudenberg, W.

    1987-01-01

    It is shown that to each locally normal state of a boson system one can associate a point process that can be interpreted as the position distribution of the state. The point process contains all information one can get by position measurements and is determined by the latter. On the other hand, to each so-called Σ/sup c/-point process Q they relate a locally normal state with position distribution Q

  5. Seedling establishment and distribution of direct radiation in slit-shaped openings of Norway spruce forests in the intermediate Alps

    International Nuclear Information System (INIS)

    Brang, P.

    1996-01-01

    Direct radiation is crucial for Norway spruce (Picea abies (L.) Karst.) seedling establishment in high-montane and subalpine spruce forests. Fisheye photography was used to estimate the daily distribution of direct radiation in small forest openings on a north-northwest and a south facing slope near Sedrun (Grisons, Switzerland). In slit-shaped openings on the north-northwest facing slope long sunflecks mostly occurred in the afternoon, when the sun shines parallel to the slit axis. This is in accordance to the silvicultural intention. However, since the stands are clumpy and therefore pervious to sunlight, the daily sunfleck distribution is fairly even notwithstanding the slit orientation, and direct radiation at noon is the dominant form of incident energy. In small circular to rectangular openings on the south facing slope direct radiation peaks at noontide. A seeding trial imitating natural seedling establishment was set in place in openings on both slopes. Based on this trial, the relations among seedling establishment, aspect, slit shape, size, and orientation are discussed for Norway spruce forests in the intermediate Alps. The directional weather factors such as radiation and precipitation can be highly influenced by slits, which is why suitable microclimate for seedling establishment can be promoted provided the slits are oriented appropriately. Slits in which the most insolated edges are oriented windward are especially favourable

  6. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  7. Experimental characterization of GIT-8 plasma opening switch

    International Nuclear Information System (INIS)

    Chuvatin, A.; Rouille, C.; Etlicher, B.; Kim, A.; Loginov, S.; Kokshenev, V.; Kovalchuk, B.

    1996-01-01

    High-current Plasma Opening Switch was experimentally studied on the GIT-8 inductive generator. Cordial laser interferometry allowed investigating the line-integrated POS plasma density dynamics during the switch operation. Recording of the axially distributed Bremsstrahlung radiation from the plasma region was used to determine the axial position where the opening started. The monitoring of fast plasma density oscillations with a characteristic frequency of ω ≅ 5 x 10 7 - 10 8 rad/s prior and during the opening is a new experimental achievement. A special study confirmed that such oscillations appear due to a plasma process. The oscillation frequency depended on the mean electron density as ω ∼ n e -0.5 . (author). 5 figs., 7 refs

  8. Computer program for source distribution process in radiation facility

    International Nuclear Information System (INIS)

    Al-Kassiri, H.; Abdul Ghani, B.

    2007-08-01

    Computer simulation for dose distribution using Visual Basic has been done according to the arrangement and activities of Co-60 sources. This program provides dose distribution in treated products depending on the product density and desired dose. The program is useful for optimization of sources distribution during loading process. there is good agreement between calculated data for the program and experimental data.(Author)

  9. Defining Success in Open Science.

    Science.gov (United States)

    Ali-Khan, Sarah E; Jean, Antoine; MacDonald, Emily; Gold, E Richard

    2018-01-01

    Mounting evidence indicates that worldwide, innovation systems are increasing unsustainable. Equally, concerns about inequities in the science and innovation process, and in access to its benefits, continue. Against a backdrop of growing health, economic and scientific challenges global stakeholders are urgently seeking to spur innovation and maximize the just distribution of benefits for all. Open Science collaboration (OS) - comprising a variety of approaches to increase open, public, and rapid mobilization of scientific knowledge - is seen to be one of the most promising ways forward. Yet, many decision-makers hesitate to construct policy to support the adoption and implementation of OS without access to substantive, clear and reliable evidence. In October 2017, international thought-leaders gathered at an Open Science Leadership Forum in the Washington DC offices of the Bill and Melinda Gates Foundation to share their views on what successful Open Science looks like. Delegates from developed and developing nations, national governments, science agencies and funding bodies, philanthropy, researchers, patient organizations and the biotechnology, pharma and artificial intelligence (AI) industries discussed the outcomes that would rally them to invest in OS, as well as wider issues of policy and implementation. This first of two reports, summarizes delegates' views on what they believe OS will deliver in terms of research, innovation and social impact in the life sciences. Through open and collaborative process over the next months, we will translate these success outcomes into a toolkit of quantitative and qualitative indicators to assess when, where and how open science collaborations best advance research, innovation and social benefit. Ultimately, this work aims to develop and openly share tools to allow stakeholders to evaluate and re-invent their innovation ecosystems, to maximize value for the global public and patients, and address long-standing questions

  10. Fouling distribution in forward osmosis membrane process.

    Science.gov (United States)

    Lee, Junseok; Kim, Bongchul; Hong, Seungkwan

    2014-06-01

    Fouling behavior along the length of membrane module was systematically investigated by performing simple modeling and lab-scale experiments of forward osmosis (FO) membrane process. The flux distribution model developed in this study showed a good agreement with experimental results, validating the robustness of the model. This model demonstrated, as expected, that the permeate flux decreased along the membrane channel due to decreasing osmotic pressure differential across the FO membrane. A series of fouling experiments were conducted under the draw and feed solutions at various recoveries simulated by the model. The simulated fouling experiments revealed that higher organic (alginate) fouling and thus more flux decline were observed at the last section of a membrane channel, as foulants in feed solution became more concentrated. Furthermore, the water flux in FO process declined more severely as the recovery increased due to more foulants transported to membrane surface with elevated solute concentrations at higher recovery, which created favorable solution environments for organic adsorption. The fouling reversibility also decreased at the last section of the membrane channel, suggesting that fouling distribution on FO membrane along the module should be carefully examined to improve overall cleaning efficiency. Lastly, it was found that such fouling distribution observed with co-current flow operation became less pronounced in counter-current flow operation of FO membrane process. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  11. Fractal-Markovian scaling of turbulent bursting process in open channel flow

    International Nuclear Information System (INIS)

    Keshavarzi, Ali Reza; Ziaei, Ali Naghi; Homayoun, Emdad; Shirvani, Amin

    2005-01-01

    The turbulent coherent structure of flow in open channel is a chaotic and stochastic process in nature. The coherence structure of the flow or bursting process consists of a series of eddies with a variety of different length scales and it is very important for the entrainment of sediment particles from the bed. In this study, a fractal-Markovian process is applied to the measured turbulent data in open channel. The turbulent data was measured in an experimental flume using three-dimensional acoustic Doppler velocity meter (ADV). A fractal interpolation function (FIF) algorithm was used to simulate more than 500,000 time series data of measured instantaneous velocity fluctuations and Reynolds shear stress. The fractal interpolation functions (FIF) enables to simulate and construct time series of u', v', and u'v' for any particular movement and state in the Markov process. The fractal dimension of the bursting events is calculated for 16 particular movements with the transition probability of the events based on 1st order Markov process. It was found that the average fractal dimensions of the streamwise flow velocity (u') are; 1.73, 1.74, 1.71 and 1.74 with the transition probability of 60.82%, 63.77%, 59.23% and 62.09% for the 1-1, 2-2, 3-3 and 4-4 movements, respectively. It was also found that the fractal dimensions of Reynold stress u'v' for quadrants 1, 2, 3 and 4 are 1.623, 1.623, 1.625 and 1.618, respectively

  12. Floral and nesting resources, habitat structure, and fire influence bee distribution across an open-forest gradient

    Science.gov (United States)

    Grundel, R.; Jean, R.P.; Frohnapple, K.J.; Glowacki, G.A.; Scott, P.E.; Pavlovic, N.B.

    2010-01-01

    Given bees' central effect on vegetation communities, it is important to understand how and why bee distributions vary across ecological gradients. We examined how plant community composition, plant diversity, nesting suitability, canopy cover, land use, and fire history affected bee distribution across an open-forest gradient in northwest Indiana, USA, a gradient similar to the historic Midwest United States landscape mosaic. When considered with the other predictors, plant community composition was not a significant predictor of bee community composition. Bee abundance was negatively related to canopy cover and positively to recent fire frequency, bee richness was positively related to plant richness and abundance of potential nesting resources, and bee community composition was significantly related to plant richness, soil characteristics potentially related to nesting suitability, and canopy cover. Thus, bee abundance was predicted by a different set of environmental characteristics than was bee species richness, and bee community composition was predicted, in large part, by a combination of the significant predictors of bee abundance and richness. Differences in bee community composition along the woody vegetation gradient were correlated with relative abundance of oligolectic, or diet specialist, bees. Because oligoleges were rarer than diet generalists and were associated with open habitats, their populations may be especially affected by degradation of open habitats. More habitat-specialist bees were documented for open and forest/scrub habitats than for savanna/woodland habitats, consistent with bees responding to habitats of intermediate woody vegetation density, such as savannas, as ecotones rather than as distinct habitat types. Similarity of bee community composition, similarity of bee abundance, and similarity of bee richness between sites were not significantly related to proximity of sites to each other. Nestedness analysis indicated that species

  13. Formal specification of open distributed systems - overview and evaluation of existing methods

    International Nuclear Information System (INIS)

    Stoelen, Ketil

    1998-02-01

    This report classifies, compares and evaluates eleven specification languages for distributed systems. The eleven specification languages have been picked from a wide spectrum of areas embracing both industry and research. We have selected languages that we see as important; either because they have proved useful within the commercial software industry, or because they play or we expect them to play an important role within research. Based on literature studies, we investigate the suitability of these specification languages to describe open distributed systems. The languages are also evaluated with respect to support for refinement and the characterization of proof-obligations. The report consists of five main parts: Part 1 gives the background and motivation for the evaluation; it also introduces the basic terminology; Part 2 motivates, identifies and formulates the concrete evaluation criterions; Part 3 evaluates the specification languages with respect to the evaluation criterions formulated in Part 2; Part 4 sums up the results from the evaluation in the form of tables; it also draws some conclusions and identifies some directions for further studies; Part 5 consists of two appendices, namely a bibliography and a list of abbreviations. (author)

  14. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  15. Resource depletion promotes automatic processing: implications for distribution of practice.

    Science.gov (United States)

    Scheel, Matthew H

    2010-12-01

    Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.

  16. The brain as a distributed intelligent processing system: an EEG study.

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-03-15

    Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. The present results support these claims and the neural efficiency hypothesis.

  17. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Science.gov (United States)

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  18. An Open Source-Based Real-Time Data Processing Architecture Framework for Manufacturing Sustainability

    Directory of Open Access Journals (Sweden)

    Muhammad Syafrudin

    2017-11-01

    Full Text Available Currently, the manufacturing industry is experiencing a data-driven revolution. There are multiple processes in the manufacturing industry and will eventually generate a large amount of data. Collecting, analyzing and storing a large amount of data are one of key elements of the smart manufacturing industry. To ensure that all processes within the manufacturing industry are functioning smoothly, the big data processing is needed. Thus, in this study an open source-based real-time data processing (OSRDP architecture framework was proposed. OSRDP architecture framework consists of several open sources technologies, including Apache Kafka, Apache Storm and NoSQL MongoDB that are effective and cost efficient for real-time data processing. Several experiments and impact analysis for manufacturing sustainability are provided. The results showed that the proposed system is capable of processing a massive sensor data efficiently when the number of sensors data and devices increases. In addition, the data mining based on Random Forest is presented to predict the quality of products given the sensor data as the input. The Random Forest successfully classifies the defect and non-defect products, and generates high accuracy compared to other data mining algorithms. This study is expected to support the management in their decision-making for product quality inspection and support manufacturing sustainability.

  19. Internet-centric collaborative design in a distributed environment

    International Nuclear Information System (INIS)

    Kim, Hyun; Kim, Hyoung Sun; Do, Nam Chul; Lee, Jae Yeol; Lee, Joo Haeng; Myong, Jae Hyong

    2001-01-01

    Recently, advanced information technologies including internet-related technology and distributed object technology have opened new possibilities for collaborative designs. In this paper, we discuss computer supports for collaborative design in a distributed environment. The proposed system is the internet-centric system composed of an engineering framework, collaborative virtual workspace and engineering service. It allows the distributed designers to more efficiently and collaboratively work their engineering tasks throughout the design process

  20. Joint Probability Distributions for a Class of Non-Markovian Processes

    OpenAIRE

    Baule, A.; Friedrich, R.

    2004-01-01

    We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H.C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single time probability distributions to the case of N-time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fr...

  1. Open Polar Server (OPS—An Open Source Infrastructure for the Cryosphere Community

    Directory of Open Access Journals (Sweden)

    Weibo Liu

    2016-03-01

    Full Text Available The Center for Remote Sensing of Ice Sheets (CReSIS at the University of Kansas has collected approximately 1000 terabytes (TB of radar depth sounding data over the Arctic and Antarctic ice sheets since 1993 in an effort to map the thickness of the ice sheets and ultimately understand the impacts of climate change and sea level rise. In addition to data collection, the storage, management, and public distribution of the dataset are also primary roles of the CReSIS. The Open Polar Server (OPS project developed a free and open source infrastructure to store, manage, analyze, and distribute the data collected by CReSIS in an effort to replace its current data storage and distribution approach. The OPS infrastructure includes a spatial database management system (DBMS, map and web server, JavaScript geoportal, and MATLAB application programming interface (API for the inclusion of data created by the cryosphere community. Open source software including GeoServer, PostgreSQL, PostGIS, OpenLayers, ExtJS, GeoEXT and others are used to build a system that modernizes the CReSIS data distribution for the entire cryosphere community and creates a flexible platform for future development. Usability analysis demonstrates the OPS infrastructure provides an improved end user experience. In addition, interpolating glacier topography is provided as an application example of the system.

  2. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  3. An Open-Source Toolbox for PEM Fuel Cell Simulation

    Directory of Open Access Journals (Sweden)

    Jean-Paul Kone

    2018-05-01

    Full Text Available In this paper, an open-source toolbox that can be used to accurately predict the distribution of the major physical quantities that are transported within a proton exchange membrane (PEM fuel cell is presented. The toolbox has been developed using the Open Source Field Operation and Manipulation (OpenFOAM platform, which is an open-source computational fluid dynamics (CFD code. The base case results for the distribution of velocity, pressure, chemical species, Nernst potential, current density, and temperature are as expected. The plotted polarization curve was compared to the results from a numerical model and experimental data taken from the literature. The conducted simulations have generated a significant amount of data and information about the transport processes that are involved in the operation of a PEM fuel cell. The key role played by the concentration constant in shaping the cell polarization curve has been explored. The development of the present toolbox is in line with the objectives outlined in the International Energy Agency (IEA, Paris, France Advanced Fuel Cell Annex 37 that is devoted to developing open-source computational tools to facilitate fuel cell technologies. The work therefore serves as a basis for devising additional features that are not always feasible with a commercial code.

  4. Addressing Interoperability in Open Hypermedia: The Design of the Open Hypermedia Protocol

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Reich, Siegfried; Wiil, Uffe K.

    1999-01-01

    Early hypertext systems were monolithic and closed, but newer systems tend to be open, distributed, and support collaboration. While this development has resulted in increased openness and flexibility, integration or adaptation of various different tools (such as content editors, viewers, services...

  5. Distributed data processing for public health surveillance

    Directory of Open Access Journals (Sweden)

    Yih Katherine

    2006-09-01

    Full Text Available Abstract Background Many systems for routine public health surveillance rely on centralized collection of potentially identifiable, individual, identifiable personal health information (PHI records. Although individual, identifiable patient records are essential for conditions for which there is mandated reporting, such as tuberculosis or sexually transmitted diseases, they are not routinely required for effective syndromic surveillance. Public concern about the routine collection of large quantities of PHI to support non-traditional public health functions may make alternative surveillance methods that do not rely on centralized identifiable PHI databases increasingly desirable. Methods The National Bioterrorism Syndromic Surveillance Demonstration Program (NDP is an example of one alternative model. All PHI in this system is initially processed within the secured infrastructure of the health care provider that collects and holds the data, using uniform software distributed and supported by the NDP. Only highly aggregated count data is transferred to the datacenter for statistical processing and display. Results Detailed, patient level information is readily available to the health care provider to elucidate signals observed in the aggregated data, or for ad hoc queries. We briefly describe the benefits and disadvantages associated with this distributed processing model for routine automated syndromic surveillance. Conclusion For well-defined surveillance requirements, the model can be successfully deployed with very low risk of inadvertent disclosure of PHI – a feature that may make participation in surveillance systems more feasible for organizations and more appealing to the individuals whose PHI they hold. It is possible to design and implement distributed systems to support non-routine public health needs if required.

  6. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    Science.gov (United States)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  7. Beyond Open Source: Evaluating the Community Availability of Software

    Directory of Open Access Journals (Sweden)

    Bret Davidson

    2016-01-01

    Full Text Available The Code4Lib community has produced an increasingly impressive collection of open source software over the last decade, but much of this creative work remains out of reach for large portions of the library community. Do the relatively privileged institutions represented by a majority of Code4Lib participants have a professional responsibility to support the adoption of their innovations? Drawing from old and new software packaging and distribution approaches (from freeware to Docker, we propose extending the open source software values of collaboration and transparency to include the wide and affordable distribution of software. We believe this will not only simplify the process of sharing our applications within the library community, but also make it possible for less well-resourced institutions to actually use our software. We identify areas of need, present our experiences with the users of our own open source projects, discuss our attempts to go beyond open source, propose a preliminary set of technology availability performance indicators for evaluating software availability, and make an argument for the internal value of supporting and encouraging a vibrant library software ecosystem.

  8. Solid Waste Processing Center Primary Opening Cells Systems, Equipment and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Sharon A.; Baker, Carl P.; Mullen, O Dennis; Valdez, Patrick LJ

    2006-04-17

    This document addresses the remote systems and design integration aspects of the development of the Solid Waste Processing Center (SWPC), a facility to remotely open, sort, size reduce, and repackage mixed low-level waste (MLLW) and transuranic (TRU)/TRU mixed waste that is either contact-handled (CH) waste in large containers or remote-handled (RH) waste in various-sized packages.

  9. Numerical investigation of the recruitment process in open marine population models

    International Nuclear Information System (INIS)

    Angulo, O; López-Marcos, J C; López-Marcos, M A; Martínez-Rodríguez, J

    2011-01-01

    The changes in the dynamics, produced by the recruitment process in an open marine population model, are investigated from a numerical point of view. The numerical method considered, based on the representation of the solution along the characteristic lines, approximates properly the steady states of the model, and is used to analyze the asymptotic behavior of the solutions of the model

  10. Cross-coherent vector sensor processing for spatially distributed glider networks.

    Science.gov (United States)

    Nichols, Brendan; Sabra, Karim G

    2015-09-01

    Autonomous underwater gliders fitted with vector sensors can be used as a spatially distributed sensor array to passively locate underwater sources. However, to date, the positional accuracy required for robust array processing (especially coherent processing) is not achievable using dead-reckoning while the gliders remain submerged. To obtain such accuracy, the gliders can be temporarily surfaced to allow for global positioning system contact, but the acoustically active sea surface introduces locally additional sensor noise. This letter demonstrates that cross-coherent array processing, which inherently mitigates the effects of local noise, outperforms traditional incoherent processing source localization methods for this spatially distributed vector sensor network.

  11. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    Directory of Open Access Journals (Sweden)

    Konrad J Karczewski

    Full Text Available The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping, a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  12. Defining an Open Source Strategy for NASA

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Lindsay, F.; Berrick, S. W.; Marshall, J. J.; Downs, R. R.

    2011-12-01

    Over the course of the past year, we have worked to help frame a strategy for NASA and open source software. This includes defining information processes to understand open source licensing, attribution, commerciality, redistribution, communities, architectures, and interactions within the agency. Specifically we held a training session at the NASA Earth Science Data Systems Working Group meeting in Open Source software as it relates to the NASA Earth Science data systems enterprise, including EOSDIS, the Distributed Active Archive Centers (DAACs), ACCESS proposals, and the MEASURES communities, and efforts to understand how open source software can be both consumed and produced within that ecosystem. In addition, we presented at the 1st NASA Open Source Summit (OSS) and helped to define an agency-level strategy, a set of recommendations and paths forward for how to identify healthy open source communities, how to deal with issues such as contributions originating from other agencies, and how to search out talent with the right skills to develop software for NASA in the modern age. This talk will review our current recommendations for open source at NASA, and will cover the set of thirteen recommendations output from the NASA Open Source Summit and discuss some of their implications for the agency.

  13. The brain as a distributed intelligent processing system: an EEG study.

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    Full Text Available BACKGROUND: Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS, first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. METHODOLOGY AND PRINCIPAL FINDINGS: In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Wechsler Adult Intelligence Scale and WISC (Wechsler Intelligence Scale for Children, and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. CONCLUSION: The present results support these claims and the neural efficiency hypothesis.

  14. The Brain as a Distributed Intelligent Processing System: An EEG Study

    Science.gov (United States)

    da Rocha, Armando Freitas; Rocha, Fábio Theoto; Massad, Eduardo

    2011-01-01

    Background Various neuroimaging studies, both structural and functional, have provided support for the proposal that a distributed brain network is likely to be the neural basis of intelligence. The theory of Distributed Intelligent Processing Systems (DIPS), first developed in the field of Artificial Intelligence, was proposed to adequately model distributed neural intelligent processing. In addition, the neural efficiency hypothesis suggests that individuals with higher intelligence display more focused cortical activation during cognitive performance, resulting in lower total brain activation when compared with individuals who have lower intelligence. This may be understood as a property of the DIPS. Methodology and Principal Findings In our study, a new EEG brain mapping technique, based on the neural efficiency hypothesis and the notion of the brain as a Distributed Intelligence Processing System, was used to investigate the correlations between IQ evaluated with WAIS (Whechsler Adult Intelligence Scale) and WISC (Wechsler Intelligence Scale for Children), and the brain activity associated with visual and verbal processing, in order to test the validity of a distributed neural basis for intelligence. Conclusion The present results support these claims and the neural efficiency hypothesis. PMID:21423657

  15. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  16. Ultralow field emission from thinned, open-ended, and defected carbon nanotubes by using microwave hydrogen plasma processing

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Jian-Hua, E-mail: jhdeng1983@163.com [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Lin; Wang, Fan-Jie; Yu, Bin; Li, Guo-Zheng; Li, De-Jun [College of Physics and Materials Science, Tianjin Normal University, Tianjin 300387 (China); Cheng, Guo-An [Key Laboratory of Beam Technology and Material Modification of Ministry of Education, Beijing Normal University, Beijing 100875 (China)

    2015-01-01

    Graphical abstract: Thinned, open-ended, and defected carbon nanotubes were prepared by using hydrogen plasma processing. The processed carbon nanotubes have far better field emission performance than that of the pristine ones. - Highlights: • CVD prepared CNT arrays were processed by microwave hydrogen plasma. • Thinned, open-ended, and defected CNTs were obtained. • Processed CNTs have far better field emission performance than the pristine ones. • Processed CNTs have applicable emission stability after being perfectly aged. - Abstract: Ultralow field emission is achieved from carbon nanotubes (CNTs) by using microwave hydrogen plasma processing. After the processing, typical capped CNT tips are removed, with thinned, open-ended, and defected CNTs left. Structural analyses indicate that the processed CNTs have more SP{sup 3}-hybridized defects as compared to the pristine ones. The morphology of CNTs can be readily controlled by adjusting microwave powers, which change the shape of CNTs by means of hydrogen plasma etching. Processed CNTs with optimal morphology are found to have an ultralow turn-on field of 0.566 V/μm and threshold field of 0.896 V/μm, much better than 0.948 and 1.559 V/μm of the as-grown CNTs, respectively. This improved FE performance is ascribed to the structural changes of CNTs after the processing. The thinned and open-ended shape of CNTs can facilitate electron tunneling through barriers and additionally, the increased defects at tube walls can serve as new active emission sites. Furthermore, our plasma processed CNTs exhibit excellent field emission stability at a large emission current density of 10.36 mA/cm{sup 2} after being perfectly aged, showing promising prospects in applications as high-performance vacuum electron sources.

  17. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    Science.gov (United States)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  18. Using ISO/IEC 12207 to analyze open source software development processes: an E-learning case study

    OpenAIRE

    Krishnamurthy, Aarthy; O'Connor, Rory

    2013-01-01

    peer-reviewed To date, there is no comprehensive study of open source software development process (OSSDP) carried out for open source (OS) e-learning systems. This paper presents the work which objectively analyzes the open source software development (OSSD) practices carried out by e-learning systems development communities and their results are represented using DEMO models. These results are compared using ISO/IEC 12207:2008. The comparison of DEMO models with ISO/IEC...

  19. Determination of material distribution in heading process of small bimetallic bar

    Science.gov (United States)

    Presz, Wojciech; Cacko, Robert

    2018-05-01

    The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.

  20. Standard services for the capture, processing, and distribution of packetized telemetry data

    Science.gov (United States)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  1. Enforcement of entailment constraints in distributed service-based business processes.

    Science.gov (United States)

    Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram

    2013-11-01

    A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web

  2. Accelerated life testing design using geometric process for pareto distribution

    OpenAIRE

    Mustafa Kamal; Shazia Zarrin; Arif Ul Islam

    2013-01-01

    In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...

  3. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  4. Transparent checkpointing and process migration in a distributed system

    OpenAIRE

    2004-01-01

    A distributed system for creating a checkpoint for a plurality of processes running on the distributed system. The distributed system includes a plurality of compute nodes with an operating system executing on each compute node. A checkpoint library resides at the user level on each of the compute nodes, and the checkpoint library is transparent to the operating system residing on the same compute node and to the other compute nodes. Each checkpoint library uses a windowed messaging logging p...

  5. Weiqi games as a tree: Zipf's law of openings and beyond

    Science.gov (United States)

    Xu, Li-Gong; Li, Ming-Xia; Zhou, Wei-Xing

    2015-06-01

    Weiqi is one of the most complex board games played by two persons. The placement strategies adopted by Weiqi players are often used to analog the philosophy of human wars. Contrary to the western chess, Weiqi games are less studied by academics partially because Weiqi is popular only in East Asia, especially in China, Japan and Korea. Here, we propose to construct a directed tree using a database of extensive Weiqi games and perform a quantitative analysis of the Weiqi tree. We find that the popularity distribution of Weiqi openings with the same number of moves is distributed according to a power law and the tail exponent increases with the number of moves. Intriguingly, the superposition of the popularity distributions of Weiqi openings with a number of moves not higher than a given number also has a power-law tail in which the tail exponent increases with the number of moves, and the superposed distribution approaches the Zipf law. These findings are the same as for chess and support the conjecture that the popularity distribution of board game openings follows the Zipf law with a universal exponent. We also find that the distribution of out-degrees has a power-law form, the distribution of branching ratios has a very complicated pattern, and the distribution of uniqueness scores defined by the path lengths from the root vertex to the leaf vertices exhibits a unimodal shape. Our work provides a promising direction for the study of the decision-making process of Weiqi playing from the perspective of directed branching tree.

  6. THE UNIQUE Na:O ABUNDANCE DISTRIBUTION IN NGC 6791: THE FIRST OPEN(?) CLUSTER WITH MULTIPLE POPULATIONS

    International Nuclear Information System (INIS)

    Geisler, D.; Villanova, S.; Cummings, J.; Carraro, G.; Pilachowski, C.; Johnson, C. I.; Bresolin, F.

    2012-01-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  7. The Unique Na:O Abundance Distribution in NGC 6791: The First Open(?) Cluster with Multiple Populations

    Science.gov (United States)

    Geisler, D.; Villanova, S.; Carraro, G.; Pilachowski, C.; Cummings, J.; Johnson, C. I.; Bresolin, F.

    2012-09-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  8. A prototype for JDEM science data processing

    International Nuclear Information System (INIS)

    Gottschalk, Erik E

    2011-01-01

    Fermilab is developing a prototype science data processing and data quality monitoring system for dark energy science. The purpose of the prototype is to demonstrate distributed data processing capabilities for astrophysics applications, and to evaluate candidate technologies for trade-off studies. We present the architecture and technical aspects of the prototype, including an open source scientific execution and application development framework, distributed data processing, and publish/subscribe message passing for quality control.

  9. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  10. A distributed computing model for telemetry data processing

    Science.gov (United States)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  11. A Java based environment to control and monitor distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.

    1997-01-01

    Distributed processing systems are considered to solve the challenging requirements of triggering and data acquisition systems for future HEP experiments. The aim of this work is to present a software environment to control and monitor large scale parallel processing systems based on a distributed client-server approach developed in Java. One server task may control several processing nodes, switching elements or controllers for different sub-systems. Servers are designed as multi-thread applications for efficient communications with other objects. Servers communicate between themselves by using Remote Method Invocation (RMI) in a peer-to-peer mechanism. This distributed server layer has to provide a dynamic and transparent access from any client to all the resources in the system. The graphical user interface programs, which are platform independent, may be transferred to any client via the http protocol. In this scheme the control and monitor tasks are distributed among servers and network controls the flow of information among servers and clients providing a flexible mechanism for monitoring and controlling large heterogenous distributed systems. (author)

  12. Proceedings: Distributed digital systems, plant process computers, and networks

    International Nuclear Information System (INIS)

    1995-03-01

    These are the proceedings of a workshop on Distributed Digital Systems, Plant Process Computers, and Networks held in Charlotte, North Carolina on August 16--18, 1994. The purpose of the workshop was to provide a forum for technology transfer, technical information exchange, and education. The workshop was attended by more than 100 representatives of electric utilities, equipment manufacturers, engineering service organizations, and government agencies. The workshop consisted of three days of presentations, exhibitions, a panel discussion and attendee interactions. Original plant process computers at the nuclear power plants are becoming obsolete resulting in increasing difficulties in their effectiveness to support plant operations and maintenance. Some utilities have already replaced their plant process computers by more powerful modern computers while many other utilities intend to replace their aging plant process computers in the future. Information on recent and planned implementations are presented. Choosing an appropriate communications and computing network architecture facilitates integrating new systems and provides functional modularity for both hardware and software. Control room improvements such as CRT-based distributed monitoring and control, as well as digital decision and diagnostic aids, can improve plant operations. Commercially available digital products connected to the plant communications system are now readily available to provide distributed processing where needed. Plant operations, maintenance activities, and engineering analyses can be supported in a cost-effective manner. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database

  13. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  14. Alkali corrosion resistant coatings and ceramic foams having superfine open cell structure and method of processing

    Science.gov (United States)

    Brown, Jr., Jesse J.; Hirschfeld, Deidre A.; Li, Tingkai

    1993-12-07

    Alkali corrosion resistant coatings and ceramic foams having superfine open cell structure are created using sol-gel processes. The processes have particular application in creating calcium magnesium zirconium phosphate, CMZP, coatings and foams.

  15. Stationary distributions of stochastic processes described by a linear neutral delay differential equation

    International Nuclear Information System (INIS)

    Frank, T D

    2005-01-01

    Stationary distributions of processes are derived that involve a time delay and are defined by a linear stochastic neutral delay differential equation. The distributions are Gaussian distributions. The variances of the Gaussian distributions are either monotonically increasing or decreasing functions of the time delays. The variances become infinite when fixed points of corresponding deterministic processes become unstable. (letter to the editor)

  16. Risk assessment of occupational groups working in open pit mining: Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Yaşar Kasap

    2017-01-01

    Full Text Available In open pit mining it is possible to prevent industrial accidents and the results of industrial accidents such as deaths, physical disabilities and financial loss by implementing risk analyses in advance. If the probabilities of different occupational groups encountering various hazards are determined, workers’ risk of having industrial accidents and catching occupational illnesses can be controlled. In this sense, the aim of this study was to assess the industrial accidents which occurred during open pit coal production in the Turkish Coal Enterprises (TCE Garp Lignite unit between 2005 and 2010 and to analyze the risks using the Analytic Hierarchy Process (AHP. The analyses conducted with AHP revealed that the greatest risk in open pit mining is landslides, the most risky occupational group is unskilled labourers and the most common hazards are caused by landslides and transportation/hand tools/falling.

  17. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    Science.gov (United States)

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  18. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically......We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...... adjusted according to past claims experience....

  19. An OpenMP Parallelisation of Real-time Processing of CERN LHC Beam Position Monitor Data

    CERN Document Server

    Renshall, H

    2012-01-01

    SUSSIX is a FORTRAN program for the post processing of turn-by-turn Beam Position Monitor (BPM) data, which computes the frequency, amplitude, and phase of tunes and resonant lines to a high degree of precision. For analysis of LHC BPM data a specific version run through a C steering code has been implemented in the CERN Control Centre to run on a server under the Linux operating system but became a real time computational bottleneck preventing truly online study of the BPM data. Timing studies showed that the independent processing of each BPMs data was a candidate for parallelization and the Open Multiprocessing (OpenMP) package with its simple insertion of compiler directives was tried. It proved to be easy to learn and use, problem free and efficient in this case reaching a factor of ten reductions in real-time over twelve cores on a dedicated server. This paper reviews the problem, shows the critical code fragments with their OpenMP directives and the results obtained.

  20. INNOVATION PROCESS IN OPEN CAPITAL BRAZILIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Ricardo Floriani

    2013-12-01

    Full Text Available This study aims to identify the innovation process used by the open capital Brazilian companies and establish a ranking of the potentially innovative ones. For this, a questionnaire was sent to 484 companies with shares traded in Bovespa, receiving a response from 22. The innovation process is based on the model of Barrett and Sexton (2006. A summary of the results is presented below. (i Organizational Capabilities – 95.5% answered that they have incentives for innovation activities and 68.2% reported having procedures for all services. The leadership has a facilitator role encouraging the initiative (86.4% and promotes the maintenance of the group relationship (72.7%. Value risk taking, even through failures and prioritize the learning and experimenting new ideas. (ii Background of the innovation – reveals aspects of the capacity (internal or (external. Of the respondents, 59.1% developed internal activities of continuing P & D. Training to innovate is present in a continuous or occasional basis in 81.8% of the companies. The respondents characterize the economic environment as dynamic and the majority purchased software and equipments. In only 12 opportunities was a reference to obtaining patents as innovation protection measure. (iii Focus of innovation – the majority of the companies mentioned process or product innovation. Rewards are offered when the objectives are met and it is brought to attention when this does not occur. (iv Highlighted performance – the innovations achieved the expectations and created effects. The relevant benefits noticed were: improvement in quality of goods and services, increase of market share, increase of goods and services, and increase of productive capacity.

  1. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi

    2010-11-15

    Bubble-nucleation processes of a Lennard-Jones liquid are studied by molecular dynamics simulations. Waiting time, which is the lifetime of a superheated liquid, is determined for several system sizes, and the apparent finite-size effect of the nucleation rate is observed. From the cumulative distribution function of the nucleation events, the bubble-nucleation process is found to be not a simple Poisson process but a Poisson process with an additional relaxation time. The parameters of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows to the critical size leads to an incorrect estimation of the nucleation rate. © 2010 The American Physical Society.

  2. Swan: A tool for porting CUDA programs to OpenCL

    Science.gov (United States)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An

  3. Open Distribution of Virtual Containers as a Key Framework for Open Educational Resources and STEAM Subjects

    Science.gov (United States)

    Corbi, Alberto; Burgos, Daniel

    2017-01-01

    This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…

  4. Being or Becoming: Toward an Open-System, Process-Centric Model of Personality.

    Science.gov (United States)

    Giordano, Peter J

    2015-12-01

    Mainstream personality psychology in the West neglects the investigation of intra-individual process and variation, because it favors a Being over a Becoming ontology. A Being ontology privileges a structural (e.g., traits or selves) conception of personality. Structure-centric models in turn suggest nomothetic research strategies and the investigation of individual and group differences. This article argues for an open-system, process-centric understanding of personality anchored in an ontology of Becoming. A classical Confucian model of personality is offered as an example of a process-centric approach for investigating and appreciating within-person personality process and variation. Both quantitative and qualitative idiographic strategies can be used as methods of scientific inquiry, particularly the exploration of the Confucian exemplar of psychological health and well-being.

  5. Open data for democracy : Developing a theoretical framework for open data use

    NARCIS (Netherlands)

    Ruijer, Erna|info:eu-repo/dai/nl/413320979; Grimmelikhuijsen, Stephan|info:eu-repo/dai/nl/313875405; Meijer, Albert|info:eu-repo/dai/nl/172436729

    2017-01-01

    Open data platforms are hoped to foster democratic processes, yet recent empirical research shows that so far they have failed to do so. We argue that current open data platforms do not take into account the complexity of democratic processes which results in overly simplistic approaches to open

  6. Syntactic processing is distributed across the language system.

    Science.gov (United States)

    Blank, Idan; Balewski, Zuzanna; Mahowald, Kyle; Fedorenko, Evelina

    2016-02-15

    Language comprehension recruits an extended set of regions in the human brain. Is syntactic processing localized to a particular region or regions within this system, or is it distributed across the entire ensemble of brain regions that support high-level linguistic processing? Evidence from aphasic patients is more consistent with the latter possibility: damage to many different language regions and to white-matter tracts connecting them has been shown to lead to similar syntactic comprehension deficits. However, brain imaging investigations of syntactic processing continue to focus on particular regions within the language system, often parts of Broca's area and regions in the posterior temporal cortex. We hypothesized that, whereas the entire language system is in fact sensitive to syntactic complexity, the effects in some regions may be difficult to detect because of the overall lower response to language stimuli. Using an individual-subjects approach to localizing the language system, shown in prior work to be more sensitive than traditional group analyses, we indeed find responses to syntactic complexity throughout this system, consistent with the findings from the neuropsychological patient literature. We speculate that such distributed nature of syntactic processing could perhaps imply that syntax is inseparable from other aspects of language comprehension (e.g., lexico-semantic processing), in line with current linguistic and psycholinguistic theories and evidence. Neuroimaging investigations of syntactic processing thus need to expand their scope to include the entire system of high-level language processing regions in order to fully understand how syntax is instantiated in the human brain. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The collaboration between and contribution of a digital open innovation platform to a local design process

    DEFF Research Database (Denmark)

    del Castillo, Jacqueline; Bhatti, Yasser; Hossain, Mokter

    2017-01-01

    We examine the potential of an open innovation digital platform to expose a local innovation process to a greater number of ideas and a more inclusive set of stakeholders. To do so we studied an online innovation challenge on the OpenIDEO to reimagine the end-of-life experience sponsored by Sutte...... it leads to a greater number of ideas from a wider and more inclusive set of stakeholders. We offer insights for the literatures on open innovation and design thinking in a healthcare context....

  8. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  9. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    Science.gov (United States)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis

  10. Modelling of Wheat-Flour Dough Mixing as an Open-Loop Hysteretic Process

    Czech Academy of Sciences Publication Activity Database

    Anderssen, R.; Kružík, Martin

    2013-01-01

    Roč. 18, č. 2 (2013), s. 283-293 ISSN 1531-3492 R&D Projects: GA AV ČR IAA100750802 Keywords : Dissipation * Dough mixing * Rate-independent systems Subject RIV: BA - General Mathematics Impact factor: 0.628, year: 2013 http://library.utia.cas.cz/separaty/2013/MTR/kruzik-modelling of wheat-flour dough mixing as an open-loop hysteretic process.pdf

  11. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, collecting close to 1 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the first...

  12. Distributed processing and analysis of ATLAS experimental data

    CERN Document Server

    Barberis, D; The ATLAS collaboration

    2011-01-01

    The ATLAS experiment is taking data steadily since Autumn 2009, and collected so far over 5 fb-1 of data (several petabytes of raw and reconstructed data per year of data-taking). Data are calibrated, reconstructed, distributed and analysed at over 100 different sites using the World-wide LHC Computing Grid and the tools produced by the ATLAS Distributed Computing project. In addition to event data, ATLAS produces a wealth of information on detector status, luminosity, calibrations, alignments, and data processing conditions. This information is stored in relational databases, online and offline, and made transparently available to analysers of ATLAS data world-wide through an infrastructure consisting of distributed database replicas and web servers that exploit caching technologies. This paper reports on the experience of using this distributed computing infrastructure with real data and in real time, on the evolution of the computing model driven by this experience, and on the system performance during the...

  13. OPEN INNOVATION PROCESSES IN SOCIAL MEDIA PLATFORMS

    OpenAIRE

    Yang, Yang

    2013-01-01

    Innovation power has becomes to the priority concern for many enterprises. Open innovation, which acts as a new innovation method, is now applied in many companies due to its unique advantages. On the other hand, social media platforms have been widely accepted by public and it shares an immeasurable business resources. Based on those facts, there must be space to link social media and open innovation together to achieve win-win. The objective was to research the important factors for op...

  14. An open-loop system design for deep space signal processing applications

    Science.gov (United States)

    Tang, Jifei; Xia, Lanhua; Mahapatra, Rabi

    2018-06-01

    A novel open-loop system design with high performance is proposed for space positioning and navigation signal processing. Divided by functions, the system has four modules, bandwidth selectable data recorder, narrowband signal analyzer, time-delay difference of arrival estimator and ANFIS supplement processor. A hardware-software co-design approach is made to accelerate computing capability and improve system efficiency. Embedded with the proposed signal processing algorithms, the designed system is capable of handling tasks with high accuracy over long period of continuous measurements. The experiment results show the Doppler frequency tracking root mean square error during 3 h observation is 0.0128 Hz, while the TDOA residue analysis in correlation power spectrum is 0.1166 rad.

  15. An open-source automated continuous condition-based maintenance platform for commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    2016-09-09

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  16. Open software architecture for east articulated maintenance arm

    International Nuclear Information System (INIS)

    Wu, Jing; Wu, Huapeng; Song, Yuntao; Li, Ming; Yang, Yang; Alcina, Daniel A.M.

    2016-01-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  17. Open software architecture for east articulated maintenance arm

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jing, E-mail: wujing@ipp.ac.cn [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Wu, Huapeng [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Song, Yuntao [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Li, Ming [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland); Yang, Yang [Institute of Plasma Physics Chinese Academy of Sciences, 350 Shushanhu Rd Hefei Anhui (China); Alcina, Daniel A.M. [Lappeenranta University of Technology, Skinnarilankatu 34 Lappeenranta (Finland)

    2016-11-01

    Highlights: • A software requirement of serial-articulated robot for EAST assembly and maintains is presented. • A open software architecture of the robot is developed. • A component-based model distribution system with real-time communication of the robot is constructed. - Abstract: For the inside inspection and the maintenance of vacuum vessel in the EAST, an articulated maintenance arm is developed. In this article, an open software architecture developed for the EAST articulated maintenance arm (EAMA) is described, which offers a robust and proper performance and easy-going experience based on standard open robotic platform OROCOS. The paper presents a component-based model software architecture using multi-layer structure: end layer, up layer, middle, and down layer. In the end layer the components are defined off-line in the task planner manner. The components in up layer complete the function of trajectory plan. The CORBA, as a communication framework, is adopted to exchange the data between the distributed components. The contributors use Real-Time Workshop from the MATLAB/Simulink to generate the components in the middle layer. Real-time Toolkit guarantees control applications running in the hard real-time mode. Ethernets and the CAN bus are used for data transfer in the down layer, where the components implement the hardware functions. The distributed architecture of control system associates each processing node with each joint, which is mapped to a component with all functioning features of the framework.

  18. Ionization processes in a transient hollow cathode discharge before electric breakdown: statistical distribution

    International Nuclear Information System (INIS)

    Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.

    1998-01-01

    The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown

  19. Dynamical Processes in Open Quantum Systems from a TDDFT Perspective: Resonances and Electron Photoemission.

    Science.gov (United States)

    Larsen, Ask Hjorth; De Giovannini, Umberto; Rubio, Angel

    2016-01-01

    We present a review of different computational methods to describe time-dependent phenomena in open quantum systems and their extension to a density-functional framework. We focus the discussion on electron emission processes in atoms and molecules addressing excited-state lifetimes and dissipative processes. Initially we analyze the concept of an electronic resonance, a central concept in spectroscopy associated with a metastable state from which an electron eventually escapes (electronic lifetime). Resonances play a fundamental role in many time-dependent molecular phenomena but can be rationalized from a time-independent context in terms of scattering states. We introduce the method of complex scaling, which is used to capture resonant states as localized states in the spirit of usual bound-state methods, and work on its extension to static and time-dependent density-functional theory. In a time-dependent setting, complex scaling can be used to describe excitations in the continuum as well as wave packet dynamics leading to electron emission. This process can also be treated by using open boundary conditions which allow time-dependent simulations of emission processes without artificial reflections at the boundaries (i.e., borders of the simulation box). We compare in detail different schemes to implement open boundaries, namely transparent boundaries using Green functions, and absorbing boundaries in the form of complex absorbing potentials and mask functions. The last two are regularly used together with time-dependent density-functional theory to describe the electron emission dynamics of atoms and molecules. Finally, we discuss approaches to the calculation of energy and angle-resolved time-dependent pump-probe photoelectron spectroscopy of molecular systems.

  20. The Marginal Distributions of a Crossing Time and Renewal Numbers Related with Two Poisson Processes are as Ph-Distributions

    Directory of Open Access Journals (Sweden)

    Mir G. H. Talpur

    2006-01-01

    Full Text Available In this paper we consider, how to find the marginal distributions of crossing time and renewal numbers related with two poisson processes by using probability arguments. The obtained results show that the one-dimension marginal distributions are N+1 order PH-distributions.

  1. Optimization of valve opening process for the suppression of impulse exhaust noise

    Science.gov (United States)

    Li, Jingxiang; Zhao, Shengdun

    2017-02-01

    Impulse exhaust noise generated by the sudden impact of discharging flow of pneumatic systems has significant temporal characteristics including high sound pressure and rapid sound transient. The impulse noise exposures are more hazardous to hearing than the energy equivalent uniform noise exposures. This paper presents a novel approach to suppress the peak sound pressure as a major indicator of impulsiveness of the impulse exhaust noise by an optimization of the opening process of valve. Relationships between exhaust flow and impulse noise are described by thermodynamics and noise generating mechanism. Then an optimized approach by controlling the valve opening process is derived under a constraint of pre-setting exhaust time. A modified servo-direct-driven valve was designed and assembled in a typical pneumatic system for the verification experiments comparing with an original solenoid valve. Experimental results with groups of initial cylinder pressures and pre-setting exhaust times are shown to verify the effects of the proposed optimization. Some indicators of energy-equivalent and impulsiveness are introduced to discuss the effects of the noise suppressions. Relationship between noise reduction and exhaust time delay is also discussed.

  2. Development of a revolving drum reactor for open-sorption heat storage processes

    International Nuclear Information System (INIS)

    Zettl, Bernhard; Englmair, Gerald; Steinmaurer, Gerald

    2014-01-01

    To evaluate the potential of an open sorption storage process using molecular sieves to provide thermal energy for space heating and hot water, an experimental study of adsorption heat generation in a rotating reactor is presented. Dehydrated zeolite of the type 4A and MSX were used in form of spherical grains and humidified room air was blown through the rotating bed. Zeolite batches of about 50 kg were able to generate an adsorption heat up to 12 kWh and temperature shifts of the process air up to 36 K depending on the inlet air water content and the state of dehydration of the storage materials. A detailed study of the heat transfer effects, the generated adsorption heat, and the evolving temperatures show the applicability of the reactor and storage concept. - Highlights: • Use of an open adsorption concept for domestic heat supply was proved. • A rotating heat drum reactor concept was successfully applied. • Zeolite batches of 50 kg generated up to 12 kWh adsorption heat (580 kJ/kg). • Temperature shift in the rotating material bed was up to 60 K during adsorption

  3. IPNS distributed-processing data-acquisition system

    International Nuclear Information System (INIS)

    Haumann, J.R.; Daly, R.T.; Worlton, T.G.; Crawford, R.K.

    1981-01-01

    The Intense Pulsed Neutron Source (IPNS) at Argonne National Laboratory is a major new user-oriented facility which has come on line for basic research in neutron scattering and neutron radiation damage. This paper describes the distributed-processing data-acquisition system which handles data collection and instrument control for the time-of-flight neutron-scattering instruments. The topics covered include the overall system configuration, each of the computer subsystems, communication protocols linking each computer subsystem, and an overview of the software which has been developed

  4. The WIPP decision plan: Charting the course for openness in the decision making process

    International Nuclear Information System (INIS)

    Hagers, J.

    1992-01-01

    In June of 1989, the Secretary of Energy requested that a plan be developed that would clearly outline the prerequisites to opening the Waste Isolation Pilot Plant (WIPP). It was to provide the basis for a decision making process that was not only visible to the public, but one which included public participation. It must also be dynamic enough to effectively deal with the changing legislative, regulatory, and technical environments. Based on a recognized need for openness, the Secretary's Draft Decision Plan was developed. The plan charted the course for ultimately making the decision to declare WIPP ready to receive waste for the start of test phase operations. It outlined to critics and supporters alike the rigorous and thorough process by which the internal decisions were made. The plan identified all internal prerequisites to the decision; charted the review cycles, and targeted the completion dates. It also outlined the processes outside the control of the Department, institutional issues, such as legislative land withdrawal, issuance of permits, and designation of transportation routes

  5. Open Source Cloud-Based Technologies for Bim

    Science.gov (United States)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  6. OPEN SOURCE CLOUD-BASED TECHNOLOGIES FOR BIM

    Directory of Open Access Journals (Sweden)

    S. Logothetis

    2018-05-01

    Full Text Available This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC projects. Besides, the development of Open Source Software (OSS has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  7. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation of the abs......In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, user and multiuser interaction, which can hardly be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples show how to implement interactive...

  8. Secrecy versus openness : Internet security and the limits of open source and peer production

    NARCIS (Netherlands)

    Schmidt, A.

    2014-01-01

    Open source and peer production have been praised as organisational models that could change the world for the better. It is commonly asserted that almost any societal activity could benefit from distributed, bottom-up collaboration — by making societal interaction more open, more social, and more

  9. Transactional Distance among Open University Students: How Does it Affect the Learning Process?

    Science.gov (United States)

    Kassandrinou, Amanda; Angelaki, Christina; Mavroidis, Ilias

    2014-01-01

    This study examines the presence of transactional distance among students, the factors affecting it, as well as the way it influences the learning process of students in a blended distance learning setting in Greece. The present study involved 12 postgraduate students of the Hellenic Open University (HOU). A qualitative research was conducted,…

  10. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    Science.gov (United States)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  11. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-05-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  12. Spatial distribution of juvenile and adult female Tanner crabs (Chionoecetes bairdi) in a glacial fjord ecosystem: Implications for recruitment processes

    Science.gov (United States)

    Nielsen, J.K.; Taggart, S. James; Shirley, Thomas C.; Mondragon, Jennifer

    2007-01-01

    A systematic pot survey in Glacier Bay, Alaska, was conducted to characterize the spatial distribution of juvenile and adult female Tanner crabs, and their association with depth and temperature. The information was used to infer important recruitment processes for Tanner crabs in glaciated ecosystems. High-catch areas for juvenile and adult female Tanner crabs were identified using local autocorrelation statistics. Spatial segregation by size class corresponded to features in the glacial landscape: high-catch areas for juveniles were located at the distal ends of two narrow glacial fjords, and high-catch areas for adults were located in the open waters of the central Bay. Juvenile female Tanner crabs were found at nearly all sampled depths (15–439 m) and temperatures (4–8°C), but the biggest catches were at depths <150 m where adults were scarce. Because adults may prey on or compete with juveniles, the distribution of juveniles could be influenced by the distribution of adults. Areas where adults or predators are scarce, such as glacially influenced fjords, could serve as refuges for juvenile Tanner crabs.

  13. Transnational Learning Processes

    DEFF Research Database (Denmark)

    Nedergaard, Peter

    This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

  14. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  15. Open Innovation and Stakeholder Engagement

    Directory of Open Access Journals (Sweden)

    Robert Wayne Gould

    2012-09-01

    Full Text Available The paradox of open innovation lies in the conflict between the practical desire to reap the benefits of open innovation and concern over the risk that others will misappropriate those benefits. Stakeholder theory and recent developments in value creation through stakeholder engagement can assist with reconciliation of this inherent structural risk. The limitations of existing open innovation typologies are identified, and a process-based model of open innovation is proposed. The model is then expanded to include stakeholder engagement. When integrated with stakeholder engagement, open innovation processes can be understood to generate benefits beyond the acquisition of specific information sought from external experts. The addition of stakeholder engagement to the open innovation model allows for greater understanding and easier acceptance of the risks inherent in the open innovation process.

  16. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  17. Aspects Concerning the Optimization of Authentication Process for Distributed Applications

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2008-06-01

    Full Text Available There will be presented the types of distributed applications. The quality characteristics for distributed applications will be analyzed. There will be established the ways to assign access rights. The authentication category will be analyzed. We will propose an algorithm for the optimization of authentication process.For the application “Evaluation of TIC projects” the algorithm proposed will be applied.

  18. Easy research data handling with an OpenEarth DataLab for geo-monitoring research

    Science.gov (United States)

    Vanderfeesten, Maurice; van der Kuil, Annemiek; Prinčič, Alenka; den Heijer, Kees; Rombouts, Jeroen

    2015-04-01

    OpenEarth DataLab is an open source-based collaboration and processing platform to enable streamlined research data management from raw data ingest and transformation to interoperable distribution. It enables geo-scientists to easily synchronise, share, compute and visualise the dynamic and most up-to-date research data, scripts and models in multi-stakeholder geo-monitoring programs. This DataLab is developed by the Research Data Services team of TU Delft Library and 3TU.Datacentrum together with coastal engineers of Delft University of Technology and Deltares. Based on the OpenEarth software stack an environment has been developed to orchestrate numerous geo-related open source software components that can empower researchers and increase the overall research quality by managing research data; enabling automatic and interoperable data workflows between all the components with track & trace, hit & run data transformation processing in cloud infrastructure using MatLab and Python, synchronisation of data and scripts (SVN), and much more. Transformed interoperable data products (KML, NetCDF, PostGIS) can be used by ready-made OpenEarth tools for further analyses and visualisation, and can be distributed via interoperable channels such as THREDDS (OpenDAP) and GeoServer. An example of a successful application of OpenEarth DataLab is the Sand Motor, an innovative method for coastal protection in the Netherlands. The Sand Motor is a huge volume of sand that has been applied along the coast to be spread naturally by wind, waves and currents. Different research disciplines are involved concerned with: weather, waves and currents, sand distribution, water table and water quality, flora and fauna, recreation and management. Researchers share and transform their data in the OpenEarth DataLab, that makes it possible to combine their data and to see influence of different aspects of the coastal protection on their models. During the project the data are available only for the

  19. Radar data processing using a distributed computational system

    Science.gov (United States)

    Mota, Gilberto F.

    1992-06-01

    This research specifies and validates a new concurrent decomposition scheme, called Confined Space Search Decomposition (CSSD), to exploit parallelism of Radar Data Processing algorithms using a Distributed Computational System. To formalize the specification, we propose and apply an object-oriented methodology called Decomposition Cost Evaluation Model (DCEM). To reduce the penalties of load imbalance, we propose a distributed dynamic load balance heuristic called Object Reincarnation (OR). To validate the research, we first compare our decomposition with an identified alternative using the proposed DCEM model and then develop a theoretical prediction of selected parameters. We also develop a simulation to check the Object Reincarnation Concept.

  20. Enhancement of the efficiency of the Open Cycle Phillips Optimized Cascade LNG process

    International Nuclear Information System (INIS)

    Fahmy, M.F.M.; Nabih, H.I.; El-Nigeily, M.

    2016-01-01

    Highlights: • Expanders replaced JT valves in the Phillips Optimized Cascade liquefaction process. • Improvement in plant liquefaction efficiency was evaluated in presence of expanders. • Comparison of the different optimum cases for the liquefaction process was presented. - Abstract: This study aims to improve the performance of the Open Cycle Phillips Optimized Cascade Process for the production of liquefied natural gas (LNG) through the replacement of Joule–Thomson (JT) valves by expanders. The expander has a higher thermodynamic efficiency than the JT valve. Moreover, the produced shaft power from the expander is integrated into the process. The study is conducted using the Aspen HYSYS-V7 simulation software for simulation of the Open Cycle Phillips Optimized Cascade Process having the JT valves. Simulation of several proposed cases in which expanders are used instead of JT valves at different locations in the process as at the propane cycle, ethylene cycle, methane cycle and the upstream of the heavies removal column is conducted. The optimum cases clearly indicate that expanders not only produce power, but also offer significant improvements in the process performance as shown by the total plant power consumption, LNG production, thermal efficiency, plant specific power and CO_2 emissions reduction. Results also reveal that replacing JT valves by expanders in the methane cycle has a dominating influence on all performance criteria and hence, can be considered as the main key contributor affecting the Phillips Optimized Cascade Process leading to a notable enhancement in its efficiency. This replacement of JT valves by liquid expanders at different locations of the methane cycle encounters power savings in the range of 4.92–5.72%, plant thermal efficiency of 92.64–92.97% and an increase in LNG production of 5.77–7.04%. Moreover, applying liquid expanders at the determined optimum cases for the different cycles, improves process performance and

  1. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-04-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  2. Open Access to essential health care information

    Directory of Open Access Journals (Sweden)

    Pandey Manoj

    2004-12-01

    Full Text Available Abstract Open Access publishing is a valuable resource for the synthesis and distribution of essential health care information. This article discusses the potential benefits of Open Access, specifically in terms of Low and Middle Income (LAMI countries in which there is currently a lack of informed health care providers – mainly a consequence of poor availability to information. We propose that without copyright restrictions, Open Access facilitates distribution of the most relevant research and health care information. Furthermore, we suggest that the technology and infrastructure that has been put in place for Open Access could be used to publish download-able manuals, guides or basic handbooks created by healthcare providers in LAMI countries.

  3. Density and diversity of OpenStreetMap road networks in China

    Directory of Open Access Journals (Sweden)

    Yingjia Zhang

    2015-12-01

    Full Text Available OpenStreetMap is a geographic information platform designed to provide real-time updates and user-generated content related to its freely available global map, and it is one of the most widely used examples of volunteered geographic information, a technique associated with so-called neogeography. This paper, based on the data from China’s OpenStreetMap road network in May 2014, taking 340 prefecture-level cities in China as its study area, presents the geometric-related (road density and attribute-related (type diversity spatial patterns of the OpenStreetMap road network, and explores their relationship. The results are as follows. (1 The distribution of OpenStreetMap road density in Shenzhen, Shanghai, Hong Kong, and Macao predominantly obeys a “positive skewness distribution”. OpenStreetMap data for eastern China shows a higher overall and circular structure. In central China, there are noticeable discrepancies in the road density, whereas in western China, the road density is low. (2 The OpenStreetMap road diversity shows a normal distribution. The spatial pattern for the so-called “Hu Huanyong line” was broken by the effect of diplomatic and strategic factors, showing a high diversity along the peripheral border, coastal cities, and core inland cites. (3 China’s OpenStreetMap is partitioned into four parts according to road density and diversity: high density and high diversity; low density and low diversity; high density and low diversity; and low density high diversity. (4 The OpenStreetMap geographical information-collection process and mechanism were analyzed, demonstrating that the road density reflects the preponderance of traffic in the real world. OpenStreetMap road diversity reflects the road-related geographic information demand and value, and it also reflects the interests of users toward to OpenStreetMap geographical information.

  4. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    Science.gov (United States)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  5. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    International Nuclear Information System (INIS)

    Limosani, Antonio; Boland, Lucien; Crosby, Sean; Huang, Joanna; Sevior, Martin; Coddington, Paul; Zhang, Shunde; Wilson, Ross

    2014-01-01

    The Australian Government is making a $AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  6. Co-occurrence of Photochemical and Microbiological Transformation Processes in Open-Water Unit Process Wetlands.

    Science.gov (United States)

    Prasse, Carsten; Wenk, Jannis; Jasper, Justin T; Ternes, Thomas A; Sedlak, David L

    2015-12-15

    The fate of anthropogenic trace organic contaminants in surface waters can be complex due to the occurrence of multiple parallel and consecutive transformation processes. In this study, the removal of five antiviral drugs (abacavir, acyclovir, emtricitabine, lamivudine and zidovudine) via both bio- and phototransformation processes, was investigated in laboratory microcosm experiments simulating an open-water unit process wetland receiving municipal wastewater effluent. Phototransformation was the main removal mechanism for abacavir, zidovudine, and emtricitabine, with half-lives (t1/2,photo) in wetland water of 1.6, 7.6, and 25 h, respectively. In contrast, removal of acyclovir and lamivudine was mainly attributable to slower microbial processes (t1/2,bio = 74 and 120 h, respectively). Identification of transformation products revealed that bio- and phototransformation reactions took place at different moieties. For abacavir and zidovudine, rapid transformation was attributable to high reactivity of the cyclopropylamine and azido moieties, respectively. Despite substantial differences in kinetics of different antiviral drugs, biotransformation reactions mainly involved oxidation of hydroxyl groups to the corresponding carboxylic acids. Phototransformation rates of parent antiviral drugs and their biotransformation products were similar, indicating that prior exposure to microorganisms (e.g., in a wastewater treatment plant or a vegetated wetland) would not affect the rate of transformation of the part of the molecule susceptible to phototransformation. However, phototransformation strongly affected the rates of biotransformation of the hydroxyl groups, which in some cases resulted in greater persistence of phototransformation products.

  7. Distributed inter process communication framework of BES III DAQ online software

    International Nuclear Information System (INIS)

    Li Fei; Liu Yingjie; Ren Zhenyu; Wang Liang; Chinese Academy of Sciences, Beijing; Chen Mali; Zhu Kejun; Zhao Jingwei

    2006-01-01

    DAQ (Data Acquisition) system is one important part of BES III, which is the large scale high-energy physics detector on the BEPC. The inter process communication (IPC) of online software in distributed environments is very pivotal for design and implement of DAQ system. This article will introduce one distributed inter process communication framework, which is based on CORBA and used in BES III DAQ online software. The article mainly presents the design and implementation of the IPC framework and application based on IPC. (authors)

  8. Rock fracture processes in chemically reactive environments

    Science.gov (United States)

    Eichhubl, P.

    2015-12-01

    Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed solution-precipitation creep in the

  9. Measurement of tracer gas distributions using an open-path FTIR system coupled with computed tomography

    Science.gov (United States)

    Drescher, Anushka C.; Yost, Michael G.; Park, Doo Y.; Levine, Steven P.; Gadgil, Ashok J.; Fischer, Marc L.; Nazaroff, William W.

    1995-05-01

    Optical remote sensing and iterative computed tomography (CT) can be combined to measure the spatial distribution of gaseous pollutant concentrations in a plane. We have conducted chamber experiments to test this combination of techniques using an Open Path Fourier Transform Infrared Spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). ART was found to converge to solutions that showed excellent agreement with the ray integral concentrations measured by the FTIR but were inconsistent with simultaneously gathered point sample concentration measurements. A new CT method was developed based on (a) the superposition of bivariate Gaussians to model the concentration distribution and (b) a simulated annealing minimization routine to find the parameters of the Gaussians that resulted in the best fit to the ray integral concentration data. This new method, named smooth basis function minimization (SBFM) generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present one set of illustrative experimental data to compare the performance of ART and SBFM.

  10. ACToR Chemical Structure processing using Open Source ...

    Science.gov (United States)

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from over 1,950 public sources. ACToR contains chemical structure information and toxicological data for over 558,000 unique chemicals. The database primarily includes data from NCCT research programs, in vivo toxicity data from ToxRef, human exposure data from ExpoCast, high-throughput screening data from ToxCast and high quality chemical structure information from the EPA DSSTox program. The DSSTox database is a chemical structure inventory for the NCCT programs and currently has about 16,000 unique structures. Included are also data from PubChem, ChemSpider, USDA, FDA, NIH and several other public data sources. ACToR has been a resource to various international and national research groups. Most of our recent efforts on ACToR are focused on improving the structural identifiers and Physico-Chemical properties of the chemicals in the database. Organizing this huge collection of data and improving the chemical structure quality of the database has posed some major challenges. Workflows have been developed to process structures, calculate chemical properties and identify relationships between CAS numbers. The Structure processing workflow integrates web services (PubChem and NIH NCI Cactus) to d

  11. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  12. The influence of emotion on lexical processing: insights from RT distributional analysis.

    Science.gov (United States)

    Yap, Melvin J; Seow, Cui Shan

    2014-04-01

    In two lexical decision experiments, the present study was designed to examine emotional valence effects on visual lexical decision (standard and go/no-go) performance, using traditional analyses of means and distributional analyses of response times. Consistent with an earlier study by Kousta, Vinson, and Vigliocco (Cognition 112:473-481, 2009), we found that emotional words (both negative and positive) were responded to faster than neutral words. Finer-grained distributional analyses further revealed that the facilitation afforded by valence was reflected by a combination of distributional shifting and an increase in the slow tail of the distribution. This suggests that emotional valence effects in lexical decision are unlikely to be entirely mediated by early, preconscious processes, which are associated with pure distributional shifting. Instead, our results suggest a dissociation between early preconscious processes and a later, more task-specific effect that is driven by feedback from semantically rich representations.

  13. Distributed processing in receivers based on tensor for cooperative communications systems

    OpenAIRE

    Igor FlÃvio SimÃes de Sousa

    2014-01-01

    In this dissertation, we present a distributed data estimation and detection approach for the uplink of a network that uses CDMA at transmitters (users). The analyzed network can be represented by an undirected and connected graph, where the nodes use a distributed estimation algorithm based on consensus averaging to perform joint channel and symbol estimation using a receiver based on tensor signal processing. The centralized receiver, developed for a central base station, and the distribute...

  14. Towards OpenVL: Improving Real-Time Performance of Computer Vision Applications

    Science.gov (United States)

    Shen, Changsong; Little, James J.; Fels, Sidney

    Meeting constraints for real-time performance is a main issue for computer vision, especially for embedded computer vision systems. This chapter presents our progress on our open vision library (OpenVL), a novel software architecture to address efficiency through facilitating hardware acceleration, reusability, and scalability for computer vision systems. A logical image understanding pipeline is introduced to allow parallel processing. We also discuss progress on our middleware—vision library utility toolkit (VLUT)—that enables applications to operate transparently over a heterogeneous collection of hardware implementations. OpenVL works as a state machine,with an event-driven mechanismto provide users with application-level interaction. Various explicit or implicit synchronization and communication methods are supported among distributed processes in the logical pipelines. The intent of OpenVL is to allow users to quickly and easily recover useful information from multiple scenes, in a cross-platform, cross-language manner across various software environments and hardware platforms. To validate the critical underlying concepts of OpenVL, a human tracking system and a local positioning system are implemented and described. The novel architecture separates the specification of algorithmic details from the underlying implementation, allowing for different components to be implemented on an embedded system without recompiling code.

  15. An Open-Rotor Distributed Propulsion Aircraft Study

    OpenAIRE

    Gibbs, Jonathan; Bachmann, Arne; Seyfang, George; Peebles, Patrick; May, Chris; Saracoğlu, Bayındır; Paniagua, Guillermo

    2016-01-01

    The EU-funded SOAR project analyzed the high-lift efficiency of an open-fan wing design by systematic variation of fan blade count and angle. The research project built a cross-flow fan propelled wing section and investigated it by means of fluid dynamic simulation and wind tunnel testing. The experimental data resulting from the wind tunnel model were used to generate non-dimensional parameters which were used to scale data for the full-scale SOAR wing section. Preliminary aircraft ...

  16. Open discovery: An integrated live Linux platform of Bioinformatics tools.

    Science.gov (United States)

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.

  17. Opening the Learning Process: The Potential Role of Feature Film in Teaching Employment Relations

    Science.gov (United States)

    Lafferty, George

    2016-01-01

    This paper explores the potential of feature film to encourage more inclusive, participatory and open learning in the area of employment relations. Evaluations of student responses in a single postgraduate course over a five-year period revealed how feature film could encourage participatory learning processes in which students reexamined their…

  18. An Extended Genetic Algorithm for Distributed Integration of Fuzzy Process Planning and Scheduling

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2016-01-01

    Full Text Available The distributed integration of process planning and scheduling (DIPPS aims to simultaneously arrange the two most important manufacturing stages, process planning and scheduling, in a distributed manufacturing environment. Meanwhile, considering its advantage corresponding to actual situation, the triangle fuzzy number (TFN is adopted in DIPPS to represent the machine processing and transportation time. In order to solve this problem and obtain the optimal or near-optimal solution, an extended genetic algorithm (EGA with innovative three-class encoding method, improved crossover, and mutation strategies is proposed. Furthermore, a local enhancement strategy featuring machine replacement and order exchange is also added to strengthen the local search capability on the basic process of genetic algorithm. Through the verification of experiment, EGA achieves satisfactory results all in a very short period of time and demonstrates its powerful performance in dealing with the distributed integration of fuzzy process planning and scheduling (DIFPPS.

  19. Building Energy Management Open Source Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-20

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications. (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring

  20. Distributed Iterative Processing for Interference Channels with Receiver Cooperation

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Manchón, Carles Navarro; Bota, Vasile

    2012-01-01

    We propose a method for the design and evaluation of distributed iterative algorithms for receiver cooperation in interference-limited wireless systems. Our approach views the processing within and collaboration between receivers as the solution to an inference problem in the probabilistic model...

  1. A role for distributed processing in advanced nuclear materials control and accountability systems

    International Nuclear Information System (INIS)

    Tisinger, R.M.; Whitty, W.J.; Ford, W.; Strittmatter, R.B.

    1986-01-01

    Networking and distributed processing hardware and software have the potential of greatly enhancing nuclear materials control and account-ability (MCandA) systems, both from safeguards and process operations perspectives while allowing timely integrated safeguards activities and enhanced computer security at reasonable cost. A hierarchical distributed system is proposed consisting of groups of terminals and instruments in plant production and support areas connected to microprocessors that are connected to either larger microprocessors or minicomputers. The structuring and development of a limited distributed MCandA prototype system, including human engineering concepts, are described. Implications of integrated safeguards and computer security concepts to the distributed system design are discussed

  2. Bounds for the probability distribution function of the linear ACD process

    OpenAIRE

    Fernandes, Marcelo

    2003-01-01

    Rio de Janeiro This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

  3. Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application

    Directory of Open Access Journals (Sweden)

    Lia Duarte

    2017-03-01

    Full Text Available Geographical Information Systems (GIS are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment—GIS-ECOAL (freely available. The performance of the application was evaluated with distributed temperature measurements gathered in the S. Pedro da Cova (Portugal coal waste pile. In order to obtain the temperature data, an optical fiber cable was disposed over the affected area of the pile, with 42 location stakes acting as precisely-located control points for the temperature measurement. A monthly data set from July (15 min of interval was fed into the application and a video composed by several layouts with temperature measurements was created allowing for recognizing two main areas with higher temperatures. The field observations also allow the identification of these zones; however, the identification of an area with higher temperatures in the top of the studied area was only possible through the visualization of the images created by this application. The generated videos make possible the dynamic and continuous visualization of the combustion process in the monitored area.

  4. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  5. Distribution of chirality in the quantum walk: Markov process and entanglement

    International Nuclear Information System (INIS)

    Romanelli, Alejandro

    2010-01-01

    The asymptotic behavior of the quantum walk on the line is investigated, focusing on the probability distribution of chirality independently of position. It is shown analytically that this distribution has a longtime limit that is stationary and depends on the initial conditions. This result is unexpected in the context of the unitary evolution of the quantum walk as it is usually linked to a Markovian process. The asymptotic value of the entanglement between the coin and the position is determined by the chirality distribution. For given asymptotic values of both the entanglement and the chirality distribution, it is possible to find the corresponding initial conditions within a particular class of spatially extended Gaussian distributions.

  6. Distributed Processing of SETI Data

    Science.gov (United States)

    Korpela, Eric

    As you have read in prior chapters, researchers have been performing progressively more sensitive SETI searches since 1960. Each search has been limited by the technologies available at the time. As radio frequency technologies have become more efficient and computers have become faster, the searches have increased in capacity and become more sensitive. Often the limits of the hardware that performs the calculations required to process the telescope data in order to expose any embedded signals is what limits the sensitivity of the search. Shortly before the start of the 21st century, projects began to appear that exploited the processing capabilities of computers connected to the Internet in order to solve problems that required a large amount of computing power. The SETI@home project, managed by myself and a group of researchers at the Space Sciences Laboratory of the University of California, Berkeley, was the first attempt to use large-scale distributed computing to solve the problems of performing a sensitive search for narrow band radio signals from extraterrestrial civilizations. (Korpela et al., 2001) A follow-on project, Astropulse, searches for extraterrestrial signals with wider bandwidths and shorter time durations. Both projects are ongoing at the present time (mid-2010).

  7. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. An Open Distributed Architecture for Sensor Networks for Risk Management

    Directory of Open Access Journals (Sweden)

    Ralf Denzer

    2008-03-01

    Full Text Available Sensors provide some of the basic input data for risk management of natural andman-made hazards. Here the word ‘sensors’ covers everything from remote sensingsatellites, providing invaluable images of large regions, through instruments installed on theEarth’s surface to instruments situated in deep boreholes and on the sea floor, providinghighly-detailed point-based information from single sites. Data from such sensors is used inall stages of risk management, from hazard, vulnerability and risk assessment in the preeventphase, information to provide on-site help during the crisis phase through to data toaid in recovery following an event. Because data from sensors play such an important part inimproving understanding of the causes of risk and consequently in its mitigation,considerable investment has been made in the construction and maintenance of highlysophisticatedsensor networks. In spite of the ubiquitous need for information from sensornetworks, the use of such data is hampered in many ways. Firstly, information about thepresence and capabilities of sensor networks operating in a region is difficult to obtain dueto a lack of easily available and usable meta-information. Secondly, once sensor networkshave been identified their data it is often difficult to access due to a lack of interoperability between dissemination and acquisition systems. Thirdly, the transfer and processing ofinformation from sensors is limited, again by incompatibilities between systems. Therefore,the current situation leads to a lack of efficiency and limited use of the available data thathas an important role to play in risk mitigation. In view of this situation, the EuropeanCommission (EC is funding a number of Integrated Projects within the Sixth FrameworkProgramme concerned with improving the accessibility of data and services for riskmanagement. Two of these projects: ‘Open Architecture and Spatial Data

  9. Equivalence of functional limit theorems for stationary point processes and their Palm distributions

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1989-01-01

    Let P be the distribution of a stationary point process on the real line and let P0 be its Palm distribution. In this paper we consider two types of functional limit theorems, those in terms of the number of points of the point process in (0, t] and those in terms of the location of the nth point

  10. Just-in-time Data Distribution for Analytical Query Processing

    NARCIS (Netherlands)

    M.G. Ivanova (Milena); M.L. Kersten (Martin); F.E. Groffen (Fabian)

    2012-01-01

    textabstract Distributed processing commonly requires data spread across machines using a priori static or hash-based data allocation. In this paper, we explore an alternative approach that starts from a master node in control of the complete database, and a variable number of worker nodes

  11. Benchmarking Distributed Stream Processing Platforms for IoT Applications

    OpenAIRE

    Shukla, Anshu; Simmhan, Yogesh

    2016-01-01

    Internet of Things (IoT) is a technology paradigm where millions of sensors monitor, and help inform or manage, physical, envi- ronmental and human systems in real-time. The inherent closed-loop re- sponsiveness and decision making of IoT applications makes them ideal candidates for using low latency and scalable stream processing plat- forms. Distributed Stream Processing Systems (DSPS) are becoming es- sential components of any IoT stack, but the efficacy and performance of contemporary DSP...

  12. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    Science.gov (United States)

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  13. GreyGuide - Guide to Good Practice in Grey Literature: A Community Driven Open Resource Project

    OpenAIRE

    Biagioni, Stefania (ISTI-CNR); Carlesi, Carlo (ISTI-CNR); Schopfel, Joachim (University of Lille); Farace, Dominic J. (GreyNet); Frantzen, Jerry (GreyNet); GreyNet, Grey Literature Network Service

    2014-01-01

    The goal of this project is to develop an open source repository of good practices in the field of grey literature. That which originated in monographic form will now open and expand to include content from the global grey literature community. Such practices will range from the production and processing of grey literature through to its distribution, uses, and preservation. The repository will contain guidelines such as those in handling theses and dissertations, how to write research report...

  14. Parallel Distributed Processing theory in the age of deep networks

    OpenAIRE

    Bowers, Jeffrey

    2017-01-01

    Parallel Distributed Processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely, that all knowledge is coded in a distributed format, and cognition is mediated by non-symbolic computations. These claims have long been debated within cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks le...

  15. Data-Intensive Text Processing with MapReduce

    CERN Document Server

    Lin, Jimmy

    2010-01-01

    Our world is being revolutionized by data-driven methods: access to large amounts of data has generated new insights and opened exciting new opportunities in commerce, science, and computing applications. Processing the enormous quantities of data necessary for these advances requires large clusters, making distributed computing paradigms more crucial than ever. MapReduce is a programming model for expressing distributed computations on massive datasets and an execution framework for large-scale data processing on clusters of commodity servers. The programming model provides an easy-to-underst

  16. Free for All: Open Source Software

    Science.gov (United States)

    Schneider, Karen

    2008-01-01

    Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…

  17. 'Predatory' open access: a longitudinal study of article volumes and market characteristics.

    Science.gov (United States)

    Shen, Cenyu; Björk, Bo-Christer

    2015-10-01

    A negative consequence of the rapid growth of scholarly open access publishing funded by article processing charges is the emergence of publishers and journals with highly questionable marketing and peer review practices. These so-called predatory publishers are causing unfounded negative publicity for open access publishing in general. Reports about this branch of e-business have so far mainly concentrated on exposing lacking peer review and scandals involving publishers and journals. There is a lack of comprehensive studies about several aspects of this phenomenon, including extent and regional distribution. After an initial scan of all predatory publishers and journals included in the so-called Beall's list, a sample of 613 journals was constructed using a stratified sampling method from the total of over 11,000 journals identified. Information about the subject field, country of publisher, article processing charge and article volumes published between 2010 and 2014 were manually collected from the journal websites. For a subset of journals, individual articles were sampled in order to study the country affiliation of authors and the publication delays. Over the studied period, predatory journals have rapidly increased their publication volumes from 53,000 in 2010 to an estimated 420,000 articles in 2014, published by around 8,000 active journals. Early on, publishers with more than 100 journals dominated the market, but since 2012 publishers in the 10-99 journal size category have captured the largest market share. The regional distribution of both the publisher's country and authorship is highly skewed, in particular Asia and Africa contributed three quarters of authors. Authors paid an average article processing charge of 178 USD per article for articles typically published within 2 to 3 months of submission. Despite a total number of journals and publishing volumes comparable to respectable (indexed by the Directory of Open Access Journals) open access

  18. Open-source hardware for medical devices.

    Science.gov (United States)

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  19. Digi-Clima Grid: image processing and distributed computing for recovering historical climate data

    Directory of Open Access Journals (Sweden)

    Sergio Nesmachnow

    2015-12-01

    Full Text Available This article describes the Digi-Clima Grid project, whose main goals are to design and implement semi-automatic techniques for digitalizing and recovering historical climate records applying parallel computing techniques over distributed computing infrastructures. The specific tool developed for image processing is described, and the implementation over grid and cloud infrastructures is reported. A experimental analysis over institutional and volunteer-based grid/cloud distributed systems demonstrate that the proposed approach is an efficient tool for recovering historical climate data. The parallel implementations allow to distribute the processing load, achieving accurate speedup values.

  20. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    Science.gov (United States)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure

  1. Momentum distributions: opening remarks

    International Nuclear Information System (INIS)

    Weigold, E.

    1982-01-01

    The problem of the hydrogen atom has played a central role in the development of quantum mechanics, beginning with Bohr's daring speculations. It was also the first problem tackled by Schroedinger with his new wave mechanics and similarly it was used by Heisenberg in his first papers as a prime example of the success of quantum mechanics. It has always played a central role in the teaching of quantum physics and has served as a most important heuristic tool, shaping our intuition and inspiring many expositions. The Schroedinger equation for the hydrogen atom is usually solved in the position representation, the solution to the equation being the wave functions psi/sub nlm/(r). If Schroedinger's equation is solved in the momentum representation instead of the coordinate representation, the absolute square of the corresponding momentum state wave function phi/sub nlm/(p) would give the momentum probability distribution of the electron in the state defined by the quantum numbers n, l and m. Three different types of collisions which can take place in the (e,2e) reaction on atomic hydrogen, which is a three body problem, are discussed

  2. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    Science.gov (United States)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  3. Process-orientated psychoanalytic work in initial interviews and the importance of the opening scene.

    Science.gov (United States)

    Wegner, Peter

    2014-06-01

    From the very first moment of the initial interview to the end of a long course of psychoanalysis, the unconscious exchange between analysand and analyst, and the analysis of the relationship between transference and countertransference, are at the heart of psychoanalytic work. Drawing on initial interviews with a psychosomatically and depressively ill student, a psychoanalytic understanding of initial encounters is worked out. The opening scene of the first interview already condenses the central psychopathology - a clinging to the primary object because it was never securely experienced as present by the patient. The author outlines the development of some psychoanalytic theories concerning the initial interview and demonstrates their specific importance as background knowledge for the clinical situation in the following domains: the 'diagnostic position', the 'therapeutic position', the 'opening scene', the 'countertransference' and the 'analyst's free-floating introspectiveness'. More recent investigations refer to 'process qualities' of the analytic relationship, such as 'synchronization' and 'self-efficacy'. The latter seeks to describe after how much time between the interview sessions constructive or destructive inner processes gain ground in the patient and what significance this may have for the decision about the treatment that follows. All these factors combined can lead to establishing a differential process-orientated indication that also takes account of the fact that being confronted with the fear of unconscious processes of exchange is specific to the psychoanalytic profession. Copyright © 2014 Institute of Psychoanalysis.

  4. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    functionality by statistically modelling the distribution of the synaptic vesicles in two groups of rats: a control group subjected to sham stress and a stressed group subjected to a single acute foot-shock (FS)-stress episode. We hypothesize that the synaptic vesicles have different spatial distributions...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  5. On relation between distribution functions in hard and soft processes

    International Nuclear Information System (INIS)

    Kisselev, A.V.; Petrov, V.A.

    1992-10-01

    It is shown that in the particle-exchange model the hadron-hadron scattering amplitude admits parton-like representation with the distribution functions coinciding with those extracted from deep inelastic processes. (author). 13 refs

  6. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    Science.gov (United States)

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  7. Underfrequency Load Shedding for an Islanded Distribution System With Distributed Generators

    DEFF Research Database (Denmark)

    Mahat, Pukar; Chen, Zhe; Bak-Jensen, Birgitte

    2010-01-01

    Significant penetration of distributed generation in many distribution systems has opened an option of operating distribution systems in island mode for economical and technical reasons. However, balancing frequency of the islanded system is still an issue to be solved, especially when the demand...

  8. The Analysis of process optimization during the loading distribution test for steam turbine

    International Nuclear Information System (INIS)

    Li Jiangwei; Cao Yuhua; Li Dawei

    2014-01-01

    The loading distribution of steam turbine needs six times to complete in total, the first time is completed when the turbine cylinder buckles, the rest must be completed orderly in the process of installing GVP pipe. To complete 5 tests of loading distribution and installation of GVP pipe, it usually takes around 90 days for most nuclear plants while the unit l of Fuqing Nuclear Power Station compress it into about 45 days by optimizing the installation process. this article describes the successful experience of how the Unit l of Fuqing Nuclear Power Station finished 5 tests of loading distribution and installation of GVP pipe in 45 days by optimizing the process, Meanwhile they analysis the advantages and disadvantages through comparing it with the process provide by suppliers, which brings up some rationalization proposals for installation work to the follow-up units of our plant. (authors)

  9. Reaction Mechanism and Distribution Behavior of Arsenic in the Bottom Blown Copper Smelting Process

    Directory of Open Access Journals (Sweden)

    Qinmeng Wang

    2017-08-01

    Full Text Available The control of arsenic, a toxic and carcinogenic element, is an important issue for all copper smelters. In this work, the reaction mechanism and distribution behavior of arsenic in the bottom blown copper smelting process (SKS process were investigated and compared to the flash smelting process. There are obvious differences of arsenic distribution in the SKS process and flash process, resulting from the differences of oxygen potentials, volatilizations, smelting temperatures, reaction intensities, and mass transfer processes. Under stable production conditions, the distributions of arsenic among matte, slag, and gas phases are 6%, 12%, and 82%, respectively. Less arsenic is reported in the gas phase with the flash process than with the SKS process. The main arsenic species in gas phase are AsS (g, AsO (g, and As2 (g. Arsenic exists in the slag predominantly as As2O3 (l, and in matte as As (l. High matte grade is harmful to the elimination of arsenic to gas. The changing of Fe/SiO2 has slight effects on the distributions of arsenic. In order to enhance the removal of arsenic from the SKS smelting system to the gas phase, low oxygen concentration, low ratios of oxygen/ore, and low matte grade should be chosen. In the SKS smelting process, no dust is recycled, and almost all dust is collected and further treated to eliminate arsenic and recover valuable metals by other process streams.

  10. OpenDrift - an open source framework for ocean trajectory modeling

    Science.gov (United States)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  11. QUALITY AND PROCESSES OF BANGLADESH OPEN UNIVERSITY COURSE MATERIALS DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    K. M. Rezanur RAHMAN

    2006-04-01

    Full Text Available A new member of the mega-Universities, Bangladesh Open University (BOU introduced a course team approach for developing effective course materials for distance students. BOU teaching media includes printed course books, study guides, radio and television broadcasts, audiocassettes and occasional face-to-face tutorials. Each course team comprises specialist course writer(s, editor, trained style editor, graphic designer,illustrator, audio-visual producer and anonymous referees. An editorial board or preview committee is responsible for the final approval for publishing or broadcasting materials for learners. This approach has been proved to be effective, but appeared to be complicated and time-consuming. This report focuses on the quality and processes of BOU course materials development taking into account the strengths and weaknesses of the current approach.

  12. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  13. Negative binomial distribution fits to multiplicity distributions is restricted δη intervals from central O+Cu collisions at 14.6A GeV/c and their implication for open-quotes Intermittencyclose quotes

    International Nuclear Information System (INIS)

    Tannenbaum, M.J.

    1993-01-01

    Experience in analyzing the data from Light and Heavy Ion Collisions in terms of distributions rather than moments suggests that conventional fluctuations of multiplicity and transverse energy can be well described by Gamma or Negative Binomial Distributions (NBD). Multiplicity distributions were obtained for central 16 O+Cu collisions in bins of δη= 0.1,0.2, 0.3 .... 0.5,1.0, where the bin of 1.0 covers 1.2 < η < 2.2 in the laboratory. NBD fits were performed to these distributions with excellent results in all δη bins. The κ parameter of the NBD fit increases linearly with the δη interval, which is a totally unexpected and particularly striking result. Due to the well known property of the NBD under convolution, this result indicates that the multiplicity distributions in adjacent bins of pseudorapidity δη ∼ 0.1 are largely statistically independent. The relationship to 2-particle correlations and open-quotes Intermittencyclose quotes will be discussed

  14. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    Science.gov (United States)

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  15. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  16. When the mean is not enough: Calculating fixation time distributions in birth-death processes.

    Science.gov (United States)

    Ashcroft, Peter; Traulsen, Arne; Galla, Tobias

    2015-10-01

    Studies of fixation dynamics in Markov processes predominantly focus on the mean time to absorption. This may be inadequate if the distribution is broad and skewed. We compute the distribution of fixation times in one-step birth-death processes with two absorbing states. These are expressed in terms of the spectrum of the process, and we provide different representations as forward-only processes in eigenspace. These allow efficient sampling of fixation time distributions. As an application we study evolutionary game dynamics, where invading mutants can reach fixation or go extinct. We also highlight the median fixation time as a possible analog of mixing times in systems with small mutation rates and no absorbing states, whereas the mean fixation time has no such interpretation.

  17. Statistical identification of the confidence limits of open loop transfer functions obtained by MAR analysis

    International Nuclear Information System (INIS)

    Antonopoulos-Domis, M.; Mourtzanos, K.

    1996-01-01

    Estimators of the confidence limits of open loop transfer functions via Multivariate Auto-Regressive (MAR) modelling are not available in the literature. The statistics of open loop transfer functions obtained by MAR modelling are investigated via numerical experiments. A system of known open loop transfer functions is simulated digitally and excited by random number series. The digital signals of the simulated system are then MAR modelled and the open loop transfer functions are estimated. Performing a large number of realizations, mean values and variances of the open loop transfer functions are estimated. It is found that if the record length N of each realization is long enough then the estimates of open loop transfer functions follow normal distribution. The variance of the open loop transfer functions is proportional to 1/N. For MAR processes the asymptotic covariance matrix of the estimate of open loop transfer functions was found in agreement with theoretical prediction. (author)

  18. Managing Distributed Innovation Processes in Virtual Organizations by Applying the Collaborative Network Relationship Analysis

    Science.gov (United States)

    Eschenbächer, Jens; Seifert, Marcus; Thoben, Klaus-Dieter

    Distributed innovation processes are considered as a new option to handle both the complexity and the speed in which new products and services need to be prepared. Indeed most research on innovation processes was focused on multinational companies with an intra-organisational perspective. The phenomena of innovation processes in networks - with an inter-organisational perspective - have been almost neglected. Collaborative networks present a perfect playground for such distributed innovation processes whereas the authors highlight in specific Virtual Organisation because of their dynamic behaviour. Research activities supporting distributed innovation processes in VO are rather new so that little knowledge about the management of such research is available. With the presentation of the collaborative network relationship analysis this gap will be addressed. It will be shown that a qualitative planning of collaboration intensities can support real business cases by proving knowledge and planning data.

  19. Open cycle thermoacoustics

    Energy Technology Data Exchange (ETDEWEB)

    Reid, Robert Stowers [Georgia Inst. of Technology, Atlanta, GA (United States)

    2000-01-01

    A new type of thermodynamic device combining a thermodynamic cycle with the externally applied steady flow of an open thermodynamic process is discussed and experimentally demonstrated. The gas flowing through this device can be heated or cooled in a series of semi-open cyclic steps. The combination of open and cyclic flows makes possible the elimination of some or all of the heat exchangers (with their associated irreversibility). Heat is directly exchanged with the process fluid as it flows through the device when operating as a refrigerator, producing a staging effect that tends to increase First Law thermodynamic efficiency. An open-flow thermoacoustic refrigerator was built to demonstrate this concept. Several approaches are presented that describe the physical characteristics of this device. Tests have been conducted on this refrigerator with good agreement with a proposed theory.

  20. Unity and disunity in evolutionary sciences: process-based analogies open common research avenues for biology and linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric

    2016-08-20

    For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to

  1. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  2. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  3. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    Science.gov (United States)

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  4. A Formal Approach to Run-Time Evaluation of Real-Time Behaviour in Distributed Process Control Systems

    DEFF Research Database (Denmark)

    Kristensen, C.H.

    This thesis advocates a formal approach to run-time evaluation of real-time behaviour in distributed process sontrol systems, motivated by a growing interest in applying the increasingly popular formal methods in the application area of distributed process control systems. We propose to evaluate...... because the real-time aspects of distributed process control systems are considered to be among the hardest and most interesting to handle....

  5. High-uniformity centimeter-wide Si etching method for MEMS devices with large opening elements

    Science.gov (United States)

    Okamoto, Yuki; Tohyama, Yukiya; Inagaki, Shunsuke; Takiguchi, Mikio; Ono, Tomoki; Lebrasseur, Eric; Mita, Yoshio

    2018-04-01

    We propose a compensated mesh pattern filling method to achieve highly uniform wafer depth etching (over hundreds of microns) with a large-area opening (over centimeter). The mesh opening diameter is gradually changed between the center and the edge of a large etching area. Using such a design, the etching depth distribution depending on sidewall distance (known as the local loading effect) inversely compensates for the over-centimeter-scale etching depth distribution, known as the global or within-die(chip)-scale loading effect. Only a single DRIE with test structure patterns provides a micro-electromechanical systems (MEMS) designer with the etched depth dependence on the mesh opening size as well as on the distance from the chip edge, and the designer only has to set the opening size so as to obtain a uniform etching depth over the entire chip. This method is useful when process optimization cannot be performed, such as in the cases of using standard conditions for a foundry service and of short turn-around-time prototyping. To demonstrate, a large MEMS mirror that needed over 1 cm2 of backside etching was successfully fabricated using as-is-provided DRIE conditions.

  6. Plastic debris in the open ocean

    OpenAIRE

    Cózar, Andrés; Echevarría, Fidel; González-Gordillo, J. Ignacio; Irigoien, Xabier; Úbeda, Bárbara; Hernández-León, Santiago; Palma, Álvaro T.; Navarro, Sandra; García-de-Lomas, Juan; Ruiz, Andrea; Fernández-de-Puelles, María L.; Duarte, Carlos M.

    2014-01-01

    There is a rising concern regarding the accumulation of floating plastic debris in the open ocean. However, the magnitude and the fate of this pollution are still open questions. Using data from the Malaspina 2010 circumnavigation, regional surveys, and previously published reports, we show a worldwide distribution of plastic on the surface of the open ocean, mostly accumulating in the convergence zones of each of the five subtropical gyres with comparable density. Howeve...

  7. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  8. Fiji: an open-source platform for biological-image analysis.

    Science.gov (United States)

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  9. VMStools: Open-source software for the processing, analysis and visualisation of fisheries logbook and VMS data

    NARCIS (Netherlands)

    Hintzen, N.T.; Bastardie, F.; Beare, D.J.; Piet, G.J.; Ulrich, C.; Deporte, N.; Egekvist, J.; Degel, H.

    2012-01-01

    VMStools is a package of open-source software, build using the freeware environment R, specifically developed for the processing, analysis and visualisation of landings (logbooks) and vessel location data (VMS) from commercial fisheries. Analyses start with standardized data formats for logbook

  10. EVALUATION OF POLLUTION PREVENTION OPTIONS TO REDUCE STYRENE EMISSIONS FROM FIBER-REINFORCED PLASTIC OPEN MOLDING PROCESSES

    Science.gov (United States)

    Pollution prevention (P2) options to reduce styrene emissions, such as new materials, and application equipment, are commercially available to the operators of open molding processes. However, information is lacking on the emissions reduction that these options can achieve. To me...

  11. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  12. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  13. Exact run length distribution of the double sampling x-bar chart with estimated process parameters

    Directory of Open Access Journals (Sweden)

    Teoh, W. L.

    2016-05-01

    Full Text Available Since the run length distribution is generally highly skewed, a significant concern about focusing too much on the average run length (ARL criterion is that we may miss some crucial information about a control chart’s performance. Thus it is important to investigate the entire run length distribution of a control chart for an in-depth understanding before implementing the chart in process monitoring. In this paper, the percentiles of the run length distribution for the double sampling (DS X chart with estimated process parameters are computed. Knowledge of the percentiles of the run length distribution provides a more comprehensive understanding of the expected behaviour of the run length. This additional information includes the early false alarm, the skewness of the run length distribution, and the median run length (MRL. A comparison of the run length distribution between the optimal ARL-based and MRL-based DS X chart with estimated process parameters is presented in this paper. Examples of applications are given to aid practitioners to select the best design scheme of the DS X chart with estimated process parameters, based on their specific purpose.

  14. Modelling spatiotemporal distribution patterns of earthworms in order to indicate hydrological soil processes

    Science.gov (United States)

    Palm, Juliane; Klaus, Julian; van Schaik, Loes; Zehe, Erwin; Schröder, Boris

    2010-05-01

    Soils provide central ecosystem functions in recycling nutrients, detoxifying harmful chemicals as well as regulating microclimate and local hydrological processes. The internal regulation of these functions and therefore the development of healthy and fertile soils mainly depend on the functional diversity of plants and animals. Soil organisms drive essential processes such as litter decomposition, nutrient cycling, water dynamics, and soil structure formation. Disturbances by different soil management practices (e.g., soil tillage, fertilization, pesticide application) affect the distribution and abundance of soil organisms and hence influence regulating processes. The strong relationship between environmental conditions and soil organisms gives us the opportunity to link spatiotemporal distribution patterns of indicator species with the potential provision of essential soil processes on different scales. Earthworms are key organisms for soil function and affect, among other things, water dynamics and solute transport in soils. Through their burrowing activity, earthworms increase the number of macropores by building semi-permanent burrow systems. In the unsaturated zone, earthworm burrows act as preferential flow pathways and affect water infiltration, surface-, subsurface- and matrix flow as well as the transport of water and solutes into deeper soil layers. Thereby different ecological earthworm types have different importance. Deep burrowing anecic earthworm species (e.g., Lumbricus terrestris) affect the vertical flow and thus increase the risk of potential contamination of ground water with agrochemicals. In contrast, horizontal burrowing endogeic (e.g., Aporrectodea caliginosa) and epigeic species (e.g., Lumbricus rubellus) increase water conductivity and the diffuse distribution of water and solutes in the upper soil layers. The question which processes are more relevant is pivotal for soil management and risk assessment. Thus, finding relevant

  15. Distribution and interplay of geologic processes on Titan from Cassini radar data

    Science.gov (United States)

    Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.

    2010-01-01

    The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ???350 m to ???2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30??), with no dunes being present above 60??. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30?? and 60?? north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial/pluvial/lacustrine processes are the

  16. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  17. Bridging the gap between a stationary point process and its Palm distribution

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1994-01-01

    In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to

  18. Analysing the Outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center

    OpenAIRE

    Marjeta, Katri

    2011-01-01

    Marjeta, Katri. 2011. Analysing the outbound logistics process enhancements in Nokia-Siemens Networks Global Distribution Center. Master´s thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 57. Due to confidentiality issues, this work has been modified from its original form. The aim of this Master Thesis work is to describe and analyze the outbound logistics process enhancement projects executed in Nokia-Siemens Networks Global Distribution Center after the N...

  19. A distributed process monitoring system for nuclear powered electrical generating facilities

    International Nuclear Information System (INIS)

    Sweney, A.D.

    1991-01-01

    Duke Power Company is one of the largest investor owned utilities in the United States, with a service area of 20,000 square miles extending across North and South Carolina. Oconee Nuclear Station, one of Duke Power's three nuclear generating facilities, is a three unit pressurized water reactor site and has, over the course of its 15-year operating lifetime, effectively run out of plant processing capability. From a severely overcrowded cable spread room to an aging overtaxed Operator Aid Computer, the problems with trying to add additional process variables to the present centralized Operator Aid Computer are almost insurmountable obstacles. This paper reports that for this reason, and to realize the inherent benefits of a distributed process monitoring and control system, Oconee has embarked on a project to demonstrate the ability of a distributed system to perform in the nuclear power plant environment

  20. Exposure to an open-field arena increases c-Fos expression in a distributed anxiety-related system projecting to the basolateral amygdaloid complex

    DEFF Research Database (Denmark)

    Hale, M.W.; Hay-Schmidt, A.; Mikkelsen, J.D.

    2008-01-01

    Anxiety states and anxiety-related behaviors appear to be regulated by a distributed and highly interconnected system of brain structures including the basolateral amygdala. Our previous studies demonstrate that exposure of rats to an open-field in high- and low-light conditions results in a marked...... increase in c-Fos expression in the anterior part of the basolateral amygdaloid nucleus (BLA) compared with controls. The neural mechanisms underlying the anatomically specific effects of open-field exposure on c-Fos expression in the BLA are not clear, however, it is likely that this reflects activation...... to this region in combination with c-Fos immunostaining to identify cells responding to exposure to an open-field arena in low-light (8-13 lux) conditions (an anxiogenic stimulus in rats). Adult male Wistar rats received a unilateral microinjection of 4% CTb in phosphate-buffered saline into the basolateral...

  1. Open Access publishing in physics gains momentum

    CERN Multimedia

    2006-01-01

    The first meeting of European particle physics funding agencies took place on 3 November at CERN to establish a consortium for Open Access publishing in particle physics, SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). Open Access could transform the academic publishing world, with a great impact on research. The traditional model of research publication is funded through reader subscriptions. Open Access will turn this model on its head by changing the funding structure of research results, without increasing the overall cost of publishing. Instead of demanding payment from readers, publications will be distributed free of charge, financed by funding agencies via laboratories and the authors. This new concept will bring greater benefits and broaden opportunities for researchers and funding agencies by providing unrestricted distribution of the results of publicly funded research. The meeting marked a positive step forward, with international support from laboratories, fundin...

  2. Open Source Business Solutions

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This analyses the Open source movement. Open source development process and management is seen different from the classical point of view. This focuses on characteristics and software market tendencies for the main Open source initiatives. It also points out the labor market future evolution for the software developers.

  3. PREDICTION OF AEROSOL HAZARDS ARISING FROM THE OPENING OF AN ANTHRAX-TAINTED LETTER IN AN OPEN OFFICE ENVIRONMENT USING COMPUTATIONAL FLUID DYNAMICS

    Directory of Open Access Journals (Sweden)

    FUE-SANG LIEN

    2010-09-01

    Full Text Available Early experimental work, conducted at Defence R&D Canada–Suffield, measured and characterized the personal and environmental contamination associated with simulated anthrax-tainted letters under a number of different scenarios in order to obtain a better understanding of the physical and biological processes for detecting, assessing, and formulating potential mitigation strategies for managing the risks associated with opening an anthrax-tainted letter. These experimental investigations have been extended in the present study to simulate numerically the contamination from the opening of anthrax-tainted letters in an open office environment using computational fluid dynamics (CFD. A quantity of 0.1 g of Bacillus atropheus (formerly referred to as Bacillus subtilis var globigii (BG spores in dry powder form, which was used here as a surrogate species for Bacillus anthracis (anthrax, was released from an opened letter in the experiment. The accuracy of the model for prediction of the spatial distribution of BG spores in the office from the opened letter is assessed qualitatively (and to the extent possible, quantitatively by detailed comparison with measured BG concentrations obtained under a number of different scenarios, some involving people moving within the office. The observed discrepancy between the numerical predictions and experimental measurements of concentration was probably the result of a number of physical processes which were not accounted for in the numerical simulation. These include air flow leakage from cracks and crevices of the building shell; the dispersion of BG spores in the Heating, Ventilation, and Air Conditioning (HVAC system; and, the effect of deposition and re-suspension of BG spores from various surfaces in the office environment.

  4. Distribution of green open space in Malang City based on multispectral data

    Science.gov (United States)

    Hasyim, A. W.; Hernawan, F. P.

    2017-06-01

    Green open space is one of the land that its existence is quite important in urban areas where the minimum area is set to reach 30% of the total area of the city. Malang which has an area of 110,6 square kilometers, is one of the major cities in East Java Province that is prone to over-land conversion due to development needs. In support of the green space program, calculation of green space is needed precisely so that remote sensing which has high accuracy is now used for measurement of green space. This study aims to analyze the area of green open space in Malang by using Landsat 8 image in 2015. The method used was the vegetation index that is Normalized Difference Vegetation Index (NDVI). From the study obtained the calculation of green open space was better to use the vegetation index method to avoid the occurrence of misclassification of other types of land use. The results of the calculation of green open space using NDVI found that the area of green open space in Malang City in 2015 reached 39% of the total area.

  5. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  6. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  7. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi; Suzuki, Masaru; Ito, Nobuyasu

    2010-01-01

    of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows

  8. Distributed Services with OpenAFS For Enterprise and Education

    CERN Document Server

    Milicchio, Franco

    2007-01-01

    Shows in detail how to build enterprise-level secure, redundant, and highly scalable services from scratch on top of the open source Linux operating system, suitable for small companies as well as big universities. This book presents the core architecture, based on Kerberos, LDAP, AFS, and Samba.

  9. The Distribution of the Interval between Events of a Cox Process with Shot Noise Intensity

    Directory of Open Access Journals (Sweden)

    Angelos Dassios

    2008-01-01

    Full Text Available Applying piecewise deterministic Markov processes theory, the probability generating function of a Cox process, incorporating with shot noise process as the claim intensity, is obtained. We also derive the Laplace transform of the distribution of the shot noise process at claim jump times, using stationary assumption of the shot noise process at any times. Based on this Laplace transform and from the probability generating function of a Cox process with shot noise intensity, we obtain the distribution of the interval of a Cox process with shot noise intensity for insurance claims and its moments, that is, mean and variance.

  10. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    International Nuclear Information System (INIS)

    He, Qingyun; Chen, Hongli; Feng, Jingchao

    2015-01-01

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  11. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    He, Qingyun; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; Feng, Jingchao

    2015-12-15

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  12. Land surface temperature distribution and development for green open space in Medan city using imagery-based satellite Landsat 8

    Science.gov (United States)

    Sulistiyono, N.; Basyuni, M.; Slamet, B.

    2018-03-01

    Green open space (GOS) is one of the requirements where a city is comfortable to stay. GOS might reduce land surface temperature (LST) and air pollution. Medan is one of the biggest towns in Indonesia that experienced rapid development. However, the early development tends to neglect the GOS existence for the city. The objective of the study is to determine the distribution of land surface temperature and the relationship between the normalized difference vegetation index (NDVI) and the priority of GOS development in Medan City using imagery-based satellite Landsat 8. The method approached to correlate the distribution of land surface temperature derived from the value of digital number band 10 with the NDVI which was from the ratio of groups five and four on satellite images of Landsat 8. The results showed that the distribution of land surface temperature in the Medan City in 2016 ranged 20.57 - 33.83 °C. The relationship between the distribution of LST distribution with NDVI was reversed with a negative correlation of -0.543 (sig 0,000). The direction of GOS in Medan City is therefore developed on the allocation of LST and divided into three priority classes namely first priority class had 5,119.71 ha, the second priority consisted of 16,935.76 ha, and third priority of 6,118.50 ha.

  13. Problem of uniqueness in the renewal process generated by the uniform distribution

    Directory of Open Access Journals (Sweden)

    D. Ugrin-Šparac

    1992-01-01

    Full Text Available The renewal process generated by the uniform distribution, when interpreted as a transformation of the uniform distribution into a discrete distribution, gives rise to the question of uniqueness of the inverse image. The paper deals with a particular problem from the described domain, that arose in the construction of a complex stochastic test intended to evaluate pseudo-random number generators. The connection of the treated problem with the question of a unique integral representation of Gamma-function is also mentioned.

  14. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  15. Exploring the Role of Distributed Learning in Distance Education at Allama Iqbal Open University: Academic Challenges at Postgraduate Level

    Directory of Open Access Journals (Sweden)

    Qadir BUKHSH

    2015-01-01

    Full Text Available Distributed learning is derived from the concept of distributed resources. Different institutions around the globe connected through network and the learners are diverse, located in the different cultures and communities. Distributed learning provides global standards of quality to all learners through synchronous and asynchronous communications and provides the opportunity of flexible and independent learning with equity, low cost educational services and has become the first choice of the dispersed learners around the globe. The present study was undertaken to investigate the challenges faced by the Faculty Members of Department of Business Administration and Computer Science at Allama Iqbal Open University Islamabad Pakistan. 25 Faculty Members were taken as sample of the study from both Departments (100% Sampling. The study was qualitative in nature and interview was the data collection tool. Data was analyzed by thematic analysis technique. The major challenges faced by the Faculty Members were as: bandwidth, synchronous learning activities, irregularity of the learners, feedback on individual work, designing and managing the learning activities, quality issues and training to use the network for teaching learning activities

  16. An open-source hardware and software system for acquisition and real-time processing of electrophysiology during high field MRI.

    Science.gov (United States)

    Purdon, Patrick L; Millan, Hernan; Fuller, Peter L; Bonmassar, Giorgio

    2008-11-15

    Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open-source system for simultaneous electrophysiology and fMRI featuring low-noise (tested up to 7T), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has been used in human EEG/fMRI studies at 3 and 7T examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3T fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level.

  17. The SCOAP3 initiative and the Open Access Article-Processing-Charge market: global partnership and competition improve value in the dissemination of science

    CERN Document Server

    Romeu, Clément; Kohls, Alexander; Mansuy, Anne; Mele, Salvatore; Vesper, Martin

    2014-01-01

    The SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) initiative is an international partnership to convert to Open Access the published literature in the field of High-Energy Physics (HEP). It has been in operation since January 2014, and covers more than 4’000 articles/year. Originally initiated by CERN, the European Organization for Nuclear Research, and now counting partners representing 41 countries and 3 intergovernmental organizations, SCOAP3 has successfully converted to Open Access all, or part of, 6 HEP journals previously restricted to subscribers. It is also supporting publication of articles in 4 existing Open Access journals. As a “Gold” Open Access initiative, SCOAP3 pays Article Processing Charges (APCs), as publishers’ source of revenue for the publication service. Differentiating itself from other Open Access initiatives, SCOAP3 set APCs through a tendering process, correlating quality and price, at consistent conditions across participating publishers. Th...

  18. OpenSHS: Open Smart Home Simulator

    Directory of Open Access Journals (Sweden)

    Nasser Alshammari

    2017-05-01

    Full Text Available This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS.

  19. OpenSHS: Open Smart Home Simulator.

    Science.gov (United States)

    Alshammari, Nasser; Alshammari, Talal; Sedky, Mohamed; Champion, Justin; Bauer, Carolin

    2017-05-02

    This paper develops a new hybrid, open-source, cross-platform 3D smart home simulator, OpenSHS, for dataset generation. OpenSHS offers an opportunity for researchers in the field of the Internet of Things (IoT) and machine learning to test and evaluate their models. Following a hybrid approach, OpenSHS combines advantages from both interactive and model-based approaches. This approach reduces the time and efforts required to generate simulated smart home datasets. We have designed a replication algorithm for extending and expanding a dataset. A small sample dataset produced, by OpenSHS, can be extended without affecting the logical order of the events. The replication provides a solution for generating large representative smart home datasets. We have built an extensible library of smart devices that facilitates the simulation of current and future smart home environments. Our tool divides the dataset generation process into three distinct phases: first design: the researcher designs the initial virtual environment by building the home, importing smart devices and creating contexts; second, simulation: the participant simulates his/her context-specific events; and third, aggregation: the researcher applies the replication algorithm to generate the final dataset. We conducted a study to assess the ease of use of our tool on the System Usability Scale (SUS).

  20. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    Science.gov (United States)

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  1. Distributed real time data processing architecture for the TJ-II data acquisition system

    International Nuclear Information System (INIS)

    Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Vega, J.; Sanchez, E.

    2004-01-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions

  2. Marketing promotion in the consumer goods’ retail distribution process

    Directory of Open Access Journals (Sweden)

    S.Bălăşescu

    2013-06-01

    Full Text Available The fundamental characteristic of contemporary marketing is the total opening towards three major directions: consumer needs, organization needs and society’s needs. The continuous expansion of marketing has been accompanied by a process of differentiation and specialization. Differentiation has led to the so called “specific marketing”. In this paper, we aim to explain that in the retail companies, the concept of sales marketing can be distinguished as an independent marketing specialization. The main objectives for this paper are: the definition and delimitation of consumer goods’ sales marketing in the retail business and the sectoral approach of the marketing concept and its specific techniques for the retail activities.

  3. Distributed system for parallel data processing of ECT signals for electromagnetic flaw detection in materials

    International Nuclear Information System (INIS)

    Guliashki, Vassil; Marinova, Galia

    2002-01-01

    The paper proposes a distributed system for parallel data processing of ECT signals for flaw detection in materials. The measured data are stored in files on a host computer, where a JAVA server is located. The host computer is connected through Internet to a set of client computers, distributed geographically. The data are distributed from the host computer by means of the JAVA server to the client computers according their requests. The software necessary for the data processing is installed on each client computer in advance. The organization of the data processing on many computers, working simultaneously in parallel, leads to great time reducing, especially in cases when huge amount of data should be processed in very short time. (Author)

  4. A decision support system using analytical hierarchy process (AHP) for the optimal environmental reclamation of an open-pit mine

    Science.gov (United States)

    Bascetin, A.

    2007-04-01

    The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.

  5. Wigner distribution function and entropy of the damped harmonic oscillator within the theory of the open quantum systems

    Science.gov (United States)

    Isar, Aurelian

    1995-01-01

    The harmonic oscillator with dissipation is studied within the framework of the Lindblad theory for open quantum systems. By using the Wang-Uhlenbeck method, the Fokker-Planck equation, obtained from the master equation for the density operator, is solved for the Wigner distribution function, subject to either the Gaussian type or the delta-function type of initial conditions. The obtained Wigner functions are two-dimensional Gaussians with different widths. Then a closed expression for the density operator is extracted. The entropy of the system is subsequently calculated and its temporal behavior shows that this quantity relaxes to its equilibrium value.

  6. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  7. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  8. A framework for integration of scientific applications into the OpenTopography workflow

    Science.gov (United States)

    Nandigam, V.; Crosby, C.; Baru, C.

    2012-12-01

    virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.

  9. The Limitations of Access Alone: Moving Towards Open Processes in Education Technology

    Science.gov (United States)

    Knox, Jeremy

    2013-01-01

    "Openness" has emerged as one of the foremost themes in education, within which an open education movement has enthusiastically embraced digital technologies as the central means of participation and inclusion. Open Educational Resources (OERs) and Massive Open Online Courses (MOOCs) have surfaced at the forefront of this development,…

  10. AKaplan-Meier estimators of distance distributions for spatial point processes

    NARCIS (Netherlands)

    Baddeley, A.J.; Gill, R.D.

    1997-01-01

    When a spatial point process is observed through a bounded window, edge effects hamper the estimation of characteristics such as the empty space function $F$, the nearest neighbour distance distribution $G$, and the reduced second order moment function $K$. Here we propose and study product-limit

  11. Computer simulation of plasma turbulence in open systems

    International Nuclear Information System (INIS)

    Sigov, Yu.S.

    1982-01-01

    A short review of the results of kinetic simulation of collective phenomena in open plasma systems with the variable total energy and number of particles, i.e., the particle and energy fluxes on boundary surfaces and/or their internal sources and channels is given. Three specific problems are considered in different detail for such systems in one-dimensional geometry: the generation and evolution of double layers in a currently unstable plasma; the collisionless relaxation of strongly non-equilibrium electron distributions; the Langmuir collapse and strong electrostatic turbulence in systems with parametric excitation of a plasma by an external pumping wave and with cooling the fast non-Maxwell electrons. In all these cases the non-linearity and a collective character of processes give examples of new dissipative plasma structures that essentially widen our idea about the nature of the plasma turbulence in non-homogeneous open systems. (Auth.)

  12. DCODE: A Distributed Column-Oriented Database Engine for Big Data Analytics

    OpenAIRE

    Liu, Yanchen; Cao, Fang; Mortazavi, Masood; Chen, Mengmeng; Yan, Ning; Ku, Chi; Adnaik, Aniket; Morgan, Stephen; Shi, Guangyu; Wang, Yuhu; Fang, Fan

    2015-01-01

    Part 10: Big Data and Text Mining; International audience; We propose a novel Distributed Column-Oriented Database Engine (DCODE) for efficient analytic query processing that combines advantages of both column storage and parallel processing. In DCODE, we enhance an existing open-source columnar database engine by adding the capability for handling queries over a cluster. Specifically, we studied parallel query execution and optimization techniques such as horizontal partitioning, exchange op...

  13. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  14. "BPELanon": Protect Business Processes on the Cloud

    OpenAIRE

    Marigianna Skouradaki; Dieter Roller; Frank Leymann; Vincenzo Ferme; Cesare Pautasso

    2015-01-01

    The advent of Cloud computing supports the offering of many Business Process Management applications on a distributed per use basis environment through its infrastructure. Due to the fact that privacy is still an open issue in the Cloud many companies are reluctant to move their Business Processes on a public Cloud. Since the Cloud environment can be beneficiary for the Business Processes the investigation of privacy issues needs to be further examined. In order to enforce the Business Proces...

  15. A novel bio-safe phase separation process for preparing open-pore biodegradable polycaprolactone microparticles.

    Science.gov (United States)

    Salerno, Aurelio; Domingo, Concepción

    2014-09-01

    Open-pore biodegradable microparticles are object of considerable interest for biomedical applications, particularly as cell and drug delivery carriers in tissue engineering and health care treatments. Furthermore, the engineering of microparticles with well definite size distribution and pore architecture by bio-safe fabrication routes is crucial to avoid the use of toxic compounds potentially harmful to cells and biological tissues. To achieve this important issue, in the present study a straightforward and bio-safe approach for fabricating porous biodegradable microparticles with controlled morphological and structural features down to the nanometer scale is developed. In particular, ethyl lactate is used as a non-toxic solvent for polycaprolactone particles fabrication via a thermal induced phase separation technique. The used approach allows achieving open-pore particles with mean particle size in the 150-250 μm range and a 3.5-7.9 m(2)/g specific surface area. Finally, the combination of thermal induced phase separation and porogen leaching techniques is employed for the first time to obtain multi-scaled porous microparticles with large external and internal pore sizes and potential improved characteristics for cell culture and tissue engineering. Samples were characterized to assess their thermal properties, morphology and crystalline structure features and textural properties. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Design of distributed systems of hydrolithospere processes management. Selection of optimal number of extracting wells

    Science.gov (United States)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The article considers the important issue of designing the distributed systems of hydrolithospere processes management. Control effects on the hydrolithospere processes are implemented by a set of extractive wells. The article shows how to determine the optimal number of extractive wells that provide a distributed control impact on the management object.

  17. Gaseous material capacity of open plasma jet in plasma spray-physical vapor deposition process

    Science.gov (United States)

    Liu, Mei-Jun; Zhang, Meng; Zhang, Qiang; Yang, Guan-Jun; Li, Cheng-Xin; Li, Chang-Jiu

    2018-01-01

    Plasma spray-physical vapor deposition (PS-PVD) process, emerging as a highly efficient hybrid approach, is based on two powerful technologies of both plasma spray and physical vapor deposition. The maximum production rate is affected by the material feed rate apparently, but it is determined by the material vapor capacity of transporting plasma actually and essentially. In order to realize high production rate, the gaseous material capacity of plasma jet must be fundamentally understood. In this study, the thermal characteristics of plasma were measured by optical emission spectrometry. The results show that the open plasma jet is in the local thermal equilibrium due to a typical electron number density from 2.1 × 1015 to 3.1 × 1015 cm-3. In this condition, the temperature of gaseous zirconia can be equal to the plasma temperature. A model was developed to obtain the vapor pressure of gaseous ZrO2 molecules as a two dimensional map of jet axis and radial position corresponding to different average plasma temperatures. The overall gaseous material capacity of open plasma jet, take zirconia for example, was further established. This approach on evaluating material capacity in plasma jet would shed light on the process optimization towards both depositing columnar coating and a high production rate of PS-PVD.

  18. Extracting message inter-departure time distributions from the human electroencephalogram.

    Directory of Open Access Journals (Sweden)

    Bratislav Mišić

    2011-06-01

    Full Text Available The complex connectivity of the cerebral cortex is a topic of much study, yet the link between structure and function is still unclear. The processing capacity and throughput of information at individual brain regions remains an open question and one that could potentially bridge these two aspects of neural organization. The rate at which information is emitted from different nodes in the network and how this output process changes under different external conditions are general questions that are not unique to neuroscience, but are of interest in multiple classes of telecommunication networks. In the present study we show how some of these questions may be addressed using tools from telecommunications research. An important system statistic for modeling and performance evaluation of distributed communication systems is the time between successive departures of units of information at each node in the network. We describe a method to extract and fully characterize the distribution of such inter-departure times from the resting-state electroencephalogram (EEG. We show that inter-departure times are well fitted by the two-parameter Gamma distribution. Moreover, they are not spatially or neurophysiologically trivial and instead are regionally specific and sensitive to the presence of sensory input. In both the eyes-closed and eyes-open conditions, inter-departure time distributions were more dispersed over posterior parietal channels, close to regions which are known to have the most dense structural connectivity. The biggest differences between the two conditions were observed at occipital sites, where inter-departure times were significantly more variable in the eyes-open condition. Together, these results suggest that message departure times are indicative of network traffic and capture a novel facet of neural activity.

  19. An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software

    Directory of Open Access Journals (Sweden)

    Crandall Ian

    2009-07-01

    Full Text Available Abstract Background Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations. Methods The model incorporates two general principles: 1 collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2 commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model. Results The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development. Conclusion The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and

  20. An informatics model for guiding assembly of telemicrobiology workstations for malaria collaborative diagnostics using commodity products and open-source software.

    Science.gov (United States)

    Suhanic, West; Crandall, Ian; Pennefather, Peter

    2009-07-17

    Deficits in clinical microbiology infrastructure exacerbate global infectious disease burdens. This paper examines how commodity computation, communication, and measurement products combined with open-source analysis and communication applications can be incorporated into laboratory medicine microbiology protocols. Those commodity components are all now sourceable globally. An informatics model is presented for guiding the use of low-cost commodity components and free software in the assembly of clinically useful and usable telemicrobiology workstations. The model incorporates two general principles: 1) collaborative diagnostics, where free and open communication and networking applications are used to link distributed collaborators for reciprocal assistance in organizing and interpreting digital diagnostic data; and 2) commodity engineering, which leverages globally available consumer electronics and open-source informatics applications, to build generic open systems that measure needed information in ways substantially equivalent to more complex proprietary systems. Routine microscopic examination of Giemsa and fluorescently stained blood smears for diagnosing malaria is used as an example to validate the model. The model is used as a constraint-based guide for the design, assembly, and testing of a functioning, open, and commoditized telemicroscopy system that supports distributed acquisition, exploration, analysis, interpretation, and reporting of digital microscopy images of stained malarial blood smears while also supporting remote diagnostic tracking, quality assessment and diagnostic process development. The open telemicroscopy workstation design and use-process described here can address clinical microbiology infrastructure deficits in an economically sound and sustainable manner. It can boost capacity to deal with comprehensive measurement of disease and care outcomes in individuals and groups in a distributed and collaborative fashion. The workstation

  1. Embracing Open Software Development in Solar Physics

    Science.gov (United States)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We

  2. IAU astroEDU: an open-access platform for peer-reviewed astronomy education activities

    Science.gov (United States)

    Heenatigala, Thilina; Russo, Pedro; Strubbe, Linda; Gomez, Edward

    2015-08-01

    astroEDU is an open access platform for peer-reviewed astronomy education activities. It addresses key problems in educational repositories such as variability in quality, not maintained or updated regularly, limited content review, and more. This is achieved through a peer-review process similar to what scholarly articles are based on. Activities submitted are peer-reviewed by an educator and a professional astronomer which gives the credibility to the activities. astroEDU activities are open-access in order to make the activities accessible to educators around the world while letting them discover, review, distribute and remix the activities. The activity submission process allows authors to learn how to apply enquiry-based learning into the activity, identify the process skills required, how to develop core goals and objectives, and how to evaluate the activity to determine the outcome. astroEDU is endorsed by the International Astronomical Union meaning each activity is given an official stamp by the international organisation for professional astronomers.

  3. Microfracture spacing distributions and the evolution of fracture patterns in sandstones

    Science.gov (United States)

    Hooker, J. N.; Laubach, S. E.; Marrett, R.

    2018-03-01

    Natural fracture patterns in sandstone were sampled using scanning electron microscope-based cathodoluminescence (SEM-CL) imaging. All fractures are opening-mode and are fully or partially sealed by quartz cement. Most sampled fractures are too small to be height-restricted by sedimentary layers. At very low strains ( 100) datasets show spacings that are best fit by log-normal size distributions, compared to exponential, power law, or normal distributions. The clustering of fractures suggests that the locations of natural factures are not determined by a random process. To investigate natural fracture localization, we reconstructed the opening history of a cluster of fractures within the Huizachal Group in northeastern Mexico, using fluid inclusions from synkinematic cements and thermal-history constraints. The largest fracture, which is the only fracture in the cluster visible to the naked eye, among 101 present, opened relatively late in the sequence. This result suggests that the growth of sets of fractures is a self-organized process, in which small, initially isolated fractures grow and progressively interact, with preferential growth of a subset of fractures developing at the expense of growth of the rest. Size-dependent sealing of fractures within sets suggests that synkinematic cementation may contribute to fracture clustering.

  4. Comparison of Environment Impact between Conventional and Cold Chain Management System in Paprika Distribution Process

    Directory of Open Access Journals (Sweden)

    Eidelweijs A Putri

    2012-09-01

    Full Text Available Pasir Langu village in Cisarua, West Java, is the largest central production area of paprika in Indonesia. On average, for every 200 kilograms of paprika produced, there is rejection amounting to 3 kilograms. This resulted in money loss for wholesalers and wastes. In one year, this amount can be approximately 11.7 million Indonesian rupiahs. Recently, paprika wholesalers in Pasir Langu village recently are developing cold chain management system to maintain quality of paprika so that number of rejection can be reduced. The objective of this study is to compare environmental impacts between conventional and cold chain management system in paprika distribution process using Life Cycle Assessment (LCA methodology and propose Photovoltaic (PV system in paprika distribution process. The result implies that the cold chain system produces more CO2 emission compared to conventional system. However, due to the promotion of PV system, the emission would be reduced. For future research, it is necessary to reduce CO2 emission from transportation process since this process is biggest contributor of CO2 emission at whole distribution process. Keywords: LCA, environmentally friendly distribution, paprika,cold chain, PV system

  5. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    Science.gov (United States)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  6. Predatory Open Access in Rehabilitation.

    Science.gov (United States)

    Manca, Andrea; Martinez, Gianluca; Cugusi, Lucia; Dragone, Daniele; Mercuro, Giuseppe; Deriu, Franca

    2017-05-01

    Increasingly scholars and researchers are being solicited by predatory open access journals seeking manuscript submissions and abusing the author-pays model by charging authors with publishing fees without any or proper peer review. Such questionable editorial practices are threatening the reputation and credibility of scholarly publishing. To date, no investigation has been conducted on this phenomenon in the field of rehabilitation. This study attempts to identify specific predatory journals operating in this field to quantify the phenomenon and its geographic distribution. Beall's List has been used to this end which, although not perfect, is a comprehensive and up-to-date report of predatory publishers. Of the 1113 publishers on the list, 59 journals were identified, for a total of 5610 published articles. The median number of articles published by each journal was 21, and the median amount of article processing charges was $499. Only 1 out of 59 journals was included in the Directory of Open Access Journals, whereas 7 (12%) were indexed by PubMed. Most of the publishers were based in India (36%) followed by the United States (25%) and Pakistan (5%), and 25% were without a verifiable address. The data indicate that the threat of predatory publishing in rehabilitation is real. Physiatrists, physiotherapists, researchers, and academics operating in this field are advised to use the tools available to recognize predatory practices before considering publishing in open access journals. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. Biosecurity and Open-Source Biology: The Promise and Peril of Distributed Synthetic Biological Technologies.

    Science.gov (United States)

    Evans, Nicholas G; Selgelid, Michael J

    2015-08-01

    In this article, we raise ethical concerns about the potential misuse of open-source biology (OSB): biological research and development that progresses through an organisational model of radical openness, deskilling, and innovation. We compare this organisational structure to that of the open-source software model, and detail salient ethical implications of this model. We demonstrate that OSB, in virtue of its commitment to openness, may be resistant to governance attempts.

  8. Distributed Open and Distance Learning: How Does E-Learning Fit? LSDA Reports.

    Science.gov (United States)

    Fletcher, Mick

    The distinctions between types of open and distance learning broadly equate to the concept of learning at a time, place, and pace that best suits the learner. Distance learning refers to geography, whereas open learning refers to time. Flexible learning is a generic term referring either to geography or time. Combining these distinctions allows…

  9. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing

    Directory of Open Access Journals (Sweden)

    Samsinar Riza

    2018-01-01

    Full Text Available The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  10. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    Science.gov (United States)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  11. Bayesian inference for multivariate point processes observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper; Aukema, B.H.

    We consider statistical and computational aspects of simulation-based Bayesian inference for a multivariate point process which is only observed at sparsely distributed times. For specicity we consider a particular data set which has earlier been analyzed by a discrete time model involving unknown...... normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared to discrete time processes in the setting of the present paper as well as other spatial-temporal situations. Keywords: Bark beetle, conditional intensity, forest entomology, Markov chain Monte Carlo...

  12. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.

  13. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    Science.gov (United States)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  14. OpenSR: An Open-Source Stimulus-Response Testing Framework

    Directory of Open Access Journals (Sweden)

    Carolyn C. Matheus

    2015-01-01

    Full Text Available Stimulus–response (S–R tests provide a unique way to acquire information about human perception by capturing automatic responses to stimuli and attentional processes. This paper presents OpenSR, a user-centered S–R testing framework providing a graphical user interface that can be used by researchers to customize, administer, and manage one type of S–R test, the implicit association test. OpenSR provides an extensible open-source Web-based framework that is platform independent and can be implemented on most computers using any operating system. In addition, it provides capabilities for automatically generating and assigning participant identifications, assigning participants to different condition groups, tracking responses, and facilitating collecting and exporting of data. The Web technologies and languages used in creating the OpenSR framework are discussed, namely, HTML5, CSS3, JavaScript, jQuery, Twitter Bootstrap, Python, and Django. OpenSR is available for free download.

  15. The constitutive distributed parameter model of multicomponent chemical processes in gas, fluid and solid phase

    International Nuclear Information System (INIS)

    Niemiec, W.

    1985-01-01

    In the literature of distributed parameter modelling of real processes is not considered the class of multicomponent chemical processes in gas, fluid and solid phase. The aim of paper is constitutive distributed parameter physicochemical model, constructed on kinetics and phenomenal analysis of multicomponent chemical processes in gas, fluid and solid phase. The mass, energy and momentum aspects of these multicomponent chemical reactions and adequate phenomena are utilized in balance operations, by conditions of: constitutive invariance for continuous media with space and time memories, reciprocity principle for isotropic and anisotropic nonhomogeneous media with space and time memories, application of definitions of following derivative and equation of continuity, to the construction of systems of partial differential constitutive state equations, in the following derivative forms for gas, fluid and solid phase. Couched in this way all physicochemical conditions of multicomponent chemical processes in gas, fluid and solid phase are new form of constitutive distributed parameter model for automatics and its systems of equations are new form of systems of partial differential constitutive state equations in sense of phenomenal distributed parameter control

  16. Reference levels for assays in opened installations of industrial radiography

    International Nuclear Information System (INIS)

    Leocadio, Joao C.; Tauhata, Luiz; Crispim, Verginia R.

    2001-01-01

    This work had as objectives to analyze the facilities open of industrial x-ray to obtain the distribution of doses in the operators and present proposed for the reference levels. The results of the monitoring revealed an improvement of the radiation protection conditions in the facilities and that the risk of potential exposure was reduced. The advantage of the proposed reference levels is that the supervisors would enlarge the frequency of audits in the facilities opened to accomplish the investigations and interventions.The facilities open with 'bunkers' presented distributions with 95% of the doses below 0,2 mSv and the distributions of the facilities with cordoned area they had 75% of the doses below 0,4 mSv. (author)

  17. Open Source Service Agent (OSSA) in the intelligence community's Open Source Architecture

    Science.gov (United States)

    Fiene, Bruce F.

    1994-01-01

    The Community Open Source Program Office (COSPO) has developed an architecture for the intelligence community's new Open Source Information System (OSIS). The architecture is a multi-phased program featuring connectivity, interoperability, and functionality. OSIS is based on a distributed architecture concept. The system is designed to function as a virtual entity. OSIS will be a restricted (non-public), user configured network employing Internet communications. Privacy and authentication will be provided through firewall protection. Connection to OSIS can be made through any server on the Internet or through dial-up modems provided the appropriate firewall authentication system is installed on the client.

  18. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    Energy Technology Data Exchange (ETDEWEB)

    Dobay, M. P. D., E-mail: maria.pamela.david@physik.uni-muenchen.de; Alberola, A. Piera; Mendoza, E. R.; Raedler, J. O., E-mail: joachim.raedler@physik.uni-muenchen.de [Ludwig-Maximilians University, Faculty of Physics, Center for NanoScience (Germany)

    2012-03-15

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  19. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    International Nuclear Information System (INIS)

    Dobay, M. P. D.; Alberola, A. Piera; Mendoza, E. R.; Rädler, J. O.

    2012-01-01

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  20. Modeling nanoparticle uptake and intracellular distribution using stochastic process algebras

    Science.gov (United States)

    Dobay, M. P. D.; Alberola, A. Piera; Mendoza, E. R.; Rädler, J. O.

    2012-03-01

    Computational modeling is increasingly important to help understand the interaction and movement of nanoparticles (NPs) within living cells, and to come to terms with the wealth of data that microscopy imaging yields. A quantitative description of the spatio-temporal distribution of NPs inside cells; however, it is challenging due to the complexity of multiple compartments such as endosomes and nuclei, which themselves are dynamic and can undergo fusion and fission and exchange their content. Here, we show that stochastic pi calculus, a widely-used process algebra, is well suited for mapping surface and intracellular NP interactions and distributions. In stochastic pi calculus, each NP is represented as a process, which can adopt various states such as bound or aggregated, as well as be passed between processes representing location, as a function of predefined stochastic channels. We created a pi calculus model of gold NP uptake and intracellular movement and compared the evolution of surface-bound, cytosolic, endosomal, and nuclear NP densities with electron microscopy data. We demonstrate that the computational approach can be extended to include specific molecular binding and potential interaction with signaling cascades as characteristic for NP-cell interactions in a wide range of applications such as nanotoxicity, viral infection, and drug delivery.

  1. An educational tool for interactive parallel and distributed processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2012-01-01

    In this article we try to describe how the modular interactive tiles system (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing a handson educational tool that allows a change in the representation...... of abstract problems related to designing interactive parallel and distributed systems. Indeed, the MITS seems to bring a series of goals into education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, and user and multi-user interaction which can rarely be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples we show how to implement...

  2. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    Science.gov (United States)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  3. Bubble size distribution analysis and control in high frequency ultrasonic cleaning processes

    International Nuclear Information System (INIS)

    Hauptmann, M; Struyf, H; Mertens, P; Heyns, M; Gendt, S De; Brems, S; Glorieux, C

    2012-01-01

    In the semiconductor industry, the ongoing down-scaling of nanoelectronic elements has lead to an increasing complexity of their fabrication. Hence, the individual fabrication processes become increasingly difficult to handle. To minimize cross-contamination, intermediate surface cleaning and preparation steps are inevitable parts of the semiconductor process chain. Here, one major challenge is the removal of residual nano-particulate contamination resulting from abrasive processes such as polishing and etching. In the past, physical cleaning techniques such as megasonic cleaning have been proposed as suitable solutions. However, the soaring fragility of the smallest structures is constraining the forces of the involved physical removal mechanisms. In the case of 'megasonic' cleaning –cleaning with ultrasound in the MHz-domain – the main cleaning action arises from strongly oscillating microbubbles which emerge from the periodically changing tensile strain in the cleaning liquid during sonication. These bubbles grow, oscillate and collapse due to a complex interplay of rectified diffusion, bubble coalescence, non-linear pulsation and the onset of shape instabilities. Hence, the resulting bubble size distribution does not remain static but alternates continuously. Only microbubbles in this distribution that show a high oscillatory response are responsible for the cleaning action. Therefore, the cleaning process efficiency can be improved by keeping the majority of bubbles around their resonance size. In this paper, we propose a method to control and characterize the bubble size distribution by means of 'pulsed' sonication and measurements of acoustic cavitation spectra, respectively. We show that the so-obtained bubble size distributions can be related to theoretical predictions of the oscillatory responses of and the onset of shape instabilities for the respective bubbles. We also propose a mechanism to explain the enhancement of both acoustic and cleaning

  4. A trial of distributed portable data acquisition and processing system implementation: the qdpb - data processing with branchpoints

    International Nuclear Information System (INIS)

    Gritsaj, K.I.; Isupov, A.Yu.

    2001-01-01

    A trial of distributed portable data acquisition and processing system qdpb is issued. An experimental setup data and hardware dependent code is separated from the generic part of the qdpb system. The generic part implementation is described

  5. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    Science.gov (United States)

    2014-07-01

    RISC 525 Brooks Road Rome NY 13441-4505 10. SPONSOR/MONITOR’S ACRONYM(S) AFRL/RI 11. SPONSOR/MONITOR’S REPORT NUMBER AFRL-RI-RS-TR-2014-195 12...cloud” technologies are not appropriate for situation understanding in areas of denial, where computation resources are limited, data not easily...graph matching process. D-SPACE distributes graph exploitation among a network of autonomous computational resources, designs the collaboration policy

  6. Heat and work distributions for mixed Gauss–Cauchy process

    International Nuclear Information System (INIS)

    Kuśmierz, Łukasz; Gudowska-Nowak, Ewa; Rubi, J Miguel

    2014-01-01

    We analyze energetics of a non-Gaussian process described by a stochastic differential equation of the Langevin type. The process represents a paradigmatic model of a nonequilibrium system subject to thermal fluctuations and additional external noise, with both sources of perturbations considered as additive and statistically independent forcings. We define thermodynamic quantities for trajectories of the process and analyze contributions to mechanical work and heat. As a working example we consider a particle subjected to a drag force and two statistically independent Lévy white noises with stability indices α = 2 and α = 1. The fluctuations of dissipated energy (heat) and distribution of work performed by the force acting on the system are addressed by examining contributions of Cauchy fluctuations (α = 1) to either bath or external force acting on the system. (paper)

  7. Does open access improve the process and outcome of podiatric care?

    Science.gov (United States)

    Wrobel, James S; Davies, Michael L; Robbins, Jeffrey M

    2011-05-19

    Open access to clinics is a management strategy to improve healthcare delivery. Providers are sometimes hesitant to adopt open access because of fear of increased visits for potentially trivial complaints. We hypothesized open access clinics would result in decreased wait times, increased number of podiatry visits, fewer no shows, higher rates of acute care visits, and lower minor amputation rates over control clinics without open access. This study was a national retrospective case-control study of VHA (Veterans Hospital Administration) podiatry clinics in 2008. Eight case facilities reported to have open podiatry clinic access for at least one year were identified from an email survey. Sixteen control facilities with similar structural features (e.g., full time podiatrists, health tech, residency program, reconstructive foot surgery, vascular, and orthopedic surgery) were identified in the same geographic region as the case facilities. Twenty-two percent of facilities responded to the survey. Fifty-four percent reported open access and 46% did not. There were no differences in facility or podiatry panel size, podiatry visits, or visit frequency between the cases and controls. Podiatry visits trended higher for control facilities but didn't reach statistical significance. Case facilities had more new consults seen within 30 days (96%, 89%; P = 0.050) and lower minor amputation rates (0.62/1,000, 1.0/1,000; P = 0.041). The VHA is the worlds largest managed care organization and it relies on clinical efficiencies as one mechanism to improve the quality of care. Open access clinics had more timely access for new patients and lower rates of minor amputations.

  8. Gold open access: the best of both worlds.

    Science.gov (United States)

    van der Heyden, M A G; van Veen, T A B

    2018-01-01

    Gold open access provides free distribution of trustworthy scientific knowledge for everyone. As publication modus, it has to withstand the bad reputation of predatory journals and overcome the preconceptions of those who believe that open access is synonymous with poor quality articles and high costs. Gold open access has a bright future and will serve the scientific community, clinicians without academic affiliations and the general public.

  9. OpenSearch technology for geospatial resources discovery

    Science.gov (United States)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above

  10. Does ego development increase during midlife? The effects of openness and accommodative processing of difficult events.

    Science.gov (United States)

    Lilgendahl, Jennifer Pals; Helson, Ravenna; John, Oliver P

    2013-08-01

    Although Loevinger's model of ego development is a theory of personality growth, there are few studies that have examined age-related change in ego level over developmentally significant periods of adulthood. To address this gap in the literature, we examined mean-level change and individual differences in change in ego level over 18 years of midlife. In this longitudinal study, participants were 79 predominantly White, college-educated women who completed the Washington University Sentence Completion Test in early (age 43) and late (age 61) midlife as well as measures of the trait of Openness (ages 21, 43, 52, and 61) and accommodative processing (assessed from narratives of difficult life events at age 52). As hypothesized, the sample overall showed a mean-level increase in ego level from age 43 to age 61. Additionally, a regression analysis showed that both the trait of Openness at age 21 and accommodative processing of difficult events that occurred during (as opposed to prior to) midlife were each predictive of increasing ego level from age 43 to age 61. These findings counter prior claims that ego level remains stable during adulthood and contribute to our understanding of the underlying processes involved in personality growth in midlife. © 2012 Wiley Periodicals, Inc.

  11. Gold open access : the best of both worlds

    NARCIS (Netherlands)

    van der Heyden, M A G; van Veen, T A B

    Gold open access provides free distribution of trustworthy scientific knowledge for everyone. As publication modus, it has to withstand the bad reputation of predatory journals and overcome the preconceptions of those who believe that open access is synonymous with poor quality articles and high

  12. Grid integrated distributed PV (GridPV).

    Energy Technology Data Exchange (ETDEWEB)

    Reno, Matthew J.; Coogan, Kyle

    2013-08-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions. Each function in the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.

  13. The Semi-opened Infrastructure Model (SopIM): A Frame to Set Up an Organizational Learning Process

    Science.gov (United States)

    Grundstein, Michel

    In this paper, we introduce the "Semi-opened Infrastructure Model (SopIM)" implemented to deploy Artificial Intelligence and Knowledge-based Systems within a large industrial company. This model illustrates what could be two of the operating elements of the Model for General Knowledge Management within the Enterprise (MGKME) that are essential to set up the organizational learning process that leads people to appropriate and use concepts, methods and tools of an innovative technology: the "Ad hoc Infrastructures" element, and the "Organizational Learning Processes" element.

  14. Evaluation of the Suitability of Alluxio for Hadoop Processing Frameworks

    CERN Document Server

    Lawrie, Christopher; CERN. Geneva. IT Department

    2016-01-01

    Alluxio is an open source memory speed virtual distributed storage platform. It sits between the storage and processing framework layers for big data processing and claims to heavily improve performance when data is required to be written/read at a high throughput; for example when a dataset is used by many jobs simultaneously. This report evaluates the viability of using Alluxio at CERN for Hadoop processing frameworks.

  15. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available There are several sources that indicate a remarkable increase in the adoption of open source software (OSS) into the technology infrastructure of organizations. In fact, the number of medium to large organizations without some OSS installations...

  16. Dust charging processes with a Cairns-Tsallis distribution function with negative ions

    Energy Technology Data Exchange (ETDEWEB)

    Abid, A. A., E-mail: abidaliabid1@hotmail.com [Applied Physics Department, Federal Urdu University of Arts, Science and Technology, Islamabad Campus, Islamabad 45320 (Pakistan); Khan, M. Z., E-mail: mzk-qau@yahoo.com [Applied Physics Department, Federal Urdu University of Arts, Science and Technology, Islamabad Campus, Islamabad 45320 (Pakistan); Plasma Technology Research Center, Department of Physics, Faculty of Science, University of Malaya, Kuala Lumpur 50603 (Malaysia); Yap, S. L. [Plasma Technology Research Center, Department of Physics, Faculty of Science, University of Malaya, Kuala Lumpur 50603 (Malaysia); Terças, H., E-mail: hugo.tercas@tecnico.ul.pt [Physics of Information Group, Instituto de Telecomunicações, Av. Rovisco Pais, Lisbon 1049-001 (Portugal); Mahmood, S. [Science Place, University of Saskatchewan, Saskatoon, Saskatchewan S7N5A2 (Canada)

    2016-01-15

    Dust grain charging processes are presented in a non-Maxwellian dusty plasma following the Cairns-Tsallis (q, α)–distribution, whose constituents are the electrons, as well as the positive/negative ions and negatively charged dust grains. For this purpose, we have solved the current balance equation for a negatively charged dust grain to achieve an equilibrium state value (viz., q{sub d} = constant) in the presence of Cairns-Tsallis (q, α)–distribution. In fact, the current balance equation becomes modified due to the Boltzmannian/streaming distributed negative ions. It is numerically found that the relevant plasma parameters, such as the spectral indexes q and α, the positive ion-to-electron temperature ratio, and the negative ion streaming speed (U{sub 0}) significantly affect the dust grain surface potential. It is also shown that in the limit q → 1 the Cairns-Tsallis reduces to the Cairns distribution; for α = 0 the Cairns-Tsallis distribution reduces to pure Tsallis distribution and the latter reduces to Maxwellian distribution for q → 1 and α = 0.

  17. Dust charging processes with a Cairns-Tsallis distribution function with negative ions

    International Nuclear Information System (INIS)

    Abid, A. A.; Khan, M. Z.; Yap, S. L.; Terças, H.; Mahmood, S.

    2016-01-01

    Dust grain charging processes are presented in a non-Maxwellian dusty plasma following the Cairns-Tsallis (q, α)–distribution, whose constituents are the electrons, as well as the positive/negative ions and negatively charged dust grains. For this purpose, we have solved the current balance equation for a negatively charged dust grain to achieve an equilibrium state value (viz., q d  = constant) in the presence of Cairns-Tsallis (q, α)–distribution. In fact, the current balance equation becomes modified due to the Boltzmannian/streaming distributed negative ions. It is numerically found that the relevant plasma parameters, such as the spectral indexes q and α, the positive ion-to-electron temperature ratio, and the negative ion streaming speed (U 0 ) significantly affect the dust grain surface potential. It is also shown that in the limit q → 1 the Cairns-Tsallis reduces to the Cairns distribution; for α = 0 the Cairns-Tsallis distribution reduces to pure Tsallis distribution and the latter reduces to Maxwellian distribution for q → 1 and α = 0

  18. Phase distribution measurements in narrow rectangular channels using image processing techniques

    International Nuclear Information System (INIS)

    Bentley, C.; Ruggles, A.

    1991-01-01

    Many high flux research reactor fuel assemblies are cooled by systems of parallel narrow rectangular channels. The HFIR is cooled by single phase forced convection under normal operating conditions. However, two-phase forced convection or two phase mixed convection can occur in the fueled region as a result of some hypothetical accidents. Such flow conditions would occur only at decay power levels. The system pressure would be around 0.15 MPa in such circumstances. Phase distribution of air-water flow in a narrow rectangular channel is examined using image processing techniques. Ink is added to the water and clear channel walls are used to allow high speed still photographs and video tape to be taken of the air-water flow field. Flow field images are digitized and stored in a Macintosh 2ci computer using a frame grabber board. Local grey levels are related to liquid thickness in the flow channel using a calibration fixture. Image processing shareware is used to calculate the spatially averaged liquid thickness from the image of the flow field. Time averaged spatial liquid distributions are calculated using image calculation algorithms. The spatially averaged liquid distribution is calculated from the time averaged spatial liquid distribution to formulate the combined temporally and spatially averaged fraction values. The temporally and spatially averaged liquid fractions measured using this technique compare well to those predicted from pressure gradient measurements at zero superficial liquid velocity

  19. AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    Directory of Open Access Journals (Sweden)

    Jeliazkova Nina

    2011-05-01

    downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.

  20. AMBIT RESTful web services: an implementation of the OpenTox application programming interface.

    Science.gov (United States)

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2011-05-16

    allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.

  1. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    Science.gov (United States)

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  2. Barista: A Framework for Concurrent Speech Processing by USC-SAIL.

    Science.gov (United States)

    Can, Doğan; Gibson, James; Vaz, Colin; Georgiou, Panayiotis G; Narayanan, Shrikanth S

    2014-05-01

    We present Barista, an open-source framework for concurrent speech processing based on the Kaldi speech recognition toolkit and the libcppa actor library. With Barista, we aim to provide an easy-to-use, extensible framework for constructing highly customizable concurrent (and/or distributed) networks for a variety of speech processing tasks. Each Barista network specifies a flow of data between simple actors, concurrent entities communicating by message passing, modeled after Kaldi tools. Leveraging the fast and reliable concurrency and distribution mechanisms provided by libcppa, Barista lets demanding speech processing tasks, such as real-time speech recognizers and complex training workflows, to be scheduled and executed on parallel (and/or distributed) hardware. Barista is released under the Apache License v2.0.

  3. Sensitivity of open-water ice growth and ice concentration evolution in a coupled atmosphere-ocean-sea ice model

    Science.gov (United States)

    Shi, Xiaoxu; Lohmann, Gerrit

    2017-09-01

    A coupled atmosphere-ocean-sea ice model is applied to investigate to what degree the area-thickness distribution of new ice formed in open water affects the ice and ocean properties. Two sensitivity experiments are performed which modify the horizontal-to-vertical aspect ratio of open-water ice growth. The resulting changes in the Arctic sea-ice concentration strongly affect the surface albedo, the ocean heat release to the atmosphere, and the sea-ice production. The changes are further amplified through a positive feedback mechanism among the Arctic sea ice, the Atlantic Meridional Overturning Circulation (AMOC), and the surface air temperature in the Arctic, as the Fram Strait sea ice import influences the freshwater budget in the North Atlantic Ocean. Anomalies in sea-ice transport lead to changes in sea surface properties of the North Atlantic and the strength of AMOC. For the Southern Ocean, the most pronounced change is a warming along the Antarctic Circumpolar Current (ACC), owing to the interhemispheric bipolar seasaw linked to AMOC weakening. Another insight of this study lies on the improvement of our climate model. The ocean component FESOM is a newly developed ocean-sea ice model with an unstructured mesh and multi-resolution. We find that the subpolar sea-ice boundary in the Northern Hemisphere can be improved by tuning the process of open-water ice growth, which strongly influences the sea ice concentration in the marginal ice zone, the North Atlantic circulation, salinity and Arctic sea ice volume. Since the distribution of new ice on open water relies on many uncertain parameters and the knowledge of the detailed processes is currently too crude, it is a challenge to implement the processes realistically into models. Based on our sensitivity experiments, we conclude a pronounced uncertainty related to open-water sea ice growth which could significantly affect the climate system sensitivity.

  4. Should Research Always be OPEN

    OpenAIRE

    Taylor, Mike

    2014-01-01

    "If I have seen further it is by standing on the shoulders of giants", said Isaac Newton. Since the earliest days of science, progress has always been achieved by the free exchange and re-use of ideas. Understanding this, scientists have always leaned in the direction of openness. Science outside of trade secrets and state secrets has a natural tendency to be open. Until recently, the principle barrier to sharing science has been the logistic difficulty of printing and distributing copies...

  5. BIG GEO DATA MANAGEMENT: AN EXPLORATION WITH SOCIAL MEDIA AND TELECOMMUNICATIONS OPEN DATA

    Directory of Open Access Journals (Sweden)

    C. Arias Munoz

    2016-06-01

    Full Text Available The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  6. Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data

    Science.gov (United States)

    Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.

    2016-06-01

    The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.

  7. DISCO - A concept of a system for integrated data base management in distributed data processing systems

    International Nuclear Information System (INIS)

    Holler, E.

    1980-01-01

    The development in data processing technology favors the trend towards distributed data processing systems: The large-scale integration of semiconductor devices has lead to very efficient (approx. 10 6 operations per second) and relatively cheap low end computers being offered today, that allow to install distributed data processing systems with a total capacity coming near to that of large-scale data processing plants at a tolerable investment expenditure. The technologies of communication and data banks, each by itself, have reached a state of development justifying their routine application. This is made evident by the present efforts for standardization in both areas. The integration of both technologies in the development of systems for integrated distributed data bank management, however, is new territory for engineering. (orig.) [de

  8. Fuel distribution process risk analysis in East Borneo

    Directory of Open Access Journals (Sweden)

    Laksmita Raizsa

    2018-01-01

    Full Text Available Fuel distribution is an important aspect of fulfilling the customer’s need. It is risky because it can cause tardiness that can cause fuel scarcity. In the process of distribution, many risks are occurring. House of Risk is a method used for mitigating the risk. It identifies seven risk events and nine risk agents. Matrix occurrence and severity are used for eliminating the minor impact risk. House of Risk 1 is used for determining the Aggregate Risk Potential (ARP. Pareto diagram is applied to prioritize risk that must be mitigated by preventive actions based on ARP. It identifies 4 priority risks, namely A8 (Car trouble, A4 (Human Error, A3 (Error deposit via bank and underpayment, and A6 (traffic accident which should be mitigated. House of Risk 2 makes for mapping between the preventive action and risk agent. It gets the Effectiveness to Difficulty Ratio (ETD for mitigating action. Conducting safety talk routine once every three days with ETD 2088 is the primary preventive actions.

  9. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  10. QuTiP: An open-source Python framework for the dynamics of open quantum systems

    Science.gov (United States)

    Johansson, J. R.; Nation, P. D.; Nori, Franco

    2012-08-01

    We present an object-oriented open-source framework for solving the dynamics of open quantum systems written in Python. Arbitrary Hamiltonians, including time-dependent systems, may be built up from operators and states defined by a quantum object class, and then passed on to a choice of master equation or Monte Carlo solvers. We give an overview of the basic structure for the framework before detailing the numerical simulation of open system dynamics. Several examples are given to illustrate the build up to a complete calculation. Finally, we measure the performance of our library against that of current implementations. The framework described here is particularly well suited to the fields of quantum optics, superconducting circuit devices, nanomechanics, and trapped ions, while also being ideal for use in classroom instruction. Catalogue identifier: AEMB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 16 482 No. of bytes in distributed program, including test data, etc.: 213 438 Distribution format: tar.gz Programming language: Python Computer: i386, x86-64 Operating system: Linux, Mac OSX, Windows RAM: 2+ Gigabytes Classification: 7 External routines: NumPy (http://numpy.scipy.org/), SciPy (http://www.scipy.org/), Matplotlib (http://matplotlib.sourceforge.net/) Nature of problem: Dynamics of open quantum systems. Solution method: Numerical solutions to Lindblad master equation or Monte Carlo wave function method. Restrictions: Problems must meet the criteria for using the master equation in Lindblad form. Running time: A few seconds up to several tens of minutes, depending on size of underlying Hilbert space.

  11. Laser scanner data processing and 3D modeling using a free and open source software

    International Nuclear Information System (INIS)

    Gabriele, Fatuzzo; Michele, Mangiameli; Giuseppe, Mussumeci; Salvatore, Zito

    2015-01-01

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue

  12. Laser scanner data processing and 3D modeling using a free and open source software

    Energy Technology Data Exchange (ETDEWEB)

    Gabriele, Fatuzzo [Dept. of Industrial and Mechanical Engineering, University of Catania (Italy); Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci; Salvatore, Zito [Dept. of Civil Engineering and Architecture, University of Catania (Italy)

    2015-03-10

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.

  13. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    Science.gov (United States)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  14. Commodity hardware and open source solutions in FTU data management

    International Nuclear Information System (INIS)

    Centioli, C.; Bracco, G.; Eccher, S.; Iannone, F.; Maslennikov, A.; Panella, M.; Vitale, V.

    2004-01-01

    Frascati Tokamak Upgrade (FTU) data management system underwent several developments in the last year, mainly due to the availability of huge amount of open source software and cheap commodity hardware. First of all, we replaced the old and expensive four SUN/SOLARIS servers running AFS (Andrew File System) fusione.it cell with three SuperServer Supermicro SC-742. Secondly Linux 2.4 OS has been installed on our new cell servers and OpenAFS 1.2.8 open source distributed file system has replaced the commercial IBM/Transarc AFS. A pioneering solution - SGI's XFS file system for Linux - has been adopted to format one terabyte of FTU storage system on which the AFS volumes are based. Benchmark tests have shown the good performances of XFS compared to the classical ext3 Linux file system. Third, the data access software has been ported to Linux, together with the interfaces to Matlab and IDL, as well as the locally developed data display utility, SHOX. Finally a new Object-Oriented Data Model (OODM) has been developed for FTU shots data to build and maintain a FTU data warehouse (DW). FTU OODM has been developed using ROOT, an object oriented data analysis framework well-known in high energy physics. Since large volumes of data are involved, a parallel data extraction process, developed in the ROOT framework, has been implemented taking advantage of the AFS distributed environment of FTU computing system

  15. Commodity hardware and open source solutions in FTU data management

    Energy Technology Data Exchange (ETDEWEB)

    Centioli, C. E-mail: centioli@frascati.enea.it; Bracco, G.; Eccher, S.; Iannone, F.; Maslennikov, A.; Panella, M.; Vitale, V

    2004-06-01

    Frascati Tokamak Upgrade (FTU) data management system underwent several developments in the last year, mainly due to the availability of huge amount of open source software and cheap commodity hardware. First of all, we replaced the old and expensive four SUN/SOLARIS servers running AFS (Andrew File System) fusione.it cell with three SuperServer Supermicro SC-742. Secondly Linux 2.4 OS has been installed on our new cell servers and OpenAFS 1.2.8 open source distributed file system has replaced the commercial IBM/Transarc AFS. A pioneering solution - SGI's XFS file system for Linux - has been adopted to format one terabyte of FTU storage system on which the AFS volumes are based. Benchmark tests have shown the good performances of XFS compared to the classical ext3 Linux file system. Third, the data access software has been ported to Linux, together with the interfaces to Matlab and IDL, as well as the locally developed data display utility, SHOX. Finally a new Object-Oriented Data Model (OODM) has been developed for FTU shots data to build and maintain a FTU data warehouse (DW). FTU OODM has been developed using ROOT, an object oriented data analysis framework well-known in high energy physics. Since large volumes of data are involved, a parallel data extraction process, developed in the ROOT framework, has been implemented taking advantage of the AFS distributed environment of FTU computing system.

  16. Open source system OpenVPN in a function of Virtual Private Network

    Science.gov (United States)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  17. Converting a manned LCU into an unmanned surface vehicle (USV): an open systems architecture (OSA) case study

    OpenAIRE

    Smith, Montrell F.

    2014-01-01

    Approved for public release; distribution is unlimited This thesis demonstrates the process by which the concepts of open systems architecture (OSA) might be applied within the context of an existing systems engineering methodology to result in a flexible system. This is accomplished by combining an existing systems engineering process model with OSA management and business principles to execute a successful asset-repurposing program. To demonstrate utility of this OSA approach to systems ...

  18. Electrical power systems for distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, T.A.; Huval, S.J. [Stewart & Stevenson Services, Inc., Houston, TX (United States)

    1996-12-31

    {open_quotes}Distributed Generation{close_quotes} has become the {open_quotes}buzz{close_quotes} word of an electric utility industry facing deregulation. Many industrial facilities utilize equipment in distributed installations to serve the needs of a thermal host through the capture of exhaust energy in a heat recovery steam generator. The electrical power generated is then sold as a {open_quotes}side benefit{close_quotes} to the cost-effective supply of high quality thermal energy. Distributed generation is desirable for many different reasons, each with unique characteristics of the product. Many years of experience in the distributed generation market has helped Stewart & Stevenson to define a range of product features that are crucial to most any application. The following paper will highlight a few of these applications. The paper will also examine the range of products currently available and in development. Finally, we will survey the additional services offered by Stewart & Stevenson to meet the needs of a rapidly changing power generation industry.

  19. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith

    2017-09-27

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  20. Distributed resource management across process boundaries

    KAUST Repository

    Suresh, Lalith; Bodik, Peter; Menache, Ishai; Canini, Marco; Ciucu, Florin

    2017-01-01

    Multi-tenant distributed systems composed of small services, such as Service-oriented Architectures (SOAs) and Micro-services, raise new challenges in attaining high performance and efficient resource utilization. In these systems, a request execution spans tens to thousands of processes, and the execution paths and resource demands on different services are generally not known when a request first enters the system. In this paper, we highlight the fundamental challenges of regulating load and scheduling in SOAs while meeting end-to-end performance objectives on metrics of concern to both tenants and operators. We design Wisp, a framework for building SOAs that transparently adapts rate limiters and request schedulers system-wide according to operator policies to satisfy end-to-end goals while responding to changing system conditions. In evaluations against production as well as synthetic workloads, Wisp successfully enforces a range of end-to-end performance objectives, such as reducing average latencies, meeting deadlines, providing fairness and isolation, and avoiding system overload.

  1. High speed vision processor with reconfigurable processing element array based on full-custom distributed memory

    Science.gov (United States)

    Chen, Zhe; Yang, Jie; Shi, Cong; Qin, Qi; Liu, Liyuan; Wu, Nanjian

    2016-04-01

    In this paper, a hybrid vision processor based on a compact full-custom distributed memory for near-sensor high-speed image processing is proposed. The proposed processor consists of a reconfigurable processing element (PE) array, a row processor (RP) array, and a dual-core microprocessor. The PE array includes two-dimensional processing elements with a compact full-custom distributed memory. It supports real-time reconfiguration between the PE array and the self-organized map (SOM) neural network. The vision processor is fabricated using a 0.18 µm CMOS technology. The circuit area of the distributed memory is reduced markedly into 1/3 of that of the conventional memory so that the circuit area of the vision processor is reduced by 44.2%. Experimental results demonstrate that the proposed design achieves correct functions.

  2. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals.

    Science.gov (United States)

    Wicherts, Jelte M

    2016-01-01

    Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals' websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors' ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal's impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

  3. Properties of the open cluster system

    International Nuclear Information System (INIS)

    Janes, K.A.; Tilley, C.; Lynga, G.

    1988-01-01

    A system of weights corresponding to the precision of open cluster data is described. Using these weights, some properties of open clusters can be studied more accurately than was possible earlier. It is clear that there are three types of objects: unbound clusters, bound clusters in the thin disk, and older bound clusters. Galactic gradients of metallicity, longevity, and linear diameter are studied. Distributions at right angles to the galactic plane are discussed in the light of the different cluster types. The clumping of clusters in complexes is studied. An estimate of the selection effects influencing the present material of open cluster data is made in order to evaluate the role played by open clusters in the history of the galactic disk. 58 references

  4. Remote-Sensing Data Distribution and Processing in the Cloud at the ASF DAAC

    Science.gov (United States)

    Stoner, C.; Arko, S. A.; Nicoll, J. B.; Labelle-Hamer, A. L.

    2016-12-01

    The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC) has been tasked to archive and distribute data from both SENTINEL-1 satellites and from the NASA-ISRO Synthetic Aperture Radar (NISAR) satellite in a cost effective manner. In order to best support processing and distribution of these large data sets for users, the ASF DAAC enhanced our data system in a number of ways that will be detailed in this presentation.The SENTINEL-1 mission comprises a constellation of two polar-orbiting satellites, operating day and night performing C-band Synthetic Aperture Radar (SAR) imaging, enabling them to acquire imagery regardless of the weather. SENTINEL-1A was launched by the European Space Agency (ESA) in April 2014. SENTINEL-1B is scheduled to launch in April 2016.The NISAR satellite is designed to observe and take measurements of some of the planet's most complex processes, including ecosystem disturbances, ice-sheet collapse, and natural hazards such as earthquakes, tsunamis, volcanoes and landslides. NISAR will employ radar imaging, polarimetry, and interferometry techniques using the SweepSAR technology employed for full-resolution wide-swath imaging. NISAR data files are large, making storage and processing a challenge for conventional store and download systems.To effectively process, store, and distribute petabytes of data in a High-performance computing environment, ASF took a long view with regard to technology choices and picked a path of most flexibility and Software re-use. To that end, this Software tools and services presentation will cover Web Object Storage (WOS) and the ability to seamlessly move from local sunk cost hardware to public cloud, such as Amazon Web Services (AWS). A prototype of SENTINEL-1A system that is in AWS, as well as a local hardware solution, will be examined to explain the pros and cons of each. In preparation for NISAR files which will be even larger than SENTINEL-1A, ASF has embarked on a number of cloud

  5. Are anonymous evaluations a better assessment of faculty teaching performance? A comparative analysis of open and anonymous evaluation processes.

    Science.gov (United States)

    Afonso, Nelia M; Cardozo, Lavoisier J; Mascarenhas, Oswald A J; Aranha, Anil N F; Shah, Chirag

    2005-01-01

    We compared teaching performance of medical school faculty using anonymous evaluations and open evaluations (in which the evaluator was not anonymous) and examined barriers to open evaluation. Residents and medical students evaluated faculty using an open evaluation instrument in which their identity was indicated in the evaluation. Following this, they completed anonymous evaluation on the same faculty members. Aggregate outcomes using the two evaluation systems were compared. Outcomes by group of evaluators (residents and students) were analyzed. Trainees were also asked to rate the barriers to the open evaluation process. A statistically significant difference between the open and anonymous evaluations was noted across all items, with faculty receiving lower scores on the anonymous evaluations. The mean score for all the items on the open evaluations was 4.45 +/- 0.65, compared to mean score of 4.07 +/- 0.80 on the anonymous evaluations. There was also a statistically significant difference between open and anonymous evaluations in five clinical teaching domains that were evaluated individually. Residents perceived that the three most common barriers to optimal evaluation were an apprehension of possible encounters with the same attending physician in the future, destruction of working relationships with the attending, and a feeling of frustration with the evaluation system. The evaluation of faculty teaching performance is complex. Most academic medical centers use the open evaluation format. This study supports the case for the use of the anonymous evaluation method as a more accurate reflection of teaching performance.

  6. Spaces of Open-source Politics

    DEFF Research Database (Denmark)

    Husted, Emil; Plesner, Ursula

    2017-01-01

    . Inspired by the literature on organizational space, the analysis explores how different organizational spaces configure the party’s process of policy development, thereby adding to our understanding of the relationship between organizational space and political organization. We analyze three different....... Curiously, it seems that physical spaces open up the political process, while digital spaces close it down by fixing meaning. Accordingly, we argue that open-source politics should not be equated with online politics but may be highly dependent on physical spaces. Furthermore, digital spaces may provide......The recent proliferation of Web 2.0 applications and their role in contemporary political life have inspired the coining of the term ‘open-source politics’. This article analyzes how open-source politics is organized in the case of a radical political party in Denmark called The Alternative...

  7. Gap formation processes in a high-density plasma opening switch

    International Nuclear Information System (INIS)

    Grossmann, J.M.; Swanekamp, S.B.; Ottinger, P.F.; Commisso, R.J.; Hinshelwood, D.D.; Weber, B.V.

    1995-01-01

    A gap opening process in plasma opening switches (POS) is examined with the aid of numerical simulations. In these simulations, a high density (n e =10 14 --5x10 15 cm -3 ) uniform plasma initially bridges a small section of the coaxial transmission line of an inductive energy storage generator. A short section of vacuum transmission line connects the POS to a short circuit load. The results presented here extend previous simulations in the n e =10 12 --10 13 cm -3 density regime. The simulations show that a two-dimensional (2-D) sheath forms in the plasma near a cathode. This sheath is positively charged, and electrostatic sheath potentials that are large compared to the anode--cathode voltage develop. Initially, the 2-D sheath is located at the generator edge of the plasma. As ions are accelerated out of the sheath, it retains its original 2-D structure, but migrates axially toward the load creating a magnetically insulated gap in its wake. When the sheath reaches the load edge of the POS, the POS stops conducting current and the load current increases rapidly. At the end of the conduction phase a gap exists in the POS whose size is determined by the radial dimensions of the 2-D sheath. Simulations at various plasma densities and current levels show that the radial size of the gap scales roughly as B/n e , where B is the magnetic field. The results of this work are discussed in the context of long-conduction-time POS physics, but exhibit the same physical gap formation mechanisms as earlier lower density simulations more relevant to short-conduction-time POS. copyright 1995 American Institute of Physics

  8. TCP (truncated compound Poisson) process for multiplicity distributions in high energy collisions

    International Nuclear Information System (INIS)

    Srivastave, P.P.

    1990-01-01

    On using the Poisson distribution truncated at zero for intermediate cluster decay in a compound Poisson process, the authors obtain TCP distribution which describes quite well the multiplicity distributions in high energy collisions. A detailed comparison is made between TCP and NB for UA5 data. The reduced moments up to the fifth agree very well with the observed ones. The TCP curves are narrower than NB at high multiplicity tail, look narrower at very high energy and develop shoulders and oscillations which become increasingly pronounced as the energy grows. At lower energies the distributions, of the data for fixed intervals of rapidity for UA5 data and for the data (at low energy) for e + e - annihilation and pion-proton, proton-proton and muon-proton scattering. A discussion of compound Poisson distribution, expression of reduced moments and Poisson transforms are also given. The TCP curves and curves of the reduced moments for different values of the parameters are also presented

  9. Improving flow distribution in influent channels using computational fluid dynamics.

    Science.gov (United States)

    Park, No-Suk; Yoon, Sukmin; Jeong, Woochang; Lee, Seungjae

    2016-10-01

    Although the flow distribution in an influent channel where the inflow is split into each treatment process in a wastewater treatment plant greatly affects the efficiency of the process, and a weir is the typical structure for the flow distribution, to the authors' knowledge, there is a paucity of research on the flow distribution in an open channel with a weir. In this study, the influent channel of a real-scale wastewater treatment plant was used, installing a suppressed rectangular weir that has a horizontal crest to cross the full channel width. The flow distribution in the influent channel was analyzed using a validated computational fluid dynamics model to investigate (1) the comparison of single-phase and two-phase simulation, (2) the improved procedure of the prototype channel, and (3) the effect of the inflow rate on flow distribution. The results show that two-phase simulation is more reliable due to the description of the free-surface fluctuations. It should first be considered for improving flow distribution to prevent a short-circuit flow, and the difference in the kinetic energy with the inflow rate makes flow distribution trends different. The authors believe that this case study is helpful for improving flow distribution in an influent channel.

  10. Sources and processes affecting the distribution of dissolved Nd isotopes and concentrations in the West Pacific

    Science.gov (United States)

    Behrens, Melanie K.; Pahnke, Katharina; Schnetger, Bernhard; Brumsack, Hans-Jürgen

    2018-02-01

    In the Atlantic, where deep circulation is vigorous, the dissolved neodymium (Nd) isotopic composition (expressed as ɛNd) is largely controlled by water mass mixing. In contrast, the factors influencing the ɛNd distribution in the Pacific, marked by sluggish circulation, is not clear yet. Indication for regional overprints in the Pacific is given based on its bordering volcanic islands. Our study aims to clarify the impact and relative importance of different Nd sources (rivers, volcanic islands), vertical (bio)geochemical processes and lateral water mass transport in controlling dissolved ɛNd and Nd concentration ([Nd]) distributions in the West Pacific between South Korea and Fiji. We find indication for unradiogenic continental input from South Korean and Chinese rivers to the East China Sea. In the tropical West Pacific, volcanic islands supply Nd to surface and subsurface waters and modify their ɛNd to radiogenic values of up to +0.7. These radiogenic signatures allow detailed tracing of currents flowing to the east and differentiation from westward currents with open ocean Pacific ɛNd composition in the complex tropical Pacific zonal current system. Modified radiogenic ɛNd of West Pacific intermediate to bottom waters upstream or within our section also indicates non-conservative behavior of ɛNd due to boundary exchange at volcanic island margins, submarine ridges, and with hydrothermal particles. Only subsurface to deep waters (3000 m) in the open Northwest Pacific show conservative behavior of ɛNd. In contrast, we find a striking correlation of extremely low (down to 2.77 pmol/kg Nd) and laterally constant [Nd] with the high-salinity North and South Pacific Tropical Water, indicating lateral transport of preformed [Nd] from the North and South Pacific subtropical gyres into the study area. This observation also explains the previously observed low subsurface [Nd] in the tropical West Pacific. Similarly, Western South Pacific Central Water, Antarctic

  11. Monte Carlo simulation of second-generation open-type PET ''single-ring OpenPET'' implemented with DOI detectors

    International Nuclear Information System (INIS)

    Tashima, Hideaki; Yamaya, Taiga; Hirano, Yoshiyuki; Yoshida, Eiji; Kinouch, Shoko; Watanabe, Mitsuo; Tanaka, Eiichi

    2013-01-01

    At the National Institute of Radiological Sciences, we are developing OpenPET, an open-type positron emission tomography (PET) geometry with a physically open space, which allows easy access to the patient during PET studies. Our first-generation OpenPET system, dual-ring OpenPET, which consisted of two detector rings, could provide an extended axial field of view (FOV) including the open space. However, for applications such as in-beam PET to monitor the dose distribution in situ during particle therapy, higher sensitivity concentrated on the irradiation field is required rather than a wide FOV. In this report, we propose a second-generation OpenPET geometry, single-ring OpenPET, which can efficiently improve sensitivity while providing the required open space. When the proposed geometry was realized with block detectors, position-dependent degradation of the spatial resolution was expected because it was necessary to arrange the detector blocks in ellipsoidal rings stacked and shifted relative to one another. However, we found by Monte Carlo simulation that the use of depth-of-interaction (DOI) detectors made it feasible to achieve uniform spatial resolution in the FOV. (author)

  12. Variation in recombination frequency and distribution across eukaryotes: patterns and processes

    Science.gov (United States)

    Feulner, Philine G. D.; Johnston, Susan E.; Santure, Anna W.; Smadja, Carole M.

    2017-01-01

    Recombination, the exchange of DNA between maternal and paternal chromosomes during meiosis, is an essential feature of sexual reproduction in nearly all multicellular organisms. While the role of recombination in the evolution of sex has received theoretical and empirical attention, less is known about how recombination rate itself evolves and what influence this has on evolutionary processes within sexually reproducing organisms. Here, we explore the patterns of, and processes governing recombination in eukaryotes. We summarize patterns of variation, integrating current knowledge with an analysis of linkage map data in 353 organisms. We then discuss proximate and ultimate processes governing recombination rate variation and consider how these influence evolutionary processes. Genome-wide recombination rates (cM/Mb) can vary more than tenfold across eukaryotes, and there is large variation in the distribution of recombination events across closely related taxa, populations and individuals. We discuss how variation in rate and distribution relates to genome architecture, genetic and epigenetic mechanisms, sex, environmental perturbations and variable selective pressures. There has been great progress in determining the molecular mechanisms governing recombination, and with the continued development of new modelling and empirical approaches, there is now also great opportunity to further our understanding of how and why recombination rate varies. This article is part of the themed issue ‘Evolutionary causes and consequences of recombination rate variation in sexual organisms’. PMID:29109219

  13. Structural-Diagenetic Controls on Fracture Opening in Tight Gas Sandstone Reservoirs, Alberta Foothills

    Science.gov (United States)

    Ukar, Estibalitz; Eichhubl, Peter; Fall, Andras; Hooker, John

    2013-04-01

    relatively undeformed backlimb strata. Fracture apertures locally increase adjacent to reverse faults without an overall increase in fracture frequency. Fluid inclusion analyses of crack-seal quartz cement indicate both aqueous and methane-rich inclusions are present. Homogenization temperatures of two-phase inclusions indicate synkinematic fracture cement precipitation and fracture opening under conditions at or near maximum burial of 190-210°C in core samples, and 120-160°C in outcrop samples. In comparison with the fracture evolution in other, less deformed tight-gas sandstone reservoirs such as the Piceance and East Texas basins where fracture opening is primarily controlled by gas generation, gas charge, and pore fluid pressure, these results suggest a strong control of regional tectonic processes on fracture generation. In conjunction with timing and rate of gas charge, rates of fracture cement growth, and stratigraphic-lithological controls, these processes determine the overall distribution of open fractures in these reservoirs.

  14. Radionuclides in the study of marine processes

    International Nuclear Information System (INIS)

    Kershaw, P.J.; Woodhead, D.S.

    1991-01-01

    For many years, the radioactive properties of the naturally occurring radionuclides have been used to determine their distributions in the marine environment and, more generally, to gain an understanding of the dynamic processes which control their behaviour in attaining these distributions. More recently the inputs from human activities of both natural and artificial (i.e. man-made) radionuclides have provided additional opportunities for the study of marine processes on local, regional and global scales. The primary objective of the symposium is to provide a forum for an open discussion of the insights concerning processes in the marine environment which can be gained from studies of radionuclide behaviour. Papers have been grouped within the following principal themes; the uses of radionuclides as tracers of water transport; scavenging and particulate transport processes in the oceans as deduced from radionuclide behaviour; processes in the seabed and radionuclides in biological systems. (Author)

  15. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  16. Gene tree rooting methods give distributions that mimic the coalescent process.

    Science.gov (United States)

    Tian, Yuan; Kubatko, Laura S

    2014-01-01

    Multi-locus phylogenetic inference is commonly carried out via models that incorporate the coalescent process to model the possibility that incomplete lineage sorting leads to incongruence between gene trees and the species tree. An interesting question that arises in this context is whether data "fit" the coalescent model. Previous work (Rosenfeld et al., 2012) has suggested that rooting of gene trees may account for variation in empirical data that has been previously attributed to the coalescent process. We examine this possibility using simulated data. We show that, in the case of four taxa, the distribution of gene trees observed from rooting estimated gene trees with either the molecular clock or with outgroup rooting can be closely matched by the distribution predicted by the coalescent model with specific choices of species tree branch lengths. We apply commonly-used coalescent-based methods of species tree inference to assess their performance in these situations. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Design and simulation of parallel and distributed architectures for images processing

    International Nuclear Information System (INIS)

    Pirson, Alain

    1990-01-01

    The exploitation of visual information requires special computers. The diversity of operations and the Computing power involved bring about structures founded on the concepts of concurrency and distributed processing. This work identifies a vision computer with an association of dedicated intelligent entities, exchanging messages according to the model of parallelism introduced by the language Occam. It puts forward an architecture of the 'enriched processor network' type. It consists of a classical multiprocessor structure where each node is provided with specific devices. These devices perform processing tasks as well as inter-nodes dialogues. Such an architecture benefits from the homogeneity of multiprocessor networks and the power of dedicated resources. Its implementation corresponds to that of a distributed structure, tasks being allocated to each Computing element. This approach culminates in an original architecture called ATILA. This modular structure is based on a transputer network supplied with vision dedicated co-processors and powerful communication devices. (author) [fr

  18. Generalized parton distribution for non zero skewness

    International Nuclear Information System (INIS)

    Kumar, Narinder; Dahiya, Harleen; Teryaev, Oleg

    2012-01-01

    In the theory of strong interactions the main open question is how the nucleon and other hadrons are built from quarks and gluons, the fundamental degrees of freedom in QCD. An essential tool to investigate hadron structure is the study of deep inelastic scattering processes, where individual quarks and gluons can be resolved. The parton densities extracted from such processes encode the distribution of longitudinal momentum and polarization carried by quarks, antiquarks and gluons within a fast moving hadron. They have provided much to shape the physical picture of hadron structure. In the recent years, it has become clear that appropriate exclusive scattering processes may provide such information encoded in the general parton distributions (GPDs). Here, we investigate the GPD for deep virtual compton scattering (DVCS) for the non zero skewness. The study has investigated the GPDs by expressing them in terms of overlaps of light front wave functions (LFWFs). The work represented a spin 1/2 system as a composite of spin 1/2 fermion and spin 1 boson with arbitrary masses

  19. Dimensions of open research: critical reflections on openness in the ROER4D project

    Directory of Open Access Journals (Sweden)

    Thomas William King

    2016-05-01

    Full Text Available Open Research has the potential to advance the scientific process by improving the transparency, rigour, scope and reach of research, but choosing to experiment with Open Research carries with it a set of ideological, legal, technical and operational considerations. Researchers, especially those in resource-constrained situations, may not be aware of the complex interrelations between these different domains of open practice, the additional resources required, or how Open Research can support traditional research practices. Using the Research on Open Educational Resources for Development (ROER4D project as an example, this paper attempts to demonstrate the interrelation between ideological, legal, technical and operational openness; the resources that conducting Open Research requires; and the benefits of an iterative, strategic approach to one’s own Open Research practice. In this paper we discuss the value of a critical approach towards Open Research to ensure better coherence between ‘open’ ideology (embodied in strategic intention and ‘open’ practice (the everyday operationalisation of open principles.

  20. Eucalyptus: an open-source cloud computing infrastructure

    International Nuclear Information System (INIS)

    Nurmi, Daniel; Wolski, Rich; Grzegorczyk, Chris; Obertelli, Graziano; Soman, Sunil; Youseff, Lamia; Zagorodnov, Dmitrii

    2009-01-01

    Utility computing, elastic computing, and cloud computing are all terms that refer to the concept of dynamically provisioning processing time and storage space from a ubiquitous 'cloud' of computational resources. Such systems allow users to acquire and release the resources on demand and provide ready access to data from processing elements, while relegating the physical location and exact parameters of the resources. Over the past few years, such systems have become increasingly popular, but nearly all current cloud computing offerings are either proprietary or depend upon software infrastructure that is invisible to the research community. In this work, we present Eucalyptus, an open-source software implementation of cloud computing that utilizes compute resources that are typically available to researchers, such as clusters and workstation farms. In order to foster community research exploration of cloud computing systems, the design of Eucalyptus emphasizes modularity, allowing researchers to experiment with their own security, scalability, scheduling, and interface implementations. In this paper, we outline the design of Eucalyptus, describe our own implementations of the modular system components, and provide results from experiments that measure performance and scalability of a Eucalyptus installation currently deployed for public use. The main contribution of our work is the presentation of the first research-oriented open-source cloud computing system focused on enabling methodical investigations into the programming, administration, and deployment of systems exploring this novel distributed computing model.

  1. A Process for the Representation of openEHR ADL Archetypes in OWL Ontologies.

    Science.gov (United States)

    Porn, Alex Mateus; Peres, Leticia Mara; Didonet Del Fabro, Marcos

    2015-01-01

    ADL is a formal language to express archetypes, independent of standards or domain. However, its specification is not precise enough in relation to the specialization and semantic of archetypes, presenting difficulties in implementation and a few available tools. Archetypes may be implemented using other languages such as XML or OWL, increasing integration with Semantic Web tools. Exchanging and transforming data can be better implemented with semantics oriented models, for example using OWL which is a language to define and instantiate Web ontologies defined by W3C. OWL permits defining significant, detailed, precise and consistent distinctions among classes, properties and relations by the user, ensuring the consistency of knowledge than using ADL techniques. This paper presents a process of an openEHR ADL archetypes representation in OWL ontologies. This process consists of ADL archetypes conversion in OWL ontologies and validation of OWL resultant ontologies using the mutation test.

  2. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  3. The CANDU 9 distributed control system design process

    International Nuclear Information System (INIS)

    Harber, J.E.; Kattan, M.K.; Macbeth, M.J.

    1997-01-01

    Canadian designed CANDU pressurized heavy water nuclear reactors have been world leaders in electrical power generation. The CANDU 9 project is AECL's next reactor design. Plant control for the CANDU 9 station design is performed by a distributed control system (DCS) as compared to centralized control computers, analog control devices and relay logic used in previous CANDU designs. The selection of a DCS as the platform to perform the process control functions and most of the data acquisition of the plant, is consistent with the evolutionary nature of the CANDU technology. The control strategies for the DCS control programs are based on previous CANDU designs but are implemented on a new hardware platform taking advantage of advances in computer technology. This paper describes the design process for developing the CANDU 9 DCS. Various design activities, prototyping and analyses have been undertaken in order to ensure a safe, functional, and cost-effective design. (author)

  4. Biointervention makes leather processing greener: an integrated cleansing and tanning system.

    Science.gov (United States)

    Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2003-06-01

    The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.

  5. Parallel Distributed Processing Theory in the Age of Deep Networks.

    Science.gov (United States)

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  6. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    Science.gov (United States)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  7. Student Support in Open Learning: Sustaining the Process.

    Science.gov (United States)

    Dearnley, Christine

    2003-01-01

    A 2-year study included interviews with 18 and survey of 160 nurses studying through open learning in the United Kingdom. They were challenged by returning to study, requiring time management and technological skills. Professional, academic, and social networks provided important support as life responsibilities and events impinged on learning.…

  8. Distributed and cooperative task processing: Cournot oligopolies on a graph.

    Science.gov (United States)

    Pavlic, Theodore P; Passino, Kevin M

    2014-06-01

    This paper introduces a novel framework for the design of distributed agents that must complete externally generated tasks but also can volunteer to process tasks encountered by other agents. To reduce the computational and communication burden of coordination between agents to perfectly balance load around the network, the agents adjust their volunteering propensity asynchronously within a fictitious trading economy. This economy provides incentives for nontrivial levels of volunteering for remote tasks, and thus load is shared. Moreover, the combined effects of diminishing marginal returns and network topology lead to competitive equilibria that have task reallocations that are qualitatively similar to what is expected in a load-balancing system with explicit coordination between nodes. In the paper, topological and algorithmic conditions are given that ensure the existence and uniqueness of a competitive equilibrium. Additionally, a decentralized distributed gradient-ascent algorithm is given that is guaranteed to converge to this equilibrium while not causing any node to over-volunteer beyond its maximum task-processing rate. The framework is applied to an autonomous-air-vehicle example, and connections are drawn to classic studies of the evolution of cooperation in nature.

  9. Palantiri: a distributed real-time database system for process control

    International Nuclear Information System (INIS)

    Tummers, B.J.; Heubers, W.P.J.

    1992-01-01

    The medium-energy accelerator MEA, located in Amsterdam, is controlled by a heterogeneous computer network. A large real-time database contains the parameters involved in the control of the accelerator and the experiments. This database system was implemented about ten years ago and has since been extended several times. In response to increased needs the database system has been redesigned. The new database environment, as described in this paper, consists out of two new concepts: (1) A Palantir which is a per machine process that stores the locally declared data and forwards all non local requests for data access to the appropriate machine. It acts as a storage device for data and a looking glass upon the world. (2) Golems: working units that define the data within the Palantir, and that have knowledge of the hardware they control. Applications access the data of a Golem by name (which do resemble Unix path names). The palantir that runs on the same machine as the application handles the distribution of access requests. This paper focuses on the Palantir concept as a distributed data storage and event handling device for process control. (author)

  10. OpenAPC. Open-Access-Publikationskosten als Open Data

    OpenAIRE

    Tullney, Marco

    2015-01-01

    Präsentationsfolien zum Vortrag „OpenAPC. Open-Access-Publikationskosten als Open Data“ in der Session „Ausgestaltung eines wissenschaftsadäquaten APC-Marktes: Grundsätze, Finanzierungsansätze und Management“ der Open-Access-Tage 2015 in Zürich (https://www.open-access.net/community/open-access-tage/open-access-tage-2015-zuerich/programm/#c1974)

  11. The textual characteristics of traditional and Open Access scientific journals are similar.

    Science.gov (United States)

    Verspoor, Karin; Cohen, K Bretonnel; Hunter, Lawrence

    2009-06-15

    Recent years have seen an increased amount of natural language processing (NLP) work on full text biomedical journal publications. Much of this work is done with Open Access journal articles. Such work assumes that Open Access articles are representative of biomedical publications in general and that methods developed for analysis of Open Access full text publications will generalize to the biomedical literature as a whole. If this assumption is wrong, the cost to the community will be large, including not just wasted resources, but also flawed science. This paper examines that assumption. We collected two sets of documents, one consisting only of Open Access publications and the other consisting only of traditional journal publications. We examined them for differences in surface linguistic structures that have obvious consequences for the ease or difficulty of natural language processing and for differences in semantic content as reflected in lexical items. Regarding surface linguistic structures, we examined the incidence of conjunctions, negation, passives, and pronominal anaphora, and found that the two collections did not differ. We also examined the distribution of sentence lengths and found that both collections were characterized by the same mode. Regarding lexical items, we found that the Kullback-Leibler divergence between the two collections was low, and was lower than the divergence between either collection and a reference corpus. Where small differences did exist, log likelihood analysis showed that they were primarily in the area of formatting and in specific named entities. We did not find structural or semantic differences between the Open Access and traditional journal collections.

  12. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    Science.gov (United States)

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  13. Implementation of the U.S. Environmental Protection Agency's Waste Reduction (WAR) Algorithm in Cape-Open Based Process Simulators

    Science.gov (United States)

    The Sustainable Technology Division has recently completed an implementation of the U.S. EPA's Waste Reduction (WAR) Algorithm that can be directly accessed from a Cape-Open compliant process modeling environment. The WAR Algorithm add-in can be used in AmsterChem's COFE (Cape-Op...

  14. An open experiment of a submilli-PIXE camera

    International Nuclear Information System (INIS)

    Matsuyama, S.; Ishii, K.; Yamazaki, H.; Iwasaki, S.; Tokai, Y.; Sugimoto, A.; Endo, H.; Ozawa, T.

    1999-01-01

    We have annually held an open experiment of PIXE analysis since 1996 to get people's understanding on nuclear technology and radiation science. Up to the present, more than 270 participants joined and enjoyed the open experiments. This year, we demonstrated performance of a submilli-PIXE camera and had sixty-nine participants in the open experiment of PIXE. Elemental spatial distribution images gave deep impression to the participants. Half of the participants were high school students since the open experiment of PIXE was held during the period of open campus of Tohoku University. About ten percents of the participants were junior high school students. Our open experiment of PIXE was very effective to arouse public interest in radiation science and nuclear technology. (author)

  15. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  16. Design and simulation for real-time distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.; Gellrich, A.; Gensah, U.; Leich, H.; Wegner, P.

    1996-01-01

    The aim of this work is to provide a proper framework for the simulation and the optimization of the event building, the on-line third level trigger, and complete event reconstruction processor farm for the future HERA-B experiment. A discrete event, process oriented, simulation developed in concurrent μC++ is used for modelling the farm nodes running with multi-tasking constraints and different types of switching elements and digital signal processors interconnected for distributing the data through the system. An adequate graphic interface to the simulation part which allows to monitor features on-line and to analyze trace files, provides a powerful development tool for evaluating and designing parallel processing architectures. Control software and data flow protocols for event building and dynamic processor allocation are presented for two architectural models. (author)

  17. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  18. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    Science.gov (United States)

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  19. The Different Role of Working Memory in Open-Ended versus Closed-Ended Creative Problem Solving: A Dual-Process Theory Account

    Science.gov (United States)

    Lin, Wei-Lun; Lien, Yunn-Wen

    2013-01-01

    This study examined how working memory plays different roles in open-ended versus closed-ended creative problem-solving processes, as represented by divergent thinking tests and insight problem-solving tasks. With respect to the analysis of different task demands and the framework of dual-process theories, the hypothesis was that the idea…

  20. Quantifying evenly distributed states in exclusion and nonexclusion processes

    Science.gov (United States)

    Binder, Benjamin J.; Landman, Kerry A.

    2011-04-01

    Spatial-point data sets, generated from a wide range of physical systems and mathematical models, can be analyzed by counting the number of objects in equally sized bins. We find that the bin counts are related to the Pólya distribution. New measures are developed which indicate whether or not a spatial data set, generated from an exclusion process, is at its most evenly distributed state, the complete spatial randomness (CSR) state. To this end, we define an index in terms of the variance between the bin counts. Limiting values of the index are determined when objects have access to the entire domain and when there are subregions of the domain that are inaccessible to objects. Using three case studies (Lagrangian fluid particles in chaotic laminar flows, cellular automata agents in discrete models, and biological cells within colonies), we calculate the indexes and verify that our theoretical CSR limit accurately predicts the state of the system. These measures should prove useful in many biological applications.

  1. Torii, an Open Portal over Open Archives

    CERN Document Server

    Bertocco, S

    2001-01-01

    The world of academic publishing is undergoing many changes. Everywhere paper-based publishing is being replaced by electronic archives and ink printing by bits. Unrestricted (web) access to many resources is becoming a fundamental feature of the academic research environment. Particularly in the high-energy physics community, the pre-print distribution has moved completely away from the paper-based system into a fully electronic system based on open archives. At the same time, freely accessible peer-reviewed journals have started to challenge the more traditional, and paper-based journals showing that the entire paper-based cycle can be effectively replaced by a web-based one. The TIPS project was born in this environment and from these observations. It is based on the idea that further progress in information distribution and scientific publishing on the web requires some key ingredients: the implementation of a more extensive semantic structure in the documents that are exchanged; a unified, desktop-like, ...

  2. Spatial Data Exploring by Satellite Image Distributed Processing

    Science.gov (United States)

    Mihon, V. D.; Colceriu, V.; Bektas, F.; Allenbach, K.; Gvilava, M.; Gorgan, D.

    2012-04-01

    Our society needs and environmental predictions encourage the applications development, oriented on supervising and analyzing different Earth Science related phenomena. Satellite images could be explored for discovering information concerning land cover, hydrology, air quality, and water and soil pollution. Spatial and environment related data could be acquired by imagery classification consisting of data mining throughout the multispectral bands. The process takes in account a large set of variables such as satellite image types (e.g. MODIS, Landsat), particular geographic area, soil composition, vegetation cover, and generally the context (e.g. clouds, snow, and season). All these specific and variable conditions require flexible tools and applications to support an optimal search for the appropriate solutions, and high power computation resources. The research concerns with experiments on solutions of using the flexible and visual descriptions of the satellite image processing over distributed infrastructures (e.g. Grid, Cloud, and GPU clusters). This presentation highlights the Grid based implementation of the GreenLand application. The GreenLand application development is based on simple, but powerful, notions of mathematical operators and workflows that are used in distributed and parallel executions over the Grid infrastructure. Currently it is used in three major case studies concerning with Istanbul geographical area, Rioni River in Georgia, and Black Sea catchment region. The GreenLand application offers a friendly user interface for viewing and editing workflows and operators. The description involves the basic operators provided by GRASS [1] library as well as many other image related operators supported by the ESIP platform [2]. The processing workflows are represented as directed graphs giving the user a fast and easy way to describe complex parallel algorithms, without having any prior knowledge of any programming language or application commands

  3. On the joint distribution of excursion duration and amplitude of a narrow-band Gaussian process

    DEFF Research Database (Denmark)

    Ghane, Mahdi; Gao, Zhen; Blanke, Mogens

    2018-01-01

    of amplitude and period are limited to excursion through a mean-level or to describe the asymptotic behavior of high level excursions. This paper extends the knowledge by presenting a theoretical derivation of probability of wave exceedance amplitude and duration, for a narrow-band Gaussian process......The probability density of crest amplitude and of duration of exceeding a given level are used in many theoretical and practical problems in engineering. The joint density is essential for design of constructions that are subjected to waves and wind. The presently available joint distributions...... distribution, as expected, and that the marginal distribution of excursion duration works both for asymptotic and non-asymptotic cases. The suggested model is found to be a good replacement for the empirical distributions that are widely used. Results from simulations of narrow-band Gaussian processes, real...

  4. Harness That S.O.B.: Distributing Remote Sensing Analysis in a Small Office/Business

    Science.gov (United States)

    Kramer, J.; Combe, J.; McCord, T. B.

    2009-12-01

    Researchers in a small office/business (SOB) operate with limited funding, equipment, and software availability. To mitigate these issues, we developed a distributed computing framework that: 1) leverages open source software to implement functionality otherwise reliant on proprietary software and 2) harnesses the unused power of (semi-)idle office computers with mixed operating systems (OSes). This abstract outlines some reasons for the effort, its conceptual basis and implementation, and provides brief speedup results. The Multiple-Endmember Linear Spectral Unmixing Model (MELSUM)1 processes remote-sensing (hyper-)spectral images. The algorithm is computationally expensive, sometimes taking a full week or more for a 1 million pixel/100 wavelength image. Analysis of pixels is independent, so a large benefit can be gained from parallel processing techniques. Job concurrency is limited by the number of active processing units. MELSUM was originally written in the Interactive Data Language (IDL). Despite its multi-threading capabilities, an IDL instance executes on a single machine, and so concurrency is limited by the machine's number of central processing units (CPUs). Network distribution can access more CPUs to provide a greater speedup, while also taking advantage of (often) underutilized extant equipment. appropriately integrating open source software magnifies the impact by avoiding the purchase of additional licenses. Our method of distribution breaks into four conceptual parts: 1) the top- or task-level user interface; 2) a mid-level program that manages hosts and jobs, called the distribution server; 3) a low-level executable for individual pixel calculations; and 4) a control program to synchronize sequential sub-tasks. Each part is a separate OS process, passing information via shell commands and/or temporary files. While the control and low-level executables are short-lived, the top-level program and distribution server run (at least) for the entirety of

  5. Process Design and Economics for the Production of Algal Biomass: Algal Biomass Production in Open Pond Systems and Processing Through Dewatering for Downstream Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Markham, Jennifer [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kinchin, Christopher [National Renewable Energy Lab. (NREL), Golden, CO (United States); Grundl, Nicholas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tan, Eric C.D. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Humbird, David [DWH Process Consulting, Denver, CO (United States)

    2016-02-17

    This report describes in detail a set of aspirational design and process targets to better understand the realistic economic potential for the production of algal biomass for subsequent conversion to biofuels and/or coproducts, based on the use of open pond cultivation systems and a series of dewatering operations to concentrate the biomass up to 20 wt% solids (ash-free dry weight basis).

  6. Using Java for distributed computing in the Gaia satellite data processing

    Science.gov (United States)

    O'Mullane, William; Luri, Xavier; Parsons, Paul; Lammers, Uwe; Hoar, John; Hernandez, Jose

    2011-10-01

    In recent years Java has matured to a stable easy-to-use language with the flexibility of an interpreter (for reflection etc.) but the performance and type checking of a compiled language. When we started using Java for astronomical applications around 1999 they were the first of their kind in astronomy. Now a great deal of astronomy software is written in Java as are many business applications. We discuss the current environment and trends concerning the language and present an actual example of scientific use of Java for high-performance distributed computing: ESA's mission Gaia. The Gaia scanning satellite will perform a galactic census of about 1,000 million objects in our galaxy. The Gaia community has chosen to write its processing software in Java. We explore the manifold reasons for choosing Java for this large science collaboration. Gaia processing is numerically complex but highly distributable, some parts being embarrassingly parallel. We describe the Gaia processing architecture and its realisation in Java. We delve into the astrometric solution which is the most advanced and most complex part of the processing. The Gaia simulator is also written in Java and is the most mature code in the system. This has been successfully running since about 2005 on the supercomputer "Marenostrum" in Barcelona. We relate experiences of using Java on a large shared machine. Finally we discuss Java, including some of its problems, for scientific computing.

  7. 12th International Symposium on Open Collaboration Companion

    CERN Document Server

    2016-01-01

    Welcome to the proceedings of OpenSym 2016, the 12th international symposium on open collaboration! Open collaboration is collaboration that is egalitarian (everyone can join, no principled or artificial barriers to participation exist), meritocratic (decisions and status are merit-based rather than imposed) and self-organizing (processes adapt to people rather than people adapt to predefined processes).

  8. Quasi-static time-series simulation using OpenDSS in IEEE distribution feeder model with high PV penetration and its impact on solar forecasting

    Science.gov (United States)

    Mohammed, Touseef Ahmed Faisal

    Since 2000, renewable electricity installations in the United States (excluding hydropower) have more than tripled. Renewable electricity has grown at a compounded annual average of nearly 14% per year from 2000-2010. Wind, Concentrated Solar Power (CSP) and solar Photo Voltaic (PV) are the fastest growing renewable energy sectors. In 2010 in the U.S., solar PV grew over 71% and CSP grew by 18% from the previous year. Globally renewable electricity installations have more than quadrupled from 2000-2010. Solar PV generation grew by a factor of more than 28 between 2000 and 2010. The amount of CSP and solar PV installations are increasing on the distribution grid. These PV installations transmit electrical current from the load centers to the generating stations. But the transmission and distribution grid have been designed for uni-directional flow of electrical energy from generating stations to load centers. This causes imbalances in voltage and switchgear of the electrical circuitry. With the continuous rise in PV installations, analysis of voltage profile and penetration levels remain an active area of research. Standard distributed photovoltaic (PV) generators represented in simulation studies do not reflect the exact location and variability properties such as distance between interconnection points to substations, voltage regulators, solar irradiance and other environmental factors. Quasi-Static simulations assist in peak load planning hour and day ahead as it gives a time sequence analysis to help in generation allocation. Simulation models can be daily, hourly or yearly depending on duty cycle and dynamics of the system. High penetration of PV into the power grid changes the voltage profile and power flow dynamically in the distribution circuits due to the inherent variability of PV. There are a number of modeling and simulations tools available for the study of such high penetration PV scenarios. This thesis will specifically utilize OpenDSS, a open source

  9. Understanding User Behavioral Patterns in Open Knowledge Communities

    Science.gov (United States)

    Yang, Xianmin; Song, Shuqiang; Zhao, Xinshuo; Yu, Shengquan

    2018-01-01

    Open knowledge communities (OKCs) have become popular in the era of knowledge economy. This study aimed to explore how users collaboratively create and share knowledge in OKCs. In particular, this research identified the behavior distribution and behavioral patterns of users by conducting frequency distribution and lag sequential analyses. Some…

  10. Gibbs' theorem for open systems with incomplete statistics

    International Nuclear Information System (INIS)

    Bagci, G.B.

    2009-01-01

    Gibbs' theorem, which is originally intended for canonical ensembles with complete statistics has been generalized to open systems with incomplete statistics. As a result of this generalization, it is shown that the stationary equilibrium distribution of inverse power law form associated with the incomplete statistics has maximum entropy even for open systems with energy or matter influx. The renormalized entropy definition given in this paper can also serve as a measure of self-organization in open systems described by incomplete statistics.

  11. Late Quaternary sedimentary dynamics in Western Amazonia: Implications for the origin of open vegetation/forest contrasts

    Science.gov (United States)

    Rossetti, D. F.; Bertani, T. C.; Zani, H.; Cremon, E. H.; Hayakawa, E. H.

    2012-12-01

    This work investigated the evolution of sedimentary environments during the latest Quaternary and their influence on the paradoxical occurrence of open vegetation patches in sharp contact with the Amazonian forest. The approach integrated pre-existing geological and floristic data from lowlands in the Brazilian Amazonia, with remote sensing imagery including multispectral optical images (TM, ETM+, and ASTER), Phased Array L-band Synthetic Aperture Radar (PALSAR), InSAR C-band SRTM-DEMs, and high resolution images obtained from Google Earth™. The detection of an abundance of paleomorphologies provided evidence of a scenario in which constant environmental shifts were linked to the evolution of fluvial and megafan depositional systems. In all studied areas, the open vegetation patches are not random, but associated with sedimentary deposits representative of environments either deactivated during the Holocene or presently in the process of deactivation. Sedimentary evolution would have determined the distribution of wetlands and terra firme in many areas of the Amazonian lowlands, and would have a major impact on the development of open vegetated patches within the modern rainforest. Subsiding areas were filled up with megafan deposits, and many fluvial tributaries were rearranged on the landscape. The close relationship between vegetation and the physical environment suggests that sedimentary history related to the evolution of depositional settings during the latest Quaternary played a major role in the distribution of flooded and non-flooded areas of the Amazonian lowlands, with a direct impact on the distribution of modern floristic patterns. As the depositional sites were abandoned and their sedimentary deposits were exposed to the surface, they became sites suitable for vegetation growth, first of herbaceous species and then of forest. Although climate fluctuations might have been involved, fault reactivation appears to have been the main cause of changes in

  12. Digitization and the open management of data: New prospects for electricity distributors

    International Nuclear Information System (INIS)

    Derdevet, Michel

    2017-01-01

    At the core of both the energy transition and the digital revolution are the grids for distributing electricity during this era of big data. The electricity grid is becoming smarter and smarter, as it is equipped with sensors capable of providing information and data - the leading example being the 35 million smart electricity meters to be installed in French households by 2021. Backed by recent legislation, the trend toward open data is, for distributors, both a requirement and a lever: an economic, social and environmental requirement for enabling localities to prove their sense of responsibility and for developing innovative services for citizens; but also a lever for distributors to become operators who, processing dynamic data, are open to their ecosystem - a lever for making new business models emerge for the local management of energy

  13. Open Access. Chapter 6 of Scholarly Communication for Librarians.

    OpenAIRE

    Morrison, Heather

    2008-01-01

    In-depth overview of open access, covering definitions (open access publishing, open access archives, gratis and libre, open access works versus open access processes), major statements and declarations, types of open access, major initiatives, trends, advocacy and lobbying.

  14. Production of silver nanoparticles by laser ablation in open air

    International Nuclear Information System (INIS)

    Boutinguiza, M.; Comesaña, R.; Lusquiños, F.; Riveiro, A.; Val, J. del; Pou, J.

    2015-01-01

    Highlights: • Silver nanoparticles have been obtained by laser ablation of metallic Ag in open air using nanosecond laser. • The continuous process enables increasing the production yield. • The obtained particles are rounded shape with narrow size distribution. - Abstract: Silver nanoparticles have attracted much attention as a subject of investigation due to their well-known properties, such as good conductivity, antibacterial and catalytic effects, etc. They are used in many different areas, such as medicine, industrial applications, scientific investigation, etc. There are different techniques for producing Ag nanoparticles, chemical, electrochemical, sonochemical, etc. These methods often lead to impurities together with nanoparticles or colloidal solutions. In this work, laser ablation of solids in open air conditions (LASOA) is used to produce silver nanoparticles and collect them on glass substrates. Production and deposition of silver nanoparticles are integrated in the same step to reduce the process. The obtained particles are analysed and the nanoparticles formation mechanism is discussed. The obtained nanoparticles were characterized by means of transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM) and UV/VIS absorption spectroscopy. The obtained nanoparticles consisted of Ag nanoparticles showing rounded shape with diameters ranging from few to 50 nm

  15. Production of silver nanoparticles by laser ablation in open air

    Energy Technology Data Exchange (ETDEWEB)

    Boutinguiza, M., E-mail: mohamed@uvigo.es [Applied Physics Department, University of Vigo EEI, Lagoas-Marcosende, 9. Vigo, 36310 (Spain); Comesaña, R. [Materials Engineering, Applied Mechanics and Construction Dpt., University of Vigo, EEI, Lagoas-Marcosende, Vigo, 36310 (Spain); Lusquiños, F.; Riveiro, A.; Val, J. del; Pou, J. [Applied Physics Department, University of Vigo EEI, Lagoas-Marcosende, 9. Vigo, 36310 (Spain)

    2015-05-01

    Highlights: • Silver nanoparticles have been obtained by laser ablation of metallic Ag in open air using nanosecond laser. • The continuous process enables increasing the production yield. • The obtained particles are rounded shape with narrow size distribution. - Abstract: Silver nanoparticles have attracted much attention as a subject of investigation due to their well-known properties, such as good conductivity, antibacterial and catalytic effects, etc. They are used in many different areas, such as medicine, industrial applications, scientific investigation, etc. There are different techniques for producing Ag nanoparticles, chemical, electrochemical, sonochemical, etc. These methods often lead to impurities together with nanoparticles or colloidal solutions. In this work, laser ablation of solids in open air conditions (LASOA) is used to produce silver nanoparticles and collect them on glass substrates. Production and deposition of silver nanoparticles are integrated in the same step to reduce the process. The obtained particles are analysed and the nanoparticles formation mechanism is discussed. The obtained nanoparticles were characterized by means of transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM) and UV/VIS absorption spectroscopy. The obtained nanoparticles consisted of Ag nanoparticles showing rounded shape with diameters ranging from few to 50 nm.

  16. Predicting cycle time distributions for integrated processing workstations : an aggregate modeling approach

    NARCIS (Netherlands)

    Veeger, C.P.L.; Etman, L.F.P.; Lefeber, A.A.J.; Adan, I.J.B.F.; Herk, van J.; Rooda, J.E.

    2011-01-01

    To predict cycle time distributions of integrated processing workstations, detailed simulation models are almost exclusively used; these models require considerable development and maintenance effort. As an alternative, we propose an aggregate model that is a lumped-parameter representation of the

  17. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    Science.gov (United States)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  18. The complete information for phenomenal distributed parameter control of multicomponent chemical processes in gas, fluid and solid phase

    International Nuclear Information System (INIS)

    Niemiec, W.

    1985-01-01

    A constitutive mathematical model of distributed parameters of multicomponent chemical processes in gas, fluid and solid phase is utilized to the realization of phenomenal distributed parameter control of these processes. Original systems of partial differential constitutive state equations, in the following derivative forms /I/, /II/ and /III/ are solved in this paper from the point of view of information for phenomenal distributed parameter control of considered processes. Obtained in this way for multicomponent chemical processes in gas, fluid and solid phase: -dynamical working space-time characteristics/analytical solutions in working space-time of chemical reactors/, -dynamical phenomenal Green functions as working space-time transfer functions, -statical working space characteristics /analytical solutions in working space of chemical reactors/, -statical phenomenal Green functions as working space transfer functions, are applied, as information for realization of constitutive distributed parameter control of mass, energy and momentum aspects of above processes. Two cases are considered by existence of: A/sup o/ - initial conditions, B/sup o/ - initial and boundary conditions, for multicomponent chemical processes in gas, fluid and solid phase

  19. Crowd innovation : The role of uncertainty for opening up the innovation process in the public sector

    OpenAIRE

    Collm, Alexandra; Schedler, Kuno

    2011-01-01

    Innovations are complex processes that can be created internally, caused externally or generated collectively with stakeholders. Integrating crowdsourcing and open innovation and supported by Web 2.0 technologies, a new innovation practice, crowd innovation, has emerged. In this paper, we illustrate empirically the practice of crowd innovation and discuss institutional obstacles, which exist for implementing crowd innovation in the public sector. Referring to the normative mode of publicness ...

  20. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  1. Model for calorimetric measurements in an open quantum system

    Science.gov (United States)

    Donvil, Brecht; Muratore-Ginanneschi, Paolo; Pekola, Jukka P.; Schwieger, Kay

    2018-05-01

    We investigate the experimental setup proposed in New J. Phys. 15, 115006 (2013), 10.1088/1367-2630/15/11/115006 for calorimetric measurements of thermodynamic indicators in an open quantum system. As a theoretical model we consider a periodically driven qubit coupled with a large yet finite electron reservoir, the calorimeter. The calorimeter is initially at equilibrium with an infinite phonon bath. As time elapses, the temperature of the calorimeter varies in consequence of energy exchanges with the qubit and the phonon bath. We show how under weak-coupling assumptions, the evolution of the qubit-calorimeter system can be described by a generalized quantum jump process including as dynamical variable the temperature of the calorimeter. We study the jump process by numeric and analytic methods. Asymptotically with the duration of the drive, the qubit-calorimeter attains a steady state. In this same limit, we use multiscale perturbation theory to derive a Fokker-Planck equation governing the calorimeter temperature distribution. We inquire the properties of the temperature probability distribution close and at the steady state. In particular, we predict the behavior of measurable statistical indicators versus the qubit-calorimeter coupling constant.

  2. THE FEATURES OF LASER EMISSION ENERGY DISTRIBUTION AT MATHEMATIC MODELING OF WORKING PROCESS

    Directory of Open Access Journals (Sweden)

    A. M. Avsiyevich

    2013-01-01

    Full Text Available The space laser emission energy distribution of different continuous operation settings depends from many factors, first on the settings design. For more accurate describing of multimode laser emission energy distribution intensity the experimental and theoretic model, which based on experimental laser emission distribution shift presentation with given accuracy rating in superposition basic function form, is proposed. This model provides the approximation error only 2,2 percent as compared with 24,6 % and 61 % for uniform and Gauss approximation accordingly. The proposed model usage lets more accurate take into consideration the laser emission and working surface interaction peculiarity, increases temperature fields calculation accuracy for mathematic modeling of laser treatment processes. The method of experimental laser emission energy distribution studying for given source and mathematic apparatus for calculation of laser emission energy distribution intensity parameters depended from the distance in radial direction on surface heating zone are shown.

  3. Radial transport processes as a precursor to particle deposition in drinking water distribution systems.

    Science.gov (United States)

    van Thienen, P; Vreeburg, J H G; Blokker, E J M

    2011-02-01

    Various particle transport mechanisms play a role in the build-up of discoloration potential in drinking water distribution networks. In order to enhance our understanding of and ability to predict this build-up, it is essential to recognize and understand their role. Gravitational settling with drag has primarily been considered in this context. However, since flow in water distribution pipes is nearly always in the turbulent regime, turbulent processes should be considered also. In addition to these, single particle effects and forces may affect radial particle transport. In this work, we present an application of a previously published turbulent particle deposition theory to conditions relevant for drinking water distribution systems. We predict quantitatively under which conditions turbophoresis, including the virtual mass effect, the Saffman lift force, and the Magnus force may contribute significantly to sediment transport in radial direction and compare these results to experimental observations. The contribution of turbophoresis is mostly limited to large particles (>50 μm) in transport mains, and not expected to play a major role in distribution mains. The Saffman lift force may enhance this process to some degree. The Magnus force is not expected to play any significant role in drinking water distribution systems. © 2010 Elsevier Ltd. All rights reserved.

  4. Understanding Motivational System in Open Learning: Learners' Engagement with a Traditional Chinese-Based Open Educational Resource System

    Science.gov (United States)

    Huang, Wenhao David; Wu, Chorng-Guang

    2017-01-01

    Learning has embraced the "open" process in recent years, as many educational resources are made available for free online. Existing research, however, has not provided sufficient evidence to systematically improve open learning interactions and engagement in open educational resource (OER) systems. This deficiency presents two…

  5. National innovation policy and global open innovation: Exploring balances, tradeoffs and complementarities

    DEFF Research Database (Denmark)

    Bloch, Carter Walter; Sverre, Herstad; Ebersberger, Bernd

    2010-01-01

    . We argue that the purpose of public research and innovation policy remains one of developing and sustaining territorial knowledge bases capable of growing and supporting internationally competitive industries. But the rules of the game have changed. Public policy now needs to carefully balance......The aim of this article is to suggest a framework for examining the way national policy mixes are responding to the challenges and opportunities of globally distributed knowledge networks, cross-sectoral technology flows and consequently open innovation processes occurring on an international scale...

  6. Poisson-process generalization for the trading waiting-time distribution in a double-auction mechanism

    Science.gov (United States)

    Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico

    2005-05-01

    In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.

  7. Novel scaling of the multiplicity distributions in the sequential fragmentation process and in the percolation

    International Nuclear Information System (INIS)

    Botet, R.

    1996-01-01

    A novel scaling of the multiplicity distributions is found in the shattering phase of the sequential fragmentation process with inhibition. The same scaling law is shown to hold in the percolation process. (author)

  8. Dimensions of Open Research: Critical Reflections on Openness in the ROER4D Project

    Science.gov (United States)

    King, Thomas; Hodgkinson-Williams, Cheryl; Willmers, Michelle; Walji, Sukaina

    2016-01-01

    Open Research has the potential to advance the scientific process by improving the transparency, rigour, scope and reach of research, but choosing to experiment with Open Research carries with it a set of ideological, legal, technical and operational considerations. Researchers, especially those in resource-constrained situations, may not be aware…

  9. Open Computer Forensic Architecture a Way to Process Terabytes of Forensic Disk Images

    Science.gov (United States)

    Vermaas, Oscar; Simons, Joep; Meijer, Rob

    This chapter describes the Open Computer Forensics Architecture (OCFA), an automated system that dissects complex file types, extracts metadata from files and ultimately creates indexes on forensic images of seized computers. It consists of a set of collaborating processes, called modules. Each module is specialized in processing a certain file type. When it receives a so called 'evidence', the information that has been extracted so far about the file together with the actual data, it either adds new information about the file or uses the file to derive a new 'evidence'. All evidence, original and derived, is sent to a router after being processed by a particular module. The router decides which module should process the evidence next, based upon the metadata associated with the evidence. Thus the OCFA system can recursively process images until from every compound file the embedded files, if any, are extracted, all information that the system can derive, has been derived and all extracted text is indexed. Compound files include, but are not limited to, archive- and zip-files, disk images, text documents of various formats and, for example, mailboxes. The output of an OCFA run is a repository full of derived files, a database containing all extracted information about the files and an index which can be used when searching. This is presented in a web interface. Moreover, processed data is easily fed to third party software for further analysis or to be used in data mining or text mining-tools. The main advantages of the OCFA system are Scalability, it is able to process large amounts of data.

  10. Open Access Publishing: What Authors Want

    Science.gov (United States)

    Nariani, Rajiv; Fernandez, Leila

    2012-01-01

    Campus-based open access author funds are being considered by many academic libraries as a way to support authors publishing in open access journals. Article processing fees for open access have been introduced recently by publishers and have not yet been widely accepted by authors. Few studies have surveyed authors on their reasons for publishing…

  11. Sudden transition and sudden change from open spin environments

    International Nuclear Information System (INIS)

    Hu, Zheng-Da; Xu, Jing-Bo; Yao, Dao-Xin

    2014-01-01

    We investigate the necessary conditions for the existence of sudden transition or sudden change phenomenon for appropriate initial states under dephasing. As illustrative examples, we study the behaviors of quantum correlation dynamics of two noninteracting qubits in independent and common open spin environments, respectively. For the independent environments case, we find that the quantum correlation dynamics is closely related to the Loschmidt echo and the dynamics exhibits a sudden transition from classical to quantum correlation decay. It is also shown that the sudden change phenomenon may occur for the common environment case and stationary quantum discord is found at the high temperature region of the environment. Finally, we investigate the quantum criticality of the open spin environment by exploring the probability distribution of the Loschmidt echo and the scaling transformation behavior of quantum discord, respectively. - Highlights: • Sudden transition or sudden change from open spin baths are studied. • Quantum discord is related to the Loschmidt echo in independent open spin baths. • Steady quantum discord is found in a common open spin bath. • The probability distribution of the Loschmidt echo is analyzed. • The scaling transformation behavior of quantum discord is displayed

  12. Distributed genetic process mining

    NARCIS (Netherlands)

    Bratosin, C.C.; Sidorova, N.; Aalst, van der W.M.P.

    2010-01-01

    Process mining aims at discovering process models from data logs in order to offer insight into the real use of information systems. Most of the existing process mining algorithms fail to discover complex constructs or have problems dealing with noise and infrequent behavior. The genetic process

  13. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (DSES)

    Science.gov (United States)

    Phillips, Robert; Dexter, Dan; Hasan, David; Crues, Edwin Z.

    2007-01-01

    This document describes the federate initialization process that was developed at the NASA Johnson Space Center with the HIIA Transfer Vehicle Flight Controller Trainer (HTV FCT) simulations and refined in the Distributed Space Exploration Simulation (DSES). These simulations use the High Level Architecture (HLA) IEEE 1516 to provide the communication and coordination between the distributed parts of the simulation. The purpose of the paper is to describe a generic initialization sequence that can be used to create a federate that can: 1. Properly initialize all HLA objects, object instances, interactions, and time management 2. Check for the presence of all federates 3. Coordinate startup with other federates 4. Robustly initialize and share initial object instance data with other federates.

  14. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design...... of error detection methods includes a high level software specification. this has the purpose of illustrating that the designed can be used in practice....

  15. Why open innovation is easier said than done

    DEFF Research Database (Denmark)

    Mahdad, Maral; Piccaluga, Andrea; Di Minin, Alberto

    innovation implementation suggests the following: (1) The process of open innovation through mobility of skilled R&D employees triggers organizational identity ambiguity and change, (2) Organizational identity ambiguity phase in the process of open innovation can be shortened by the support of parent company...... and managerial skills highlighting sensemaking mechanisms, (3) Constructing a shared organizational identity with university members involved in this process is an undeniable element of OI success for this strategy. We contribute to the literature by establishing linkages among organizational identity and open...

  16. The open innovation research landscape

    DEFF Research Database (Denmark)

    Bogers, Marcel; Zobel, Ann-Kristin; Afuah, Allan

    2017-01-01

    This paper provides an overview of the main perspectives and themes emerging in research on open innovation (OI). The paper is the result of a collaborative process among several OI scholars – having a common basis in the recurrent Professional Development Workshop on ‘Researching Open Innovation...

  17. Open Access for International Criminal Lawyers

    NARCIS (Netherlands)

    van Laer, Coen

    2016-01-01

    This study investigates to what extent Open Access is useful for international criminal lawyers. Free reuse and distribution may be particularly advantageous for the audience in less resourceful countries. And individual authors need visibility to promote their academic reputation. However, many

  18. Core power distribution measurement and data processing in Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Hong

    1997-01-01

    For the first time in China, Daya Bay Nuclear Power Station applied the advanced technology of worldwide commercial pressurized reactors to the in-core detectors, the leading excore six-chamber instrumentation for precise axial power distribution, and the related data processing. Described in this article are the neutron flux measurement in Daya Bay Nuclear Power Station, and the detailed data processing

  19. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    2015-01-01

    Full Text Available Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (si of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(ei provided by each electrode of the 10/20 system about the identified si. H(ei Principal Component Analysis (PCA was used to study the temporal and spatial activation of these sources si. This analysis evidenced 4 different patterns of H(ei covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies.

  20. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    Science.gov (United States)

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.