WorldWideScience

Sample records for end-to-end foodweb control

  1. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  2. Integrating end-to-end threads of control into object-oriented analysis and design

    Science.gov (United States)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  3. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  4. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  5. End-to-end verifiability

    OpenAIRE

    Ryan, Peter; Benaloh, Josh; Rivest, Ronald; Stark, Philip; Teague, Vanessa; Vora, Poorvi

    2016-01-01

    This pamphlet describes end-to-end election verifiability (E2E-V) for a nontechnical audience: election officials, public policymakers, and anyone else interested in secure, transparent, evidence - based electronic elections. This work is part of the Overseas Vote Foundation’s End-to-End Verifiable Internet Voting: Specification and Feasibility Assessment Study (E2E VIV Project), funded by the Democracy Fund.

  6. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Hudgins, Andrew P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Carrillo, Ismael M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jin, Xin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simmins, John [Electric Power Research Institute (EPRI)

    2018-02-21

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR) power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.

  7. Arcus end-to-end simulations

    Science.gov (United States)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  8. TROPOMI end-to-end performance studies

    Science.gov (United States)

    Voors, Robert; de Vries, Johan; Veefkind, Pepijn; Gloudemans, Annemieke; Mika, Àgnes; Levelt, Pieternel

    2008-10-01

    The TROPOspheric Monitoring Instrument (TROPOMI) is a UV/VIS/NIR/SWIR non-scanning nadir viewing imaging spectrometer that combines a wide swath (110°) with high spatial resolution (8 x 8 km). Its main heritages are from the Ozone Monitoring Instrument (OMI) and from SCIAMACHY. Since its launch in 2004 OMI has been providing, on a daily basis and on a global scale, a wealth of data on ozone, NO2 and minor trace gases, aerosols and local pollution, a scanning spectrometer launched in 2004. The TROPOMI UV/VIS/NIR and SWIR heritage is a combination of OMI and SCIAMACHY. In the framework of development programs for a follow-up mission for the successful Ozone Monitoring Instrument, we have developed the so-called TROPOMI Integrated Development Environment. This is a GRID based software simulation tool for OMI follow-up missions. It includes scene generation, an instrument simulator, a level 0-1b processing chain, as well as several level 1b-2 processing chains. In addition it contains an error-analyzer, i.e. a tool to feedback the level 2 results to the input of the scene generator. The paper gives a description of the TROPOMI instrument and focuses on design aspects as well as on the performance, as tested in the end-to-end development environment TIDE.

  9. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    Science.gov (United States)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  10. Understanding TCP over TCP: effects of TCP tunneling on end-to-end throughput and latency

    Science.gov (United States)

    Honda, Osamu; Ohsaki, Hiroyuki; Imase, Makoto; Ishizuka, Mika; Murayama, Junichi

    2005-10-01

    TCP tunnel is a technology that aggregates and transfers packets sent between end hosts as a single TCP connection. By using a TCP tunnel, the fairness among aggregated flows can be improved and several protocols can be transparently transmitted through a firewall. Currently, many applications such as SSH, VTun, and HTun use a TCP tunnel. However, since most applications running on end hosts generally use TCP, two TCP congestion controls (i.e., end-to-end TCP and tunnel TCP) operate simultaneously and interfere each other. Under certain conditions, it has been known that using a TCP tunnel severely degrades the end-to-end TCP performance. Namely, it has known that using a TCP tunnel drastically degrades the end-to-end TCP throughput for some time, which is called TCP meltdown problem. On the contrary, under other conditions, it has been known that using a TCP tunnel significantly improves the end-to-end TCP performance. However, it is still an open issue --- how, when, and why is a TCP tunnel malicious for end-to-end TCP performance? In this paper, we therefore investigate effect of TCP tunnel on end-to-end TCP performance using simulation experiments. Specifically, we quantitatively reveal effects of several factors (e.g., the propagation delay, usage of SACK option, TCP socket buffer size, and sender buffer size of TCP tunnel) on performance of end-to-end TCP and tunnel TCP.

  11. Model Scaling Approach for the GOCE End to End Simulator

    Science.gov (United States)

    Catastini, G.; De Sanctis, S.; Dumontel, M.; Parisch, M.

    2007-08-01

    The Gravity field and steady-state Ocean Circulation Explorer (GOCE) is the first core Earth explorer of ESA's Earth observation programme of satellites for research in the Earth sciences. The objective of the mission is to produce high-accuracy, high-resolution, global measurements of the Earth's gravity field, leading to improved geopotential and geoid (the equipotential surface corresponding to the steady-state sea level) models for use in a wide range of geophysical applications. More precisely, the GOCE mission is designed to provide a global reconstruction of the geo- potential model and geoid with high spatial resolution (better than 0.1 cm at the degree and order l = 50 and better than 1.0 cm at degree and order l = 200). Such scientific performance scenario requires at least the computation of 200 harmonics of the gravitational field and a simulated time span covering a minimum of 60 days (corresponding to a full coverage of the Earth surface). Thales Alenia Space Italia (TAS-I) is responsible, as Prime Contractor, of the GOCE Satellite. The GOCE mission objective is the high-accuracy retrieval of the Earth gravity field. The idea of an End-to-End simulator (E2E) was conceived in the early stages of the GOCE programme, as an essential tool for supporting the design and verification activities as well as for assessing the satellite system performance. The simulator in its present form has been developed at TAS-I for ESA since the beginning of Phase B and is currently used for: checking the consistency of spacecraft and payload specifications with the overall system requirements supporting trade-off, sensitivity and worst-case analyses supporting design and pre-validation testing of the Drag-Free and Attitude Control (DFAC) laws preparing and testing the on-ground and in-flight gradiometer calibration concepts prototyping the post-processing algorithms, transforming the scientific data from Level 0 (raw telemetry format) to Level 1B (i.e. geo-located gravity

  12. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  13. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  14. Utilizing Domain Knowledge in End-to-End Audio Processing

    DEFF Research Database (Denmark)

    Tax, Tycho; Antich, Jose Luis Diez; Purwins, Hendrik

    2017-01-01

    End-to-end neural network based approaches to audio modelling are generally outperformed by models trained on high-level data representations. In this paper we present preliminary work that shows the feasibility of training the first layers of a deep convolutional neural network (CNN) model...

  15. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  16. End-to-end plasma bubble PIC simulations on GPUs

    Science.gov (United States)

    Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava

    2017-10-01

    Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.

  17. Toward End-to-End Face Recognition Through Alignment Learning

    Science.gov (United States)

    Zhong, Yuanyi; Chen, Jiansheng; Huang, Bo

    2017-08-01

    Plenty of effective methods have been proposed for face recognition during the past decade. Although these methods differ essentially in many aspects, a common practice of them is to specifically align the facial area based on the prior knowledge of human face structure before feature extraction. In most systems, the face alignment module is implemented independently. This has actually caused difficulties in the designing and training of end-to-end face recognition models. In this paper we study the possibility of alignment learning in end-to-end face recognition, in which neither prior knowledge on facial landmarks nor artificially defined geometric transformations are required. Specifically, spatial transformer layers are inserted in front of the feature extraction layers in a Convolutional Neural Network (CNN) for face recognition. Only human identity clues are used for driving the neural network to automatically learn the most suitable geometric transformation and the most appropriate facial area for the recognition task. To ensure reproducibility, our model is trained purely on the publicly available CASIA-WebFace dataset, and is tested on the Labeled Face in the Wild (LFW) dataset. We have achieved a verification accuracy of 99.08\\% which is comparable to state-of-the-art single model based methods.

  18. End-to-end network/application performance troubleshooting methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  19. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP...

  20. Probing end-to-end cyclization beyond Willemski and Fixman.

    Science.gov (United States)

    Chen, Shaohua; Duhamel, Jean; Winnik, Mitchell A

    2011-04-07

    A series of poly(ethylene oxide)s labeled at both ends with pyrene, (PEO(X)-Py(2), where X represents the number average molecular weight (M(n)) of the PEO chains and equals 2, 5, 10, and 16.5 K) was prepared together with one-pyrene-monolabeled PEO (PEO(2K)-Py). The process of end-to-end cyclization (EEC) was investigated by monitoring intramolecular excimer formation in seven organic solvents with viscosities (η) ranging from 0.32 to 1.92 mPa·s. The steady-state fluorescence spectra showed that excimer formation of PEO(X)-Py(2) decreased strongly with increasing η and M(n). The monomer and excimer time-resolved fluorescence decays were analyzed according to the traditional Birks' scheme. Birks' scheme analysis indicated that the decrease in excimer formation with increasing M(n) and η was due partly to a decrease in the rate constant of EEC, but most importantly, to a large increase in the fraction of pyrenes that did not form excimer (f(Mfree)). This result is in itself incompatible with Birks' scheme analysis which requires that f(Mfree) be the molar fraction of chains bearing a single pyrene at one chain end; in short, f(Mfree) does not depend on M(n) and η within the framework of Birks' scheme analysis. In turn, this unexpected result agrees with the framework of the fluorescence blob model (FBM) which predicts that quenching takes place inside a blob, which is the finite volume probed by an excited chromophore during its lifetime. Increasing M(n) and η results in a larger fraction of chains having a conformation where the quencher is located outside the blob resulting in an increase in f(Mfree). Equations were derived to apply the FBM analysis, originally designed to study randomly labeled polymers, to the end-labeled PEO(X)-Py(2) series. FBM analysis was found to describe satisfyingly the data obtained with the longer PEO(X)-Py(2) samples.

  1. End-to-end security in telemedical networks--a practical guideline.

    Science.gov (United States)

    Wozak, Florian; Schabetsberger, Thomas; Ammmenwerth, Elske

    2007-01-01

    The interconnection of medical networks in different healthcare institutions will be constantly increasing over the next few years, which will require concepts for securing medical data during transfer, since transmitting patient related data via potentially insecure public networks is considered a violation of data privacy. The aim of our work was to develop a model-based approach towards end-to-end security which is defined as continuous security from point of origin to point of destination in a communication process. We show that end-to-end security must be seen as a holistic security concept, which comprises the following three major parts: authentication and access control, transport security, as well as system security. For integration into existing security infrastructures abuse case models were used, which extend UML use cases, by elements necessary to describe abusive interactions. Abuse case models can be constructed for each part mentioned above, allowing for potential security risks in communication from point of origin to point of destination to be identified and counteractive measures to be directly derived from the abuse case models. The model-based approach is a guideline to continuous risk assessment and improvement of end-to-end security in medical networks. Validity and relevance to practice will be systematically evaluated using close-to-reality test networks as well as in production environments.

  2. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-01-01

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function. PMID:25538749

  3. OGC standards for end-to-end sensor network integration

    Science.gov (United States)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  4. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  5. End-to-End Beam Dynamics Simulations for the ANL-RIA Driver Linac

    CERN Document Server

    Ostroumov, P N

    2004-01-01

    The proposed Rare Isotope Accelerator (RIA) Facility consists of a superconducting (SC) 1.4 GV driver linac capable of producing 400 kW beams of any ion from hydrogen to uranium. The driver is configured as an array of ~350 SC cavities, each with independently controllable rf phase. For the end-to-end beam dynamics design and simulation we use a dedicated code, TRACK. The code integrates ion motion through the three-dimensional fields of all elements of the driver linac beginning from the exit of the electron cyclotron resonance (ECR) ion source to the production targets. TRACK has been parallelized and is able to track large number of particles in randomly seeded accelerators with misalignments and a comprehensive set of errors. The simulation starts with multi-component dc ion beams extracted from the ECR. Beam losses are obtained by tracking up to million particles in hundreds of randomly seeded accelerators. To control beam losses a set of collimators is applied in designated areas. The end-to-end simulat...

  6. Comparison of postoperative motility in hand-sewn end-to-end anastomosis and functional end-to-end anastomosis: an experimental study in conscious dogs.

    Science.gov (United States)

    Toyomasu, Yoshitaka; Mochiki, Erito; Ando, Hiroyuki; Yanai, Mitsuhiro; Ogata, Kyoichi; Tabe, Yuichi; Ohno, Tetsuro; Aihara, Ryuusuke; Kuwano, Hiroyuki

    2010-09-01

    The objective of this study is to compare the postoperative motility between hand-sewn end-to-end anastomosis and functional end-to-end anastomosis. Fifteen conscious dogs were divided into three groups: normal intact dog group, end-to-end anastomosis group (EE), and functional end-to-end anastomosis group (FEE). In the EE and FEE groups, the dogs underwent a transection of the jejunum 30 cm distal to the Treitz ligament and anastomosis in each method. To compare the gastrointestinal motility, the time to the appearance and the rate of propagation of interdigestive migrating motor contractions (IMC) across the anastomosis, as well as the motility index (MI) at the oral and anal sides of the anastomosis, were measured using strain gauge force transducers. Furthermore, the histological examination of intrinsic nerve fibers was evaluated. The time to the appearance of propagation of IMC in the EE and FEE was not significantly different. The propagation rates of IMC in the EE and FEE completely recovered within 4 weeks of the surgery. The MI in the EE and FEE was not significantly different. In addition, no continuity of intrinsic nerve fibers across the anastomosis could be identified in either group. In the present study, there are no significant differences between the EE and FEE with regard to the time of the appearance and the rate of propagation of IMC. These results suggest that the effect of functional end-to-end anastomosis on postoperative motility is not different from that of hand-sewn end-to-end anastomosis.

  7. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  8. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  9. Portable end-to-end ground system for low-cost mission support

    Science.gov (United States)

    Lam, Barbara

    1996-11-01

    This paper presents a revolutionary architecture of the end-to-end ground system to reduce overall mission support costs. The present ground system of the Jet Propulsion Laboratory (JPL) is costly to operate, maintain, deploy, reproduce, and document. In the present climate of shrinking NASA budgets, this proposed architecture takes on added importance as it should dramatically reduce all of the above costs. Currently, the ground support functions (i.e., receiver, tracking, ranging, telemetry, command, monitor and control) are distributed among several subsystems that are housed in individual rack-mounted chassis. These subsystems can be integrated into one portable laptop system using established Multi Chip Module (MCM) packaging technology and object-based software libraries. The large scale integration of subsystems into a small portable system connected to the World Wide Web (WWW) will greatly reduce operations, maintenance and reproduction costs. Several of the subsystems can be implemented using Commercial Off-The-Shelf (COTS) products further decreasing non-recurring engineering costs. The inherent portability of the system will open up new ways for using the ground system at the "point-of-use" site as opposed to maintaining several large centralized stations. This eliminates the propagation delay of the data to the Principal Investigator (PI), enabling the capture of data in real-time and performing multiple tasks concurrently from any location in the world. Sample applications are to use the portable ground system in remote areas or mobile vessels for real-time correlation of satellite data with earth-bound instruments; thus, allowing near real-time feedback and control of scientific instruments. This end-to-end portable ground system will undoubtedly create opportunities for better scientific observation and data acquisition.

  10. VisualCommander for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the development of a highly extensible and user-configurable software application for end-to-end mission simulation and design. We will leverage...

  11. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  12. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  13. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  14. An end-to-end anastomosis model of guinea pig bile duct: A 6-mo observation

    Science.gov (United States)

    Zhang, Xiao-Qing; Tian, Yuan-Hu; Xu, Zhi; Wang, Li-Xin; Hou, Chun-Sheng; Ling, Xiao-Feng; Zhou, Xiao-Si

    2011-01-01

    AIM: To establish the end-to-end anastomosis (EEA) model of guinea pig bile duct and evaluate the healing process of bile duct. METHODS: Thirty-two male guinea pigs were randomly divided into control group, 2-, 3-, and 6-mo groups after establishment of EEA model. Histological, immunohistochemical and serologic tests as well as measurement of bile contents were performed. The bile duct diameter and the diameter ratio (DR) were measured to assess the formation of relative stricture. RESULTS: Acute and chronic inflammatory reactions occurred throughout the healing process of bile duct. Serology test and bile content measurement showed no formation of persistent stricture in 6-mo group. The DR revealed a transient formation of relative stricture in 2-mo group in comparation to control group (2.94 ± 0.17 vs 1.89 ± 0.27, P = 0.004). However, this relative stricture was released in 6-mo group (2.14 ± 0.18, P = 0.440). CONCLUSION: A simple and reliable EEA model of guinea pig bile duct can be established with a good reproducibility and a satisfactory survival rate. PMID:21390151

  15. End-to-End Optimization of High-Throughput DNA Sequencing.

    Science.gov (United States)

    O'Reilly, Eliza; Baccelli, Francois; De Veciana, Gustavo; Vikalo, Haris

    2016-10-01

    At the core of Illumina's high-throughput DNA sequencing platforms lies a biophysical surface process that results in a random geometry of clusters of homogeneous short DNA fragments typically hundreds of base pairs long-bridge amplification. The statistical properties of this random process and the lengths of the fragments are critical as they affect the information that can be subsequently extracted, that is, density of successfully inferred DNA fragment reads. The ensembles of overlapping DNA fragment reads are then used to computationally reconstruct the much longer target genome sequence. The success of the reconstruction in turn depends on having a sufficiently large ensemble of DNA fragments that are sufficiently long. In this article using stochastic geometry, we model and optimize the end-to-end flow cell synthesis and target genome sequencing process, linking and partially controlling the statistics of the physical processes to the success of the final computational step. Based on a rough calibration of our model, we provide, for the first time, a mathematical framework capturing the salient features of the sequencing platform that serves as a basis for optimizing cost, performance, and/or sensitivity analysis to various parameters.

  16. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  17. An end-to-end communications architecture for condition-based maintenance applications

    Science.gov (United States)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  18. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    industries. The research gives at the same time managers of construction projects a tool with which to manage their requirements end-to-end. In order to investigate how construction companies handle requirements, a case project – a Danish construction syndicate producing sandwich elements made from High...... Performance Concrete and insulation materials – is used. By means of action research and interviews of case project staff it has become evident that many elements of formalized requirements management are missing in the case project. To fill those gaps and be able to manage requirements end-to-end...

  19. Security Considerations around End-to-End Security in the IP-based Internet of Things

    NARCIS (Netherlands)

    Brachmann, M.; Garcia-Mochon, O.; Keoh, S.L.; Kumar, S.S.

    2012-01-01

    The IP-based Internet of Things refers to the interconnection of smart objects in a Low-power and Lossy Network (LLN) with the Internetby means of protocols such as 6LoWPAN or CoAP. The provisioning of an end-to-end security connection is the key to ensure basic functionalities such as software

  20. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    NARCIS (Netherlands)

    Lehner, B.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-01-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an

  1. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    HARDWARE SUPPORT FOR MALWARE DEFENSE AND END-TO- END TRUST INTERNATIONAL BUSINESS MACHINES CORPORATION ( IBM ) FEBRUARY 2017 FINAL TECHNICAL REPORT...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) International Business Machines Corporation T.J. Watson Research Center 1101 Kitchawan Rd Yorktown...for bare bones ACM firmware .................................................. 37 iii ACKNOWLEDGMENTS The following staff members at IBM Research

  2. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian

    2013-01-01

    We report on the observation of coupling a single nitrogen vacancy (NV) center in a nanodiamond crystal to a propagating plasmonic mode of silver nanowires. The nanocrystal is placed either near the apex of a single silver nanowire or in the gap between two end-to-end aligned silver nanowires. We...

  3. Strategic design issues of IMS versus end-to-end architectures

    NARCIS (Netherlands)

    Braet, O.; Ballon, P.

    2007-01-01

    Purpose - The paper aims to discuss the business issues surrounding the choice between the end-to-end internet architecture, in particular peer-to-peer networks, versus managed telecommunications architectures, in particular IMS, for the migration towards a next-generation mobile system.

  4. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    Generation Networks (NGNs). In this paper, an end-to-end availability model is proposed and evaluated using a combination of Reliability Block Diagrams (RBD) and a proposed five-state Markov model. The overall availability for intra- and inter domain communication in IMS is analyzed, and the state...

  5. Integrated Information and Network Management for End-to-End Quality of Service

    Science.gov (United States)

    2011-11-01

    Integrated Information and Network Management for End-to-End Quality of Service Marco Carvalho and Adrian Granados Florida Institute for...Carvalho, M., Granados A., Naqwi, W., Brothers, A., Hanna, J., and Turck, K., “A cross-layer communications substrate for tactical information management

  6. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    This paper describes a wireless real-time communication system design using two Time Division Multiple Access (TDMA) protocols. Messages are subject to prioritization and queuing. For this interoperation scenario, we show a method for end-to-end configuration of protocols and queue sizes...

  7. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  8. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  9. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    Berkowitz Army Materiel Command and the University of Alabama in Huntsville part- nered to develop an integrated end-to-end performance metrics system...their supply chains, Army Materiel Command (AMC), headquar- tered in Huntsville, Alabama, partnered with University of Alabama–Huntsville (UAH) in...chain metrics system. This research proj- ect, named the Enterprise Supply Chain Analysis & Logistics Engine (eSCALE) project, was man- aged by AMC

  10. Financing the End-to-end Supply Chain: A Reference Guide to Supply Chain Finance

    OpenAIRE

    Templar, Simon; Hofmann, Erik; Findlay, Charles

    2016-01-01

    Financing the End to End Supply Chain provides readers with a real insight into the increasingly important area of supply chain finance. It demonstrates the importance of the strategic relationship between the physical supply of goods and services and the associated financial flows. The book provides a clear introduction, demonstrating the importance of the strategic relationship between supply chain and financial communities within an organization. It contains vital information on how supply...

  11. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  12. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  13. END-TO-END DEPTH FROM MOTION WITH STABILIZED MONOCULAR VIDEOS

    Directory of Open Access Journals (Sweden)

    C. Pinard

    2017-08-01

    Full Text Available We propose a depth map inference system from monocular videos based on a novel dataset for navigation that mimics aerial footage from gimbal stabilized monocular camera in rigid scenes. Unlike most navigation datasets, the lack of rotation implies an easier structure from motion problem which can be leveraged for different kinds of tasks such as depth inference and obstacle avoidance. We also propose an architecture for end-to-end depth inference with a fully convolutional network. Results show that although tied to camera inner parameters, the problem is locally solvable and leads to good quality depth prediction.

  14. End-to-End Simulation and Verification of Rendezvous and Docking/Berthing Systems using Robotics

    OpenAIRE

    Benninghoff, Heike

    2016-01-01

    The rendezvous and docking/berthing (RvD/B) phase is one of the most complex and critical parts of future on-orbit servicing missions. Especially the operations during the final approach (separation distance < 20m) have to be verified and tested in detail. Such tests involve on-board systems, communication systems and ground systems. In the framework of an end-to-end simulation of the final approach to and capture of a tumbling client satellite, the necessary components are developed and t...

  15. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  16. The end-to-end simulator for the E-ELT HIRES high resolution spectrograph

    Science.gov (United States)

    Genoni, M.; Landoni, M.; Riva, M.; Pariani, G.; Mason, E.; Di Marcantonio, P.; Disseau, K.; Di Varano, I.; Gonzalez, O.; Huke, P.; Korhonen, H.; Li Causi, Gianluca

    2017-06-01

    We present the design, architecture and results of the End-to-End simulator model of the high resolution spectrograph HIRES for the European Extremely Large Telescope (E-ELT). This system can be used as a tool to characterize the spectrograph both by engineers and scientists. The model allows to simulate the behavior of photons starting from the scientific object (modeled bearing in mind the main science drivers) to the detector, considering also calibration light sources, and allowing to perform evaluation of the different parameters of the spectrograph design. In this paper, we will detail the architecture of the simulator and the computational model which are strongly characterized by modularity and flexibility that will be crucial in the next generation astronomical observation projects like E-ELT due to of the high complexity and long-time design and development. Finally, we present synthetic images obtained with the current version of the End-to-End simulator based on the E-ELT HIRES requirements (especially high radial velocity accuracy). Once ingested in the Data reduction Software (DRS), they will allow to verify that the instrument design can achieve the radial velocity accuracy needed by the HIRES science cases.

  17. An End-To-End Test of A Simulated Nuclear Electric Propulsion System

    Science.gov (United States)

    VanDyke, Melissa; Hrbud, Ivana; Goddfellow, Keith; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase I Space Fission Systems issues in it particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  18. End-to-End Airplane Detection Using Transfer Learning in Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhong Chen

    2018-01-01

    Full Text Available Airplane detection in remote sensing images remains a challenging problem due to the complexity of backgrounds. In recent years, with the development of deep learning, object detection has also obtained great breakthroughs. For object detection tasks in natural images, such as the PASCAL (Pattern Analysis, Statistical Modelling and Computational Learning VOC (Visual Object Classes Challenge, the major trend of current development is to use a large amount of labeled classification data to pre-train the deep neural network as a base network, and then use a small amount of annotated detection data to fine-tune the network for detection. In this paper, we use object detection technology based on deep learning for airplane detection in remote sensing images. In addition to using some characteristics of remote sensing images, some new data augmentation techniques have been proposed. We also use transfer learning and adopt a single deep convolutional neural network and limited training samples to implement end-to-end trainable airplane detection. Classification and positioning are no longer divided into multistage tasks; end-to-end detection attempts to combine them for optimization, which ensures an optimal solution for the final stage. In our experiment, we use remote sensing images of airports collected from Google Earth. The experimental results show that the proposed algorithm is highly accurate and meaningful for remote sensing object detection.

  19. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  20. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    Science.gov (United States)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  1. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    Science.gov (United States)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an

  2. Sieving of H2 and D2 Through End-to-End Nanotubes

    Science.gov (United States)

    Devagnik, Dasgupta; Debra, J. Searles; Lamberto, Rondoni; Stefano, Bernardi

    2014-10-01

    We study the quantum molecular sieving of H2 and D2 through two nanotubes placed end-to-end. An analytic treatment, assuming that the particles have classical motion along the axis of the nanotube and are confined in a potential well in the radial direction, is considered. Using this idealistic model, and under certain conditions, it is found that this device can act as a complete sieve, allowing chemically pure deuterium to be isolated from an isotope mixture. We also consider a more realistic model of two carbon nanotubes and carry out molecular dynamics simulations using a Feynman—Hibbs potential to model the quantum effects on the dynamics of H2 and D2. Sieving is also observed in this case, but is caused by a different process.

  3. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  4. End-to-end simulations of the E-ELT/METIS coronagraphs

    Science.gov (United States)

    Carlomagno, Brunella; Absil, Olivier; Kenworthy, Matthew; Ruane, Garreth; Keller, Christoph U.; Otten, Gilles; Feldt, Markus; Hippler, Stefan; Huby, Elsa; Mawet, Dimitri; Delacroix, Christian; Surdej, Jean; Habraken, Serge; Forsberg, Pontus; Karlsson, Mikael; Vargas Catalan, Ernesto; Brandl, Bernhard R.

    2016-07-01

    The direct detection of low-mass planets in the habitable zone of nearby stars is an important science case for future E-ELT instruments such as the mid-infrared imager and spectrograph METIS, which features vortex phase masks and apodizing phase plates (APP) in its baseline design. In this work, we present end-to-end performance simulations, using Fourier propagation, of several METIS coronagraphic modes, including focal-plane vortex phase masks and pupil-plane apodizing phase plates, for the centrally obscured, segmented E-ELT pupil. The atmosphere and the AO contributions are taken into account. Hybrid coronagraphs combining the advantages of vortex phase masks and APPs are considered to improve the METIS coronagraphic performance.

  5. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  6. Establishing end-to-end security in a nationwide network for telecooperation.

    Science.gov (United States)

    Staemmler, Martin; Walz, Michael; Weisser, Gerald; Engelmann, Uwe; Weininger, Robert; Ernstberger, Antonio; Sturm, Johannes

    2012-01-01

    Telecooperation is used to support care for trauma patients by facilitating a mutual exchange of treatment and image data in use-cases such as emergency consultation, second-opinion, transfer, rehabilitation and out-patient aftertreatment. To comply with data protection legislation a two-factor authentication using ownership and knowledge has been implemented to assure personalized access rights. End-to-end security is achieved by symmetric encryption in combination with external trusted services which provide the symmetric key solely at runtime. Telecooperation partners may be chosen at departmental level but only individuals of that department, as a result of checking the organizational assignments maintained by LDAP services, are granted access. Data protection officers of a federal state have accepted the data protection means. The telecooperation platform is in routine operation and designed to serve for up to 800 trauma centers in Germany, organized in more than 50 trauma networks.

  7. Kinetics of contact formation and end-to-end distance distributions of swollen disordered peptides.

    Science.gov (United States)

    Soranno, Andrea; Longhi, Renato; Bellini, Tommaso; Buscaglia, Marco

    2009-02-18

    Unstructured polypeptide chains are subject to various degrees of swelling or compaction depending on the combination of solvent condition and amino acid sequence. Highly denatured proteins generally behave like random-coils with excluded volume repulsion, whereas in aqueous buffer more compact conformations have been observed for the low-populated unfolded state of globular proteins as well as for naturally disordered sequences. To quantitatively account for the different mechanisms inducing the swelling of polypeptides, we have examined three 14-residues peptides in aqueous buffer and in denaturant solutions, including the well characterized AGQ repeat as a reference and two variants, in which we have successively introduced charged side chains and removed the glycines. Quenching of the triplet state of tryptophan by close contact with cysteine has been used in conjunction with Förster resonance energy transfer to study the equilibrium and kinetic properties of the peptide chains. The experiments enable accessing end-to-end root mean-square distance, probability of end-to-end contact formation and intrachain diffusion coefficient. The data can be coherently interpreted on the basis of a simple chain model with backbone angles obtained from a library of coil segments of proteins and hard sphere repulsion at each Calpha position. In buffered water, we find that introducing charges in a glycine-rich sequence induces a mild chain swelling and a significant speed-up of the intrachain dynamics, whereas the removal of the glycines results in almost a two-fold increase of the chain volume and a drastic slowing down. In denaturants we observe a pronounced swelling of all the chains, with significant differences between the effect of urea and guanidinium chloride.

  8. The MARS pathfinder end-to-end information system: A pathfinder for the development of future NASA planetary missions

    Science.gov (United States)

    Cook, Richard A.; Kazz, Greg J.; Tai, Wallace S.

    1996-01-01

    The development of the Mars pathfinder is considered with emphasis on the End-to-End Information System (EEIS) development approach. The primary mission objective is to successfully develop and deliver a single flight system to the Martian surface, demonstrating entry, descent and landing. The EEIS is a set of functions distributed throughout the flight, ground and Mission Operation Systems (MOS) that inter-operate in order to control, collect, transport, process, store and analyze the uplink and downlink information flows of the mission. Coherence between the mission systems is achieved though the EEIS architecture. The key characteristics of the system are: a concurrent engineering approach for the development of flight, ground and mission operation systems; the fundamental EEIS architectural heuristics; a phased incremental EEIS development and test approach, and an EEIS design deploying flight, ground and MOS operability features, including integrated ground and flight based toolsets.

  9. MO-B-BRB-04: 3D Dosimetry in End-To-End Dosimetry QA

    Energy Technology Data Exchange (ETDEWEB)

    Ibbott, G. [UT MD Anderson Cancer Center (United States)

    2016-06-15

    irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.

  10. End-to-End Printed-Circuit Board Assembly Design Using Altium Designer and Solid Works Systems

    National Research Council Canada - National Science Library

    A. M. Goncharenko; A. E. Kurnosenko; V. G. Kostikov; A. V. Lavrov; V. A. Soloviev

    2015-01-01

    .... The article offers an alternate approach to the end-to-end simultaneous development in Altium Designer / Solid Works CAD/CAE systems, which enables a radically shortened time to design new devices...

  11. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  12. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... are usually not the desired output, but just an intermediary step. End-to-end (E2E) models, which take raw text as input and produce the desired output directly, need not depend on token-level labels. We propose an E2E model based on pointer networks, which can be trained directly on pairs of raw input...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  13. Functional Partitioning to Optimize End-to-End Performance on Many-core Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Li, Min [Virginia Polytechnic Institute and State University (Virginia Tech); Vazhkudai, Sudharshan S [ORNL; Butt, Ali R [Virginia Polytechnic Institute and State University (Virginia Tech); Meng, Fei [ORNL; Ma, Xiaosong [ORNL; Kim, Youngjae [ORNL; Engelmann, Christian [ORNL; Shipman, Galen M [ORNL

    2010-01-01

    Scaling computations on emerging massive-core supercomputers is a daunting task, which coupled with the significantly lagging system I/O capabilities exacerbates applications end-to-end performance. The I/O bottleneck often negates potential performance benefits of assigning additional compute cores to an application. In this paper, we address this issue via a novel functional partitioning (FP) runtime environment that allocates cores to specific application tasks - checkpointing, de-duplication, and scientific data format transformation - so that the deluge of cores can be brought to bear on the entire gamut of application activities. The focus is on utilizing the extra cores to support HPC application I/O activities and also leverage solid-state disks in this context. For example, our evaluation shows that dedicating 1 core on an oct-core machine for checkpointing and its assist tasks using FP can improve overall execution time of a FLASH benchmark on 80 and 160 cores by 43.95% and 41.34%, respectively.

  14. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  15. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  16. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Science.gov (United States)

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  17. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    Directory of Open Access Journals (Sweden)

    Luis Gutierrez-Heredia

    Full Text Available Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters, but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon and freeware (123D Catch, Meshmixer and Netfabb, allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis.

  18. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  19. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  20. The optical performance of the PILOT instrument from ground end-to-end tests

    Science.gov (United States)

    Misawa, R.; Bernard, J.-Ph.; Longval, Y.; Ristorcelli, I.; Ade, P.; Alina, D.; André, Y.; Aumont, J.; Bautista, L.; de Bernardis, P.; Boulade, O.; Bousqet, F.; Bouzit, M.; Buttice, V.; Caillat, A.; Chaigneau, M.; Charra, M.; Crane, B.; Douchin, F.; Doumayrou, E.; Dubois, J. P.; Engel, C.; Griffin, M.; Foenard, G.; Grabarnik, S.; Hargrave, P.; Hughes, A.; Laureijs, R.; Leriche, B.; Maestre, S.; Maffei, B.; Marty, C.; Marty, W.; Masi, S.; Montel, J.; Montier, L.; Mot, B.; Narbonne, J.; Pajot, F.; Pérot, E.; Pimentao, J.; Pisano, G.; Ponthieu, N.; Rodriguez, L.; Roudil, G.; Salatino, M.; Savini, G.; Simonella, O.; Saccoccio, M.; Tauber, J.; Tucker, C.

    2017-06-01

    The Polarized Instrument for Long-wavelength Observation of the Tenuous interstellar medium ( PILOT) is a balloon-borne astronomy experiment designed to study the linear polarization of thermal dust emission in two photometric bands centred at wavelengths 240 μm (1.2 THz) and 550 μm (545 GHz), with an angular resolution of a few arcminutes. Several end-to-end tests of the instrument were performed on the ground between 2012 and 2014, in order to prepare for the first scientific flight of the experiment that took place in September 2015 from Timmins, Ontario, Canada. This paper presents the results of those tests, focussing on an evaluation of the instrument's optical performance. We quantify image quality across the extent of the focal plane, and describe the tests that we conducted to determine the focal plane geometry, the optimal focus position, and sources of internal straylight. We present estimates of the detector response, obtained using an internal calibration source, and estimates of the background intensity and background polarization.

  1. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    Science.gov (United States)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  2. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  3. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    Science.gov (United States)

    Lehner, B. A. E.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-11-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an exploration mission in cis-lunar space with a focus on human-robot interactions, called Human Assisted Robotic Vehicle Studies (HARVeSt). HARVeSt will build on knowledge of plant growth in space gained from experiments on-board the ISS and test the first growth of plants on the Moon. A planned deep space habitat will be utilised as the base of operations for human-robotic elements of the mission. The mission will serve as a technology demonstrator not only for autonomous tele-operations in cis-lunar space but also for key enabling technologies for future human surface missions. The successful approach of the ISS will be built on in this mission with international cooperation. Mission assets such as a modular rover will allow for an extendable mission and to scout and prepare the area for the start of an international Moon Village.

  4. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model.

    Science.gov (United States)

    Ainsworth, Cameron H; Paris, Claire B; Perlin, Natalie; Dornberger, Lindsey N; Patterson, William F; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover.

  5. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    Science.gov (United States)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  6. AAL Security and Privacy: transferring XACML policies for end-to-end acess and usage control

    NARCIS (Netherlands)

    Vlamings, H.G.M.; Koster, R.P.

    2010-01-01

    Ambient Assisted Living (AAL) systems and services aim to provide a solution for growing healthcare expenses and degradation of life quality of elderly using information and communication technology. Inparticular AAL solutions are being created that are heavily based on web services an sensor

  7. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  8. jade: An End-To-End Data Transfer and Catalog Tool

    Science.gov (United States)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  9. A Mechanistic End-to-End Concussion Model That Translates Head Kinematics to Neurologic Injury

    Directory of Open Access Journals (Sweden)

    Laurel J. Ng

    2017-06-01

    Full Text Available Past concussion studies have focused on understanding the injury processes occurring on discrete length scales (e.g., tissue-level stresses and strains, cell-level stresses and strains, or injury-induced cellular pathology. A comprehensive approach that connects all length scales and relates measurable macroscopic parameters to neurological outcomes is the first step toward rationally unraveling the complexity of this multi-scale system, for better guidance of future research. This paper describes the development of the first quantitative end-to-end (E2E multi-scale model that links gross head motion to neurological injury by integrating fundamental elements of tissue and cellular mechanical response with axonal dysfunction. The model quantifies axonal stretch (i.e., tension injury in the corpus callosum, with axonal functionality parameterized in terms of axonal signaling. An internal injury correlate is obtained by calculating a neurological injury measure (the average reduction in the axonal signal amplitude over the corpus callosum. By using a neurologically based quantity rather than externally measured head kinematics, the E2E model is able to unify concussion data across a range of exposure conditions and species with greater sensitivity and specificity than correlates based on external measures. In addition, this model quantitatively links injury of the corpus callosum to observed specific neurobehavioral outcomes that reflect clinical measures of mild traumatic brain injury. This comprehensive modeling framework provides a basis for the systematic improvement and expansion of this mechanistic-based understanding, including widening the range of neurological injury estimation, improving concussion risk correlates, guiding the design of protective equipment, and setting safety standards.

  10. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  11. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    Science.gov (United States)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  12. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...... of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers....

  13. End-To-END Performance of the future MOMA intrument aboard the EXOMARS MISSION

    Science.gov (United States)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Danell, R.; van Amerom, F. H. W.; Freissinet, C.; Glavin, D. P.; Stalport, F.; Arevalo, R. D., Jr.; Coll, P. J.; Steininger, H.; Raulin, F.; Goesmann, F.; Mahaffy, P. R.; Brinckerhoff, W. B.

    2016-12-01

    After the SAM experiment aboard the curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the future ExoMars mission will be the continuation of the search for the organic composition of the Mars surface with the advantage that the sample will be extracted as deep as 2 meters below the martian surface to minimize effects of radiation and oxidation on organic materials. To analyse the wide range of organic composition (volatile and non volatils compounds) of the martian soil MOMA is composed with an UV laser desorption / ionization (LDI) and a pyrolysis gas chromatography ion trap mass spectrometry (pyr-GC-ITMS). In order to analyse refractory organic compounds and chirality samples which undergo GC-ITMS analysis may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). To optimize and test the performance of the GC-ITMS instrument we have performed several coupling tests campaigns between the GC, providing by the French team (LISA, LATMOS, CentraleSupelec), and the MS, providing by the US team (NASA, GSFC). Last campaign has been done with the ITU models wich is similar to the flight model and wich include the oven and the taping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References:[1] Buch, A. et al. (2009) J chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459. Acknowledgements: Funding provided by the Mars Exploration Program (point of contact, George Tahu, NASA/HQ). MOMA is a collaboration between NASA and ESA (PI Goesmann, MPS). MOMA-GC team acknowledges support from the French Space Agency (CNES), French National Programme of Planetology (PNP), National French Council (CNRS), Pierre Simon Laplace Institute.

  14. Urban Biomining Meets Printable Electronics: End-To-End at Destination Biological Recycling and Reprinting

    Science.gov (United States)

    Rothschild, Lynn J. (Principal Investigator); Koehne, Jessica; Gandhiraman, Ram; Navarrete, Jesica; Spangle, Dylan

    2017-01-01

    Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a limited lifespan. Thus, current mission architectures must compensate for replacement. In space, spent electronics are discarded; on earth, there is some recycling but current processes are toxic and environmentally hazardous. Imagine instead an end-to-end recycling of spent electronics at low mass, low cost, room temperature, and in a non-toxic manner. Here, we propose a solution that will not only enhance mission success by decreasing upmass and providing a fresh supply of electronics, but in addition has immediate applications to a serious environmental issue on the Earth. Spent electronics will be used as feedstock to make fresh electronic components, a process we will accomplish with so-called 'urban biomining' using synthetically enhanced microbes to bind metals with elemental specificity. To create new electronics, the microbes will be used as 'bioink' to print a new IC chip, using plasma jet electronics printing. The plasma jet electronics printing technology will have the potential to use martian atmospheric gas to print and to tailor the electronic and chemical properties of the materials. Our preliminary results have suggested that this process also serves as a purification step to enhance the proportion of metals in the 'bioink'. The presence of electric field and plasma can ensure printing in microgravity environment while also providing material morphology and electronic structure tunabiity and thus optimization. Here we propose to increase the TRL level of the concept by engineering microbes to dissolve the siliceous matrix in the IC, extract copper from a mixture of metals, and use the microbes as feedstock to print interconnects using mars gas simulant. To assess the ability of this concept to influence mission architecture, we will do an analysis of the infrastructure required to execute

  15. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    Science.gov (United States)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  16. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  17. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    Science.gov (United States)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  18. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy: An experimental study in the rat forelimb model.

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-10-15

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function.

  19. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  20. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  1. An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of ...

    African Journals Online (AJOL)

    In order to better understand ecosystem functioning under simultaneous pressures (e.g. both climate change and fishing pressures), integrated modelling approaches are advocated. We developed an end-to-end model of the southern Benguela ecosystem by coupling the high trophic level model OSMOSE with a ...

  2. Sutureless functional end-to-end anastomosis using a linear stapler with polyglycolic acid felt for intestinal anastomoses

    Directory of Open Access Journals (Sweden)

    Masanori Naito, MD, PhD

    2017-05-01

    Conclusion: Sutureless functional end-to-end anastomosis using the Endo GIA™ Reinforced appears to be safe, efficacious, and straightforward. Reinforcement of the crotch site with a bioabsorbable polyglycolic acid sheet appears to mitigate conventional problems with crotch-site vulnerability.

  3. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  4. End-to-End demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power conversion and ion engine operation

    Science.gov (United States)

    Hrbud, Ivana; van Dyke, Melissa; Houts, Mike; Goodfellow, Keith

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems. .

  5. A GMPLS/OBS network architecture enabling QoS-aware end-to-end burst transport

    OpenAIRE

    Pedroso, Pedro; Perelló Muntan, Jordi; Spadaro, Salvatore; Careglio, Davide; Solé Pareta, Josep; Klinkowski, Miroslaw

    2010-01-01

    This paper introduces a Generalized Multi-Protocol Label Switching (GMPLS)-enabled Optical Burst Switched (OBS) network architecture featuring end-to-end QoS-aware burst transport services. This is achieved by setting up burst Label Switched Paths (LSPs) properly dimensioned to match specific burst drop probability requirements. These burst LSPs are used for specific guaranteed QoS levels, whereas the remaining network capacity can be left for best-effort burst support. Aiming to ensure...

  6. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  7. Hybrid monitoring scheme for end-to-end performance enhancement of multicast-based real-time media

    Science.gov (United States)

    Park, Ju-Won; Kim, JongWon

    2004-10-01

    As real-time media applications based on IP multicast networks spread widely, end-to-end QoS (quality of service) provisioning for these applications have become very important. To guarantee the end-to-end QoS of multi-party media applications, it is essential to monitor the time-varying status of both network metrics (i.e., delay, jitter and loss) and system metrics (i.e., CPU and memory utilization). In this paper, targeting the multicast-enabled AG (Access Grid) a next-generation group collaboration tool based on multi-party media services, the applicability of hybrid monitoring scheme that combines active and passive monitoring is investigated. The active monitoring measures network-layer metrics (i.e., network condition) with probe packets while the passive monitoring checks both application-layer metrics (i.e., user traffic condition by analyzing RTCP packets) and system metrics. By comparing these hybrid results, we attempt to pinpoint the causes of performance degradation and explore corresponding reactions to improve the end-to-end performance. The experimental results show that the proposed hybrid monitoring can provide useful information to coordinate the performance improvement of multi-party real-time media applications.

  8. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  9. Intelligent End-To-End Resource Virtualization Using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, Georgios; Kontos, T.; Niemegeers, I.G.M.M.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.M.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of

  10. End-to-End Concurrent Multipath Transfer Using Transport Layer Multihoming

    Science.gov (United States)

    2006-07-01

    insight into the ambient conditions under which cwnd overgrowth can be observed with SCTP, we develop an analytical model of this behavior and analyze...example in Section 6.2. The goal of this model is to provide insight into the ambient conditions un- der which cwnd overgrowth can be observed, thus...to con- gestion control. While some initial work in the area demonstrates feasibility [53], further work is needed to determine how these techniques

  11. End-to-end observatory software modeling using domain specific languages

    Science.gov (United States)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  12. CUSat: An End-to-End In-Orbit Inspection System University Nanosatellite Program

    Science.gov (United States)

    2007-01-01

    valid OMB control nupber. PL§f&j DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYY) 2. REPORT TYPE 3. DATES COVERED (From - To) 09...120dWo7 Sun 320107 7/4161 Q fai Soft-4710/ 070777/wt 6.0- S. 2,25V7 T, S2,M 40 1.040/9770 Fl" 7/7 Cow FuI, S.V25,+7 Th, 1-9t7 [ GN/C NWV-0/0 D.Oft-N01...walls " ACS revising, testing * Tight lid fits on " ADS testing, sun ack 0 Stiffener * MOMS SW 0 Alodining parts * GNC Supervisor * Flight helicoil

  13. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  14. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  15. End-to-end testing. [to verify electrical equipment failure due to carbon fibers released in aircraft-fuel fires

    Science.gov (United States)

    Pride, R. A.

    1979-01-01

    The principle objective of the kinds of demonstration tests that are discussed is to try to verify whether or not carbon fibers that are released by burning composite parts in an aircraft-fuel fires can produce failures in electrical equipment. A secondary objective discussed is to experimentally validate the analytical models for some of the key elements in the risk analysis. The approach to this demonstration testing is twofold: limited end-to-end test are to be conducted in a shock tube; and planning for some large outdoor burn tests is being done.

  16. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    Science.gov (United States)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  17. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  18. End-to-End Printed-Circuit Board Assembly Design Using Altium Designer and Solid Works Systems

    Directory of Open Access Journals (Sweden)

    A. M. Goncharenko

    2015-01-01

    Full Text Available The main goal of this white paper is to investigate the methods to accelerate the end-to-end simultaneous development of electronic PC assemblies in MCAD/ECAD systems. With raising the produced electronic equipment rates and quantities, there is a need to speed up the yield of new products. The article offers an alternate approach to the end-to-end simultaneous development in Altium Designer / Solid Works CAD/CAE systems, which enables a radically shortened time to design new devices and databases of components.The first part of the paper analyses the methods and models to solve the tasks of the endto-end simultaneous development of PC assemblies using the Circuit Works module for Solid Works. It examines the problems of traditional data exchange methods between Altium Designer and Solid Works arising from the limitations of the IDF 2.0 format used, as well as from the problems of 3D-models of components and because it is necessary to support two different databases.The second part gives guidelines and an example of the end-to-end simultaneous PC assembly development using the Altium Modeler module for Solid Works aimed at Altium Designer and presents a brief review of algorithms. The proposed method neither requires an additional database, nor uses an intermediate format such as IDF. The module translates the PCB model directly to Solid Works to generate the assembly model. The Altium Modeler is also capable to update its created assembly in Solid Works, which is very useful in case of modification of components and PCB itself. This approach is better tailored to the end-to-end development in terms of acceleration, enhancing facility of simultaneous work in different MCAD/ECAD systems, and eliminating errors arising from the need to support two CAD databases of the same functionality.In the conclusion the paper gives suggestions for using the modules for simultaneous development of electronic PC assemblies in Altium Designer and Solid Works.

  19. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  20. Context-driven, prescription-based personal activity classification: methodology, architecture, and end-to-end implementation.

    Science.gov (United States)

    Xu, James Y; Chang, Hua-I; Chien, Chieh; Kaiser, William J; Pottie, Gregory J

    2014-05-01

    Enabling large-scale monitoring and classification of a range of motion activities is of primary importance due to the need by healthcare and fitness professionals to monitor exercises for quality and compliance. Past work has not fully addressed the unique challenges that arise from scaling. This paper presents a novel end-to-end system solution to some of these challenges. The system is built on the prescription-based context-driven activity classification methodology. First, we show that by refining the definition of context, and introducing the concept of scenarios, a prescription model can provide personalized activity monitoring. Second, through a flexible architecture constructed from interface models, we demonstrate the concept of a context-driven classifier. Context classification is achieved through a classification committee approach, and activity classification follows by means of context specific activity models. Then, the architecture is implemented in an end-to-end system featuring an Android application running on a mobile device, and a number of classifiers as core classification components. Finally, we use a series of experimental field evaluations to confirm the expected benefits of the proposed system in terms of classification accuracy, rate, and sensor operating life.

  1. Risk Factors for Dehiscence of Stapled Functional End-to-End Intestinal Anastomoses in Dogs: 53 Cases (2001-2012).

    Science.gov (United States)

    Snowdon, Kyle A; Smeak, Daniel D; Chiang, Sharon

    2016-01-01

    To identify risk factors for dehiscence in stapled functional end-to-end anastomoses (SFEEA) in dogs. Retrospective case series. Dogs (n = 53) requiring an enterectomy. Medical records from a single institution for all dogs undergoing an enterectomy (2001-2012) were reviewed. Surgeries were included when gastrointestinal (GIA) and thoracoabdominal (TA) stapling equipment was used to create a functional end-to-end anastomosis between segments of small intestine or small and large intestine in dogs. Information regarding preoperative, surgical, and postoperative factors was recorded. Anastomotic dehiscence was noted in 6 of 53 cases (11%), with a mortality rate of 83%. The only preoperative factor significantly associated with dehiscence was the presence of inflammatory bowel disease (IBD). Surgical factors significantly associated with dehiscence included the presence, duration, and number of intraoperative hypotensive periods, and location of anastomosis, with greater odds of dehiscence in anastomoses involving the large intestine. IBD, location of anastomosis, and intraoperative hypotension are risk factors for intestinal anastomotic dehiscence after SFEEA in dogs. Previously suggested risk factors (low serum albumin concentration, preoperative septic peritonitis, and intestinal foreign body) were not confirmed in this study. © Copyright 2015 by The American College of Veterinary Surgeons.

  2. Benefits of Intraluminal Agarose Stents during End-to-End Intestinal Anastomosis in New Zealand White Rabbits.

    Science.gov (United States)

    Kuo, Wen-Yao; Huang, Hsiao-Chun; Huang, Shih-Wei; Yu, Kuan-Hua; Cheng, Feng-Pang; Wang, Jiann-Hsiung; Wu, Jui-Te

    2017-12-01

    In the present study, we evaluated the utility of an intraluminal agarose stent (IAS) for end-to-end intestinal anastomoses in rabbits. Female New Zealand white rabbits (n = 14) underwent conventional sutured anastomosis (CSA) with or without an IAS. IAS were used to maintain the luminal diameter for more rapid and accurate suturing, and then was squeezed transluminally to crush it into fragments, which passed through the intestines and were eliminated. The rabbits were euthanized on postoperative day 21. At necropsy, the anastomoses were assessed for adhesion formation, stenosis, and bursting pressure and were examined histologically for collagen content and blood vessel formation. Anastamosis surgery took less time in the IAS group (15.0 ± 2.6 min) than in the CSA-only group (30.1 ± 7.9 min). Only 1 postoperative death occurred (in the CSA group), and postmortem examination revealed evidence of anastomotic leakage. Adhesion formation and stenosis did not differ between groups, but bursting pressure, collagen content, and blood vessel formation were all significantly increased in the IAS group. IAS may decrease the operative time by maintaining a clear surgical field at the anastomotic site. In addition, the use of IAS promotes rapid healing and maintains the luminal diameter during end-to-end intestinal anastomosis.

  3. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  4. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  5. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene......Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures...... that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods...

  6. Long-wavelength optical properties of a plasmonic crystal composed of end-to-end nanorod dimers

    Directory of Open Access Journals (Sweden)

    X. Q. Yu

    2013-06-01

    Full Text Available We theoretically investigate the long-wavelength optical properties of a plasmonic crystal composed of end-to-end gold nanorod dimers. The strong coupling between incident light and the electron oscillations inside the nanorods gives rise to a plasmon polariton, which can be analogous to the phonon polariton in an ionic crystal. Huang-Kun-like equations are employed to explore the underlying physical mechanism for both symmetrical and asymmetrical geometries. In the long wavelength limit, the macroscopic dielectric response of the proposed structure is deduced analytically. The polariton dispersion curve shows a typical anticrossing profile in the strong coupling regime and adjacent branches are separated by a Rabi splitting. The resultant polaritonic stop band is validated by the numerical simulations.

  7. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    Science.gov (United States)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  8. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  9. Recipient vessel selection in the difficult neck: Outcomes of external carotid artery transposition and end-to-end microvascular anastomosis.

    Science.gov (United States)

    Garg, Ravi K; Poore, Samuel O; Wieland, Aaron M; Sanchez, Ruston; Baskaya, Mustafa K; Hartig, Gregory K

    2017-02-01

    Selection of recipient vessels for head and neck microvascular surgery may be limited in the previously dissected or irradiated neck. When distal branches of the external carotid artery (ECA) are unavailable, additional options for arterial inflow are needed. Here we propose high ligation of the ECA and transposition toward the lower neck as an alternative. After obtaining institutional approval, patients who underwent head and neck tumor resection and simultaneous free flap reconstruction were identified over a 5-year period. Patients whose recipient artery was listed in the operative report were included. Chart review was performed to identify patient demographics, operative details, and patient and flap complications. In cases where the ECA was used, the artery was traced distally with care taken to protect the hypoglossal nerve. The ECA was then divided and transposed toward the lower neck where an end-to-end microvascular anastomosis was performed. The recipient artery used for head and neck microsurgery was available for 176 flaps, and the facial (n = 127, 72.2%) and external carotid (n = 19, 10.8%) arteries were most commonly used. There were 0 flap thromboses in the ECA group compared to 3 flap thromboses that occurred with other recipient arteries (P = 1.00). No cases of first bite syndrome or hypoglossal nerve injury were identified. The ECA may be transposed toward the lower neck and used for end-to-end microvascular anastomosis without complication of hypoglossal nerve injury or first bite syndrome. This method may be considered an alternative in patients with limited recipient vessel options for head and neck microsurgery. © 2015 Wiley Periodicals, Inc. Microsurgery 37:96-100, 2017. © 2015 Wiley Periodicals, Inc.

  10. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  11. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  12. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    Science.gov (United States)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  13. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    Science.gov (United States)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  14. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot

  15. A novel end-to-end classifier using domain transferred deep convolutional neural networks for biomedical images.

    Science.gov (United States)

    Pang, Shuchao; Yu, Zhezhou; Orgun, Mehmet A

    2017-03-01

    Highly accurate classification of biomedical images is an essential task in the clinical diagnosis of numerous medical diseases identified from those images. Traditional image classification methods combined with hand-crafted image feature descriptors and various classifiers are not able to effectively improve the accuracy rate and meet the high requirements of classification of biomedical images. The same also holds true for artificial neural network models directly trained with limited biomedical images used as training data or directly used as a black box to extract the deep features based on another distant dataset. In this study, we propose a highly reliable and accurate end-to-end classifier for all kinds of biomedical images via deep learning and transfer learning. We first apply domain transferred deep convolutional neural network for building a deep model; and then develop an overall deep learning architecture based on the raw pixels of original biomedical images using supervised training. In our model, we do not need the manual design of the feature space, seek an effective feature vector classifier or segment specific detection object and image patches, which are the main technological difficulties in the adoption of traditional image classification methods. Moreover, we do not need to be concerned with whether there are large training sets of annotated biomedical images, affordable parallel computing resources featuring GPUs or long times to wait for training a perfect deep model, which are the main problems to train deep neural networks for biomedical image classification as observed in recent works. With the utilization of a simple data augmentation method and fast convergence speed, our algorithm can achieve the best accuracy rate and outstanding classification ability for biomedical images. We have evaluated our classifier on several well-known public biomedical datasets and compared it with several state-of-the-art approaches. We propose a robust

  16. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    Science.gov (United States)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  17. Profiling wind and greenhouse gases by infrared-laser occultation: results from end-to-end simulations in windy air

    Directory of Open Access Journals (Sweden)

    A. Plach

    2015-07-01

    Full Text Available The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs, and line-of-sight (l.o.s. wind speed with focus on the upper troposphere and lower stratosphere (UTLS. While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting. Here we use a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both stand-alone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm−1 and exploits transmission differences from a wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s. wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s−1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to the decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s−1 but is found to benefit in the case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and

  18. SU-F-J-177: A Novel Image Analysis Technique (center Pixel Method) to Quantify End-To-End Tests

    Energy Technology Data Exchange (ETDEWEB)

    Wen, N; Chetty, I [Henry Ford Health System, Detroit, MI (United States); Snyder, K [Henry Ford Hospital System, Detroit, MI (United States); Scheib, S [Varian Medical System, Barton (Switzerland); Qin, Y; Li, H [Henry Ford Health System, Detroit, Michigan (United States)

    2016-06-15

    Purpose: To implement a novel image analysis technique, “center pixel method”, to quantify end-to-end tests accuracy of a frameless, image guided stereotactic radiosurgery system. Methods: The localization accuracy was determined by delivering radiation to an end-to-end prototype phantom. The phantom was scanned with 0.8 mm slice thickness. The treatment isocenter was placed at the center of the phantom. In the treatment room, CBCT images of the phantom (kVp=77, mAs=1022, slice thickness 1 mm) were acquired to register to the reference CT images. 6D couch correction were applied based on the registration results. Electronic Portal Imaging Device (EPID)-based Winston Lutz (WL) tests were performed to quantify the errors of the targeting accuracy of the system at 15 combinations of gantry, collimator and couch positions. The images were analyzed using two different methods. a) The classic method. The deviation was calculated by measuring the radial distance between the center of the central BB and the full width at half maximum of the radiation field. b) The center pixel method. Since the imager projection offset from the treatment isocenter was known from the IsoCal calibration, the deviation was determined between the center of the BB and the central pixel of the imager panel. Results: Using the automatic registration method to localize the phantom and the classic method of measuring the deviation of the BB center, the mean and standard deviation of the radial distance was 0.44 ± 0.25, 0.47 ± 0.26, and 0.43 ± 0.13 mm for the jaw, MLC and cone defined field sizes respectively. When the center pixel method was used, the mean and standard deviation was 0.32 ± 0.18, 0.32 ± 0.17, and 0.32 ± 0.19 mm respectively. Conclusion: Our results demonstrated that the center pixel method accurately analyzes the WL images to evaluate the targeting accuracy of the radiosurgery system. The work was supported by a Research Scholar Grant, RSG-15-137-01-CCE from the American

  19. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  20. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  1. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  2. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  3. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  4. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    Science.gov (United States)

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. Copyright © 2015. Published by Elsevier GmbH.

  5. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  6. Towards a cross-platform software framework to support end-to-end hydrometeorological sensor network deployment

    Science.gov (United States)

    Celicourt, P.; Sam, R.; Piasecki, M.

    2016-12-01

    Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.

  7. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  8. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    Science.gov (United States)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  9. End-to-End Data Rate Performance of Decode-and-Forward Relaying with Different Resource Allocation Schemes

    Directory of Open Access Journals (Sweden)

    Inam Ullah

    2017-01-01

    Full Text Available This paper studies the end-to-end (e2e data rate of dual-hop Decode-and-Forward (DF infrastructure relaying under different resource allocation schemes. In this context, we first provide a comparative analysis of the optimal resource allocation scheme with respect to several other approaches in order to provide insights into the system behavior and show the benefits of each alternative. Then, assuming the optimal resource allocation, a closed form expression for the distribution of the mean and outage data rates is derived. It turns out that the corresponding mean e2e data rate formula attains an expression in terms of an integral that does not admit a closed form solution. Therefore, a tight lower bound formula for the mean e2e data rate is presented. Results can be used to select the most convenient resource allocation scheme and perform link dimensioning in the network planning phase, showing the explicit relationships that exist between component link bandwidths, SNR values, and mean data rate.

  10. An anthropomorphic multimodality (CT/MRI) phantom prototype for end-to-end tests in radiation therapy

    CERN Document Server

    Gallas, Raya R; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2014-01-01

    With the increasing complexity of external beam therapy, so-called "end-to-end" tests are intended to cover all steps from therapy planning to follow-up to fulfill the high demands on quality assurance. As magnetic resonance imaging (MRI) gains growing importance in the treatment process and established phantoms (such as the Alderson head) cannot be used for those tests, novel multimodality phantoms have to be developed. Here, we present a feasibility study for such a customizable multimodality head phantom. We used a set of patient CT images as the basis for the anthropomorphic head shape. The recipient - consisting of an epoxy resin - was produced using rapid prototyping (3D printing). The phantom recipient includes a nasal air cavity, two soft tissues volumes and cranial bone. Additionally a spherical tumor volume was positioned in the center. The volumes were filled with dipotassium phosphate-based cranial bone surrogate, agarose gel, and distilled water. The tumor volume was filled with normoxic dosimetr...

  11. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  12. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  13. Effects of collagen membranes enriched with in vitro-differentiated N1E-115 cells on rat sciatic nerve regeneration after end-to-end repair

    Directory of Open Access Journals (Sweden)

    Fornaro Michele

    2010-02-01

    Full Text Available Abstract Peripheral nerves possess the capacity of self-regeneration after traumatic injury but the extent of regeneration is often poor and may benefit from exogenous factors that enhance growth. The use of cellular systems is a rational approach for delivering neurotrophic factors at the nerve lesion site, and in the present study we investigated the effects of enwrapping the site of end-to-end rat sciatic nerve repair with an equine type III collagen membrane enriched or not with N1E-115 pre-differentiated neural cells. After neurotmesis, the sciatic nerve was repaired by end-to-end suture (End-to-End group, end-to-end suture enwrapped with an equine collagen type III membrane (End-to-EndMemb group; and end-to-end suture enwrapped with an equine collagen type III membrane previously covered with neural cells pre-differentiated in vitro from N1E-115 cells (End-to-EndMembCell group. Along the postoperative, motor and sensory functional recovery was evaluated using extensor postural thrust (EPT, withdrawal reflex latency (WRL and ankle kinematics. After 20 weeks animals were sacrificed and the repaired sciatic nerves were processed for histological and stereological analysis. Results showed that enwrapment of the rapair site with a collagen membrane, with or without neural cell enrichment, did not lead to any significant improvement in most of functional and stereological predictors of nerve regeneration that we have assessed, with the exception of EPT which recovered significantly better after neural cell enriched membrane employment. It can thus be concluded that this particular type of nerve tissue engineering approach has very limited effects on nerve regeneration after sciatic end-to-end nerve reconstruction in the rat.

  14. Adaptation and validation of a commercial head phantom for cranial radiosurgery dosimetry end-to-end audit.

    Science.gov (United States)

    Dimitriadis, Alexis; Palmer, Antony L; Thomas, Russell A S; Nisbet, Andrew; Clark, Catharine H

    2017-06-01

    To adapt and validate an anthropomorphic head phantom for use in a cranial radiosurgery audit. Two bespoke inserts were produced for the phantom: one for providing the target and organ at risk for delineation and the other for performing dose measurements. The inserts were tested to assess their positional accuracy. A basic treatment plan dose verification with an ionization chamber was performed to establish a baseline accuracy for the phantom and beam model. The phantom and inserts were then used to perform dose verification measurements of a radiosurgery plan. The dose was measured with alanine pellets, EBT extended dose film and a plastic scintillation detector (PSD). Both inserts showed reproducible positioning (±0.5 mm) and good positional agreement between them (±0.6 mm). The basic treatment plan measurements showed agreement to the treatment planning system (TPS) within 0.5%. Repeated film measurements showed consistent gamma passing rates with good agreement to the TPS. For 2%-2 mm global gamma, the mean passing rate was 96.7% and the variation in passing rates did not exceed 2.1%. The alanine pellets and PSD showed good agreement with the TPS (-0.1% and 0.3% dose difference in the target) and good agreement with each other (within 1%). The adaptations to the phantom showed acceptable accuracies. The presence of alanine and PSD do not affect film measurements significantly, enabling simultaneous measurements by all three detectors. Advances in knowledge: A novel method for thorough end-to-end test of radiosurgery, with capability to incorporate all steps of the clinical pathway in a time-efficient and reproducible manner, suitable for a national audit.

  15. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  16. Evaluation of a composite Gel-Alanine phantom on an end-to-end test to treat multiple brain metastases by a single isocenter VMAT technique.

    Science.gov (United States)

    Pavoni, Juliana Fernandes; Neves-Junior, Wellington Furtado Pimenta; da Silveira, Matheus Antonio; Haddad, Cecília Maria Kalil; Baffa, Oswaldo

    2017-09-01

    This work aims to evaluate the application of a cylindrical phantom made of dosimetric gel containing alanine pellets distributed inside the gel volume during an end-to-end test of a single isocenter VMAT for simultaneous treatment of multiple brain metastases. The evaluation is based on the comparison of the results obtained with the composite phantom with the treatment planning system (TPS) dose distribution validated by using the clinical conventional quality control with point and planar dose measurements. A cylindrical MAGIC-f gel phantom containing alanine dosimeters (composite phantom) was used to design the VMAT plan in the treatment planning system (TPS). The alanine dosimeters were pellets with radius of 2.5 mm and height of 3 mm, and played the role of brain metastasis inside the gel cylinder, which simulated the cerebral tissue. Five of the alanine dosimeters were selected to simulate five lesions; five planning target volumes (PTVs) were created including the dosimeters and irradiated with different doses. Conventional quality assurance (QA) was performed on the TPS plan and on the composite phantom; a phantom containing only gel (Gel 1 phantom) was also irradiated. One day after irradiation, magnetic resonance images were acquired for both phantoms on a 3T scanner. An electron spin resonance spectrometer was used to evaluate alanine doses. Calibration curves were constructed for the alanine and the gel dosimeters. All the gel only measurement was repeated (Gel 2 phantom) in order to confirm the previous gel measurement. The VMAT treatment plan was approved by the conventional QA. The doses measured by alanine dosimeters on the composite gel phantom agreed to the TPS on average within 3.3%. The alanine dose for each lesion was used to calibrate the gel dosimeter measurements of the concerned PTV. Both gel dose volume histograms (DVH) achieved for each PTV were in agreement with the expected TPS DVH, except for a small discrepancy observed for the Gel 2

  17. End-to-end renal vein anastomosis to preserve renal venous drainage following inferior vena cava radical resection due to leiomyosarcoma.

    Science.gov (United States)

    Araujo, Raphael L C; Gaujoux, Sébastien; D'Albuquerque, Luiz Augusto Carneiro; Sauvanet, Alain; Belghiti, Jacques; Andraus, Wellington

    2014-05-01

    When retrohepatic inferior vena cava (IVC) resection is required, for example, for IVC leiomyosarcoma, reconstruction is recommended. This is particularly true when the renal vein confluence is resected to preserve venous outflow, including that of the right kidney. Two patients with retrohepatic IVC leiomyosarcoma involving renal vein confluences underwent hepatectomy with en bloc IVC resection below the renal vein confluence. IVC reconstruction was not performed, but end-to-end renal vein anastomoses were, including a prosthetic graft in 1 case. The postoperative course was uneventful with respect to kidney function, anastomosis patency assessed using Doppler ultrasonography and computerized tomography, and transient lower limb edema. End-to-end renal vein anastomosis after a retrohepatic IVC resection including the renal vein confluence should be considered as an alternative option for preserving right kidney drainage through the left renal vein when IVC reconstruction is not possible or should be avoided. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  19. An Integrated End-to-End Modeling Framework for Testing Ecosystem-Wide Effects of Human-Induced Pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Maar, Marie; Nielsen, Rasmus

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  20. One stage functional end-to-end stapled intestinal anastomosis and resection performed by nonexpert surgeons for the treatment of small intestinal obstruction in 30 dogs.

    Science.gov (United States)

    Jardel, Nicolas; Hidalgo, Antoine; Leperlier, Dimitri; Manassero, Mathieu; Gomes, Aymeric; Bedu, Anne Sophie; Moissonnier, Pierre; Fayolle, Pascal; Begon, Dominique; Riquois, Elisabeth; Viateau, Véronique

    2011-02-01

    To describe stapled 1-stage functional end-to-end intestinal anastomosis for treatment of small intestinal obstruction in dogs and evaluate outcome when the technique is performed by nonexpert surgeons after limited training in the technique. Case series. Dogs (n=30) with intestinal lesions requiring an enterectomy. Stapled 1-stage functional end-to-end anastomosis and resection using a GIA-60 and a TA-55 stapling devices were performed under supervision of senior residents and faculty surgeons by junior surgeons previously trained in the technique on pigs. Procedure duration and technical problems were recorded. Short-term results were collected during hospitalization and at suture removal. Long-term outcome was established by clinical and ultrasonographic examinations at least 2 months after surgery and from written questionnaires, completed by owners. Mean±SD procedure duration was 15±12 minutes. Postoperative recovery was uneventful in 25 dogs. One dog had anastomotic leakage, 1 had a localized abscess at the transverse staple line, and 3 dogs developed an incisional abdominal wall abscess. No long-term complications occurred (follow-up, 2-32 months). Stapled 1-stage functional end-to-end anastomosis and resection is a fast and safe procedure in the hand of nonexpert but trained surgeons. © Copyright 2011 by The American College of Veterinary Surgeons.

  1. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [ORNL; Fugate, David L [ORNL; Cetiner, Sacit M [ORNL; Qualls, A L [ORNL

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  2. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data

  3. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    Science.gov (United States)

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  4. Experience of using MOSFET detectors for dose verification measurements in an end-to-end 192Ir brachytherapy quality assurance system.

    Science.gov (United States)

    Persson, Maria; Nilsson, Josef; Carlsson Tedgren, Åsa

    2017-10-27

    Establishment of an end-to-end system for the brachytherapy (BT) dosimetric chain could be valuable in clinical quality assurance. Here, the development of such a system using MOSFET (metal oxide semiconductor field effect transistor) detectors and experience gained during 2 years of use are reported with focus on the performance of the MOSFET detectors. A bolus phantom was constructed with two implants, mimicking prostate and head & neck treatments, using steel needles and plastic catheters to guide the 192Ir source and house the MOSFET detectors. The phantom was taken through the BT treatment chain from image acquisition to dose evaluation. During the 2-year evaluation-period, delivered doses were verified a total of 56 times using MOSFET detectors which had been calibrated in an external 60Co beam. An initial experimental investigation on beam quality differences between 192Ir and 60Co is reported. The standard deviation in repeated MOSFET measurements was below 3% in the six measurement points with dose levels above 2 Gy. MOSFET measurements overestimated treatment planning system doses by 2-7%. Distance-dependent experimental beam quality correction factors derived in a phantom of similar size as that used for end-to-end tests applied on a time-resolved measurement improved the agreement. MOSFET detectors provide values stable over time and function well for use as detectors for end-to-end quality assurance purposes in 192Ir BT. Beam quality correction factors should address not only distance from source but also phantom dimensions. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  5. Interoperable End-to-End Remote Patient Monitoring Platform based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2017-08-07

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for Personal Health Devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory and power and that use short range wireless technology. It explains aspects of IEEE 11073, including the Domain Information Model, state model and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger eco-system of interoperable devices and systems that include IHE PCD-01, HL7 and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living (AAL) in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  6. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  7. Effective link quality estimation as a means to improved end-to-end packet delivery in high traffic mobile ad hoc networks☆

    Directory of Open Access Journals (Sweden)

    Syed Rehan Afzal

    2017-08-01

    Full Text Available Accurate link quality estimation is a fundamental building block in quality aware multi hop routing. In an inherently lossy, unreliable and dynamic medium such as wireless, the task of accurate estimation becomes very challenging. Over the years ETX has been widely used as a reliable link quality estimation metric. However, more recently it has been established that under heavy traffic loads ETX performance gets significantly worse. We examine the ETX metric's behavior in detail with respect to the MAC layer and UDP data; and identify the causes of its unreliability. Motivated by the observations made in our analysis, we present the design and implementation of our link quality measurement metric xDDR – a variation of ETX. This article extends xDDR to support network mobility. Our experiments show that xDDR substantially outperforms minimum hop count, ETX and HETX in terms of end-to-end packet delivery ratio in static as well as mobile scenarios.

  8. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    Science.gov (United States)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  9. Application of Modified Direct Denitration to Support the ORNL Coupled-End-to-End Demonstration in Production of Mixed Oxides Suitable for Pellet Fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Elisabeth A [ORNL; Vedder, Raymond James [ORNL; Felker, Leslie Kevin [ORNL; Marschman, Steve [ORNL

    2007-01-01

    The current and future development of the Modified Direct Denitration (MDD)process is in support of Oak Ridge National Laboratory's (ORNL) Coupled End-to-End (CETE) research, development, and demonstration (R&D) of proposed advanced fuel reprocessing and fuel fabrication processes. This work will involve the co-conversion of the U/Pu/Np product streams from the UREX+3 separation flow sheet utilizing the existing MDD glove-box setup and the in-cell co-conversion of the U/Pu/Np/Am/Cm product streams from the UREX+1a flow sheet. Characterization equipment is being procured and installed. Oxide powder studies are being done on calcination/reduction variables, as well as pressing and sintering of pellets to permit metallographic examinations.

  10. Food-web dynamics under climate change

    DEFF Research Database (Denmark)

    Zhang, L.; Takahashi, M.; Hartvig, Martin

    2017-01-01

    Climate change affects ecological communities through its impact on the physiological performance of individuals. However, the population dynamic of species well inside their thermal niche is also determined by competitors, prey and predators, in addition to being influenced by temperature changes....... We use a trait-based food-web model to examine how the interplay between the direct physiological effects from temperature and the indirect effects due to changing interactions between populations shapes the ecological consequences of climate change for populations and for entire communities. Our...... climatically well-adapted species may be brought to extinction by the changed food-web topology. Our results highlight that the impact of climate change on specific populations is largely unpredictable, and apparently well-adapted species may be severely impacted...

  11. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  12. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  13. Ecosystem limits to food web fluxes and fisheries yields in the North Sea simulated with an end-to-end food web model

    Science.gov (United States)

    Heath, Michael R.

    2012-09-01

    Equilibrium yields from an exploited fish stock represent the surplus production remaining after accounting for losses due to predation. However, most estimates of maximum sustainable yield, upon which fisheries management targets are partly based, assume that productivity and predation rates are constant in time or at least stationary. This means that there is no recognition of the potential for interaction between different fishing sectors. Here, an end-to-end ecosystem model is developed to explore the possible scale and mechanisms of interactions between pelagic and demersal fishing in the North Sea. The model simulates fluxes of nitrogen between detritus, inorganic nutrient and guilds of taxa spanning phytoplankton to mammals. The structure strikes a balance between graininess in space, taxonomy and demography, and the need to constrain the parameter-count sufficiently to enable automatic parameter optimization. Simulated annealing is used to locate the maximum likelihood parameter set, given the model structure and a suite of observations of annual rates of production and fluxes between guilds. Simulations of the impact of fishery harvesting rates showed that equilibrium yields of pelagic and demersal fish were strongly interrelated due to a variety of top-down and bottom-up food web interactions. The results clearly show that management goals based on simultaneously achieving maximum sustainable biomass yields from all commercial fish stocks is simply unattainable. Trade-offs between, for example, pelagic and demersal fishery sectors and other properties of the ecosystem have to be considered in devising an overall harvesting strategy.

  14. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  15. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system by impro......Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system...... by improving our understanding of the Earth's interior and climate. An End-to-End mission performance simulation was carried out during Phase A of the mission, with the aim of analyzing the key system requirements, particularly with respect to the number of Swarm satellites and their orbits related...... applied to the synthetic data to analyze various aspects of field recovery in relation to different number of satellites, different constellations and realistic noise sources. This paper gives an overview of the study activities, describes the generation of the synthetic data, and assesses the obtained...

  16. Modeling the interaction of IEEE 802.3x hop-by-hop flow control and TCP end-to-end flow control

    NARCIS (Netherlands)

    R. Malhotra; R. van Haalen; M.R.H. Mandjes (Michel); R. Núñez Queija (Rudesindo (Sindo))

    2005-01-01

    textabstractEthernet is rapidly expanding beyond its niche of local area networks. However, its success in larger metropolitan area networks will be determined by its ability to combine simplicity, low costs and quality of service. A key element in successfully transporting bursty traffic and at the

  17. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    Science.gov (United States)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  18. Automated segmentation of 3D anatomical structures on CT images by using a deep convolutional network based on end-to-end learning approach

    Science.gov (United States)

    Zhou, Xiangrong; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2017-02-01

    We have proposed an end-to-end learning approach that trained a deep convolutional neural network (CNN) for automatic CT image segmentation, which accomplished a voxel-wised multiple classification to directly map each voxel on 3D CT images to an anatomical label automatically. The novelties of our proposed method were (1) transforming the anatomical structures segmentation on 3D CT images into a majority voting of the results of 2D semantic image segmentation on a number of 2D-slices from different image orientations, and (2) using "convolution" and "deconvolution" networks to achieve the conventional "coarse recognition" and "fine extraction" functions which were integrated into a compact all-in-one deep CNN for CT image segmentation. The advantage comparing to previous works was its capability to accomplish real-time image segmentations on 2D slices of arbitrary CT-scan-range (e.g. body, chest, abdomen) and produced correspondingly-sized output. In this paper, we propose an improvement of our proposed approach by adding an organ localization module to limit CT image range for training and testing deep CNNs. A database consisting of 240 3D CT scans and a human annotated ground truth was used for training (228 cases) and testing (the remaining 12 cases). We applied the improved method to segment pancreas and left kidney regions, respectively. The preliminary results showed that the accuracies of the segmentation results were improved significantly (pancreas was 34% and kidney was 8% increased in Jaccard index from our previous results). The effectiveness and usefulness of proposed improvement for CT image segmentations were confirmed.

  19. End-to-end process of hollow spacecraft structures with high frequency and low mass obtained with in-house structural optimization tool and additive manufacturing

    Directory of Open Access Journals (Sweden)

    Alexandru-Mihai CISMILIANU

    2017-09-01

    Full Text Available In the space sector the most decisive elements are: mass reduction, cost saving and minimum lead time; here, structural optimization and additive layer manufacturing (ALM fit best. The design must be driven by stiffness, because an important requirement for spacecraft (S/C structures is to reduce the dynamic coupling between the S/C and the launch vehicle. The objective is to create an end-to-end process, from the input given by the customer to the manufacturing of an aluminum part as light as possible but at the same time considerably stiffer while taking the full advantage of the design flexibility given by ALM. To design and optimize the parts, a specialized in-house tool was used, guaranteeing a load-sufficient material distribution. Using topological optimization, the iterations between the design and the stress departments were diminished, thus greatly reducing the lead time. In order to improve and lighten the obtained structure a design with internal cavities and hollow beams was considered. This implied developing of a procedure for powder evacuation through iterations with the manufacturer while optimizing the design for ALM. The resulted part can be then manufactured via ALM with no need of further design adjustments. To achieve a high-quality part with maximum efficiency, it is essential to have a loop between the design team and the manufacturer. Topological optimization and ALM work hand in hand if used properly. The team achieved a more efficient structure using topology optimization and ALM, than using conventional design and manufacturing methods.

  20. POTION: an end-to-end pipeline for positive Darwinian selection detection in genome-scale data through phylogenetic comparison of protein-coding genes.

    Science.gov (United States)

    Hongo, Jorge A; de Castro, Giovanni M; Cintra, Leandro C; Zerlotini, Adhemar; Lobo, Francisco P

    2015-08-01

    Detection of genes evolving under positive Darwinian evolution in genome-scale data is nowadays a prevailing strategy in comparative genomics studies to identify genes potentially involved in adaptation processes. Despite the large number of studies aiming to detect and contextualize such gene sets, there is virtually no software available to perform this task in a general, automatic, large-scale and reliable manner. This certainly occurs due to the computational challenges involved in this task, such as the appropriate modeling of data under analysis, the computation time to perform several of the required steps when dealing with genome-scale data and the highly error-prone nature of the sequence and alignment data structures needed for genome-wide positive selection detection. We present POTION, an open source, modular and end-to-end software for genome-scale detection of positive Darwinian selection in groups of homologous coding sequences. Our software represents a key step towards genome-scale, automated detection of positive selection, from predicted coding sequences and their homology relationships to high-quality groups of positively selected genes. POTION reduces false positives through several sophisticated sequence and group filters based on numeric, phylogenetic, quality and conservation criteria to remove spurious data and through multiple hypothesis corrections, and considerably reduces computation time thanks to a parallelized design. Our software achieved a high classification performance when used to evaluate a curated dataset of Trypanosoma brucei paralogs previously surveyed for positive selection. When used to analyze predicted groups of homologous genes of 19 strains of Mycobacterium tuberculosis as a case study we demonstrated the filters implemented in POTION to remove sources of errors that commonly inflate errors in positive selection detection. A thorough literature review found no other software similar to POTION in terms of customization

  1. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L [Clinica Luganese, Radiotherapy Center, Lugano (Switzerland)

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  2. SU-E-T-360: End-To-End Dosimetric Testing of a Versa HD Linear Accelerator with the Agility Head Modeled in Pinnacle3

    Energy Technology Data Exchange (ETDEWEB)

    Saenz, D; Narayanasamy, G; Cruz, W; Papanikolaou, N; Stathakis, S [University of Texas Health Science Center at San Antonio, San Antonio, TX (United States)

    2015-06-15

    Purpose: The Versa HD incorporates a variety of upgrades, primarily including the Agility head. The distinct dosimetric properties of the head from its predecessors combined with flattening-filter-free (FFF) beams require a new investigation of modeling in planning systems and verification of modeling accuracy. Methods: A model was created in Pinnacle{sup 3} v9.8 with commissioned beam data. Leaf transmission was modeled as <0.5% with maximum leaf speed of 3 cm/s. Photon spectra were tuned for FFF beams, for which profiles were modeled with arbitrary profiles rather than with cones. For verification, a variety of plans with varied parameters were devised, and point dose measurements were compared to calculated values. A phantom of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle{sup 3}. Beams of different field sizes, SSD, wedges, and gantry angles were created. All available photon energies (6 MV, 10 MV, 18 MV, 6 FFF, 10 FFF) as well four clinical electron energies (6, 9, 12, and 15 MeV) were investigated. The plans were verified at a calculation point (8 cm deep for photons, variable for electrons) by measurement with a PTW Semiflex ionization chamber. In addition, IMRT testing was performed with three standard plans (step and shoot IMRT, small and large field VMAT plans). The plans were delivered on the Delta4 IMRT QA phantom (ScandiDos, Uppsala, Sweden). Results: Homogeneous point dose measurement agreed within 2% for all photon and electron beams. Open field photon measurements along the central axis at 100 cm SSD passed within 1%. Gamma passing rates were >99.5% for all plans with a 3%/3mm tolerance criteria. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4±2.3%. Conclusion: The end-to-end testing ensured confidence in the ability of Pinnacle{sup 3} to model photon and electron beams with the Agility head.

  3. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S; Nguyen, N; Liu, F; Huang, Y [Rhode Island Hospital / Warren Alpert Medical, Providence, RI (United States); Sio, T [Mayo Clinic, Rochester, MN (United States); Jung, J [East Carolina University, Greenville, North Carolina (United States); Pyakuryal, A [UniversityIllinois at Chicago, Chicago, IL (United States); Jang, S [Princeton Radiation Oncology Ctr., Jamesburg, NJ (United States)

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  4. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Science.gov (United States)

    Proschek, V.; Kirchengast, G.; Schweitzer, S.

    2011-10-01

    Measuring greenhouse gas (GHG) profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS) is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO) satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO) method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity) and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO) data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO) data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori) information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2), water vapor (H2O), methane (CH4), and ozone (O3). The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points to the climate benchmarking capability of the LMIO

  5. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  6. InterDomain-QOSM: The NSIS QOS Model for Inter-domain Signaling to Enable End-to-End QoS Provisioning Over Heterogeneous Network Domains

    NARCIS (Netherlands)

    Zhang, J.; Monteiro, E.; Mendes, P.; Karagiannis, Georgios; Andres-Colas, J.

    2006-01-01

    This document has three goals. First of all, it presents our analysis of how to use the NSIS signaling (inter-domain QOSM and intra-domain QOSM) to fulfill the QoS control in accord with the ITU-T RACF functional architecture. For this goal, we discuss how the ITU-T RACF entities in the ITU-T RACF

  7. Food-web dynamics in a large river discontinuum

    Science.gov (United States)

    Cross, Wyatt F.; Baxter, Colden V.; Rosi-Marshall, Emma J.; Hall, Robert O.; Kennedy, Theodore A.; Donner, Kevin C.; Kelly, Holly A. Wellard; Seegert, Sarah E.Z.; Behn, Kathrine E.; Yard, Michael D.

    2013-01-01

    Nearly all ecosystems have been altered by human activities, and most communities are now composed of interacting species that have not co-evolved. These changes may modify species interactions, energy and material flows, and food-web stability. Although structural changes to ecosystems have been widely reported, few studies have linked such changes to dynamic food-web attributes and patterns of energy flow. Moreover, there have been few tests of food-web stability theory in highly disturbed and intensely managed freshwater ecosystems. Such synthetic approaches are needed for predicting the future trajectory of ecosystems, including how they may respond to natural or anthropogenic perturbations. We constructed flow food webs at six locations along a 386-km segment of the Colorado River in Grand Canyon (Arizona, USA) for three years. We characterized food-web structure and production, trophic basis of production, energy efficiencies, and interaction-strength distributions across a spatial gradient of perturbation (i.e., distance from Glen Canyon Dam), as well as before and after an experimental flood. We found strong longitudinal patterns in food-web characteristics that strongly correlated with the spatial position of large tributaries. Above tributaries, food webs were dominated by nonnative New Zealand mudsnails (62% of production) and nonnative rainbow trout (100% of fish production). The simple structure of these food webs led to few dominant energy pathways (diatoms to few invertebrate taxa to rainbow trout), large energy inefficiencies (i.e., Below large tributaries, invertebrate production declined ∼18-fold, while fish production remained similar to upstream sites and comprised predominately native taxa (80–100% of production). Sites below large tributaries had increasingly reticulate and detritus-based food webs with a higher prevalence of omnivory, as well as interaction strength distributions more typical of theoretically stable food webs (i

  8. Microbial Food-Web Drivers in Tropical Reservoirs.

    Science.gov (United States)

    Domingues, Carolina Davila; da Silva, Lucia Helena Sampaio; Rangel, Luciana Machado; de Magalhães, Leonardo; de Melo Rocha, Adriana; Lobão, Lúcia Meirelles; Paiva, Rafael; Roland, Fábio; Sarmento, Hugo

    2017-04-01

    Element cycling in aquatic systems is driven chiefly by planktonic processes, and the structure of the planktonic food web determines the efficiency of carbon transfer through trophic levels. However, few studies have comprehensively evaluated all planktonic food-web components in tropical regions. The aim of this study was to unravel the top-down controls (metazooplankton community structure), bottom-up controls (resource availability), and hydrologic (water residence time) and physical (temperature) variables that affect different components of the microbial food web (MFW) carbon stock in tropical reservoirs, through structural equation models (SEM). We conducted a field study in four deep Brazilian reservoirs (Balbina, Tucuruí, Três Marias, and Funil) with different trophic states (oligo-, meso-, and eutrophic). We found evidence of a high contribution of the MFW (up to 50% of total planktonic carbon), especially in the less-eutrophic reservoirs (Balbina and Tucuruí). Bottom-up and top-down effects assessed through SEM indicated negative interactions between soluble reactive phosphorus and phototrophic picoplankton (PPP), dissolved inorganic nitrogen, and heterotrophic nanoflagellates (HNF). Copepods positively affected ciliates, and cladocerans positively affected heterotrophic bacteria (HB) and PPP. Higher copepod/cladoceran ratios and an indirect positive effect of copepods on HB might strengthen HB-HNF coupling. We also found low values for the degree of uncoupling (D) and a low HNF/HB ratio compared with literature data (mostly from temperate regions). This study demonstrates the importance of evaluating the whole size spectrum (including microbial compartments) of the different planktonic compartments, in order to capture the complex carbon dynamics of tropical aquatic ecosystems.

  9. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    Energy Technology Data Exchange (ETDEWEB)

    Ferreyra, M; Salinas Aranda, F; Dodat, D; Sansogne, R; Arbiser, S [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical and dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.

  10. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  11. End-to-end image quality assessment

    Science.gov (United States)

    Raventos, Joaquin

    2012-05-01

    An innovative computerized benchmarking approach (US Patent pending Sep 2011) based on extensive application of photometry, geometrical optics, and digital media using a randomized target, for a standard observer to assess the image quality of video imaging systems, at different day time, and low-light luminance levels. It takes into account, the target's contrast and color characteristics, as well as the observer's visual acuity and dynamic response. This includes human vision as part of the "extended video imaging system" (EVIS), and allows image quality assessment by several standard observers simultaneously.

  12. Using food-web theory to conserve ecosystems.

    Science.gov (United States)

    McDonald-Madden, E; Sabbadin, R; Game, E T; Baxter, P W J; Chadès, I; Possingham, H P

    2016-01-18

    Food-web theory can be a powerful guide to the management of complex ecosystems. However, we show that indices of species importance common in food-web and network theory can be a poor guide to ecosystem management, resulting in significantly more extinctions than necessary. We use Bayesian Networks and Constrained Combinatorial Optimization to find optimal management strategies for a wide range of real and hypothetical food webs. This Artificial Intelligence approach provides the ability to test the performance of any index for prioritizing species management in a network. While no single network theory index provides an appropriate guide to management for all food webs, a modified version of the Google PageRank algorithm reliably minimizes the chance and severity of negative outcomes. Our analysis shows that by prioritizing ecosystem management based on the network-wide impact of species protection rather than species loss, we can substantially improve conservation outcomes.

  13. Using food-web theory to conserve ecosystems

    Science.gov (United States)

    McDonald-Madden, E.; Sabbadin, R.; Game, E. T.; Baxter, P. W. J.; Chadès, I.; Possingham, H. P.

    2016-01-01

    Food-web theory can be a powerful guide to the management of complex ecosystems. However, we show that indices of species importance common in food-web and network theory can be a poor guide to ecosystem management, resulting in significantly more extinctions than necessary. We use Bayesian Networks and Constrained Combinatorial Optimization to find optimal management strategies for a wide range of real and hypothetical food webs. This Artificial Intelligence approach provides the ability to test the performance of any index for prioritizing species management in a network. While no single network theory index provides an appropriate guide to management for all food webs, a modified version of the Google PageRank algorithm reliably minimizes the chance and severity of negative outcomes. Our analysis shows that by prioritizing ecosystem management based on the network-wide impact of species protection rather than species loss, we can substantially improve conservation outcomes. PMID:26776253

  14. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  15. USE OF ARTIFICIAL AQUATIC FOOD-WEB FOR POST-TREATMENT OF DOMESTIC WASTEWATER IN COLD CLIMATE: A REVIEW

    Directory of Open Access Journals (Sweden)

    Aigars Lavrinovičs

    2017-11-01

    Full Text Available Eutrophication-promoting phosphorous loads originating from wastewater treatment plants are commonly controlled by chemical precipitation in the tertiary treatment phase. However, this approach is costly and generates additional waste. Therefore, inexpensive and sustainable methods for wastewater post-treatment are wanted. The artificial aquatic food-web that performs wastewater phycoremediation by algae and subsequent biomass harvest by filter-feeding organisms not only requires low energy consumption but also produces biomass for treatment process cost recovery. Still, the knowledge on its performance at different scales and under different climates are limited. In this review, the application possibilities of the artificial aquatic food-web for domestic wastewater post-treatment is discussed, focusing on its use in cold climate regions. Considering the reduced biological activity of aquatic organisms at low ambient temperature, possible solutions for its performance and prospect application at low temperatures are suggested. Finally, directions for future research regarding the practical use of artificial aquatic food-web are highlighted.

  16. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    Energy Technology Data Exchange (ETDEWEB)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  17. A concept of food-web structure in organic arable farming systems

    NARCIS (Netherlands)

    Smeding, F.W.; Snoo, de G.R.

    2003-01-01

    A proposal for a descriptive or topological farm food web is derived from field observations and from references in literature. Important themes in the food-web theory are tentatively applied to this preliminary model, explaining differences between local farm food-web structures and how they are

  18. Can You Build It? Using Manipulatives to Assess Student Understanding of Food-Web Concepts

    Science.gov (United States)

    Grumbine, Richard

    2012-01-01

    This article outlines an exercise that assesses student knowledge of food-web and energy-flow concepts. Students work in teams and use manipulatives to build food-web models based on criteria assigned by the instructor. The models are then peer reviewed according to guidelines supplied by the instructor.

  19. Towards ecosystem-based management: Identifying operational food-web indicators for marine ecosystems

    DEFF Research Database (Denmark)

    Tam, Jamie C.; Link, Jason S.; Rossberg, Axel G.

    2017-01-01

    ) are an important aspect of all marine ecosystems and biodiversity. Here we describe and discuss a process to evaluate the selection of operational food-web indicators for use in evaluating marine ecosystem status. This process brought together experts in food-web ecology, marine ecology, and resource management......, to identify available indicators that can be used to inform marine management. Standard evaluation criteria (availability and quality of data, conceptual basis, communicability, relevancy to management) were implemented to identify practical food-web indicators ready for operational use and indicators...... that hold promise for future use in policy and management. The major attributes of the final suite of operational food-web indicators were structure and functioning. Indicators that represent resilience of the marine ecosystem were less developed. Over 60 potential food-web indicators were evaluated...

  20. Experimental demonstration of a record high 11.25Gb/s real-time optical OFDM transceiver supporting 25km SMF end-to-end transmission in simple IMDD systems.

    Science.gov (United States)

    Giddings, R P; Jin, X Q; Hugues-Salas, E; Giacoumidis, E; Wei, J L; Tang, J M

    2010-03-15

    The fastest ever 11.25Gb/s real-time FPGA-based optical orthogonal frequency division multiplexing (OOFDM) transceivers utilizing 64-QAM encoding/decoding and significantly improved variable power loading are experimentally demonstrated, for the first time, incorporating advanced functionalities of on-line performance monitoring, live system parameter optimization and channel estimation. Real-time end-to-end transmission of an 11.25Gb/s 64-QAM-encoded OOFDM signal with a high electrical spectral efficiency of 5.625bit/s/Hz over 25km of standard and MetroCor single-mode fibres is successfully achieved with respective power penalties of 0.3dB and -0.2dB at a BER of 1.0 x 10(-3) in a directly modulated DFB laser-based intensity modulation and direct detection system without in-line optical amplification and chromatic dispersion compensation. The impacts of variable power loading as well as electrical and optical components on the transmission performance of the demonstrated transceivers are experimentally explored in detail. In addition, numerical simulations also show that variable power loading is an extremely effective means of escalating system performance to its maximum potential.

  1. Non-deterministic modelling of food-web dynamics.

    Directory of Open Access Journals (Sweden)

    Benjamin Planque

    Full Text Available A novel approach to model food-web dynamics, based on a combination of chance (randomness and necessity (system constraints, was presented by Mullon et al. in 2009. Based on simulations for the Benguela ecosystem, they concluded that observed patterns of ecosystem variability may simply result from basic structural constraints within which the ecosystem functions. To date, and despite the importance of these conclusions, this work has received little attention. The objective of the present paper is to replicate this original model and evaluate the conclusions that were derived from its simulations. For this purpose, we revisit the equations and input parameters that form the structure of the original model and implement a comparable simulation model. We restate the model principles and provide a detailed account of the model structure, equations, and parameters. Our model can reproduce several ecosystem dynamic patterns: pseudo-cycles, variation and volatility, diet, stock-recruitment relationships, and correlations between species biomass series. The original conclusions are supported to a large extent by the current replication of the model. Model parameterisation and computational aspects remain difficult and these need to be investigated further. Hopefully, the present contribution will make this approach available to a larger research community and will promote the use of non-deterministic-network-dynamics models as 'null models of food-webs' as originally advocated.

  2. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  3. Experimental demonstrations of record high REAM intensity modulator-enabled 19.25Gb/s real-time end-to-end dual-band optical OFDM colorless transmissions over 25km SSMF IMDD systems.

    Science.gov (United States)

    Zhang, Q W; Hugues-Salas, E; Giddings, R P; Wang, M; Tang, J M

    2013-04-08

    Record-high 19.25Gb/s real-time end-to-end dual-band optical OFDM (OOFDM) colorless transmissions across the entire C-band are experimentally demonstrated, for the first time, in reflective electro-absorption modulator (REAM)-based 25km standard SMF systems using intensity modulation and direct detection. Adaptively modulated baseband (0-2GHz) and passband (6.125 ± 2GHz) OFDM RF sub-bands, supporting signal line rates of 9.75Gb/s and 9.5Gb/s respectively, are independently generated and detected with FPGA-based DSP clocked at only 100MHz as well as DACs/ADCs operating at sampling speeds as low as 4GS/s. The two OFDM sub-bands are electrically multiplexed for intensity modulation of a single optical carrier by an 8GHz REAM. The REAM colorlessness is experimentally characterized, based on which optimum REAM operating conditions are identified. To maximize and balance the signal transmission performance of each sub-band, on-line adaptive transceiver optimization functions and live performance monitoring are fully exploited to optimize key OOFDM transceiver and system parameters. For different wavelengths within the C-band, corresponding minimum received optical powers at the FEC limit vary in a range of <0.5dB and bit error rate performances for both baseband and passband signals are almost identical. Furthermore, detailed investigations are also undertaken of the maximum aggregated signal line rate sensitivity to electrical sub-band power variation. It is shown that the aforementioned system has approximately 3dB tolerance to RF sub-band power variation.

  4. SU-F-J-150: Development of An End-To-End Chain Test for the First-In-Man MR-Guided Treatments with the MRI Linear Accelerator by Using the Alderson Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Hoogcarspel, S; Kerkmeijer, L; Lagendijk, J; Van Vulpen, M; Raaymakers, B [University Medical Center Utrecht, Utrecht, Utrecht (Netherlands)

    2016-06-15

    The Alderson phantom is a human shaped quality assurance tool that has been used for over 30 years in radiotherapy. The phantom can provide integrated tests of the entire chain of treatment planning and delivery. The purpose of this research was to investigate if this phantom can be used to chain test a treatment on the MRI linear accelerator (MRL) which is currently being developed at the UMC Utrecht, in collaboration with Elekta and Philips. The latter was demonstrated by chain testing the future First-in-Man treatments with this system.An Alderson phantom was used to chain test an entire treatment with the MRL. First, a CT was acquired of the phantom with additional markers that are both visible on MR and CT. A treatment plan for treating bone metastases in the sacrum was made. The phantom was consecutively placed in the MRL. For MRI imaging, an 3D volume was acquired. The initially developed treatment plan was then simulated on the new MRI dataset. For simulation, both the MR and CT data was used by registering them together. Before treatment delivery a MV image was acquired and compared with a DRR that was calculated form the MR/CT registration data. Finally, the treatment was delivered. Figure 1 shows both the T1 weighted MR-image of the phantom and the CT that was registered to the MR image. Figure 2 shows both the calculated and measured MV image that was acquired by the MV panel. Figure 3 shows the dose distribution that was simulated. The total elapsed time for the entire procedure excluding irradiation was 13:35 minutes.The Alderson Phantom yields sufficient MR contrast and can be used for full MR guided radiotherapy treatment chain testing. As a result, we are able to perform an end-to-end chain test of the future First-in-Man treatments.

  5. Rearrangement of potassium ions and Kv1.1/Kv1.2 potassium channels in regenerating axons following end-to-end neurorrhaphy: ionic images from TOF-SIMS.

    Science.gov (United States)

    Liu, Chiung-Hui; Chang, Hung-Ming; Wu, Tsung-Huan; Chen, Li-You; Yang, Yin-Shuo; Tseng, To-Jung; Liao, Wen-Chieh

    2017-10-01

    The voltage-gated potassium channels Kv1.1 and Kv1.2 that cluster at juxtaparanodal (JXP) regions are essential in the regulation of nerve excitability and play a critical role in axonal conduction. When demyelination occurs, Kv1.1/Kv1.2 activity increases, suppressing the membrane potential nearly to the equilibrium potential of K+, which results in an axonal conduction blockade. The recovery of K+-dependent communication signals and proper clustering of Kv1.1/Kv1.2 channels at JXP regions may directly reflect nerve regeneration following peripheral nerve injury. However, little is known about potassium channel expression and its relationship with the dynamic potassium ion distribution at the node of Ranvier during the regenerative process of peripheral nerve injury (PNI). In the present study, end-to-end neurorrhaphy (EEN) was performed using an in vivo model of PNI. The distribution of K+ at regenerating axons following EEN was detected by time-of-flight secondary-ion mass spectrometry. The specific localization and expression of Kv1.1/Kv1.2 channels were examined by confocal microscopy and western blotting. Our data showed that the re-establishment of K+ distribution and intensity was correlated with the functional recovery of compound muscle action potential morphology in EEN rats. Furthermore, the re-clustering of Kv1.1/1.2 channels 1 and 3 months after EEN at the nodal region of the regenerating nerve corresponded to changes in the K+ distribution. This study provided direct evidence of K+ distribution in regenerating axons for the first time. We proposed that the Kv1.1/Kv1.2 channels re-clustered at the JXP regions of regenerating axons are essential for modulating the proper patterns of K+ distribution in axons for maintaining membrane potential stability after EEN.

  6. Food-web stability signals critical transitions in temperate shallow lakes.

    Science.gov (United States)

    Kuiper, Jan J; van Altena, Cassandra; de Ruiter, Peter C; van Gerven, Luuk P A; Janse, Jan H; Mooij, Wolf M

    2015-07-15

    A principal aim of ecologists is to identify critical levels of environmental change beyond which ecosystems undergo radical shifts in their functioning. Both food-web theory and alternative stable states theory provide fundamental clues to mechanisms conferring stability to natural systems. Yet, it is unclear how the concept of food-web stability is associated with the resilience of ecosystems susceptible to regime change. Here, we use a combination of food web and ecosystem modelling to show that impending catastrophic shifts in shallow lakes are preceded by a destabilizing reorganization of interaction strengths in the aquatic food web. Analysis of the intricate web of trophic interactions reveals that only few key interactions, involving zooplankton, diatoms and detritus, dictate the deterioration of food-web stability. Our study exposes a tight link between food-web dynamics and the dynamics of the whole ecosystem, implying that trophic organization may serve as an empirical indicator of ecosystem resilience.

  7. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    Science.gov (United States)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data

  8. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  9. Indiscriminate Fisheries: Understanding the Foodweb of the Great Tonle Sap Lake, Cambodia

    Science.gov (United States)

    Hannah, L.; Kaufman, L.

    2014-12-01

    Indiscriminate fisheries target multiple species with multiple gear types. In contrast to well-studied, industrialized single-species, single-gear fisheries, little theory and little but growing literature on practice exists for indiscriminate fisheries. Indiscriminate fisheries are disproportionately important in low-income countries, providing most of the animal protein intake in countries such as Cambodia and Bangladesh. Indiscriminate fisheries may be either freshwater or marine, but here we focus on what may be the largest freshwater indiscriminate fishery in the world. Cambodia's freshwater fishery stands out because it provides the majority of animal protein to over 3 million people living in poverty. The fishery of the Tonle Sap lake is one of the largest, if not the largest contributor to this freshwater fish take, and is perhaps the largest freshwater fishery in the world. In contrast to its importance, very little is known about the foodweb ecology of this system, or how community management which now governs the entire fishery, interacts with biological and physical factors such as climate change.The foodweb of the Tonle Sap has changed dramatically due to high fishing pressure. A system that once harbored giant catfish, barbs and stingrays is now dominated by fish under 20cm in length. The simplification of the system may not have reduced its productivity. Theory of indiscriminate fisheries suggests that r-selected species may be favored and that biomass available for harvest may be maximized, while being more sensitive to environmental fluctuations such as climate change due to food web simplification. The r-selection and size predictions of theory have been confirmed by observations of the Tonle Sap. Early model results suggest sensitivity to environmental stochasticity. The interaction of these ecological changes with social systems will be tested in the Tonle Sap. Fisheries management across the lake has been transferred to community management

  10. Ecological-network models link diversity, structure and function in the plankton food-web

    Science.gov (United States)

    D’Alelio, Domenico; Libralato, Simone; Wyatt, Timothy; Ribera d’Alcalà, Maurizio

    2016-01-01

    A planktonic food-web model including sixty-three functional nodes (representing auto- mixo- and heterotrophs) was developed to integrate most trophic diversity present in the plankton. The model was implemented in two variants - which we named ‘green’ and ‘blue’ - characterized by opposite amounts of phytoplankton biomass and representing, respectively, bloom and non-bloom states of the system. Taxonomically disaggregated food-webs described herein allowed to shed light on how components of the plankton community changed their trophic behavior in the two different conditions, and modified the overall functioning of the plankton food web. The green and blue food-webs showed distinct organizations in terms of trophic roles of the nodes and carbon fluxes between them. Such re-organization stemmed from switches in selective grazing by both metazoan and protozoan consumers. Switches in food-web structure resulted in relatively small differences in the efficiency of material transfer towards higher trophic levels. For instance, from green to blue states, a seven-fold decrease in phytoplankton biomass translated into only a two-fold decrease in potential planktivorous fish biomass. By linking diversity, structure and function in the plankton food-web, we discuss the role of internal mechanisms, relying on species-specific functionalities, in driving the ‘adaptive’ responses of plankton communities to perturbations. PMID:26883643

  11. Food-web models predict species abundances in response to habitat change.

    Directory of Open Access Journals (Sweden)

    Nicholas J Gotelli

    2006-10-01

    Full Text Available Plant and animal population sizes inevitably change following habitat loss, but the mechanisms underlying these changes are poorly understood. We experimentally altered habitat volume and eliminated top trophic levels of the food web of invertebrates that inhabit rain-filled leaves of the carnivorous pitcher plant Sarracenia purpurea. Path models that incorporated food-web structure better predicted population sizes of food-web constituents than did simple keystone species models, models that included only autecological responses to habitat volume, or models including both food-web structure and habitat volume. These results provide the first experimental confirmation that trophic structure can determine species abundances in the face of habitat loss.

  12. Food-web structure in the hypertrophic Rietvlei Dam based on ...

    African Journals Online (AJOL)

    2013-08-06

    Aug 6, 2013 ... nektonic (fish) food-web components, collected from 3 to 7 shallow inshore locations (with additional plankton samples at. 1 or 2 deep offshore sites) in Rietvlei Dam over a period of 30 months. The resulting δ13C values did not indicate significant consumption of zooplankton by fish, while the δ15N values ...

  13. Food-web stability signals critical transitions in temperate shallow lakes

    NARCIS (Netherlands)

    Kuiper, Jan J.; van Altena, Cassandra; de Ruiter, P.C.; Van Gerven, Luuk P.A.; Janse, Jan H.; Mooij, Wolf M.

    2015-01-01

    A principal aim of ecologists is to identify critical levels of environmental change beyond which ecosystems undergo radical shifts in their functioning. Both food-web theory and alternative stable states theory provide fundamental clues to mechanisms conferring stability to natural systems. Yet, it

  14. Contribution by microbes to the foodweb of a mangrove biotope: the ...

    African Journals Online (AJOL)

    The contribution of mangroves as a source of nutrients to the foodweb of fish is under debate worldwide. An analysis of stable isotopes of carbon and nitrogen in various consumers and producers of a mangrove biotope in southern India revealed that the microbes associated with mangrove sediment contribute significantly ...

  15. Modelling the impacts of semi-intensive aquaculture on the foodweb ...

    African Journals Online (AJOL)

    Nutrient loadings are an important component of aquaculture impacts as they can lead to cascade effects at the ecosystem level. An evaluation of these effects on foodweb functioning is presented and discussed for the case study of Lake Burullus in the Nile Delta, Egypt, where semi-intensive aquaculture in earthen ponds ...

  16. Food-web structure in the hypertrophic Rietvlei Dam based on ...

    African Journals Online (AJOL)

    ... major planktonic (phytoplankton, zooplankton), benthic (submerged macrophytes and associated epiphytes, benthic macro-invertebrates) and nektonic (fish) food-web components, collected from 3 to 7 shallow inshore locations (with additional plankton samples at 1 or 2 deep offshore sites) in Rietvlei Dam over a period ...

  17. End-to-End Fault Tolerance Using Transport Layer Multihoming

    Science.gov (United States)

    2005-01-01

    2.1 Simulation network topology with cross-traffic, congestion -based loss, and no failures...aggressive failovers that reduce stalls during network congestion and failure events. xiv Chapter 1 INTRODUCTION 1.1 Problem Statement This...take a long time to converge on a new route after a link failure is detected. Labovitz et al. [72] show that the Internet’s interdomain routers may take

  18. End-to-End Service Oriented Architectures (SOA) Security Project

    Science.gov (United States)

    2012-02-01

    Complexity, in Proceedings of IFIP International Information Security (SEC 2011) conference, June 2011, Lucerne , Switzerland. • P. Angin, B. Bhargava...usr/lib/ /usr/lib/ The coolkey download is available online [CAC3]. Downloading the DoD Root CA certificates, and other intermediate certificate...Below is the link used to download the DoD Root CA certificates. The procedure to install these certificates is given in the link. Install all the

  19. End-to-end visual speech recognition with LSTMS

    NARCIS (Netherlands)

    Petridis, Stavros; Li, Zuwei; Pantic, Maja

    2017-01-01

    Traditional visual speech recognition systems consist of two stages, feature extraction and classification. Recently, several deep learning approaches have been presented which automatically extract features from the mouth images and aim to replace the feature extraction stage. However, research on

  20. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  1. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  2. End-to-End Multi-View Lipreading

    NARCIS (Netherlands)

    Petridis, Stavros; Wang, Yujiang; Li, Zuwei; Pantic, Maja

    2017-01-01

    Non-frontal lip views contain useful information which can be used to enhance the performance of frontal view lipreading. However, the vast majority of recent lipreading works, including the deep learning approaches which significantly outperform traditional approaches, have focused on frontal mouth

  3. END-TO-END INDIA-UK TRANSNATIONAL WIRELESS TESTBED

    National Research Council Canada - National Science Library

    Budhiraja, Rohit; Ramamurthi, Bhaskar; Narayanan, Babu; A, Oredope

    2011-01-01

    .... The India-UK Advanced Technology Centre initiative is a collaborative research project between various institutes and companies across UK and India, which envisages, apart from several research...

  4. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    This thesis addresses selected topics of Quality of Service (QoS) provisioning in heterogeneous data networks that construct the communication environment of today's Internet. In the vast range of protocols available in different domains of network infrastructures, a few chosen ones are discussed...... and their key QoS features are analysed. This thesis mainly focuses on home and access networks, and their interaction. Considering home networks, UPnP-QoS Architecture was chosen in order to analyse the possibilities of QoS provisioning at users' premises using service oriented architectures. First...

  5. End to End Beam Dynamics of the ESS Linac

    DEFF Research Database (Denmark)

    Thomsen, Heine Dølrath

    2012-01-01

    The European Spallation Source, ESS, uses a linear accelerator to deliver a high intensity proton beam to the target station. The nominal beam power on target will be 5 MW at an energy of 2.5 GeV. We briefly describe the individual accelerating structures and transport lines through which we have...... carried out multiparticle beam dynamics simulations. We will present a review of the beam dynamics from the source to the target....

  6. End-to-End Privacy for Open Big Data Markets

    OpenAIRE

    Perera, Charith; Ranjan, Rajiv; Wang, Lizhe

    2015-01-01

    The idea of an open data market envisions the creation of a data trading model to facilitate exchange of data between different parties in the Internet of Things (IoT) domain. The data collected by IoT products and solutions are expected to be traded in these markets. Data owners will collect data using IoT products and solutions. Data consumers who are interested will negotiate with the data owners to get access to such data. Data captured by IoT products will allow data consumers to further...

  7. 40 Gigabit ethernet: prototyping transparent end-to-end connectivity

    NARCIS (Netherlands)

    Dumitru, C.; Koning, R.; de Laat, C.

    2011-01-01

    The ever increasing demands of data intensive eScience applications have pushed the limits of computer networks. With the launch of the new 40 Gigabit Ethernet (40GE) standard, 802.3ba, applications can go beyond the common 10 Gigabit/s per data stream barrier for both local area, and as

  8. Evidence of butyltin biomagnification along the Northern Adriatic food-web (Mediterranean Sea) elucidated by stable isotope ratios.

    Science.gov (United States)

    Fortibuoni, Tomaso; Noventa, Seta; Rampazzo, Federico; Gion, Claudia; Formalewicz, Malgorzata; Berto, Daniela; Raicevich, Saša

    2013-04-02

    The biomagnification of tributyltin (TBT), dibutyltin (DBT), monobutyltin (MBT), and total butyltins (ΣBT) was analyzed in the Northern Adriatic food-web (Mediterranean) considering trophodynamic interactions among species and carbon sources in the food-web. Although it is acknowledged that these contaminants bioaccumulate in marine organisms, it is still controversial whether they biomagnify along food-webs. A wide range of species was considered, from plankton feeders to top predators, whose trophic level (TL) was assessed measuring the biological enrichment of nitrogen stable isotopes (δ(15)N). Carbon isotopic signature (δ(13)C) was used to trace carbon sources in the food-web (terrestrial vs marine). At least one butyltin species was detected in the majority of samples, and TBT was the predominant contaminant. A significant positive relationship was found between TL and butyltin concentrations, implying food-web biomagnification. Coherently, the Trophic Magnification Factor resulted higher than 1, ranging between 3.88 for ΣBT and 4.62 for DBT. A negative but not significant correlation was instead found between δ(13)C and butyltin concentrations, indicating a slight decreasing gradient of contaminants concentrations in species according to the coastal influence as carbon source in their diet. However, trophodynamic mechanisms are likely more important factors in determining butyltin distribution in the Northern Adriatic food-web.

  9. Visualizing the Food-Web Effects of Fishing for Tunas in the Pacific Ocean

    OpenAIRE

    Hinke, Jefferson T.; Kaplan, Isaac C.; Kerim Aydin; Watters, George M.; Olson, Robert J.; James F. K. Kitchell

    2004-01-01

    We use food-web models to develop visualizations to compare and evaluate the interactions of tuna fisheries with their supporting food webs in the eastern tropical Pacific (ETP) and the central north Pacific (CNP) Oceans. In the ETP and CNP models, individual fisheries use slightly different food webs that are defined by the assemblage of targeted tuna species. Distinct energy pathways are required to support different tuna species and, consequently, the specific fisheries that target differe...

  10. Plankton food-webs: to what extent can they be simplified?

    Directory of Open Access Journals (Sweden)

    Domenico D'Alelio

    2016-05-01

    Full Text Available Plankton is a hugely diverse community including both unicellular and multicellular organisms, whose individual dimensions span over seven orders of magnitude. Plankton is a fundamental part of biogeochemical cycles and food-webs in aquatic systems. While knowledge has progressively accumulated at the level of single species and single trophic processes, the overwhelming biological diversity of plankton interactions is insufficiently known and a coherent and unifying trophic framework is virtually lacking. We performed an extensive review of the plankton literature to provide a compilation of data suitable for implementing food-web models including plankton trophic processes at high taxonomic resolution. We identified the components of the plankton community at the Long Term Ecological Research Station MareChiara in the Gulf of Naples. These components represented the sixty-three nodes of a plankton food-web. To each node we attributed biomass and vital rates, i.e. production, consumption, assimilation rates and ratio between autotrophy and heterotrophy in mixotrophic protists. Biomasses and rates values were defined for two opposite system’s conditions; relatively eutrophic and oligotrophic states. We finally identified 817 possible trophic links within the web and provided each of them with a relative weight, in order to define a diet-matrix, valid for both trophic states, which included all consumers, fromn anoflagellates to carnivorous plankton. Vital rates for plankton resulted, as expected, very wide; this strongly contrasts with the narrow ranges considered in plankton system models implemented so far. Moreover, the amount and variety of trophic links highlighted by our review is largely excluded by state-of-the-art biogeochemical and food-web models for aquatic systems. Plankton models could potentially benefit from the integration of the trophic diversity outlined in this paper: first, by using more realistic rates; second, by better

  11. Macroalgal detritus and food-web subsidies along an Arctic fjord depth-gradient

    Directory of Open Access Journals (Sweden)

    Paul E Renaud

    2015-06-01

    Full Text Available Tight coupling between pelagic and benthic communities is accepted as a general principle on Arctic shelves. Whereas this paradigm has been useful for guiding ecological research, it has perhaps led to a disproportionate focus on POM and ice algae as the most likely sources of carbon for the benthic food web. Arctic shelves are complex systems, including banks, fjords, and trough systems up to 350 m or more in depth. In this stable-isotope study, thirteen different potential carbon sources were analysed for their contribution to the food-webs of Isfjorden, Svalbard. A mixing model with herbivorous copepods and grazing sea urchins as end-members was applied to determine the relative contributions of the most likely carbon sources to pelagic and benthic taxa. Most taxa from the benthos feed on a broad mixture of POM and macroalgal detritus, even at depths down to 410 m. Most suspension-feeding bivalves had isotopic signals consistent with more than a 50% contribution from kelps and rockweeds. In contrast, nearly all pelagic species had diets consistent with an overwhelming contribution of pelagic POM. These results indicate that macroalgal detritus can contribute significantly to near-shore Arctic food-webs, a trophic link that may increase if macroalgae increase in the Arctic as predicted. These weaker quantitative links between pelagic and benthic components of coastal systems highlight the need for thorough sampling of potential carbon-baselines in food-web studies. A large detrital-carbon component in diets of Arctic benthos may dampen the impacts of strong seasonality in polar primary producers, leading to higher ecosystem resilience, but may also result in lower secondary productivity.

  12. A foodweb model to explore uncertainties in the South Georgia shelf pelagic ecosystem

    Science.gov (United States)

    Hill, Simeon L.; Keeble, Kathryn; Atkinson, Angus; Murphy, Eugene J.

    2012-01-01

    Foodweb models provide a useful framework for compiling data on biomass, production, consumption and feeding relationships. They are particularly useful for identifying gaps and inconsistencies in the data, and for exploring plausible scenarios of change. We compiled data on the pelagic foodweb of the South Georgia shelf, which is one of the most intensively studied areas in the Southern Ocean. The data suggest that current average annual copepod production is three times that of Antarctic krill and that flying seabirds and fish are, respectively, responsible for 25% and 21% of local krill consumption. The most striking inconsistency was that estimated consumption of fish was 5 times their estimated production. We developed a static mass balance model of the foodweb representing one of many possible solutions to the inconsistencies in the data. The model included sufficient fish biomass to balance the original consumption estimate, and consequently fish became the main krill consumers. Nonetheless, only 74% of local krill production was consumed by predators, suggesting that there are additional mortality sources that we did not explicitly model. We developed further models to explore scenarios incorporating plausible climate-driven reductions in krill biomass. In scenarios with unchanged predator diets, an 80% reduction in krill biomass resulted in a 73% reduction in vertebrate biomass. However, when predators with diverse diets were able to switch to feeding on alternative zooplankton prey, total vertebrate biomass was maintained at current levels. Scenarios in which 80% of krill biomass was replaced with copepod biomass required 28% more primary production because the estimated consumption rate of copepods is higher than that of krill. The additional copepod biomass did not alter the consequences for vertebrates. These scenarios illustrate the wide range of potential consequences of a shift from a krill to a copepod dominated system in a warming climate. They

  13. Adaptive behaviour, tri-trophic food-web stability and damping of chaos

    OpenAIRE

    Visser, André W.; Mariani, Patrizio; Pigolotti, Simone

    2011-01-01

    We examine the effect of adaptive foraging behaviour within a tri-trophic food web with intra-guild predation. The intra-guild prey is allowed to adjust its foraging effort so as to achieve an optimal per capita growth rate in the face of realized feeding, predation risk and foraging cost. Adaptive fitness-seeking behaviour of the intra-guild prey has a stabilizing effect on the tri-trophic food-web dynamics provided that (i) a finite optimal foraging effort exists and (ii) the trophic transf...

  14. Quantifying foodweb interactions with simultaneous linear equations: Stable isotope models of the Truckee River, USA

    Science.gov (United States)

    Saito, L.; Redd, C.; Chandra, S.; Atwell, L.; Fritsen, C.H.; Rosen, Michael R.

    2007-01-01

    Aquatic foodweb models for 2 seasons (relatively high- [March] and low-flow [August] conditions) were constructed for 4 reaches on the Truckee River using ??13C and ??15N data from periphyton, macroinvertebrate, and fish samples collected in 2003 and 2004. The models were constructed with isotope values that included measured periphyton signatures and calculated mean isotope values for detritus and seston as basal food sources of each food web. The pseudo-optimization function in Excel's Solver module was used to minimize the sum of squared error between predicted and observed stable-isotope values while simultaneously solving for diet proportions for all foodweb consumers and estimating ??13C and ??15N trophic enrichment factors. This approach used an underdetermined set of simultaneous linear equations and was tested by running the pseudo-optimization procedure for 500 randomly selected sets of initial conditions. Estimated diet proportions had average standard deviations (SDs) of 0.03 to 0.04??? and SDs of trophic enrichment factors ranged from dead ends because they generally were not consumed. Predatory macroinvertebrate diets varied along the river and affected estimated trophic positions of fish that consumed them. Differences in complexity and composition of the food webs appeared to be related to season, but could also have been caused by interactions with nonnative species, especially invasive crayfish. ?? 2007 by The North American Benthological Society.

  15. A framework for evaluating food-web responses to hydrological manipulations in riverine systems.

    Science.gov (United States)

    Rolls, Robert J; Baldwin, Darren S; Bond, Nick R; Lester, Rebecca E; Robson, Barbara J; Ryder, Daren S; Thompson, Ross M; Watson, Garth A

    2017-12-01

    Environmental flows are used to restore elements of the hydrological regime altered by human use of water. One of the primary justifications and purposes for environmental flows is the maintenance of target species populations but, paradoxically, there has been little emphasis on incorporating the food-web and trophic dynamics that determine population-level responses into the monitoring and evaluation of environmental flow programs. We develop a generic framework for incorporating trophic dynamics into monitoring programs to identify the food-web linkages between hydrological regimes and population-level objectives of environmental flows. These linkages form the basis for objective setting, ecological targets and indicator selection that are necessary for planning monitoring programs with a rigorous scientific basis. Because there are multiple facets of trophic dynamics that influence energy production and transfer through food webs, the specific objectives of environmental flows need to be defined during the development of monitoring programs. A multitude of analytical methods exist that each quantify distinct aspects of food webs (e.g. energy production, prey selection, energy assimilation), but no single method can provide a basis for holistic understanding of food webs. Our paper critiques a range of analytical methods for quantifying attributes of food webs to inform the setting, monitoring and evaluation of trophic outcomes of environmental flows and advance the conceptual understanding of trophic dynamics in river-floodplain systems. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  16. Food-web structure and ecosystem services: insights from the Serengeti.

    Science.gov (United States)

    Dobson, Andy

    2009-06-27

    The central organizing theme of this paper is to discuss the dynamics of the Serengeti grassland ecosystem from the perspective of recent developments in food-web theory. The seasonal rainfall patterns that characterize the East African climate create an annually oscillating, large-scale, spatial mosaic of feeding opportunities for the larger ungulates in the Serengeti; this in turn creates a significant annual variation in the food available for their predators. At a smaller spatial scale, periodic fires during the dry season create patches of highly nutritious grazing that are eaten in preference to the surrounding older patches of less palatable vegetation. The species interactions between herbivores and plants, and carnivores and herbivores, are hierarchically nested in the Serengeti food web, with the largest bodied consumers on each trophic level having the broadest diets that include species from a large variety of different habitats in the ecosystem. The different major habitats of the Serengeti are also used in a nested fashion; the highly nutritious forage of the short grass plains is available only to the larger migratory species for a few months each year. The longer grass areas, the woodlands and kopjes (large partially wooded rocky islands in the surrounding mosaic of grassland) contain species that are resident throughout the year; these species often have smaller body size and more specialized diets than the migratory species. Only the larger herbivores and carnivores obtain their nutrition from all the different major habitat types in the ecosystem. The net effect of this is to create a nested hierarchy of subchains of energy flow within the larger Serengeti food web; these flows are seasonally forced by rainfall and operate at different rates in different major branches of the web. The nested structure that couples sequential trophic levels together interacts with annual seasonal variation in the fast and slow chains of nutrient flow in a way that

  17. Effects of fish and nutrient additions on food-web stability in a charophyte-dominated lake

    NARCIS (Netherlands)

    van de Bund, W.; Van Donk, E.

    2004-01-01

    1. The response of major food-web constituents to combinations of nutrient addition and zooplanktivorous fish abundance was tested during two subsequent years in the shallow charophyte-dominated lake Naardermeer in the Netherlands, using in situ enclosures. 2. Treatment effects differed sharply

  18. Reductions in fish-community contamination following lowhead dam removal linked more to shifts in food-web structure than sediment pollution.

    Science.gov (United States)

    Davis, Robert P; Sullivan, S Mažeika P; Stefanik, Kay C

    2017-12-01

    Recent increases in dam removals have prompted research on ecological and geomorphic river responses, yet contaminant dynamics following dam removals are poorly understood. We investigated changes in sediment concentrations and fish-community body burdens of mercury (Hg), selenium (Se), polychlorinated biphenyls (PCB), and chlorinated pesticides before and after two lowhead dam removals in the Scioto and Olentangy Rivers (Columbus, Ohio). These changes were then related to documented shifts in fish food-web structure. Seven study reaches were surveyed from 2011 to 2015, including controls, upstream and downstream of the previous dams, and upstream restored vs. unrestored. For most contaminants, fish-community body burdens declined following dam removal and converged across study reaches by the last year of the study in both rivers. Aldrin and dieldrin body burdens in the Olentangy River declined more rapidly in the upstream-restored vs. the upstream-unrestored reach, but were indistinguishable by year three post dam removal. No upstream-downstream differences were observed in body burdens in the Olentangy River, but aldrin and dieldrin body burdens were 138 and 148% higher, respectively, in downstream reaches than in upstream reaches of the Scioto River following dam removal. The strongest relationships between trophic position and body burdens were observed with PCBs and Se in the Scioto River, and with dieldrin in the Olentangy River. Food-chain length - a key measure of trophic structure - was only weakly related to aldrin body burdens, and unrelated to other contaminants. Overall, we demonstrate that lowhead dam removal may effectively reduce ecosystem contamination, largely via shifts in fish food-web dynamics versus sediment contaminant concentrations. This study presents some of the first findings documenting ecosystem contamination following dam removal and will be useful in informing future dam removals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Impacts of elevated terrestrial nutrient loads and temperature on pelagic food-web efficiency and fish production.

    Science.gov (United States)

    Lefébure, R; Degerman, R; Andersson, A; Larsson, S; Eriksson, L-O; Båmstedt, U; Byström, P

    2013-05-01

    Both temperature and terrestrial organic matter have strong impacts on aquatic food-web dynamics and production. Temperature affects vital rates of all organisms, and terrestrial organic matter can act both as an energy source for lower trophic levels, while simultaneously reducing light availability for autotrophic production. As climate change predictions for the Baltic Sea and elsewhere suggest increases in both terrestrial matter runoff and increases in temperature, we studied the effects on pelagic food-web dynamics and food-web efficiency in a plausible future scenario with respect to these abiotic variables in a large-scale mesocosm experiment. Total basal (phytoplankton plus bacterial) production was slightly reduced when only increasing temperatures, but was otherwise similar across all other treatments. Separate increases in nutrient loads and temperature decreased the ratio of autotrophic:heterotrophic production, but the combined treatment of elevated temperature and terrestrial nutrient loads increased both fish production and food-web efficiency. CDOM: Chl a ratios strongly indicated that terrestrial and not autotrophic carbon was the main energy source in these food webs and our results also showed that zooplankton biomass was positively correlated with increased bacterial production. Concomitantly, biomass of the dominant calanoid copepod Acartia sp. increased as an effect of increased temperature. As the combined effects of increased temperature and terrestrial organic nutrient loads were required to increase zooplankton abundance and fish production, conclusions about effects of climate change on food-web dynamics and fish production must be based on realistic combinations of several abiotic factors. Moreover, our results question established notions on the net inefficiency of heterotrophic carbon transfer to the top of the food web. © 2013 Blackwell Publishing Ltd.

  20. Adaptive behaviour, tri-trophic food-web stability and damping of chaos

    DEFF Research Database (Denmark)

    Visser, Andre; Mariani, Patrizio; Pigolotti, Simone

    2012-01-01

    We examine the effect of adaptive foraging behaviour within a tri-trophic food web with intra-guild predation. The intra-guild prey is allowed to adjust its foraging effort so as to achieve an optimal per capita growth rate in the face of realized feeding, predation risk and foraging cost. Adaptive...... fitness-seeking behaviour of the intra-guild prey has a stabilizing effect on the tri-trophic food-web dynamics provided that (i) a finite optimal foraging effort exists and (ii) the trophic transfer efficiency from resource to predator via the intra-guild prey is greater than that from the resource...... directly. The latter condition is a general criterion for the feasibility of intra-guild predation as a trophic mode. Under these conditions, we demonstrate rigorously that adaptive behaviour will always promote stability of community dynamics in the sense that the region of parameter space in which...

  1. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  2. Food-Web Complexity in Guaymas Basin Hydrothermal Vents and Cold Seeps.

    Directory of Open Access Journals (Sweden)

    Marie Portail

    Full Text Available In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to

  3. Radiocarbon Depression in Aquatic Foodwebs of the Colorado River, USA: Coupling Between Carbonate Weathering and the Biosphere

    Science.gov (United States)

    Sickman, J. O.; Huang, W.; Lucero, D.; Anderson, M.

    2012-12-01

    The 14C isotopic composition of living organisms is generally considered to be in isotopic equilibrium with atmosphere CO2. During the course of investigations of aquatic foodwebs of the Colorado River, we measured substantial radiocarbon depression of organisms within planktonic and benthic foodwebs of Copper Basin Reservoir, a short residence-time water body at the intake to the Colorado River Aqueduct. All trophic levels had depressed radiocarbon content with inferred "age" of ca. 1,200 radiocarbon years (range: 0.85 to 0.87 fraction modern carbon (fmc)). Additional measurements of the radiocarbon content of dissolved organic carbon (DOC) and dissolved inorganic carbon (DIC) were made in other major rivers in California (New (near Salton Sea), Santa Ana (near Riverside), San Joaquin (near Fresno) and Salinas (near San Luis Obispo)). In the New River (which is composed primarily of irrigation tailwater derived from the Colorado River), the radiocarbon values for DIC closely matched those found in biota of the Copper Basin Reservoir (0.85 to 0.87 fmc), but radiocarbon values for DOC were slightly higher (0.91 to 0.95 fmc). In the other California rivers, radiocarbon concentrations in DIC were generally below modern and lower than corresponding levels in DOC; in the case of the Santa Ana River, DOC was older than DIC as a result of wastewater inputs from upstream treatment plants. Together these data suggest that the carbonate equilibrium of California rivers is influenced by weathering of carbonate minerals which produces HCO3- with no 14C. We hypothesize that this dead carbon can move into aquatic foodwebs via algae and phytoplankton uptake during photosynthesis, depressing the 14C content of aquatic foodwebs below that of the atmosphere. Based on a simple two-component mixing model incorporating carbonate weathering and atmospheric CO2, we estimate that 15-17% of the carbon in the aquatic foodweb of Copper Basin is derived directly from mineral weathering of

  4. Adaptive behaviour, tri-trophic food-web stability and damping of chaos.

    Science.gov (United States)

    Visser, André W; Mariani, Patrizio; Pigolotti, Simone

    2012-06-07

    We examine the effect of adaptive foraging behaviour within a tri-trophic food web with intra-guild predation. The intra-guild prey is allowed to adjust its foraging effort so as to achieve an optimal per capita growth rate in the face of realized feeding, predation risk and foraging cost. Adaptive fitness-seeking behaviour of the intra-guild prey has a stabilizing effect on the tri-trophic food-web dynamics provided that (i) a finite optimal foraging effort exists and (ii) the trophic transfer efficiency from resource to predator via the intra-guild prey is greater than that from the resource directly. The latter condition is a general criterion for the feasibility of intra-guild predation as a trophic mode. Under these conditions, we demonstrate rigorously that adaptive behaviour will always promote stability of community dynamics in the sense that the region of parameter space in which stability is achieved is larger than for the non-adaptive counterpart of the system.

  5. Eighty years of food-web response to interannual variation in discharge recorded in river diatom frustules from an ocean sediment core.

    Science.gov (United States)

    Sculley, John B; Lowe, Rex L; Nittrouer, Charles A; Drexler, Tina M; Power, Mary E

    2017-09-19

    Little is known about the importance of food-web processes as controls of river primary production due to the paucity of both long-term studies and of depositional environments which would allow retrospective fossil analysis. To investigate how freshwater algal production in the Eel River, northern California, varied over eight decades, we quantified siliceous shells (frustules) of freshwater diatoms from a well-dated undisturbed sediment core in a nearshore marine environment. Abundances of freshwater diatom frustules exported to Eel Canyon sediment from 1988 to 2001 were positively correlated with annual biomass of Cladophora surveyed over these years in upper portions of the Eel basin. Over 28 years of contemporary field research, peak algal biomass was generally higher in summers following bankfull, bed-scouring winter floods. Field surveys and experiments suggested that bed-mobilizing floods scour away overwintering grazers, releasing algae from spring and early summer grazing. During wet years, growth conditions for algae could also be enhanced by increased nutrient loading from the watershed, or by sustained summer base flows. Total annual rainfall and frustule densities in laminae over a longer 83-year record were weakly and negatively correlated, however, suggesting that positive effects of floods on annual algal production were primarily mediated by "top-down" (consumer release) rather than "bottom-up" (growth promoting) controls.

  6. Visualizing the Food-Web Effects of Fishing for Tunas in the Pacific Ocean

    Directory of Open Access Journals (Sweden)

    Jefferson T. Hinke

    2004-06-01

    Full Text Available We use food-web models to develop visualizations to compare and evaluate the interactions of tuna fisheries with their supporting food webs in the eastern tropical Pacific (ETP and the central north Pacific (CNP Oceans. In the ETP and CNP models, individual fisheries use slightly different food webs that are defined by the assemblage of targeted tuna species. Distinct energy pathways are required to support different tuna species and, consequently, the specific fisheries that target different tuna assemblages. These simulations suggest that catches of tunas, sharks, and billfishes have lowered the biomass of the upper trophic levels in both systems, whereas increases in intermediate and lower trophic level animals have accompanied the decline of top predators. Trade-offs between fishing and predation mortality rates that occur when multiple fisheries interact with their respective food webs may lead to smaller changes in biomass than if only the effect of a single fishery is considered. Historical simulations and hypothetical management scenarios further demonstrate that the effects of longline and purse seine fisheries have been strongest in upper trophic levels, but that lower trophic levels may respond more strongly to purse-seine fisheries. The apex predator guild has responded most strongly to longlining. Simulations of alternative management strategies that attempt to rebuild shark and billfish populations in each ecosystem reveal that (1 changes in longlining more effectively recover top predator populations than do changes in purse seining and (2 restrictions on both shallow-set longline gear and shark finning may do more to recover top predators than do simple reductions in fishing effort.

  7. Food-web traits of the North Aegean Sea ecosystem (Eastern Mediterranean) and comparison with other Mediterranean ecosystems

    Science.gov (United States)

    Tsagarakis, K.; Coll, M.; Giannoulaki, M.; Somarakis, S.; Papaconstantinou, C.; Machias, A.

    2010-06-01

    A mass-balance trophic model was built to describe the food-web traits of the North Aegean Sea (Strymonikos Gulf and Thracian Sea, Greece, Eastern Mediterranean) during the mid-2000s and to explore the impacts of fishing. This is the first food-web model representing the Aegean Sea, and results were presented and discussed in comparison to other previous ecosystems modelled from the western and the central areas of the basin (South Catalan and North-Central Adriatic Seas). Forty functional groups were defined, covering the entire trophic spectrum from lower to higher trophic levels. Emphasis was placed on commercial invertebrates and fish. The potential ecological role of the invasive ctenophore, Mnemiopsis leidyi, and several vulnerable groups (e.g., dolphins) was also explored. Results confirmed the spatial productivity patterns known for the Mediterranean Sea showing, for example, that the total biomass is highest in N.C. Adriatic and lowest in N. Aegean Sea. Accordingly, food-web flows and several ecosystem indicators like the mean transfer efficiency were influenced by these patterns. Nevertheless, all three systems shared some common features evidencing similarities of Mediterranean Sea ecosystems such as dominance of the pelagic fraction in terms of flows and strong benthic-pelagic coupling of zooplankton and benthic invertebrates through detritus. The importance of detritus highlighted the role of the microbial food-web, which was indirectly considered through detritus dynamics. Ciliates, mesozooplankton and several benthic invertebrate groups were shown as important elements of the ecosystem linking primary producers and detritus with higher trophic levels in the N. Aegean Sea. Adult anchovy was shown as the most important fish group in terms of production, consumption and overall effect on the rest of the ecological groups in the model, in line with results from the Western Mediterranean Sea. The five fishing fleets considered (both artisanal and

  8. Bioaccumulation factors and the steady state assumption for cesium isotopes in aquatic foodwebs near nuclear facilities.

    Science.gov (United States)

    Rowan, D J

    2013-07-01

    Steady state approaches, such as transfer coefficients or bioaccumulation factors, are commonly used to model the bioaccumulation of (137)Cs in aquatic foodwebs from routine operations and releases from nuclear generating stations and other nuclear facilities. Routine releases from nuclear generating stations and facilities, however, often consist of pulses as liquid waste is stored, analyzed to ensure regulatory compliance and then released. The effect of repeated pulse releases on the steady state assumption inherent in the bioaccumulation factor approach has not been evaluated. In this study, I examine the steady state assumption for aquatic biota by analyzing data for two cesium isotopes in the same biota, one isotope in steady state (stable (133)Cs) from geologic sources and the other released in pulses ((137)Cs) from reactor operations. I also compare (137)Cs bioaccumulation factors for similar upstream populations from the same system exposed solely to weapon test (137)Cs, and assumed to be in steady state. The steady state assumption appears to be valid for small organisms at lower trophic levels (zooplankton, rainbow smelt and 0+ yellow perch) but not for older and larger fish at higher trophic levels (walleye). Attempts to account for previous exposure and retention through a biokinetics approach had a similar effect on steady state, upstream and non-steady state, downstream populations of walleye, but were ineffective in explaining the more or less constant deviation between fish with steady state exposures and non-steady state exposures of about 2-fold for all age classes of walleye. These results suggest that for large, piscivorous fish, repeated exposure to short duration, pulse releases leads to much higher (137)Cs BAFs than expected from (133)Cs BAFs for the same fish or (137)Cs BAFs for similar populations in the same system not impacted by reactor releases. These results suggest that the steady state approach should be used with caution in any

  9. Effects of internal phosphorus loadings and food-web structure on the recovery of a deep lake from eutrophication

    Science.gov (United States)

    Lepori, Fabio; Roberts, James J.

    2017-01-01

    We used monitoring data from Lake Lugano (Switzerland and Italy) to assess key ecosystem responses to three decades of nutrient management (1983–2014). We investigated whether reductions in external phosphorus loadings (Lext) caused declines in lake phosphorus concentrations (P) and phytoplankton biomass (Chl a), as assumed by the predictive models that underpinned the management plan. Additionally, we examined the hypothesis that deep lakes respond quickly to Lext reductions. During the study period, nutrient management reduced Lext by approximately a half. However, the effects of such reduction on P and Chl a were complex. Far from the scenarios predicted by classic nutrient-management approaches, the responses of P and Chl a did not only reflect changes in Lext, but also variation in internal P loadings (Lint) and food-web structure. In turn, Lint varied depending on basin morphometry and climatic effects, whereas food-web structure varied due to apparently stochastic events of colonization and near-extinction of key species. Our results highlight the complexity of the trajectory of deep-lake ecosystems undergoing nutrient management. From an applied standpoint, they also suggest that [i] the recovery of warm monomictic lakes may be slower than expected due to the development of Lint, and that [ii] classic P and Chl a models based on Lext may be useful in nutrient management programs only if their predictions are used as starting points within adaptive frameworks.

  10. The cold-water coral community as a hot spot of carbon cycling on continental margins: a food-web analysis from Rockall Bank (northeast Atlantic)

    NARCIS (Netherlands)

    Van Oevelen, D.J.; Duineveld, G.; Lavaleye, M.; Mienis, F.; Soetaert, K.E.R.; Heip, C.H.R.

    2009-01-01

    We present a quantitative food-web analysis of the cold-water coral community, i.e., the assembly of living corals, dead coral branches and sediment beneath, associated with the reef-building Lophelia pertusa on the giant carbonate mounds at ~800-m depth at Rockall Bank. Carbon flows, 140 flows

  11. Early Cretaceous terrestrial ecosystems in East Asia based on food-web and energy-flow models

    Science.gov (United States)

    Matsukawa, M.; Saiki, K.; Ito, M.; Obata, I.; Nichols, D.J.; Lockley, M.G.; Kukihara, R.; Shibata, K.

    2006-01-01

    In recent years, there has been global interest in the environments and ecosystems around the world. It is helpful to reconstruct past environments and ecosystems to help understand them in the present and the future. The present environments and ecosystems are an evolving continuum with those of the past and the future. This paper demonstrates the contribution of geology and paleontology to such continua. Using fossils, we can make an estimation of past population density as an ecosystem index based on food-web and energy-flow models. Late Mesozoic nonmarine deposits are distributed widely on the eastern Asian continent and contain various kinds of fossils such as fishes, amphibians, reptiles, dinosaurs, mammals, bivalves, gastropods, insects, ostracodes, conchostracans, terrestrial plants, and others. These fossil organisms are useful for late Mesozoic terrestrial ecosystem reconstruction using food-web and energy-flow models. We chose Early Cretaceous fluvio-lacustrine basins in the Choyr area, southeastern Mongolia, and the Tetori area, Japan, for these analyses and as a potential model for reconstruction of other similar basins in East Asia. The food-web models are restored based on taxa that occurred in these basins. They form four or five trophic levels in an energy pyramid consisting of rich primary producers at its base and smaller biotas higher in the food web. This is the general energy pyramid of a typical ecosystem. Concerning the population densities of vertebrate taxa in 1 km2 in these basins, some differences are recognized between Early Cretaceous and the present. For example, Cretaceous estimates suggest 2.3 to 4.8 times as many herbivores and 26.0 to 105.5 times the carnivore population. These differences are useful for the evaluation of past population densities of vertebrate taxa. Such differences may also be caused by the different metabolism of different taxa. Preservation may also be a factor, and we recognize that various problems occur in

  12. Using marine reserves to manage impact of bottom trawl fisheries requires consideration of benthic food-web interactions

    DEFF Research Database (Denmark)

    van Denderen, Pieter Daniël; Rijnsdorp, Adriaan D.; van Kooten, Tobias

    2016-01-01

    Marine protected areas (MPAs) are widely used to protect exploited fish species as well as to conserve marine habitats and their biodiversity. They have become a popular management tool also for bottom trawl fisheries, a common fishing technique on continental shelves worldwide. The effects...... of bottom trawling go far beyond the impact on target species, as trawls also affect other components of the benthic ecosystem and the seabed itself. This means that for bottom trawl fisheries, MPAs can potentially be used not only to conserve target species but also to reduce impact of these side......-effects of the fishery. However, predicting the protective effects of MPAs is complicated because the side-effects of trawling potentially alter the food-web interactions between target and non-target species. These changes in predatory and competitive interactions among fish and benthic invertebrates may have important...

  13. Emerging and Legacy Contaminants in The Foodweb in The Lower Columbia River: USGS ConHab Project

    Science.gov (United States)

    Nilsen, E. B.; Alvarez, D.; Counihan, T.; Elias, E.; Gelfenbaum, G. R.; Hardiman, J.; Jenkins, J.; Mesa, M.; Morace, J.; Patino, R.; Torres, L.; Waite, I.; Zaugg, S.

    2012-12-01

    An interdisciplinary study, USGS Columbia River Contaminants and Habitat Characterization (ConHab) project, investigates transport pathways, chemical fate, and effects of polybrominated diphenyl ethers (PBDEs) and other endocrine disrupting chemicals (EDCs) in aquatic media and the foodweb in the lower Columbia River, Oregon and Washington. Polar organic chemical integrative samplers (POCIS) and semipermeable membrane devices (SPMDs) were co-deployed at each of 10 sites in 2008 to provide a measure of the dissolved concentrations of select PBDEs, chlorinated pesticides, and other EDCs. PBDE-47 was the most prevalent of the PBDEs detected. Numerous organochlorine pesticides, both banned and current-use, including hexachlorobenzene, pentachloroanisole, dichlorodiphenyltrichloroethane (DDT) and its degradates, chlorpyrifos, endosulfan, and the endosulfan degradation products, were measured at each site. EDCs commonly detected included a series of polycyclic aromatic hydrocarbons (PAHs), fragrances (galaxolide), pesticides (chlorpyrifos and atrazine), plasticizers (phthalates), and flame retardants (phosphates). The downstream sites tended to have the highest concentrations of contaminants in the lower Columbia River. In 2009 and 2010 passive samplers were deployed and resident largescale suckers (Catostomus macrocheilus) and surface bed sediments were collected at three of the original sites representing a gradient of exposure based on 2008 results. Brain, fillet, liver, stomach, and gonad tissues were analyzed. Chemical concentrations were highest in livers, followed by brain, stomach, gonad, and, lastly, fillet. Concentrations of halogenated compounds in tissue samples ranged from PBDE-100 > PBDE-154 > PBDE-153. Concentrations in tissues and in sediments increased moving downstream from Skamania, WA to Columbia City, OR to Longview, WA. Preliminary biomarker results indicate that fish at the downstream sites experience greater stress relative to the upstream site

  14. Trophic transference of microplastics under a low exposure scenario: Insights on the likelihood of particle cascading along marine food-webs.

    Science.gov (United States)

    Santana, M F M; Moreira, F T; Turra, A

    2017-08-15

    Microplastics are emergent pollutants in marine environments, whose risks along food-web still need to be understood. Within this knowledge gap, MPs transference and persistence along trophic levels are key processes. We assessed the potential occurrence of these processes considering a less extreme scenario of exposure than used previously, with microplastics present only in the hemolymph of prey (the mussel Perna perna) and absent in the gut cavity. Predators were the crab Callinectes ornatus and the puffer fish Spheoeroides greeleyi. Transference of microplastics occurred from prey to predators but without evidences of particle persistence in their tissues after 10days of exposure. This suggests a reduced likelihood of trophic cascading of particles and, consequently, a reduced risk of direct impacts of microplastics on higher trophic levels. However, the contact with microplastics along food-webs is still concerning, modulated by the concentration of particles in prey and predators' depuration capacity and rate. Copyright © 2017. Published by Elsevier Ltd.

  15. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  16. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Since the majority of the potential ADR targets are large (>meters) upper stages and payloads between 800 and 1100 km altitude, they are relatively bright, with...

  17. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data.

    Science.gov (United States)

    Storch, Laura S; Glaser, Sarah M; Ye, Hao; Rosenberg, Andrew A

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model's utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts.

  18. Integration of DST's for non-conflicting end-to-end flight scheduling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR effort we propose an innovative approach for the integration of Decision Support Tools (DSTs) for increased situational awareness, improved cooperative...

  19. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2018-01-01

    Industry 4.0 refers to the fourth industrial revolution, and introduces modern communication and computation technologies such as 5G, cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines and applications will rely on connectivity, while...... having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...... as a mechanism to handle the diverse set of requirements to the network. We present methods for slicing deterministic and packet-switched industrial communication protocols at an abstraction level which is decoupled from the specific implementation of the underlying technologies, and hence simplifies the slicing...

  20. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  1. Deep View-Sensitive Pedestrian Attribute Inference in an end-to-end Model

    OpenAIRE

    Sarfraz, M. Saquib; Schumann, Arne; Wang, Yan; Stiefelhagen, Rainer

    2017-01-01

    Pedestrian attribute inference is a demanding problem in visual surveillance that can facilitate person retrieval, search and indexing. To exploit semantic relations between attributes, recent research treats it as a multi-label image classification task. The visual cues hinting at attributes can be strongly localized and inference of person attributes such as hair, backpack, shorts, etc., are highly dependent on the acquired view of the pedestrian. In this paper we assert this dependence in ...

  2. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...... with the analysers of the calculi, and the results of the analysis are reflected back into a modified version of the input UML model. The design platform supporting the methodology, Choreographer, interoperates with state-of-the-art UML modelling tools. We illustrate the approach with a well known protocol...

  3. Caius: Synthetic Observations Using a Robust End-to-End Radiative Transfer Pipeline

    Science.gov (United States)

    Simeon Barrow, Kirk Stuart; Wise, John H.; O'Shea, Brian; Norman, Michael L.; Xu, Hao

    2018-01-01

    We present synthetic observations for the first generations of galaxies in the Universe and make predictions for future deep field observations for redshifts greater than 6. Due to the strong impact of nebular emission lines and the relatively compact scale of HII regions, high resolution cosmological simulations and a robust suite of analysis tools are required to properly simulate spectra. We created a software pipeline consisting of FSPS, Yggdrasil, Hyperion, Cloudy and our own tools to generate synthetic IR observations from a fully three-dimensional arrangement of gas, dust, and stars. Our prescription allows us to include emission lines for a complete chemical network and tackle the effect of dust extinction and scattering in the various lines of sight. We provide spectra, 2-D binned photon imagery for both HST and JWST IR filters, luminosity relationships, and emission line strengths for a large sample of high redshift galaxies in the Renaissance Simulations (Xu et al. 2013). We also pay special attention to contributions from Population III stars and high-mass X-ray binaries and explore a direct-collapse black hole simulation (Aykutalp et al. 2014). Our resulting synthetic spectra show high variability between galactic halos with a strong dependence on stellar mass, viewing angle, metallicity, gas mass fraction, and formation history.

  4. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    comments RST reset RTT round trip time SACK selective acknowledgment SDN software-defined networking SIMPLE Software-defIned Middlebox PoLicy...elements of software-defined networking ( SDN ). Concluding our coverage of the solution space, in Section 3.3 we take a closer look at various... SDN to middlebox architectures [22,83]. SDN is a new and emerging network architectural design strategy that decouples the intelligence and traffic

  5. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    recognize that long-term survival of their communities and livelihoods rests on balancing sustainability of these industries with increased... diversification of coastal economies. Consequently, several new initiatives have been launched by federal, provincial, regional and local authorities, First

  6. Adaptive end-to-end optimization of mobile video streaming using QoS negotiation

    NARCIS (Netherlands)

    Taal, Jacco R.; Langendoen, Koen; van der Schaaf, Arjen; van Dijk, H.W.; Lagendijk, R. (Inald) L.

    Video streaming over wireless links is a non-trivial problem due to the large and frequent changes in the quality of the underlying radio channel combined with latency constraints. We believe that every layer in a mobile system must be prepared to adapt its behavior to its environment. Thus layers

  7. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    Science.gov (United States)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  8. Using Voice Over Internet Protocol to Create True End-to-End Security

    Science.gov (United States)

    2011-09-01

    Wikileaks (Fildes, 2010). These sets were not made public by any foreign spy or even a teenager hacking into classified networks out of curiosity or...video and chat client that supports SIP, XMPP/Jabber, AIM/ICQ, Windows Live, Yahoo !, Bonjour, and others. 4 Wireshark is a network protocol analyzer...Jitsi has the ability to connect via audio, video, and other services such as Jabber and Yahoo Messenger adding to the abnormally large section of

  9. End-to-End Architecture Modularisation and Slicing for Next Generation Networks

    OpenAIRE

    An, Xueli; Trivisonno, Riccardo; Einsiedler, Hans; von Hugo, Dirk; Haensge, Kay; Huang, Xiaofeng; Shen, Qing; Corujo, Daniel; Mahmood, Kashif; Trossen, Dirk; Liebsch, Marco; Leitao, Filipe; Phan, Cao-Thanh; Klamm, Frederic

    2016-01-01

    The journey towards the deployment of next generation networks has recently accelerated, driven by the joint effort of research and standards organisations. Despite this fact, the overall picture is still unclear as prioritization and understanding on several key concepts are not yet agreed by major vendors and network providers. Network Slicing is one of the central topics of the debate, and it is expected to become the key feature of next generation networks, providing the flexibility requi...

  10. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  11. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effect...

  12. Research on the Establishment and Evaluation of End - to - End Service Quality Index System

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    From the perspective of power data networks, put forward the index system model to measure the quality of service, covering user experience, business performance, network capacity support, etc., and gives the establishment and use of each layer index in the model.

  13. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible.......Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  14. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    Science.gov (United States)

    2012-08-01

    MITRE, 1977. [Bis95] Matt Bishop. Race conditions, files, and security flaws; or the tortoise and the hare redux. Report CSE-95-8, Univ. of California...Tal Garfinkel, Keith Adams, Andrew Warfield, and Jason Franklin. Compat- ibility is Not Transparency: VMM Detection Myths and Realities. In Proc. 11th

  15. End-to-End simulation study of a full magnetic gradiometry mission

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Olsen, Nils

    2014-01-01

    of a simulated low Earth orbiting satellite. The observations are synthesized from realistic models based upon a combination of the major sources contributing to the Earth’s magnetic field. From those synthetic data, we estimate field models using either the magnetic vector field observations only or the full......In this paper, we investigate space magnetic gradiometry as a possible path for future exploration of the Earth’s magnetic field with satellites. Synthetic observations of the magnetic field vector and of six elements of the magnetic gradient tensor are calculated for times and positions...

  16. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  17. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    in this paper focus on 1) peer-to-peer in a WLAN setting, 2) p2p behind NAT and 3) what we call a server bounce mechanism. The algorithm is supported by a User-specific Virtual Network to obtain required network state information. Experimental tests are conducted, using both simulations and actual...

  18. A secure searcher for end-to-end encrypted email communication

    OpenAIRE

    Mani, Balamaruthu

    2015-01-01

    Email has become a common mode of communication for confidential personal as well as business needs. There are different approaches to authenticate the sender of an email message at the receiver‟s client and ensure that the message can be read only by the intended recipient. A typical approach is to use an email encryption standard to encrypt the message on the sender‟s client and decrypt it on the receiver‟s client for secure communication. A major drawback of this approach is that only the ...

  19. Integrating end-to-end encryption and authentication technology into broadband networks

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, L.G.

    1995-11-01

    BISDN services will involve the integration of high speed data, voice, and video functionality delivered via technology similar to Asynchronous Transfer Mode (ATM) switching and SONET optical transmission systems. Customers of BISDN services may need a variety of data authenticity and privacy assurances, via Asynchronous Transfer Mode (ATM) services Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale for implementation at high speed. The incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. While there are many design issues associated with the serving of public keys for authenticated signaling and for establishment of session cryptovariables, this paper is concerned with the impact of encryption itself on such communications once the signaling and setup have been completed. Network security protections should be carefully matched to the threats against which protection is desired. Even after eliminating unnecessary protections, the remaining customer-required network security protections can impose severe performance penalties. These penalties (further discussed below) usually involve increased communication processing for authentication or encryption, increased error rate, increased communication delay, and decreased reliability/availability. Protection measures involving encryption should be carefully engineered so as to impose the least performance, reliability, and functionality penalties, while achieving the required security protection. To study these trade-offs, a prototype encryptor/decryptor was developed. This effort demonstrated the viability of implementing certain encryption techniques in high speed networks. The research prototype processes ATM cells in a SONET OC-3 payload. This paper describes the functionality, reliability, security, and performance design trade-offs investigated with the prototype.

  20. End-to-end unsupervised deformable image registration with a convolutional neural network

    NARCIS (Netherlands)

    de Vos, Bob D.; Berendsen, Floris; Viergever, Max A.; Staring, Marius; Išgum, Ivana

    2017-01-01

    In this work we propose a deep learning network for deformable image registration (DIRNet). The DIRNet consists of a convolutional neural network (ConvNet) regressor, a spatial transformer, and a resampler. The ConvNet analyzes a pair of fixed and moving images and outputs parameters for the spatial

  1. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    of children any process is allowed to spawn. Suppose a process with ID i and c children (c < mc) spawns a new child. Then the child’s ID will always...written in assembly; we verify all of the code, regardless of language. Categories and Subject Descriptors D.2.4 [Software En- gineering]: Software... allowed to flow between various domains? If we express the policy in terms of the high-level syscall specifications, then what will this imply for the

  2. Asynchronous food-web pathways could buffer the response of Serengeti predators to El Niño Southern Oscillation.

    Science.gov (United States)

    Sinclair, A R E; Metzger, Kristine L; Fryxell, John M; Packer, Craig; Byrom, Andrea E; Craft, Meggan E; Hampson, Katie; Lembo, Tiziana; Durant, Sarah M; Forrester, Guy J; Bukombe, John; Mchetto, John; Dempewolf, Jan; Hilborn, Ray; Cleaveland, Sarah; Nkwabi, Ally; Mosser, Anna; Mduma, Simon A R

    2013-05-01

    Understanding how entire ecosystems maintain stability in the face of climatic and human disturbance is one of the most fundamental challenges in ecology. Theory suggests that a crucial factor determining the degree of ecosystem stability is simply the degree of synchrony with which different species in ecological food webs respond to environmental stochasticity. Ecosystems in which all food-web pathways are affected similarly by external disturbance should amplify variability in top carnivore abundance over time due to population interactions, whereas ecosystems in which a large fraction of pathways are nonresponsive or even inversely responsive to external disturbance will have more constant levels of abundance at upper trophic levels. To test the mechanism underlying this hypothesis, we used over half a century of demographic data for multiple species in the Serengeti (Tanzania) ecosystem to measure the degree of synchrony to variation imposed by an external environmental driver, the El Niño Southern Oscillation (ENSO). ENSO effects were mediated largely via changes in dry-season vs. wet-season rainfall and consequent changes in vegetation availability, propagating via bottom-up effects to higher levels of the Serengeti food web to influence herbivores, predators and parasites. Some species in the Serengeti food web responded to the influence of ENSO in opposite ways, whereas other species were insensitive to variation in ENSO. Although far from conclusive, our results suggest that a diffuse mixture of herbivore responses could help buffer top carnivores, such as Serengeti lions, from variability in climate. Future global climate changes that favor some pathways over others, however, could alter the effectiveness of such processes in the future.

  3. Linking Carbon-Nitrogen-Phosphorus Cycle and Foodweb Models of an Estuarine Lagoon Ecosystem

    Directory of Open Access Journals (Sweden)

    Ali Ertürk

    2015-09-01

    Full Text Available In this study, an NPZD model and a trophic network model that contains organism groups on the higher trophic levels were developed and linked using the “bottom-up control” approach. Such a linkage of models provides the possibility to use the advantages of both models; reproducing of the erratic behaviour of nutrients and plankton as realistic as possible, while still taking the more complex organisms in the trophic network, which respond to external forcing in a larger time scale. The models developed in this study were applied to the Curonian Lagoon that is an important estuarine ecosystem for Lithuania. The tests and simulations have proven that the results of the NPZD model were accurate enough for representing the nutrient and phytoplankton dynamics in the Curonian Lagoon as well as spatial differences which are of ecological interest. Linkage with trophic network model demonstrated NPZD model results to be consistent with the Curonian Lagoons ecosystem. The modelling results showed that primary production is relatively high in the Curonian Lagoon and is unlikely to be controlled by the organisms that are on the higher trophic levels of the food web. Analysis of the NPZD model scenarios with different nutrients inputs revealed that phosphorus is the main limiting nutrient for primary production in the Curonian Lagoon. However, different combinations of nitrogen and phosphorus inputs control the relative abundance of different phytoplankton groups. Investigation of reaction of ecosystem to water temperature increase showed that the temperature increase finally leads to decrease of available phytoplankton to upper levels of the food web.DOI: 10.15181/csat.v3i1.1093

  4. Optical mass memory system (AMM-13). AMM/DBMS interface control document

    Science.gov (United States)

    Bailey, G. A.

    1980-01-01

    The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.

  5. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    Science.gov (United States)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  6. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  7. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    Science.gov (United States)

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  8. Building the tree of life from scratch: an end-to-end work flow for phylogenomic studies

    Science.gov (United States)

    Whole genome sequences are rich sources of information about organisms that are superbly useful for addressing a wide variety of evolutionary questions. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understan...

  9. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    Science.gov (United States)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  10. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  11. End-to-end encryption in on-line payment systems: The industry reluctance and the role of laws

    OpenAIRE

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry gives its best endeavors to strengthen the technical and processing factors, while the government has been called upon to improve the legal factors. However, a breach of consumer's data and financial ...

  12. Themis-ml: A Fairness-aware Machine Learning Interface for End-to-end Discrimination Discovery and Mitigation

    OpenAIRE

    Bantilan, Niels

    2017-01-01

    As more industries integrate machine learning into socially sensitive decision processes like hiring, loan-approval, and parole-granting, we are at risk of perpetuating historical and contemporary socioeconomic disparities. This is a critical problem because on the one hand, organizations who use but do not understand the discriminatory potential of such systems will facilitate the widening of social disparities under the assumption that algorithms are categorically objective. On the other ha...

  13. End-To-End Solution for Integrated Workload and Data Management using glideinWMS and Globus Online

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the glideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Petascale Scienc...

  14. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B [Loma Linda UniversityMedical Center, Loma Linda, CA (United States)

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  15. Bridging automatic speech recognition and psycholinguistics: Extending Shortlist to an end-to-end model of human speech recognition (L)

    Science.gov (United States)

    Scharenborg, Odette; ten Bosch, Louis; Boves, Lou; Norris, Dennis

    2003-12-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz., Shortlist [Norris, Cognition 52, 189-234 (1994)]. Experiments based on ``real-life'' speech highlight critical limitations posed by some of the simplifying assumptions made in models of human speech recognition. These limitations could be overcome by avoiding hard phone decisions at the output side of the APR, and by using a match between the input and the internal lexicon that flexibly copes with deviations from canonical phonemic representations.

  16. An end-to-end model of the Earth Radiation Budget Experiment (ERBE) Earth-viewing nonscanning radiometric channels

    OpenAIRE

    Priestly, Kory James

    1993-01-01

    The Earth Radiation Budget Experiment (ERBE) active-cavity radiometers are used to measure the incoming solar, reflected solar, and emitted longwave radiation from the Earth and its atmosphere. The radiometers are carried by the National Aeronautics and Space Administration's Earth Radiation Budget Satellite (ERBS) and the National Oceanic and Atmospheric Administration's NOAA-9 and NOAA-10 spacecraft. Four Earth-viewing nonscanning active-cavity radiometers are carried by e...

  17. Bridging Automatic Speech Recognition and Psycholinguistics: Extending Shortlist to an End-to-End Model of Human Speech Recognition

    NARCIS (Netherlands)

    Scharenborg, O.E.; Bosch, L.F.M. ten; Boves, L.W.J.; Norris, D.

    2003-01-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz. Shortlist (Norris, 1994). Experiments based on 'real-life' speech highlight

  18. Food-web dynamics and isotopic niches in deep-sea communities residing in a submarine canyon and on the adjacent open slopes

    Science.gov (United States)

    Demopoulos, Amanda W.J.; McClain-Counts, Jennifer; Ross, Steve W.; Brooke, Sandra; Mienis, Furu

    2017-01-01

    Examination of food webs and trophic niches provide insights into organisms' functional ecology, yet few studies have examined trophodynamics within submarine canyons, where the interaction of canyon morphology and oceanography influences habitat provision and food deposition. Using stable isotope analysis and Bayesian ellipses, we documented deep-sea food-web structure and trophic niches in Baltimore Canyon and the adjacent open slopes in the US Mid-Atlantic Region. Results revealed isotopically diverse feeding groups, comprising approximately 5 trophic levels. Regression analysis indicated that consumer isotope data are structured by habitat (canyon vs. slope), feeding group, and depth. Benthic feeders were enriched in 13C and 15N relative to suspension feeders, consistent with consuming older, more refractory organic matter. In contrast, canyon suspension feeders had the largest and more distinct isotopic niche, indicating they consume an isotopically discrete food source, possibly fresher organic material. The wider isotopic niche observed for canyon consumers indicated the presence of feeding specialists and generalists. High dispersion in δ13C values for canyon consumers suggests that the isotopic composition of particulate organic matter changes, which is linked to depositional dynamics, resulting in discrete zones of organic matter accumulation or resuspension. Heterogeneity in habitat and food availability likely enhances trophic diversity in canyons. Given their abundance in the world's oceans, our results from Baltimore Canyon suggest that submarine canyons may represent important havens for trophic diversity.

  19. Fish tissue lipid-C:N relationships for correcting δ(13) C values and estimating lipid content in aquatic food-web studies.

    Science.gov (United States)

    Hoffman, Joel C; Sierszen, Michael E; Cotter, Anne M

    2015-11-15

    Normalizing δ(13) C values of animal tissue for lipid content is necessary to accurately interpret food-web relationships from stable isotope analysis. To reduce the effort and expense associated with chemical extraction of lipids, various studies have tested arithmetic mass balance to mathematically normalize δ(13) C values for lipid content; however, the approach assumes that lipid content is related to the tissue C:N ratio. We evaluated two commonly used models for estimating tissue lipid content based on C:N ratio (a mass balance model and a stoichiometric model) by comparing model predictions to measure the lipid content of white muscle tissue. We then determined the effect of lipid model choice on δ(13) C values normalized using arithmetic mass balance. To do so, we used a collection of fish from Lake Superior spanning a wide range in lipid content (5% to 73% lipid). We found that the lipid content was positively related to the bulk muscle tissue C:N ratio. The two different lipid models produced similar estimates of lipid content based on tissue C:N, within 6% for tissue C:N values 1.0‰. Published in 2015. This article is a U.S. Government work and is in the public domain in the U.S.A.

  20. Evidence for a (15)N positive excursion in terrestrial foodwebs at the Middle to Upper Palaeolithic transition in south-western France: Implications for early modern human palaeodiet and palaeoenvironment.

    Science.gov (United States)

    Bocherens, Hervé; Drucker, Dorothée G; Madelaine, Stéphane

    2014-04-01

    The Middle to Upper Palaeolithic transition around 35,000 years ago coincides with the replacement of Neanderthals by anatomically modern humans in Europe. Several hypotheses have been suggested to explain this replacement, one of them being the ability of anatomically modern humans to broaden their dietary spectrum beyond the large ungulate prey that Neanderthals consumed exclusively. This scenario is notably based on higher nitrogen-15 amounts in early Upper Palaeolithic anatomically modern human bone collagen compared with late Neanderthals. In this paper, we document a clear increase of nitrogen-15 in bone collagen of terrestrial herbivores during the early Aurignacian associated with anatomically modern humans compared with the stratigraphically older Châtelperronian and late Mousterian fauna associated with Neanderthals. Carnivores such as wolves also exhibit a significant increase in nitrogen-15, which is similar to that documented for early anatomically modern humans compared with Neanderthals in Europe. A shift in nitrogen-15 at the base of the terrestrial foodweb is responsible for such a pattern, with a preserved foodweb structure before and after the Middle to Upper Palaeolithic transition in south-western France. Such an isotopic shift in the terrestrial ecosystem may be due to an increase in aridity during the time of deposition of the early Aurignacian layers. If it occurred across Europe, such a shift in nitrogen-15 in terrestrial foodwebs would be enough to explain the observed isotopic trend between late Neanderthals and early anatomically modern humans, without any significant change in the diet composition at the Middle to Upper Palaeolithic transition. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Food-web and ecosystem structure of the open-ocean and deep-sea environments of the Azores, NE Atlantic

    Directory of Open Access Journals (Sweden)

    Telmo Morato

    2016-12-01

    Full Text Available The Marine Strategy Framework Directive intends to adopt ecosystem-based management for resources, biodiversity and habitats that puts emphasis on maintaining the health of the ecosystem alongside appropriate human use of the marine environment, for the benefit of current and future generations. Within the overall framework of ecosystem-based management, ecosystem models are tools to evaluate and gain insights in ecosystem properties. The low data availability and complexity of modelling deep-water ecosystems has limited the application of ecosystem models to few deep-water ecosystems. Here, we aim to develop an ecosystem model for the deep-sea and open ocean in the Azores exclusive economic zone with the overarching objective of characterising the food-web and ecosystem structure of the ecosystem. An ecosystem model with 45 functional groups, including a detritus group, two primary producer groups, eight invertebrate groups, 29 fish groups, three marine mammal groups, a turtle and a seabird group was built. Overall data quality measured by the pedigree index was estimated to be higher than the mean value of all published models. Therefore, the model was built with source data of an overall reasonable quality, especially considering the normally low data availability for deep-sea ecosystems. The total biomass (excluding detritus of the modelled ecosystem for the whole area was calculated as 24.7 t km-². The mean trophic level for the total marine catch of the Azores was estimated to be 3.95, similar to the trophic level of the bathypelagic and medium-size pelagic fish. Trophic levels for the different functional groups were estimated to be similar to those obtained with stable isotopes and stomach contents analyses, with some exceptions on both ends of the trophic spectra. Omnivory indices were in general low, indicating prey speciation for the majority of the groups. Cephalopods, pelagic sharks and toothed whales were identified as groups with

  2. An Efficient Congestion Control Protocol for Wired/Wireless Networks

    OpenAIRE

    Hanaa Torkey; Gamal ATTIYA; Ahmed Abdel Nabi

    2014-01-01

    Recently, wide spectrum of heterogeneous wireless access networks integrate with high speed wired networks to deliver Internet services. End-to-end service delivery with satisfactory quality is challenging issue in such network architectures. Although the Internet transport control protocol (TCP) addresses such challenge, it has poor performance with high speed wired networks (i.e. high bandwidth-delay product). Moreover, it behaves badly with wireless access networks (i.e. misinterpretation ...

  3. Combined Admission Control and Scheduling for QoS Differentiation in LTE Uplink

    DEFF Research Database (Denmark)

    Anas, Mohmmad; Rosa, Claudio; Calabrese, Francesco Davide

    2008-01-01

    Long term evolution (LTE) architecture shall support end-to-end quality of service (QoS). For the QoS support and service differentiation it is important that the admission control and packet scheduling functionalities are QoS-aware. In this paper a combined admission control and a decoupled time......-frequency domain scheduling framework for LTE uplink is presented. The proposed framework is shown to effectively differentiate QoS user classes in a mixed traffic scenario....

  4. Estimation of RTT and bandwidth for congestion Control Applications in Communication Networks

    OpenAIRE

    Jacobsson, Krister; Hjalmarsson, Håkan; Möller, Niels; Johansson, Karl Henrik

    2004-01-01

    Heterogeneous communication networks with their variety of application demands, uncertain time-varying traffic load, and mixture of wired and wireless links pose several challenging problem in modeling and control. In this paper we focus on the round-trip time (RTT), which is a particularly important variable for efficient end-to-end congestion control, and on bandwidth estimation. Based on a simple aggregated model of the network, an algorithm combining a Kalman filter and a change detection...

  5. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been obtained. The key to quality taxonomic work on macrofaunal animals from the tropical abyss is careful extraction of the animals (in cold, filtered seawater, microscopic observation and preservation of live specimens, from a variety of sampling devices by experienced zoologists at sea. Essential to the long-term iterative building of taxonomic knowledge from the CCZ is an “end-to-end” methodology to the taxonomic science that takes into account careful sampling design, at-sea taxonomic identification and fixation, post-cruise laboratory work with both DNA and morphology and finally a careful sample and data management pipeline that results in specimens and data in accessible open museum collections and online repositories.

  6. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement Part I—Temporal Alignment

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part II, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. In contrast to the

  7. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement : Part II – Perceptual Model

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part I, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. This paper describes the

  8. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  9. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    Science.gov (United States)

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-06-16

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  10. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    Energy Technology Data Exchange (ETDEWEB)

    Lin, M; Feigenberg, S [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  11. Web-based bioinformatics workflows for end-to-end RNA-seq data computation and analysis in agricultural animal species

    Science.gov (United States)

    Remarkable advances in next-generation sequencing (NGS) technologies, bioinformatics algorithms, and computational technologies have significantly accelerated genomic research. However, complicated NGS data analysis still remains as a major bottleneck. RNA-seq, as one of the major area in the NGS fi...

  12. Ex vivo proof-of-concept of end-to-end scaffold-enhanced laser-assisted vascular anastomosis of porcine arteries

    NARCIS (Netherlands)

    Pabittei, Dara R.; Heger, Michal; van Tuijl, Sjoerd; Simonet, Marc; de Boon, Wadim; van der Wal, Allard C.; Balm, Ron; de Mol, Bas A.

    2015-01-01

    The low welding strength of laser-assisted vascular anastomosis (LAVA) has hampered the clinical application of LAVA as an alternative to suture anastomosis. To improve welding strength, LAVA in combination with solder and polymeric scaffolds (ssLAVA) has been optimized in vitro. Currently, ssLAVA

  13. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    Science.gov (United States)

    Sumner, Matthew Casey

    This thesis describes the construction, integration, and use of a new 230-GHz ultra-wideband heterodyne receiver, as well as the development and testing of a new sideband-deconvolution algorithm, both designed to enable rapid, sensitive molecular-line surveys. The 230-GHz receiver, known as Z-Rex, is the first of a new generation of wideband receivers to be installed at the Caltech Submillimeter Observatory (CSO). Intended as a proof-of-concept device, it boasts an ultra-wide IF output range of sim 6 - 18 GHz, offering as much as a twelvefold increase in the spectral coverage that can be achieved with a single LO setting. A similarly wideband IF system has been designed to couple this receiver to an array of WASP2 spectrometers, allowing the full bandwidth of the receiver to be observed at low resolution, ideal for extra-galactic redshift surveys. A separate IF system feeds a high-resolution 4-GHz AOS array frequently used for performing unbiased line surveys of galactic objects, particularly star-forming regions. The design and construction of the wideband IF system are presented, as is the work done to integrate the receiver and the high-resolution spectrometers into a working system. The receiver is currently installed at the CSO where it is available for astronomers' use. In addition to demonstrating wideband design principles, the receiver also serves as a testbed for a synthesizer-driven, active LO chain that is under consideration for future receiver designs. Several lessons have been learned, including the importance of driving the final amplifier of the LO chain into saturation and the absolute necessity of including a high-Q filter to remove spurious signals from the synthesizer output. The on-telescope performance of the synthesizer-driven LO chain is compared to that of the Gunn-oscillator units currently in use at the CSO. Although the frequency agility of the synthesized LO chain gives it a significant advantage for unbiased line surveys, the cleaner signal and broader tuning range of the Gunn continue to make it the preferred choice. The receiver and high-resolution spectrometer system were brought into a fully operational state late in 2007, when they were used to perform unbiased molecular-line surveys of several galactic sources, including the Orion KL hot core and a position in the L1157 outflow. In order to analyze these data, a new data pipeline was needed to deconvolve the double-sideband signals from the receiver and to model the molecular spectra. A highly automated sideband-deconvolution system has been created, and spectral-analysis tools are currently being developed. The sideband deconvolution relies on chi-square minimization to determine the optimal single-sideband spectrum in the presence of unknown sideband-gain imbalances and spectral baselines. Analytic results are presented for several different methods of approaching the problem, including direct optimization, nonlinear root finding, and a hybrid approach that utilizes a two-stage process to separate out the relatively weak nonlinearities so that the majority of the parameters can be found with a fast linear solver. Analytic derivations of the Jacobian matrices for all three cases are presented, along with a new Mathematica utility that enables the calculation of arbitrary gradients. The direct-optimization method has been incorporated into software, along with a spectral simulation engine that allows different deconvolution scenarios to be tested. The software has been validated through the deconvolution of simulated data sets, and initial results from L1157 and Orion are presented. Both surveys demonstrate the power of the wideband receivers and improved data pipeline to enable exciting scientific studies. The L1157 survey was completed in only 20 hours of telescope time and offers moderate sensitivity over a > 50-GHz range, from 220 GHz to approximately 270 or 280 GHz. The speed with which this survey was completed implies that the new systems will permit unbiased line surveys to become a standard observational tool. The Orion survey is expected to offer sim 30 mK sensitivity over a similar frequency range, improving previous results by an order of magnitude. The new receiver's ability to cover such broad bandwidths permits very deep surveys to be completed in a reasonable time, and the sideband-deconvolution algorithm is capable of preserving these low noise levels. Combined, these tools can provide line spectra with the sensitivity required for constraining astrochemical models and investigating prebiotic molecules.

  14. Ex vivo proof-of-concept of end-to-end scaffold-enhanced laser-assisted vascular anastomosis of porcine arteries.

    Science.gov (United States)

    Pabittei, Dara R; Heger, Michal; van Tuijl, Sjoerd; Simonet, Marc; de Boon, Wadim; van der Wal, Allard C; Balm, Ron; de Mol, Bas A

    2015-07-01

    The low welding strength of laser-assisted vascular anastomosis (LAVA) has hampered the clinical application of LAVA as an alternative to suture anastomosis. To improve welding strength, LAVA in combination with solder and polymeric scaffolds (ssLAVA) has been optimized in vitro. Currently, ssLAVA requires proof-of-concept in a physiologically representative ex vivo model before advancing to in vivo studies. This study therefore investigated the feasibility of ex vivo ssLAVA in medium-sized porcine arteries. Scaffolds composed of poly(ε-caprolactone) (PCL) or poly(lactic-co-glycolic acid) (PLGA) were impregnated with semisolid solder and placed over coapted aortic segments. ssLAVA was performed with a 670-nm diode laser. In the first substudy, the optimum number of laser spots was determined by bursting pressure analysis. The second substudy investigated the resilience of the welds in a Langendorf-type pulsatile pressure setup, monitoring the number of failed vessels. The type of failure (cohesive vs adhesive) was confirmed by electron microscopy, and thermal damage was assessed histologically. The third substudy compared breaking strength of aortic repairs made with PLGA and semisolid genipin solder (ssLAVR) to repairs made with BioGlue. ssLAVA with 11 lasing spots and PLGA scaffold yielded the highest bursting pressure (923 ± 56 mm Hg vs 703 ± 96 mm Hg with PCL ssLAVA; P = .0002) and exhibited the fewest failures (20% vs 70% for PCL ssLAVA; P = .0218). The two failed PLGA ssLAVA arteries leaked at 19 and 22 hours, whereas the seven failed PCL ssLAVA arteries burst between 12 and 23 hours. PLGA anastomoses broke adhesively, whereas PCL welds failed cohesively. Both modalities exhibited full-thickness thermal damage. Repairs with PLGA scaffold yielded higher breaking strength than BioGlue repairs (323 ± 28 N/cm(2) vs 25 ± 4 N/cm(2), respectively; P = .0003). PLGA ssLAVA yields greater anastomotic strength and fewer anastomotic failures than PCL ssLAVA. Aortic repairs with BioGlue were inferior to those produced with PLGA ssLAVR. The results demonstrate the feasibility of ssLAVA/R as an alternative method to suture anastomosis or tissue sealant. Further studies should focus on reducing thermal damage. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  15. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Muñóz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesús; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaró, Miquel; Pérez-Neira, Ana; Casellas, Ramon; Martínez, Ricardo; Núñez-Martínez, Jose; Manuel Requena Esteso, Manuel; Pubill, David; Font-Batch, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  16. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    Science.gov (United States)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  17. Perspectives with the GCT end-to-end prototype of the small-sized telescope proposed for the Cherenkov telescope array

    Science.gov (United States)

    Costantini, H.; Dournaux, J.-L.; Ernenwein, J.-P.; Laporte, P.; Sol, H.

    2017-01-01

    In the framework of the Cherenkov Telescope Array (CTA), the GCT (Gamma-ray Cherenkov Telescope) team is building a dual-mirror telescope as one of the proposed prototypes for the CTA small size class of telescopes. The telescope is based on a Schwarzschild-Couder (SC) optical design, an innovative solution for ground-based Cherenkov astronomy, which allows a compact telescope structure, a lightweight large Field of View (FoV) camera and enables good angular resolution across the entire FoV. We review the different mechanical and optical components of the telescope. In order to characterise them, the Paris prototype will be operated during several weeks in 2016. In this framework, an estimate of the expected performance of this prototype has been made, based on Monte Carlo simulations. In particular the observability of the Crab Nebula in the context of high Night Sky Background (NSB) is presented.

  18. An Intrinsic TE Approach for End-to-End QoS Provisioning in OBS Networks Using Static Load-Balanced Routing Strategies

    Directory of Open Access Journals (Sweden)

    Alvaro L. Barradas

    2010-10-01

    Full Text Available Optical burst switching provides a feasible paradigm for the next IP over optical backbones. However its burst loss performance can be highly affected by burst contention. In this paper we discuss traffic engineering approaches for path selection with the objective tominimize contention using only topological information. The discussed strategies are based on balancing the traffic across the network in order to reduce congestion without incurring into link state protocol penalties. The routing strategies are evaluated by simulation on an optical burst switching model specifically developed for the purpose with OMNeT++. Results show that our strategies outperform the traditionally used shortest path routing to an extent that depends on the network connectivity.

  19. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management

    Directory of Open Access Journals (Sweden)

    Pierrick Marie

    2015-06-01

    Full Text Available Quality of Context (QoC awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  20. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    Full Text Available Palm oil represents the most efficient oilseed crop in the world but the production of palm oil involves plantation operations in one of the most fragile environments - the tropical lowlands. Deforestation, the drying-out of swampy lowlands and chemical fertilizers lead to environmental problems that are putting pressure on this industry. Unmanned aircraft systems (UAS together with latest photogrammetric processing and image analysis capabilities represent an emerging technology that was identified to be suitable to optimize oil palm plantation operations. This paper focuses on two key elements of a UAS-based oil palm monitoring system. The first is the accuracy of the acquired data that is necessary to achieve meaningful results in later analysis steps. High performance GNSS technology was utilized to achieve those accuracies while decreasing the demand for cost-intensive GCP measurements. The second key topic is the analysis of the resulting data in order to optimize plantation operations. By automatically extracting information on a block level as well as on a single-tree level, operators can utilize the developed application to increase their productivity. The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  1. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    Energy Technology Data Exchange (ETDEWEB)

    Huang, L [Huntsman Cancer Hospital, Salt Lake City, UT (United States); Sarkar, V [University of Utah Hospitals, Salt Lake City, UT (United States); Spiessens, S [Varian Medical Systems France, Buc Cedex (France); Rassiah-Szegedi, P; Huang, Y; Salter, B [University Utah, Salt Lake City, UT (United States); Zhao, H [University of Utah, Salt Lake City, UT (United States); Szegedi, M [Huntsman Cancer Hospital, The University of Utah, Salt Lake City, UT (United States)

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  2. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    Science.gov (United States)

    2012-04-24

    emergency planning. Public health departments are suffering from brain drain.‖  ―For planning purposes, the assumption is that 60% of volunteers will...concerto practice. "In a concerto, a musician practices the difficult parts one at a time, then they put the whole process together. POD elements

  3. Design and implementation of a secure and user-friendly broker platform supporting the end-to-end provisioning of e-homecare services.

    Science.gov (United States)

    Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart

    2010-01-01

    We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.

  4. Operating performance of the gamma-ray Cherenkov telescope: An end-to-end Schwarzschild–Couder telescope prototype for the Cherenkov Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Dournaux, J.L., E-mail: jean-laurent.dournaux@obspm.fr [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); De Franco, A. [Department of Physics, University of Oxford, Keble Road, Oxford OX1 3RH (United Kingdom); Laporte, P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); White, R. [Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany); Greenshaw, T. [University of Liverpool, Oliver Lodge Laboratory, P.O. Box 147, Oxford Street, Liverpool L69 3BX (United Kingdom); Sol, H. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Abchiche, A. [CNRS, Division technique DT-INSU, 1 Place Aristide Briand, 92190 Meudon (France); Allan, D. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Amans, J.P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Armstrong, T.P. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Balzer, A.; Berge, D. [GRAPPA, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Boisson, C. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); and others

    2017-02-11

    The Cherenkov Telescope Array (CTA) consortium aims to build the next-generation ground-based very-high-energy gamma-ray observatory. The array will feature different sizes of telescopes allowing it to cover a wide gamma-ray energy band from about 20 GeV to above 100 TeV. The highest energies, above 5 TeV, will be covered by a large number of Small-Sized Telescopes (SSTs) with a field-of-view of around 9°. The Gamma-ray Cherenkov Telescope (GCT), based on Schwarzschild–Couder dual-mirror optics, is one of the three proposed SST designs. The GCT is described in this contribution and the first images of Cherenkov showers obtained using the telescope and its camera are presented. These were obtained in November 2015 in Meudon, France.

  5. ALIVE-IN-RANGE MEDIUM ACCESS CONTROL PROTOCOL TO MINIMIZE DELAY IN UNDERWATER WIRELESS SENSOR NETWORK COMMUNICATION AT A FREQUENCY OF 2.4 GHz

    Directory of Open Access Journals (Sweden)

    VIKAS RAINA

    2017-11-01

    Full Text Available Time synchronization between the sensor nodes to reduce end to end delay for critical and real time data monitoring can be achieved by cautiously monitoring the mobility of the mobile sink node in underwater wireless sensor networks. The proposed Alive-in-Range (AR-MAC medium access control protocol monitors delay sensitive, critical and real time data. The idea evolves as a reduction in duty cycle, precise time scheduling of active/sleep cycles of the sensors, monitoring the mobility of the sink node along-with the selection of appropriate queues and schedulers can reduce the end to end delay enhancing other performance metrics too. The algorithms effective path determination and optimum throughput path determination are proposed. It is assumed that the sensors are properly anchored to limit their movement due to waves within the permissible limits to follow these algorithms. This paper attempts to utilize electromagnetic waves at resonance frequency of 2.4 GHz for underwater communication. The results verify that the implementation of Alive-in-Range MAC protocol has reduced the average end to end delay significantly making it appropriate for critical and real time data monitoring. This work proves the suitability of electromagnetic waves as an effective alternative for underwater wireless communication. The main objective is to mitigate sink neighbourhood problem, distance constrained mobile sink problem and to reduce the average end to end delay by implementing Alive-in-Range (AR-MAC medium access control protocol in underwater sensor networks and to draw the attention of researchers in this area.

  6. Transfer of PCBs from bottom sediment to freshwater river fish: a food-web modelling approach in the Rhône River (France) in support of sediment management.

    Science.gov (United States)

    Lopes, C; Persat, H; Babut, M

    2012-07-01

    Since 2005, restrictions have been because of fish consumption along the Rhone River because of high polychlorobiphenyl (PCB) concentrations, which have resulted inadverse economic consequences for professional fisheries in affected areas. French environmental authorities have expended considerable efforts to research sediment remediation strategies and development of sediment quality guidelines designed to protect the health of humans consuming Rhône River fish. Here we: (1) develop a bioaccumulation food-web model that describes PCB concentrations in three common freshwater fish species of the Rhône River, using Bayesian inference to estimate the input parameters; (2) test the predictive power of the model in terms of risk assessment for fish consumption; and (3) discuss the use of this approach to develop sediment quality guidelines that protect the health of humans consuming Rhône River fish. The bioaccumulation model predictions are protective for human consumer of fish and are efficient for use in risk assessment. For example, 85% of the predicted values were within a factor of 5 of measured CB153 concentrations in fish. Using sensitivity analyses, the major role played by sediment and diet behaviors on bioaccumulation process is illustrated: the parameters involved in the respiratory process (contamination from water) have little impact on model outputs, whereas the parameters related to diet and digestion processes are the most sensitive. The bioaccumulation model was applied to derive sediment concentrations compatible with safe fish consumption. The resulting PCB sediment thresholds (expressed as the sum of seven PCB indicator congeners) that are protective for the consumption of the fish species ranged from 0.7 to 3 ng/g (dw). Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Linear quadratic Gaussian control for adaptive optics and multiconjugate adaptive optics: experimental and numerical analysis.

    Science.gov (United States)

    Petit, Cyril; Conan, Jean-Marc; Kulcsár, Caroline; Raynaud, Henri-François

    2009-06-01

    We present a comprehensive analysis of the linear quadratic Gaussian control approach applied to adaptive optics (AO) and multiconjugated AO (MCAO) based on numerical and experimental validations. The structure of the control law is presented and its main properties discussed. We then propose an extended experimental validation of this control law in AO and a simplified MCAO configuration. Performance is compared with end-to-end numerical simulations. Sensitivity of the performance regarding tuning parameters is tested. Finally, extension to full MCAO and laser tomographic AO (LTAO) through numerical simulation is presented and analyzed.

  8. Bitrate adaptation flow control and client-based congestion control for multimedia-on-demand

    Science.gov (United States)

    Chan, Siu-Ping; Kok, Chi-Wah

    2003-06-01

    A flow control for streaming multimedia data over UDP on IP network is presented. The bitrate adaptation algorithm embedded in the protocol is considered to be an end-to-end mechanism in the application level. The flow control system constantly maintains the streaming buffer at a prescribed capacity even with bursty network losses by adapting the multimedia bitrate B from the streaming codec. A congestion control algorithm which is considered to be located in a lower level than that of the flow control mechanism is presented. It works together with the presented flow control to resolve network congestion problems while maintaining a degree of TCP-friendliness by changing the sending rate R at the server. Simulation results obtained from NS2 have shown better resource allocation can be obtained, and an overall incrase in the average sending rate and hence better quality of streaming media is observed.

  9. Nutrient cycling and foodwebs in Dutch estuaries

    NARCIS (Netherlands)

    Nienhuis, P.H.

    1993-01-01

    In this review several aspects of the functioning of the Dutch estuaries (Ems-Dollard, Wadden Sea, Oosterschelde, Westerschelde, Grevelingen and Veerse Meer) have been compared. A number of large European rivers (especially Rhine) have a prevailing influence on the nutrient cycling of most Dutch

  10. A Software Architecture for Control of Value Production in Federated Systems

    Directory of Open Access Journals (Sweden)

    Jay S. Bayne

    2003-08-01

    Full Text Available Federated enterprises are defined as interactive commercial entities that produce products and consume resources through a network of open, free-market transactions. Value production in such entities is defined as the real-time computation of enterprise value propositions. These computations are increasingly taking place in a grid-connected space – a space that must provide for secure, real-time, reliable end-to-end transactions governed by formal trading protocols. We present the concept of a value production unit (VPU as a key element of federated trading systems, and a software architecture for automation and control of federations of such VPUs.

  11. A Rate-Based Congestion Control Algorithm for the SURAP 4 Packet Radio Architecture (SRNTN-72)

    Science.gov (United States)

    1990-01-01

    traffic in) the nitsork. Flov, control is, exercised at the transport l,it er hN mecans (it TCP wijndovt s and at the litik-laver h\\ mean,, of the...detemine which of the end-to-end traffic flows it cames. if any. is a major contributor to the heavy traffic load. A node me.,,uring high forwarding...neighbor is ready for it. The spacing, tntervad is t\\ picall\\ short and of lo. , anince. The mnmew of’silence alioriullwi gives acknowledglements high

  12. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  13. RoCoMAR: Robots’ Controllable Mobility Aided Routing and Relay Architecture for Mobile Sensor Networks

    Directory of Open Access Journals (Sweden)

    Seokhoon Yoon

    2013-07-01

    Full Text Available In a practical deployment, mobile sensor network (MSN suffers from a low performance due to high node mobility, time-varying wireless channel properties, and obstacles between communicating nodes. In order to tackle the problem of low network performance and provide a desired end-to-end data transfer quality, in this paper we propose a novel ad hoc routing and relaying architecture, namely RoCoMAR (Robots’ Controllable Mobility Aided Routing that uses robotic nodes’ controllable mobility. RoCoMAR repeatedly performs link reinforcement process with the objective of maximizing the network throughput, in which the link with the lowest quality on the path is identified and replaced with high quality links by placing a robotic node as a relay at an optimal position. The robotic node resigns as a relay if the objective is achieved or no more gain can be obtained with a new relay. Once placed as a relay, the robotic node performs adaptive link maintenance by adjusting its position according to the movements of regular nodes. The simulation results show that RoCoMAR outperforms existing ad hoc routing protocols for MSN in terms of network throughput and end-to-end delay.

  14. QoS Management and Control for an All-IP WiMAX Network Architecture: Design, Implementation and Evaluation

    Directory of Open Access Journals (Sweden)

    Thomas Michael Bohnert

    2008-01-01

    Full Text Available The IEEE 802.16 standard provides a specification for a fixed and mobile broadband wireless access system, offering high data rate transmission of multimedia services with different Quality-of-Service (QoS requirements through the air interface. The WiMAX Forum, going beyond the air interface, defined an end-to-end WiMAX network architecture, based on an all-IP platform in order to complete the standards required for a commercial rollout of WiMAX as broadband wireless access solution. As the WiMAX network architecture is only a functional specification, this paper focuses on an innovative solution for an end-to-end WiMAX network architecture offering in compliance with the WiMAX Forum specification. To our best knowledge, this is the first WiMAX architecture built by a research consortium globally and was performed within the framework of the European IST project WEIRD (WiMAX Extension to Isolated Research Data networks. One of the principal features of our architecture is support for end-to-end QoS achieved by the integration of resource control in the WiMAX wireless link and the resource management in the wired domains in the network core. In this paper we present the architectural design of these QoS features in the overall WiMAX all-IP framework and their functional as well as performance evaluation. The presented results can safely be considered as unique and timely for any WiMAX system integrator.

  15. REAL TIME ANALYSIS OF WIRELESS CONTROLLER AREA NETWORK

    Directory of Open Access Journals (Sweden)

    Gerardine Immaculate Mary

    2014-09-01

    Full Text Available It is widely known that Control Area Networks (CAN are used in real-time, distributed and parallel processing which cover manufacture plants, humanoid robots, networking fields, etc., In applications where wireless conditions are encountered it is convenient to continue the exchange of CAN frames within the Wireless CAN (WCAN. The WCAN considered in this research is based on wireless token ring protocol (WTRP; a MAC protocol for wireless networks to reduce the number of retransmissions due to collision and the wired counterpart CAN attribute on message based communication. WCAN uses token frame method to provide channel access to the nodes in the system. This method allow all the nodes to share common broadcast channel by taken turns in transmitting upon receiving the token frame which is circulating within the network for specified amount of time. This method provides high throughput in bounded latency environment, consistent and predictable delays and good packet delivery ratio. The most important factor to consider when evaluating a control network is the end-to-end time delay between sensors, controllers, and actuators. The correct operation of a control system depends on the timeliness of the data coming over the network, and thus, a control network should be able to guarantee message delivery within a bounded transmission time. The proposed WCAN is modeled and simulated using QualNet, and its average end to end delay and packet delivery ratio (PDR are calculated. The parameters boundaries of WCAN are evaluated to guarantee a maximum throughput and a minimum latency time, in the case of wireless communications, precisely WCAN.

  16. TCP congestion control in input-queued crossbar switch

    Science.gov (United States)

    Zheng, Hongyun; Zhao, Yongxiang; Chen, Changjia

    2005-02-01

    In this paper, we consider congestion control in input queued crossbar switch environment where each input port with finite buffer space while TCP protocol is employed for end-to-end congestion control. We find that it is impossible to achieve efficiency and fairness among TCP flows at the same time only by queue management. Then we propose a scheme of hFS&rEDF, which combine heuristic fair switch arbitration (hFS) and queue management policy of early drop front randomly (rEDF). In our proposed scheme, switch arbitration strategy of hFS unevenly allows input ports to transfer packets to output ports while packets at head of any other input ports involved in conflicts have to be dropped by the policy of rEDF with a probability. Simulation results prove that our proposed scheme can achieve better tradeoff between throughput and fairness.

  17. Guaranteeing Isochronous Control of Networked Motion Control Systems Using Phase Offset Adjustment

    Directory of Open Access Journals (Sweden)

    Ikhwan Kim

    2015-06-01

    Full Text Available Guaranteeing isochronous transfer of control commands is an essential function for networked motion control systems. The adoption of real-time Ethernet (RTE technologies may be profitable in guaranteeing deterministic transfer of control messages. However, unpredictable behavior of software in the motion controller often results in unexpectedly large deviation in control message transmission intervals, and thus leads to imprecise motion. This paper presents a simple and efficient heuristic to guarantee the end-to-end isochronous control with very small jitter. The key idea of our approach is to adjust the phase offset of control message transmission time in the motion controller by investigating the behavior of motion control task. In realizing the idea, we performed a pre-runtime analysis to determine a safe and reliable phase offset and applied the phase offset to the runtime code of motion controller by customizing an open-source based integrated development environment (IDE. We also constructed an EtherCAT-based motion control system testbed and performed extensive experiments on the testbed to verify the effectiveness of our approach. The experimental results show that our heuristic is highly effective even for low-end embedded controller implemented in open-source software components under various configurations of control period and the number of motor drives.

  18. Video Classification and Adaptive QoP/QoS Control for Multiresolution Video Applications on IPTV

    Directory of Open Access Journals (Sweden)

    Huang Shyh-Fang

    2012-01-01

    Full Text Available With the development of heterogeneous networks and video coding standards, multiresolution video applications over networks become important. It is critical to ensure the service quality of the network for time-sensitive video services. Worldwide Interoperability for Microwave Access (WIMAX is a good candidate for delivering video signals because through WIMAX the delivery quality based on the quality-of-service (QoS setting can be guaranteed. The selection of suitable QoS parameters is, however, not trivial for service users. Instead, what a video service user really concerns with is the video quality of presentation (QoP which includes the video resolution, the fidelity, and the frame rate. In this paper, we present a quality control mechanism in multiresolution video coding structures over WIMAX networks and also investigate the relationship between QoP and QoS in end-to-end connections. Consequently, the video presentation quality can be simply mapped to the network requirements by a mapping table, and then the end-to-end QoS is achieved. We performed experiments with multiresolution MPEG coding over WIMAX networks. In addition to the QoP parameters, the video characteristics, such as, the picture activity and the video mobility, also affect the QoS significantly.

  19. A unifying framework for systems modeling, control systems design, and system operation

    Science.gov (United States)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  20. Control and management of distribution system with integrated DERs via IEC 61850 based communication

    Directory of Open Access Journals (Sweden)

    Ikbal Ali

    2017-06-01

    Full Text Available Distributed Energy Resources (DERs are being increasingly integrated in the distribution systems and resulting in complex power flow scenarios. In such cases, effective control, management and protection of distribution systems becomes highly challenging. Standardized and interoperable communication in distribution systems has the potential to deal with such challenges to achieve higher energy efficiency and reliability. Ed. 2 of IEC 61850 standards, for utility automation, standardizing the exchange of data among different substations, DERs, control centers, PMUs and PDCs. This paper demonstrates the modelling of information and services needed for control, management and protection of distribution systems with integrated DERs. This paper has used IP tunnels and/or mapping over IP layer for transferring IEC 61850 messages, such as sample values (SVs and GOOSE (Generic Object Oriented Substation Event, over distribution system Wide Area Network (WAN. Finally performance of the proposed communication configurations for different applications is analyzed by calculating End-to-End (ETE delay, throughput and jitter.

  1. Source and Channel Adaptive Rate Control for Multicast Layered Video Transmission Based on a Clustering Algorithm

    Directory of Open Access Journals (Sweden)

    Jérôme Viéron

    2004-03-01

    Full Text Available This paper introduces source-channel adaptive rate control (SARC, a new congestion control algorithm for layered video transmission in large multicast groups. In order to solve the well-known feedback implosion problem in large multicast groups, we first present a mechanism for filtering RTCP receiver reports sent from receivers to the whole session. The proposed filtering mechanism provides a classification of receivers according to a predefined similarity measure. An end-to-end source and FEC rate control based on this distributed feedback aggregation mechanism coupled with a video layered coding system is then described. The number of layers, their rate, and their levels of protection are adapted dynamically to aggregated feedbacks. The algorithms have been validated with the NS2 network simulator.

  2. Controlling

    OpenAIRE

    Tůmová, Jana

    2009-01-01

    The diploma thesis focuses on the problems of operational controllership and the instruments used for their solution. In connection with these issues the thesis also defines the role of the controller, in particular his authorities and responsibilities. The theoretical part of the work looks at the academic conception of controlleship. The practical part describes the main functions, duties, activities and the system of cooperation in between controllership and other departments in particular...

  3. Analysis of the end-to-end process of material sourcing from an operational excellence perspective. Deduction of a decision model for the logistics process based on a fixed set of materials

    OpenAIRE

    Brú Artieda, Berta

    2014-01-01

    In the last years new trends in Logistics and Supply Chain Management have arisen as a reaction to the changes in demand patterns. Manufactures are in a constant search for new methods and strategies for designing not only an efficient but also an agile Supply Chain from supplier to customer. Quality and Pricing are no longer the only influential factors for a company, now Logistics has become a new tool for gaining leverage over competitors. The present work aims to apply these new approa...

  4. Enhancing LTE with Cloud-RAN and Load-Controlled Parasitic Antenna Arrays

    DEFF Research Database (Denmark)

    Artuso, Matteo; Boviz, Dora; Checko, Aleksandra

    2016-01-01

    implementations of C-RANs tackle fundamental technical and economic challenges. In this article, we present an end-to-end solution for practically implementable C-RANs by providing innovative solutions to key issues such as the design of cost-effective hardware and power-effective signals for RRHs, efficient...... design and distribution of data and control traffic for coordinated communications, and conception of a flexible and elastic architecture supporting dynamic allocation of both the densely distributed RRHs and the centralized processing resources in the cloud to create virtual base stations. More...... specifically, we propose a novel antenna array architecture called load-controlled parasitic antenna array (LCPAA) where multiple antennas are fed by a single RF chain. Energy- and spectral-efficient modulation as well as signaling schemes that are easy to implement are also provided. Additionally, the design...

  5. Process control and recovery in the Link Monitor and Control Operator Assistant

    Science.gov (United States)

    Lee, Lorrine; Hill, Randall W., Jr.

    1993-01-01

    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  6. Temperature effects on seaweed-sustaining top-down control vary with season.

    Science.gov (United States)

    Werner, Franziska J; Graiff, Angelika; Matthiessen, Birte

    2016-03-01

    Rising seawater temperature and CO2 concentrations (ocean acidification) represent two of the most influential factors impacting marine ecosystems in the face of global climate change. In ecological climate change research, full-factorial experiments performed across seasons in multispecies, cross-trophic-level settings are essential as they permit a more realistic estimation of direct and indirect effects as well as the relative importance of the effects of both major environmental stressors on ecosystems. In benthic mesocosm experiments, we tested the responses of coastal Baltic Sea Fucus vesiculosus communities to elevated seawater temperature and CO2 concentrations across four seasons of one year. While increasing [CO2] levels had only minor effects, warming had strong and persistent effects on grazers, and the resulting effects on the Fucus community were found to be season dependent. In late summer, a temperature-driven collapse of grazers caused a cascading effect from the consumers to the foundation species, resulting in overgrowth of Fucus thalli by epiphytes. In fall/winter (outside the growing season of epiphytes), intensified grazing under warming resulted in a significant reduction in Fucus biomass. Thus, we were able to confirm the prediction that future increases in water temperatures will influence marine food-web processes by altering top-down control, but we were also able to show that specific consequences for food-web structure depend on the season. Since F. vesiculosus is the dominant habitat-forming brown algal system in the Baltic Sea, its potential decline under global warming implies a loss of key functions and services such as provision of nutrient storage, substrate, food, shelter, and nursery grounds for a diverse community of marine invertebrates and fish in Baltic Sea coastal waters.

  7. Cross-Layer Control with Worst Case Delay Guarantees in Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Shu Fan

    2016-01-01

    Full Text Available The delay guarantee is a challenge to meet different real-time requirements in applications of backpressure-based wireless multihop networks, and therefore, researchers are interested in the possibility of providing bounded end-to-end delay. In this paper, a new cross-layer control algorithm with worst case delay guarantees is proposed. The utility maximization algorithm is developed using a Lyapunov optimization framework. Virtual queues that ensure the worst case delay of nondropped packets are designed. It is proved through rigorous theoretical analyses and verified by simulations that the time average overall utility achieved by the new algorithm can be arbitrarily close to the optimal solution with finite queue backlogs. The simulation results evaluated with Matlab show that the proposed algorithm achieves higher throughput utility with fewer data dropped compared with the existing work.

  8. A Cloud-Assisted Random Linear Network Coding Medium Access Control Protocol for Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Elli Kartsakli

    2014-03-01

    Full Text Available Relay sensor networks are often employed in end-to-end healthcare applications to facilitate the information flow between patient worn sensors and the medical data center. Medium access control (MAC protocols, based on random linear network coding (RLNC, are a novel and suitable approach to efficiently handle data dissemination. However, several challenges arise, such as additional delays introduced by the intermediate relay nodes and decoding failures, due to channel errors. In this paper, we tackle these issues by adopting a cloud architecture where the set of relays is connected to a coordinating entity, called cloud manager. We propose a cloud-assisted RLNC-based MAC protocol (CLNC-MAC and develop a mathematical model for the calculation of the key performance metrics, namely the system throughput, the mean completion time for data delivery and the energy efficiency. We show the importance of central coordination in fully exploiting the gain of RLNC under error-prone channels.

  9. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  10. Designing a delay-based adaptive congestion control mechanism using control theory and system identification for TCP/IP networks

    Science.gov (United States)

    Morita, Mitsushige; Ohsaki, Hiroyuki; Murata, Masayuki

    2002-07-01

    In the Internet, TCP (Transmission Control Protocol) has been used as an end-to-end congestion control mechanism. Of all several TCP implementations, TCP Reno is the most popular implementation. TCP Reno uses a loss-based approach since it estimates the severity of congestion by detecting packet losses in the network. On the contrary, another implementation called TCP Vegas uses a delay-based approach. The main advantage of a delay-based approach is, if it is properly designed, packet losses can be prevented by anticipating impending congestion from increasing packet delays. However, TCP Vegas was designed using not a theoretical approach but an ad hock one. In this paper, we therefore design a delay-based congestion control mechanism by utilizing the classical control theory. Our rate-based congestion control mechanism dynamically adjusts the packet transmission rate to stabilize the round-trip time for utilizing the network resources and also for preventing packet losses in the network. Presenting several simulation results in two network configurations, we quantitatively show the robustness and the effectiveness of our delay-based congestion control mechanism.

  11. Gold nanorod linking to control plasmonic properties in solution and polymer nanocomposites.

    Science.gov (United States)

    Ferrier, Robert C; Lee, Hyun-Su; Hore, Michael J A; Caporizzo, Matthew; Eckmann, David M; Composto, Russell J

    2014-02-25

    A novel, solution-based method is presented to prepare bifunctional gold nanorods (B-NRs), assemble B-NRs end-to-end in various solvents, and disperse linked B-NRs in a polymer matrix. The B-NRs have poly(ethylene glycol) grafted along its long axis and cysteine adsorbed to its ends. By controlling cysteine coverage, bifunctional ligands or polymer can be end-grafted to the AuNRs. Here, two dithiol ligands (C6DT and C9DT) are used to link the B-NRs in organic solvents. With increasing incubation time, the nanorod chain length increases linearly as the longitudinal surface plasmon resonance shifts toward lower adsorption wavelengths (i.e., red shift). Analogous to step-growth polymerization, the polydispersity in chain length also increases. Upon adding poly(ethylene glycol) or poly(methyl methacrylate) to chloroform solution with linked B-NR, the nanorod chains are shown to retain end-to-end linking upon spin-casting into PEO or PMMA films. Using quartz crystal microbalance with dissipation (QCM-D), the mechanism of nanorod linking is investigated on planar gold surfaces. At submonolayer coverage of cysteine, C6DT molecules can insert between cysteines and reach an areal density of 3.4 molecules per nm(2). To mimic the linking of Au NRs, this planar surface is exposed to cysteine-coated Au nanoparticles, which graft at 7 NPs per μm(2). This solution-based method to prepare, assemble, and disperse Au nanorods is applicable to other nanorod systems (e.g., CdSe) and presents a new strategy to assemble anisotropic particles in organic solvents and polymer coatings.

  12. Simulations of the Impact of Controlled Mobility for Routing Protocols

    Directory of Open Access Journals (Sweden)

    Loscrí Valeria

    2010-01-01

    Full Text Available This paper addresses mobility control routing in wireless networks. Given a data flow request between a source-destination pair, the problem is to move nodes towards the best placement, such that the performance of the network is improved. Our purpose is to find the best nodes selection depending on the minimization of the maximum distance that nodes have to travel to reach their final position. We propose a routing protocol, the Routing Protocol based on Controlled Mobility (RPCM, where the chosen nodes' path minimizes the total travelled distance to reach desirable position. Specifically, controlled mobility is intended as a new design dimension network allowing to drive nodes to specific best position in order to achieve some common objectives. The main aim of this paper is to show by simulation the effectiveness of controlled mobility when it is used as a new design dimension in wireless networks. Extensive simulations are conducted to evaluate the proposed routing algorithm. Results show how our protocol outperforms a well-known routing protocol, the Ad hoc On Demand Distance Vector routing (AODV, in terms of throughput, average end-to-end data packet delay and energy spent to send a packet unit.

  13. Squid are important components of foodwebs in most marine ...

    African Journals Online (AJOL)

    spamer

    CAPITOLI, R. R. and M. HAIMOVICI 1993 — Alimentacion del besugo (Pagrus pagrus) en el extremo sur de Brasil. Frente mar., Sec. A 14: 81–86. CARNEIRO, M. H. 1995 — Reprodução e alimentação dos lin- guados Paralichthys patagonicus e P. orbignyanus (Pleu- ronectiformes: Bothidae), no Rio Grande do Sul, Brasil.

  14. A Prosthetic Hand Body Area Controller Based on Efficient Pattern Recognition Control Strategies.

    Science.gov (United States)

    Benatti, Simone; Milosevic, Bojan; Farella, Elisabetta; Gruppioni, Emanuele; Benini, Luca

    2017-04-15

    Poliarticulated prosthetic hands represent a powerful tool to restore functionality and improve quality of life for upper limb amputees. Such devices offer, on the same wearable node, sensing and actuation capabilities, which are not equally supported by natural interaction and control strategies. The control in state-of-the-art solutions is still performed mainly through complex encoding of gestures in bursts of contractions of the residual forearm muscles, resulting in a non-intuitive Human-Machine Interface (HMI). Recent research efforts explore the use of myoelectric gesture recognition for innovative interaction solutions, however there persists a considerable gap between research evaluation and implementation into successful complete systems. In this paper, we present the design of a wearable prosthetic hand controller, based on intuitive gesture recognition and a custom control strategy. The wearable node directly actuates a poliarticulated hand and wirelessly interacts with a personal gateway (i.e., a smartphone) for the training and personalization of the recognition algorithm. Through the whole system development, we address the challenge of integrating an efficient embedded gesture classifier with a control strategy tailored for an intuitive interaction between the user and the prosthesis. We demonstrate that this combined approach outperforms systems based on mere pattern recognition, since they target the accuracy of a classification algorithm rather than the control of a gesture. The system was fully implemented, tested on healthy and amputee subjects and compared against benchmark repositories. The proposed approach achieves an error rate of 1.6% in the end-to-end real time control of commonly used hand gestures, while complying with the power and performance budget of a low-cost microcontroller.

  15. Autonomous terrain characterization and modelling for dynamic control of unmanned vehicles

    Science.gov (United States)

    Talukder, A.; Manduchi, R.; Castano, R.; Owens, K.; Matthies, L.; Castano, A.; Hogg, R.

    2002-01-01

    This end-to-end obstacle negotiation system is envisioned to be useful in optimized path planning and vehicle navigation in terrain conditions cluttered with vegetation, bushes, rocks, etc. Results on natural terrain with various natural materials are presented.

  16. The efficacy of centralized flow rate control in 802.11-based wireless mesh networks

    KAUST Repository

    Jamshaid, K.

    2013-06-13

    Commodity WiFi-based wireless mesh networks (WMNs) can be used to provide last mile Internet access. These networks exhibit extreme unfairness with backlogged traffic sources. Current solutions propose distributed source-rate control algorithms requiring link-layer or transport-layer changes on all mesh nodes. This is often infeasible in large practical deployments. In wireline networks, router-assisted rate control techniques have been proposed for use alongside end-to-end mechanisms. We wish to evaluate the feasibility of establishing similar centralized control via gateways in WMNs. In this paper, we focus on the efficacy of this control rather than the specifics of the controller design mechanism. We answer the question: Given sources that react predictably to congestion notification, can we enforce a desired rate allocation through a single centralized controller? The answer is not obvious because flows experience varying contention levels, and transmissions are scheduled by a node using imperfect local knowledge. We find that common router-assisted flow control schemes used in wired networks fail in WMNs because they assume that (1) links are independent, and (2) router queue buildups are sufficient for detecting congestion. We show that non-work-conserving, rate-based centralized scheduling can effectively enforce rate allocation. It can achieve results comparable to source rate limiting, without requiring any modifications to mesh routers or client devices. 2013 Jamshaid et al.; licensee Springer.

  17. Wireless cellular control of time-shared robotic web cameras

    Science.gov (United States)

    Abrams, David H.; Prokopowicz, Peter N.

    2003-01-01

    We present a novel user-interface and distributed imaging system for controlling robotic web-cameras via a wireless cellular phone. A user scrolls an image canvas to select a new live picture. The cellular phone application (a Java MIDlet) sends a URL request, which encodes the new pan/tilt/optical-zoom of a live picture, to a web-camera server. The user downloads a new live picture centered on the user"s new viewpoint. The web-camera server mediates requests from users, by time-sharing control of the physical robotic hardware. By processing a queue of user requests at different pan/tilt/zoom locations, the server can capture a single photograph for each user. While one user downloads a new live image, the robotic camera moves to capture images and service other independent user requests. The end-to-end system enables each user to independently steer the robotic camera, viewing live snapshot pictures from a cellular phone.

  18. Optical network control plane for multi-domain networking

    DEFF Research Database (Denmark)

    Manolova, Anna Vasileva

    This thesis focuses on multi-domain routing for traffice engineering and survivability support in optical transport networks under the Generalized Multi-Protocol Label Switching (GMPLS) control framework. First, different extensions to the Border Gateway Protocol for multi-domain Traffic Engineer......This thesis focuses on multi-domain routing for traffice engineering and survivability support in optical transport networks under the Generalized Multi-Protocol Label Switching (GMPLS) control framework. First, different extensions to the Border Gateway Protocol for multi-domain Traffic...... process are not enough for efficient TE in mesh multi-domain networks. Enhancing the protocol with multi-path dissemination capability, combined with the employment of an end-to-end TE metric proves to be a highly efficient solution. Simulation results show good performance characteristics of the proposed...... is not as essential for improved network performance as the length of the provided paths. Second, the issue of multi-domain survivability support is analyzed. An AS-disjoint paths is beneficial not only for resilience support, but also for facilitating adequate network reactions to changes in the network, which...

  19. Conservation biological control of pests in the molecular era: new opportunities to address old constraints

    Directory of Open Access Journals (Sweden)

    Gurr eGeoff

    2016-01-01

    Full Text Available ABSTRACTBiological control has long been considered a potential alternative to pesticidal strategies for pest management but its impact and level of use globally remain modest and inconsistent. A rapidly expanding range of molecular – particularly DNA-related – techniques is currently revolutionizing many life sciences. This review identifies a series of constraints on the development and uptake of conservation biological control and considers the contemporary and likely future influence of molecular methods on these constraints. Molecular approaches are now often used to complement morphological taxonomic methods for the identification and study of biological control agents including microbes. A succession of molecular techniques has been applied to ‘who eats whom’ questions in food-web ecology. Polymerase chain reaction (PCR approaches have largely superseded immunological approaches such as enzyme-linked immunosorbent assay (ELISA and now – in turn – are being overtaken by next generation sequencing (NGS- based approaches that offer unparalleled power at a rapidly diminishing cost. There is scope also to use molecular techniques to manipulate biological control agents, which will be accelerated with the advent of gene editing tools, the CRISPR/Cas9 system in particular. Gene editing tools also offer unparalleled power to both elucidate and manipulate the plant defence mechanisms including those that involve natural enemy attraction to attacked plants. Rapid advances in technology will allow the development of still more novel pest management options for which uptake is likely to be limited chiefly by regulatory hurdles.

  20. Improving SCTP Performance by Jitter-Based Congestion Control over Wired-Wireless Networks

    Directory of Open Access Journals (Sweden)

    Chen Jyh-Ming

    2011-01-01

    Full Text Available With the advances of wireless communication technologies, wireless networks gradually become the most adopted communication networks in the new generation Internet. Computing devices and mobile devices may be equipped with multiple wired and/or wireless network interfaces. Stream Control Transmission Protocol (SCTP has been proposed for reliable data transport and its multihoming feature makes use of network interfaces effectively to improve performance and reliability. However, like TCP, SCTP suffers unnecessary performance degradation over wired-wireless heterogeneous networks. The main reason is that the original congestion control scheme of SCTP cannot differentiate loss events so that SCTP reduces the congestion window inappropriately. In order to solve this problem and improve performance, we propose a jitter-based congestion control scheme with end-to-end semantics over wired-wireless networks. Besides, we solved ineffective jitter ratio problem which may cause original jitter-based congestion control scheme to misjudge congestion loss as wireless loss. Available bandwidth estimation scheme will be integrated into our congestion control mechanism to make the bottleneck more stabilized. Simulation experiments reveal that our scheme (JSCTP gives prominence to improve performance effectively over wired-wireless networks.

  1. Integration of congestion control and adaptive playout for unicast media streaming

    Science.gov (United States)

    Kim, Yoonyoung; Kim, JongWon

    2003-11-01

    To ease multimedia streaming over the QoS-deficient Internet, the network-adaptive streaming has been introduced recently. For an unicast media streaming environment, TCP-friendly end-to-end congestion control is widely suggested to handle the network congestion. However, the congestion control usually gives abrupt changes in the available bandwidth between streaming systems and the performance of media streaming can be degraded. To help this situation, we can adaptively control the playback speed of audio and video by adopting the time-scale modification technique. It can mitigate the effect of network variations in delay and loss, especially focusing on the low-latency video streaming situation. In this paper, we attempt to improve the streaming quality, when the congestion control is applied, by taking advantage of the adaptive playout mechanism. It can pro-actively prepare for imminent change with the adaptive playout capability, by estimating the expected buffer level and adjusting change in transmission rate, and controlling the playback rate.

  2. Power flow control based solely on slow feedback loop for heart pump applications.

    Science.gov (United States)

    Wang, Bob; Hu, Aiguo Patrick; Budgett, David

    2012-06-01

    This paper proposes a new control method for regulating power flow via transcutaneous energy transfer (TET) for implantable heart pumps. Previous work on power flow controller requires a fast feedback loop that needs additional switching devices and resonant capacitors to be added to the primary converter. The proposed power flow controller eliminates these additional components, and it relies solely on a slow feedback loop to directly drive the primary converter to meet the heart pump power demand and ensure zero voltage switching. A controlled change in switching frequency varies the resonant tank shorting period of a current-fed push-pull resonant converter, thus changing the magnitude of the primary resonant voltage, as well as the tuning between primary and secondary resonant tanks. The proposed controller has been implemented successfully using an analogue circuit and has reached an end-to-end power efficiency of 79.6% at 10 W with a switching frequency regulation range of 149.3 kHz to 182.2 kHz.

  3. JWST Wavefront Sensing and Control: Operations Plans, Demonstrations, and Status

    Science.gov (United States)

    Perrin, Marshall; Acton, D. Scott; Lajoie, Charles-Philippe; Knight, J. Scott; Myers, Carey; Stark, Chris; JWST Wavefront Sensing & Control Team

    2018-01-01

    After JWST launches and unfolds in space, its telescope optics will be aligned through a complex series of wavefront sensing and control (WFSC) steps to achieve diffraction-limited performance. This iterative process will comprise about half of the observatory commissioning time (~ 3 out of 6 months). We summarize the JWST WFSC process, schedule, and expectations for achieved performance, and discuss our team’s activities to prepare for an effective & efficient telescope commissioning. During the recently-completed OTIS cryo test at NASA JSC, WFSC demonstrations showed the flight-like operation of the entire JWST active optics and WFSC system from end to end, including all hardware and software components. In parallel, the same test data were processed through the JWST Mission Operations Center at STScI to demonstrate the readiness of ground system components there (such as the flight operations system, data pipelines, archives, etc). Moreover, using the Astronomer’s Proposal Tool (APT), the entire telescope commissioning program has been implemented, reviewed, and is ready for execution. Between now and launch our teams will continue preparations for JWST commissioning, including further rehearsals and testing, to ensure a successful alignment of JWST’s telescope optics.

  4. Congestion-Aware Routing and Fuzzy-based Rate Controller for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    M. Hatamian

    2016-04-01

    Full Text Available In this paper, congestion-aware routing and fuzzy-based rate controller for wireless sensor networks (WSNs is proposed. The proposed method tries to make a distinction between locally generated data and transit data by using a priority-based mechanism which provides a novel queueing model. Furthermore, a novel congestion-aware routing using greedy approach is proposed. The proposed congestion-aware routing tries to find more affordable routes. Moreover, a fuzzy rate controller is utilized for rate controlling which uses two criteria as its inputs, including congestion score and buffer occupancy. These two parameters are based on total packet input rate, packet forwarding rate at MAC layer, number of packets in the queue buffer, and total buffer size at each node. As soon as the congestion is detected, the notification signal is sent to offspring nodes. As a result, they are able to adjust their data transmission rate. Simulation results clearly show that the implementation of the proposed method using a greedy approach and fuzzy logic has done significant reduction in terms of packet loss rate, end-to-end delay and average energy consumption.

  5. Investigations of an Accelerometer-based Disturbance Feedforward Control for Vibration Suppression in Adaptive Optics of Large Telescopes

    Science.gov (United States)

    Glück, Martin; Pott, Jörg-Uwe; Sawodny, Oliver

    2017-06-01

    Adaptive Optics (AO) systems in large telescopes do not only correct atmospheric phase disturbances, but they also telescope structure vibrations induced by wind or telescope motions. Often the additional wavefront error due to mirror vibrations can dominate the disturbance power and contribute significantly to the total tip-tilt Zernike mode error budget. Presently, these vibrations are compensated for by common feedback control laws. However, when observing faint natural guide stars (NGS) at reduced control bandwidth, high-frequency vibrations (>5 Hz) cannot be fully compensated for by feedback control. In this paper, we present an additional accelerometer-based disturbance feedforward control (DFF), which is independent of the NGS wavefront sensor exposure time to enlarge the “effective servo bandwidth”. The DFF is studied in a realistic AO end-to-end simulation and compared with commonly used suppression concepts. For the observation in the faint (>13 mag) NGS regime, we obtain a Strehl ratio by a factor of two to four larger in comparison with a classical feedback control. The simulation realism is verified with real measurement data from the Large Binocular Telescope (LBT); the application for on-sky testing at the LBT and an implementation at the E-ELT in the MICADO instrument is discussed.

  6. TCP flow control using link layer information in mobile networks

    Science.gov (United States)

    Koga, Hiroyuki; Kawahara, Kenji; Oie, Yuji

    2002-07-01

    Mobile Networks have been expanding and IMT-2000 further increases their available bandwidth over wireless links. However, TCP, which is a reliable end-to-end transport protocol, is tuned to perform well in wired networks where bit error rates are very low and packet loss occurs mostly because of congestion. Although a TCP sender can execute flow control to utilize as much available bandwidth as possible in wired networks, it cannot work well in wireless networks characterized by high bit error rates. In the next generation mobile systems, sophisticated error recovery technologies of FEC and ARQ are indeed employed over wireless links, i.e., over Layer 2, to avoid performance degradation of upper layers. However, multiple retransmissions by Layer 2 ARQ can adversely increase transmission delay of TCP segments, which will further make TCP unnecessarily increase RTO (Retransmission TimeOut). Furthermore, a link bandwidth assigned to TCP flows can change in response to changing air conditions to use wireless links efficiently. TCP thus has to adapt its transmission rate according to the changing available bandwidth. The major goal of this study is to develop a receiver-based effective TCP flow control without any modification on TCP senders, which are probably connected with wired networks. For this end, we propose a TCP flow control employing some Layer 2 information on a wireless link at the mobile station. Our performance evaluation of the proposed TCP shows that the receiver-based TCP flow control can moderate the performance degradation very well even if FER on Layer 2 is high.

  7. Hybrid Control of Supply Chains: a Structured Exploration from a Systems Perspective

    Directory of Open Access Journals (Sweden)

    Paul W. P. J. Grefen

    2013-07-01

    Full Text Available Supply chains are becoming increasingly complex these days, both in the structure of the chains and in the need for fine-grained, real-time control. This development occurs in many industries, such as manufacturing, logistics, and the service industry. The increasing structural complexity is caused by larger numbers of participating companies in supply chains because of increasing complexity of products and services. Increasing requirements to control are caused by developments like mass-customization, pressure on delivery times, and smaller margins for waste. Maintaining well-structured strategic, tactic, and operational control over these complex supply chains is not an easy task – certainly as they are pressured by end-to-end synchronization requirements and just-in-time demands. Things become even more complex when chains need to be flexible to react to changing requirements to the products or services they deliver. To enable design of well-structured control, clear models of control topologies are required. In this paper, we address this need by exploring supply chain control topologies in an organized fashion. The exploration is based on integrating a supply chain model and a control model in two alternative ways to obtain two extreme models for supply chain control. These two models are next combined to obtain a hybrid chain control model in which control parameters can be adapted to accommodate different circumstances, hence facilitating agility in supply chains and networks. We apply the developed model to a number of case studies to show its usability. The contribution of this paper is the structured analysis of the design space for chain-level control models - not the description of individual new models.

  8. Smart Control of Multiple Evaporator Systems with Wireless Sensor and Actuator Networks

    Directory of Open Access Journals (Sweden)

    Apolinar González-Potes

    2016-02-01

    Full Text Available This paper describes the complete integration of a fuzzy control of multiple evaporator systems with the IEEE 802.15.4 standard, in which we study several important aspects for this kind of system, like a detailed analysis of the end-to-end real-time flows over wireless sensor and actuator networks (WSAN, a real-time kernel with an earliest deadline first (EDF scheduler, periodic and aperiodic tasking models for the nodes, lightweight and flexible compensation-based control algorithms for WSAN that exhibit packet dropouts, an event-triggered sampling scheme and design methodologies. We address the control problem of the multi-evaporators with the presence of uncertainties, which was tackled through a wireless fuzzy control approach, showing the advantages of this concept where it can easily perform the optimization for a set of multiple evaporators controlled by the same smart controller, which should have an intelligent and flexible architecture based on multi-agent systems (MAS that allows one to add or remove new evaporators online, without the need for reconfiguring, while maintaining temporal and functional restrictions in the system. We show clearly how we can get a greater scalability, the self-configuration of the network and the least overhead with a non-beacon or unslotted mode of the IEEE 802.15.4 protocol, as well as wireless communications and distributed architectures, which could be extremely helpful in the development process of networked control systems in large spatially-distributed plants, which involve many sensors and actuators. For this purpose, a fuzzy scheme is used to control a set of parallel evaporator air-conditioning systems, with temperature and relative humidity control as a multi-input and multi-output closed loop system; in addition, a general architecture is presented, which implements multiple control loops closed over a communication network, integrating the analysis and validation method for multi

  9. Biogeographical patterns and environmental controls of phytoplankton communities from contrasting hydrographical zones of the Labrador Sea

    Science.gov (United States)

    Fragoso, Glaucia M.; Poulton, Alex J.; Yashayaev, Igor M.; Head, Erica J. H.; Stinchcombe, Mark C.; Purdie, Duncan A.

    2016-02-01

    The Labrador Sea is an important oceanic sink for atmospheric CO2 because of intensive convective mixing during winter and extensive phytoplankton blooms that occur during spring and summer. Therefore, a broad-scale investigation of the responses of phytoplankton community composition to environmental forcing is essential for understanding planktonic food-web organisation and biogeochemical functioning in the Labrador Sea. Here, we investigated the phytoplankton community structure (>4 μm) from near surface blooms (1.2 mg chla m-3) occurred on and near the shelves in May and in offshore waters of the central Labrador Sea in June due to haline- and thermal-stratification, respectively. Sea ice-related (Fragilariopsis cylindrus and F. oceanica) and Arctic diatoms (Fossula arctica, Bacterosira bathyomphala and Thalassiosira hyalina) dominated the relatively cold (Ephemera planamembranacea and Fragilariopsis atlantica. The data presented here demonstrate that the Labrador Sea spring and early summer blooms are composed of contrasting phytoplankton communities, for which taxonomic segregation appears to be controlled by the physical and biogeochemical characteristics of the dominant water masses.

  10. CRRT: Congestion-Aware and Rate-Controlled Reliable Transport in Wireless Sensor Networks

    Science.gov (United States)

    Alam, Muhammad Mahbub; Hong, Choong Seon

    For successful data collection in wireless sensor networks, it is important to ensure that the required delivery ratio is maintained while keeping a fair rate for every sensor. Furthermore, emerging high-rate applications might require complete reliability and the transfer of large volume of data, where persistent congestion might occur. These requirements demand a complete but efficient solution for data transport in sensor networks which reliably transports data from many sources to one or more sinks, avoids congestion and maintains fairness. In this paper, we propose congestion-aware and rate-controlled reliable transport (CRRT), an efficient and low-overhead data transport mechanism for sensor networks. CRRT uses efficient MAC retransmission to increase one-hop reliability and end-to-end retransmission for loss recovery. It also controls the total rate of the sources centrally, avoids the congestion in the bottleneck based on congestion notifications from intermediate nodes and centrally assigns the rate to the sources based on rate assignment policy of the applications. Performance of CRRT is evaluated in NS-2 and simulation results demonstrate the effectiveness of CRRT.

  11. Integrated operational control and dynamic task allocation of unattended distributed sensor systems

    Science.gov (United States)

    Talukder, Ashit

    2009-05-01

    Unattended autonomous systems of the future will involve groups of static and mobile sensors functioning in coordination to achieve overall task objectives. Such systems can be viewed as wirelessly networked unmanned heterogeneous sensor networks. We discuss a distributed heterogeneous sensing system with static sensors and mobile robots with novel control optimization algorithms for dynamic adaptation, coordinated control and end to end resource management of all sensors in response to detected events to achieve overall system goals and objectives. Our system is designed for a host of applications, such as unmediated data monitoring and record keeping of the environment, battlefield monitoring using integrated ground, ocean and air sensors, and reactive operation to threats or changing conditions, and homeland security or border/road surveillance systems where unmanned vehicles can be deployed autonomously in response to detected events. Results for large area coastal monitoring are presented. Offline results using actual modeled data from in-situ sensory measurements demonstrate how the sensor parameters can be adapted to maximize observability of a freshwater plume while ensuring that individual system components operate within their physical limitations.1 2

  12. Cyber-Physical Attack-Resilient Wide-Area Monitoring, Protection, and Control for the Power Grid

    Energy Technology Data Exchange (ETDEWEB)

    Ashok, Aditya; Govindarasu, Manimaran; Wang, Jianhui

    2017-07-01

    Cyber security and resiliency of Wide-Area Monitoring, Protection and Control (WAMPAC) applications is critically important to ensure secure, reliable, and economic operation of the bulk power system. WAMPAC relies heavily on the security of measurements and control commands transmitted over wide-area communication networks for real-time operational, protection, and control functions. Also, the current “N-1 security criteria” for grid operation is inadequate to address malicious cyber events and therefore it is important to fundamentally redesign WAMPAC and to enhance Energy Management System (EMS) applications to make them attack-resilient. In this paper, we propose an end-to-end defense-in-depth architecture for attack-resilient WAMPAC that addresses resilience at both the infrastructure layer and the application layers. Also, we propose an attack-resilient cyber-physical security framework that encompasses the entire security life cycle including risk assessment, attack prevention, attack detection, attack mitigation, and attack resilience. The overarching objective of this paper is to provide a broad scope that comprehensively describes most of the major research issues and potential solutions in the context of cyber-physical security of WAMPAC for the power grid.

  13. Mount control system of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    Science.gov (United States)

    Antolini, Elisa; Tosti, Gino; Tanci, Claudio; Bagaglia, Marco; Canestrari, Rodolfo; Cascone, Enrico; Gambini, Giorgio; Nucciarelli, Giuliano; Pareschi, Giovanni; Scuderi, Salvo; Stringhetti, Luca; Busatta, Andrea; Giacomel, Stefano; Marchiori, Gianpietro; Manfrin, Cristiana; Marcuzzi, Enrico; Di Michele, Daniele; Grigolon, Carlo; Guarise, Paolo

    2016-08-01

    The ASTRI SST-2M telescope is an end-to-end prototype proposed for the Small Size class of Telescopes (SST) of the future Cherenkov Telescope Array (CTA). The prototype is installed in Italy at the INAF observing station located at Serra La Nave on Mount Etna (Sicily) and it was inaugurated in September 2014. This paper presents the software and hardware architecture and development of the system dedicated to the control of the mount, health, safety and monitoring systems of the ASTRI SST-2M telescope prototype. The mount control system installed on the ASTRI SST-2M telescope prototype makes use of standard and widely deployed industrial hardware and software. State of the art of the control and automation industries was selected in order to fulfill the mount related functional and safety requirements with assembly compactness, high reliability, and reduced maintenance. The software package was implemented with the Beckhoff TwinCAT version 3 environment for the software Programmable Logical Controller (PLC), while the control electronics have been chosen in order to maximize the homogeneity and the real time performance of the system. The integration with the high level controller (Telescope Control System) has been carried out by choosing the open platform communications Unified Architecture (UA) protocol, supporting rich data model while offering compatibility with the PLC platform. In this contribution we show how the ASTRI approach for the design and implementation of the mount control system has made the ASTRI SST-2M prototype a standalone intelligent machine, able to fulfill requirements and easy to be integrated in an array configuration such as the future ASTRI mini-array proposed to be installed at the southern site of the Cherenkov Telescope Array (CTA).

  14. A survey of performance enhancement of transmission control protocol (TCP in wireless ad hoc networks

    Directory of Open Access Journals (Sweden)

    Mast Noor

    2011-01-01

    Full Text Available Abstract Transmission control protocol (TCP, which provides reliable end-to-end data delivery, performs well in traditional wired network environments, while in wireless ad hoc networks, it does not perform well. Compared to wired networks, wireless ad hoc networks have some specific characteristics such as node mobility and a shared medium. Owing to these specific characteristics of wireless ad hoc networks, TCP faces particular problems with, for example, route failure, channel contention and high bit error rates. These factors are responsible for the performance degradation of TCP in wireless ad hoc networks. The research community has produced a wide range of proposals to improve the performance of TCP in wireless ad hoc networks. This article presents a survey of these proposals (approaches. A classification of TCP improvement proposals for wireless ad hoc networks is presented, which makes it easy to compare the proposals falling under the same category. Tables which summarize the approaches for quick overview are provided. Possible directions for further improvements in this area are suggested in the conclusions. The aim of the article is to enable the reader to quickly acquire an overview of the state of TCP in wireless ad hoc networks.

  15. Increasing operational command and control security by the implementation of device independent quantum key distribution

    Science.gov (United States)

    Bovino, Fabio Antonio; Messina, Angelo

    2016-10-01

    In a very simplistic way, the Command and Control functions can be summarized as the need to provide the decision makers with an exhaustive, real-time, situation picture and the capability to convey their decisions down to the operational forces. This two-ways data and information flow is vital to the execution of current operations and goes far beyond the border of military operations stretching to Police and disaster recovery as well. The availability of off-the shelf technology has enabled hostile elements to endanger the security of the communication networks by violating the traditional security protocols and devices and hacking sensitive databases. In this paper an innovative approach based to implementing Device Independent Quantum Key Distribution system is presented. The use of this technology would prevent security breaches due to a stolen crypto device placed in an end-to-end communication chain. The system, operating with attenuated laser, is practical and provides the increasing of the distance between the legitimate users.

  16. Do differences in food web structure between organic and conventional farms affect the ecosystem service of pest control?

    Science.gov (United States)

    Macfadyen, Sarina; Gibson, Rachel; Polaszek, Andrew; Morris, Rebecca J; Craze, Paul G; Planqué, Robert; Symondson, William O C; Memmott, Jane

    2009-03-01

    While many studies have demonstrated that organic farms support greater levels of biodiversity, it is not known whether this translates into better provision of ecosystem services. Here we use a food-web approach to analyse the community structure and function at the whole-farm scale. Quantitative food webs from 10 replicate pairs of organic and conventional farms showed that organic farms have significantly more species at three trophic levels (plant, herbivore and parasitoid) and significantly different network structure. Herbivores on organic farms were attacked by more parasitoid species on organic farms than on conventional farms. However, differences in network structure did not translate into differences in robustness to simulated species loss and we found no difference in percentage parasitism (natural pest control) across a variety of host species. Furthermore, a manipulative field experiment demonstrated that the higher species richness of parasitoids on the organic farms did not increase mortality of a novel herbivore used to bioassay ecosystem service. The explanation for these differences is likely to include inherent differences in management strategies and landscape structure between the two farming systems.

  17. Controlling for anthropogenically induced atmospheric variation in stable carbon isotope studies.

    Science.gov (United States)

    Long, Eric S; Sweitzer, Richard A; Diefenbach, Duane R; Ben-David, Merav

    2005-11-01

    Increased use of stable isotope analysis to examine food-web dynamics, migration, transfer of nutrients, and behavior will likely result in expansion of stable isotope studies investigating human-induced global changes. Recent elevation of atmospheric CO2 concentration, related primarily to fossil fuel combustion, has reduced atmospheric CO2 delta13C (13C/12C), and this change in isotopic baseline has, in turn, reduced plant and animal tissue delta13C of terrestrial and aquatic organisms. Such depletion in CO2 delta13C and its effects on tissue delta13C may introduce bias into delta13C investigations, and if this variation is not controlled, may confound interpretation of results obtained from tissue samples collected over a temporal span. To control for this source of variation, we used a high-precision record of atmospheric CO2 delta13C from ice cores and direct atmospheric measurements to model modern change in CO2 delta13C. From this model, we estimated a correction factor that controls for atmospheric change; this correction reduces bias associated with changes in atmospheric isotopic baseline and facilitates comparison of tissue delta13C collected over multiple years. To exemplify the importance of accounting for atmospheric CO2 delta13C depletion, we applied the correction to a dataset of collagen delta13C obtained from mountain lion (Puma concolor) bone samples collected in California between 1893 and 1995. Before correction, in three of four ecoregions collagen delta13C decreased significantly concurrent with depletion of atmospheric CO2 delta13C (n > or = 32, P correction to collagen delta13C data removed trends from regions demonstrating significant declines, and measurement error associated with the correction did not add substantial variation to adjusted estimates. Controlling for long-term atmospheric variation and correcting tissue samples for changes in isotopic baseline facilitate analysis of samples that span a large temporal range.

  18. The telescope control of the ASTRI SST-2M prototype for the Cherenkov telescope Array: hardware and software design architecture

    Science.gov (United States)

    Antolini, Elisa; Cascone, Enrico; Schwarz, Joseph; Stringhetti, Luca; Tanci, Claudio; Tosti, Gino; Aisa, Damiano; Aisa, Simone; Bagaglia, Marco; Busatta, Andrea; Campeggi, Carlo; Cefala, Marco; Farnesini, Lucio; Giacomel, Stefano; Marchiori, Gianpiero; Marcuzzi, Enrico; Nucciarelli, Giuliano; Piluso, Antonfranco

    2014-07-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a flagship project of the Italian Ministry of Research and led by the Italian National Institute of Astrophysics (INAF). One of its aims is to develop, within the Cherenkov Telescope Array (CTA) framework, an end-to-end small-sized telescope prototype in a dual-mirror configuration (SST-2M) in order to investigate the energy range E ~ 1-100 TeV. A long-term goal of the ASTRI program is the production of an ASTRI/CTA mini-array composed of seven SST-2M telescopes. The prototype, named ASTRI SST-2M, is seen as a standalone system that needs only network and power connections to work. The software system that is being developed to control the prototype is the base for the Mini-Array Software System (MASS), which has the task to make possible the operation of both the ASTRI SST-2M prototype and the ASTRI/CTA mini-array. The scope of this contribution is to give an overview of the hardware and software architecture adopted for the ASTRI SST- 2M prototype, showing how to apply state of the art industrial technologies to telescope control and monitoring systems.

  19. Towards Controlling Latency in Wireless Networks

    KAUST Repository

    Bouacida, Nader

    2017-04-24

    Wireless networks are undergoing an unprecedented revolution in the last decade. With the explosion of delay-sensitive applications in the Internet (i.e., online gaming and VoIP), latency becomes a major issue for the development of wireless technology. Taking advantage of the significant decline in memory prices, industrialists equip the network devices with larger buffering capacities to improve the network throughput by limiting packets drops. Over-buffering results in increasing the time that packets spend in the queues and, thus, introducing more latency in networks. This phenomenon is known as “bufferbloat”. While throughput is the dominant performance metric, latency also has a huge impact on user experience not only for real-time applications but also for common applications like web browsing, which is sensitive to latencies in order of hundreds of milliseconds. Concerns have arisen about designing sophisticated queue management schemes to mitigate the effects of such phenomenon. My thesis research aims to solve bufferbloat problem in both traditional half-duplex and cutting-edge full-duplex wireless systems by reducing delay while maximizing wireless links utilization and fairness. Our work shed lights on buffer management algorithms behavior in wireless networks and their ability to reduce latency resulting from excessive queuing delays inside oversized static network buffers without a significant loss in other network metrics. First of all, we address the problem of buffer management in wireless full-duplex networks by using Wireless Queue Management (WQM), which is an active queue management technique for wireless networks. Our solution is based on Relay Full-Duplex MAC (RFD-MAC), an asynchronous media access control protocol designed for relay full-duplexing. Compared to the default case, our solution reduces the end-to-end delay by two orders of magnitude while achieving similar throughput in most of the cases. In the second part of this thesis

  20. Human-level control through deep reinforcement learning.

    Science.gov (United States)

    Mnih, Volodymyr; Kavukcuoglu, Koray; Silver, David; Rusu, Andrei A; Veness, Joel; Bellemare, Marc G; Graves, Alex; Riedmiller, Martin; Fidjeland, Andreas K; Ostrovski, Georg; Petersen, Stig; Beattie, Charles; Sadik, Amir; Antonoglou, Ioannis; King, Helen; Kumaran, Dharshan; Wierstra, Daan; Legg, Shane; Hassabis, Demis

    2015-02-26

    The theory of reinforcement learning provides a normative account, deeply rooted in psychological and neuroscientific perspectives on animal behaviour, of how agents may optimize their control of an environment. To use reinforcement learning successfully in situations approaching real-world complexity, however, agents are confronted with a difficult task: they must derive efficient representations of the environment from high-dimensional sensory inputs, and use these to generalize past experience to new situations. Remarkably, humans and other animals seem to solve this problem through a harmonious combination of reinforcement learning and hierarchical sensory processing systems, the former evidenced by a wealth of neural data revealing notable parallels between the phasic signals emitted by dopaminergic neurons and temporal difference reinforcement learning algorithms. While reinforcement learning agents have achieved some successes in a variety of domains, their applicability has previously been limited to domains in which useful features can be handcrafted, or to domains with fully observed, low-dimensional state spaces. Here we use recent advances in training deep neural networks to develop a novel artificial agent, termed a deep Q-network, that can learn successful policies directly from high-dimensional sensory inputs using end-to-end reinforcement learning. We tested this agent on the challenging domain of classic Atari 2600 games. We demonstrate that the deep Q-network agent, receiving only the pixels and the game score as inputs, was able to surpass the performance of all previous algorithms and achieve a level comparable to that of a professional human games tester across a set of 49 games, using the same algorithm, network architecture and hyperparameters. This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.

  1. Human-level control through deep reinforcement learning

    Science.gov (United States)

    Mnih, Volodymyr; Kavukcuoglu, Koray; Silver, David; Rusu, Andrei A.; Veness, Joel; Bellemare, Marc G.; Graves, Alex; Riedmiller, Martin; Fidjeland, Andreas K.; Ostrovski, Georg; Petersen, Stig; Beattie, Charles; Sadik, Amir; Antonoglou, Ioannis; King, Helen; Kumaran, Dharshan; Wierstra, Daan; Legg, Shane; Hassabis, Demis

    2015-02-01

    The theory of reinforcement learning provides a normative account, deeply rooted in psychological and neuroscientific perspectives on animal behaviour, of how agents may optimize their control of an environment. To use reinforcement learning successfully in situations approaching real-world complexity, however, agents are confronted with a difficult task: they must derive efficient representations of the environment from high-dimensional sensory inputs, and use these to generalize past experience to new situations. Remarkably, humans and other animals seem to solve this problem through a harmonious combination of reinforcement learning and hierarchical sensory processing systems, the former evidenced by a wealth of neural data revealing notable parallels between the phasic signals emitted by dopaminergic neurons and temporal difference reinforcement learning algorithms. While reinforcement learning agents have achieved some successes in a variety of domains, their applicability has previously been limited to domains in which useful features can be handcrafted, or to domains with fully observed, low-dimensional state spaces. Here we use recent advances in training deep neural networks to develop a novel artificial agent, termed a deep Q-network, that can learn successful policies directly from high-dimensional sensory inputs using end-to-end reinforcement learning. We tested this agent on the challenging domain of classic Atari 2600 games. We demonstrate that the deep Q-network agent, receiving only the pixels and the game score as inputs, was able to surpass the performance of all previous algorithms and achieve a level comparable to that of a professional human games tester across a set of 49 games, using the same algorithm, network architecture and hyperparameters. This work bridges the divide between high-dimensional sensory inputs and actions, resulting in the first artificial agent that is capable of learning to excel at a diverse array of challenging tasks.

  2. Manual Khalifa Therapy Improves Functional and Morphological Outcome of Patients with Anterior Cruciate Ligament Rupture in the Knee: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Michael Ofner

    2014-01-01

    Full Text Available Rupture of the anterior cruciate ligament (ACL is a high incidence injury usually treated surgically. According to common knowledge, it does not heal spontaneously, although some claim the opposite. Regeneration therapy by Khalifa was developed for injuries of the musculoskeletal system by using specific pressure to the skin. This randomized, controlled, observer-blinded, multicentre study was performed to validate this assumption. Thirty patients with complete ACL rupture, magnetic resonance imaging (MRI verified, were included. Study examinations (e.g., international knee documentation committee (IKDC score were performed at inclusion (t0. Patients were randomized to receive either standardised physiotherapy (ST or additionally 1 hour of Khalifa therapy at the first session (STK. Twenty-four hours later, study examinations were performed again (t1. Three months later control MRI and follow-up examinations were performed (t2. Initial status was comparable between both groups. There was a highly significant difference of mean IKDC score results at t1 and t2. After 3 months, 47% of the STK patients, but no ST patient, demonstrated an end-to-end homogeneous ACL in MRI. Clinical and physical examinations were significantly different in t1 and t2. ACL healing can be improved with manual therapy. Physical activity can be performed without pain and nearly normal range of motion after one treatment of specific pressure.

  3. Group scheduling based on control-packet batch processing in optical burst switched networks

    Science.gov (United States)

    Yuan, Chi; Li, Zhengbin; He, Yongqi; Xu, Anshi

    2007-11-01

    Optical burst switching (OBS) is proposed as a high-speed, flexible, and transparent technology. It is thought to be the best way to adapt the bursty IP traffic over optical wavelength division multiplexing (WDM) networks. OBS technology facilitates the efficient integration of both IP and WDM. It provides statistical multiplexing gains and avoids long end to end setup time of traditional virtual circuit configuration. However, there are still a lot of challenges, one of which is burst contention. Owing to the fact that random access memory like buffering is not available in the optical domain at present, there exists a real possibility that bursts may contend with one another at a switching node. Many contention resolutions are proposed. The major contention resolutions in literature are wavelength conversion, fiber delay lines, and deflecting routing. In this paper, a new data burst scheduling scheme, called group scheduling based on control-packet batch processing (GSCBP) was proposed to reduce burst contention. Like transmission control protocol, GSCBP has a batch processing window. Control packets which located in the batch processing window are batch processed. A heuristic scheduling algorithm arranges the relevant bursts' route based on the processing result and the network resource. A new node architecture supporting group scheduling was presented. The GSCBP algorithm is combined with wavelength converter and/or fiber delay lines which is shared by some data channels. Meanwhile, an extended open shortest path first (E-OSPF) routing strategy was proposed for OBS. Both GSCBP and E-OSPF are introduced into 14-node national science foundation network by means of simulations. The ETE delay, burst blocking probability, as well as burst dropping probability were attained. Results show that the GSBCP lead to the higher-priority traffic drop rate decrease one order of magnitude, if drop rate and ETE delay of lower priority traffic is sacrificed.

  4. Control mechanisms on the ctenophore Mnemiopsis population dynamics: A modeling study

    Science.gov (United States)

    Salihoglu, B.; Fach, B. A.; Oguz, T.

    2011-07-01

    A comprehensive understanding of the mechanisms that control the ctenophore Mnemiopsis blooms in the Black Sea is gained with a zero-dimensional population based model. The stage resolving model considers detailed mass and population growth dynamics of four stages of model-ctenophore. These stages include the different growth characteristics of egg, juvenile, transitional and adult stages. The dietary patterns of the different stages follow the observations given in the literature. The model is able to represent consistent development patterns, while reflecting the physiological complexity of a population of Mnemiopsis species. The model is used to analyze the influence of temperature and food variability on Mnemiopsis reproduction and outburst. Model results imply a strong temperature control on all stages of Mnemiopsis and that high growth rates at high temperatures can only be reached if food sources are not limited (i.e. 25 mg C m - 3 and 90 mg C m - 3 mesozooplankton and microplankton, respectively). A decrease of 5 °C can result in considerable decrease in biomass of all stages, whereas at a temperature of 25 °C a 40% decrease in food concentrations could result in termination of transfer between stages. Model results demonstrate the strong role of mesozooplankton in controlling the adult ctenophore biomass capable of reproduction and that different nutritional requirements of each stage can be critical for population growth. The high overall population growth rates may occur only when growth conditions are favorable for both larval and lobate stages. Current model allows the flexibility to assess the effect of changing temperature and food conditions on different ctenophore stages. Without including this structure in end-to-end models it is not possible to analyze the influence of ctenophores on different trophic levels of the ecosystem.

  5. Proposal of QoS enabled GMPLS controlled XG-PON

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2011-01-01

    End-to-end QoS provisioning is still a challenging task despite many years of research and development in this area. Considering GMPLS based core/metro network and RSVP capable Home Gateways it is the access part of the network where QoS signaling is bridged. In this paper we propose strategies...

  6. Do differences in food web structure between organic and conventional farms affect the ecosystem service of pest control?

    NARCIS (Netherlands)

    Macfadyen, S.; Gibson, R.; Polaszek, A.; Morris, R.J.; Craze, P.G.; Planque, R.; Symondson, W.O.C.; Memmott, J.

    2009-01-01

    While many studies have demonstrated that organic farms support greater levels of biodiversity, it is not known whether this translates into better provision of ecosystem services. Here we use a food-web approach to analyse the community structure and function at the whole-farm scale. Quantitative

  7. Long-term ecological consequences of herbicide treatment to control the invasive grass, Spartina anglica, in an Australian saltmarsh

    Science.gov (United States)

    Shimeta, Jeff; Saint, Lynnette; Verspaandonk, Emily R.; Nugegoda, Dayanthi; Howe, Steffan

    2016-07-01

    Invasive plants acting as habitat modifiers in coastal wetlands can have extensive ecological impacts. Control of invasive plants often relies on herbicides, although little is known about subsequent environmental impacts. Studying effects of herbicides on non-target species and long-term cascading consequences may yield insights into the ecology of invasive species by revealing interactions with native species. We conducted a long-term field experiment measuring effects of treating the invasive saltmarsh grass, Spartina anglica, with the herbicide Fusilade Forte®. No changes in sedimentary macrofaunal abundances or species richness, diversity, or assemblages were detected 1-2 months after spraying, despite known toxicity of Fusilade Forte® to fauna. This lack of impact may have been due to low exposure, since the herbicide was taken up primarily by plant leaves, with the small amount that reached the sediment hydrolyzing rapidly. Six months after spraying, however, total macrofauna in treated plots was more than four times more abundant than in unsprayed control plots, due to a fifteen-fold increase in annelids. This population growth correlated with increased sedimentary organic matter in treated plots, likely due to decomposition of dead S. anglica leaves serving as food for annelids. After another year, no differences in macrofauna or organic matter remained between treatments. The indirect effect on annelid populations from herbicide treatment could benefit management efforts by providing greater food resources for wading birds, in addition to improving birds' access to sediments by reducing plant cover. This study shows that an invasive grass can have a significant impact on native fauna through food-web interactions, influenced by herbicide usage.

  8. Pancreaticojejunostomy is comparable to pancreaticogastrostomy after pancreaticoduodenectomy: an updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Crippa, Stefano; Cirocchi, Roberto; Randolph, Justus; Partelli, Stefano; Belfiori, Giulio; Piccioli, Alessandra; Parisi, Amilcare; Falconi, Massimo

    2016-06-01

    To perform an up-to-date meta-analysis of randomized controlled trials (RCTs) comparing pancreaticojejunostomy (PJ) and pancreaticogastrostomy (PG) in order to determine the safer anastomotic technique. Compared to existing meta-analysis, new RCTs were evaluated and subgroup analyses of different anastomotic techniques were carried out. We conducted a bibliographic research using the National Library of Medicine's PubMed database from January 1990 to July 2015 of RCTs. Only RCTs, in English, that compared PG versus all types of PJ were selected. Data were independently extracted by two authors. We performed a quantitative systematic review following the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. A random-effect model was applied. Statistical heterogeneity was assessed using the I (2) and χ (2) tests. Primary outcomes were rate of overall and clinically significant pancreatic fistula (POPF). Ten RCTs were identified including 1629 patients, 826 undergoing PG and 803 undergoing PJ. RCTs showed significant heterogeneity regarding definitions of POPF, perioperative management, and characteristics of pancreatic gland. No significant differences were found in the rate of overall and clinically significant POPF, morbidity, mortality, reoperation, and intra-abdominal sepsis when PG was compared with all types PJ or when subgroup analysis (double-layer PG with or without anterior gastrotomy versus duct to mucosa PJ and single-layer PG versus single-layer end-to-end/end-to-side PJ) were analyzed. PG is not superior to PJ in the prevention of POPF. Current RCTs have major methodological limitations with significant heterogeneity in regard to surgical techniques, definition of POPF/complications, and perioperative management.

  9. Non-random food-web assembly at habitat edges increases connectivity and functional redundancy.

    Science.gov (United States)

    Peralta, Guadalupe; Frost, Carol M; Didham, Raphael K; Rand, Tatyana A; Tylianakis, Jason M

    2017-04-01

    Habitat fragmentation dramatically alters the spatial configuration of landscapes, with the creation of artificial edges affecting community structure and dynamics. Despite this, it is not known how the different food webs in adjacent habitats assemble at their boundaries. Here we demonstrate that the composition and structure of herbivore-parasitoid food webs across edges between native and plantation forests are not randomly assembled from those of the adjacent communities. Rather, elevated proportions of abundant, interaction-generalist parasitoid species at habitat edges allowed considerable interaction rewiring, which led to higher linkage density and less modular networks, with higher parasitoid functional redundancy. This was despite high overlap in host composition between edges and interiors. We also provide testable hypotheses for how food webs may assemble between habitats with lower species overlap. In an increasingly fragmented world, non-random assembly of food webs at edges may increasingly affect community dynamics at the landscape level. © 2016 by the Ecological Society of America.

  10. Life in the fast lane: fish and foodweb structure in the main channel of large rivers

    Science.gov (United States)

    Dettmers, J.M.; Wahl, David H.; Soluk, D.A.; Gutreuter, S.

    2001-01-01

    We studied the main channel of the lower Illinois River and of the Mississippi River just upstream and downstream of its confluence with the Illinois River to describe the abundance, composition, and/or seasonal appearance of components of the main-channel community. Abundance of fishes in the main channel was high, especially adults. Most adult fishes were present in the main channel for either 3 or 4 seasons/y, indicating that fishes regularly reside in the main channel. We documented abundant zooplankton and benthic invertebrates in the main channel, and the presence of these food types in the diets of channel catfish and freshwater drum. All trophic levels were well represented in the main channel, indicating that the main channel supports a unique food web. The main channel also serves as an important energetic link with other riverine habitats (e.g., floodplains, secondary channels, backwater lakes) because of the mobility of resident fishes and because of the varied energy sources supplying this food web. It may be more realistic to view energy flow in large-river systems as a combination of 3 existing concepts, the river continuum concept (downstream transport), the flood pulse concept (lateral transport to the floodplain), and the riverine productivity model (autochthonous production). We urge additional research to quantify the links between the main channel and other habitat types in large rivers because of the apparent importance of main-channel processes in the overall structure and function of large-river ecosystems.

  11. EPA Townetting CTD casts - Evaluating the ecological health of Puget Sound's pelagic foodweb

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — To evaluate effects of human influence on the health of Puget Sound's pelagic ecosystems, we propose a sampling program across multiple oceanographic basins...

  12. EPA2011 Microbial & nutrient database - Evaluating the ecological health of Puget Sound's pelagic foodweb

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — To evaluate effects of human influence on the health of Puget Sound's pelagic ecosystems, we propose a sampling program across multiple oceanographic basins...

  13. Interactive effects of warming, eutrophication and size structure: impacts on biodiversity and food-web structure.

    Science.gov (United States)

    Binzer, Amrei; Guill, Christian; Rall, Björn C; Brose, Ulrich

    2016-01-01

    Warming and eutrophication are two of the most important global change stressors for natural ecosystems, but their interaction is poorly understood. We used a dynamic model of complex, size-structured food webs to assess interactive effects on diversity and network structure. We found antagonistic impacts: Warming increases diversity in eutrophic systems and decreases it in oligotrophic systems. These effects interact with the community size structure: Communities of similarly sized species such as parasitoid-host systems are stabilized by warming and destabilized by eutrophication, whereas the diversity of size-structured predator-prey networks decreases strongly with warming, but decreases only weakly with eutrophication. Nonrandom extinction risks for generalists and specialists lead to higher connectance in networks without size structure and lower connectance in size-structured communities. Overall, our results unravel interactive impacts of warming and eutrophication and suggest that size structure may serve as an important proxy for predicting the community sensitivity to these global change stressors. © 2015 John Wiley & Sons Ltd.

  14. Unpacking brown food-webs: Animal trophic identity reflects rampant microbivory.

    Science.gov (United States)

    Steffan, Shawn A; Chikaraishi, Yoshito; Dharampal, Prarthana S; Pauli, Jonathan N; Guédot, Christelle; Ohkouchi, Naohiko

    2017-05-01

    Detritivory is the dominant trophic paradigm in most terrestrial, aquatic, and marine ecosystems, yet accurate measurement of consumer trophic position within detrital (="brown") food webs has remained unresolved. Measurement of detritivore trophic position is complicated by the fact that detritus is suffused with microbes, creating a detrital complex of living and nonliving biomass. Given that microbes and metazoans are trophic analogues of each other, animals feeding on detrital complexes are ingesting other detritivores (microbes), which should elevate metazoan trophic position and should be rampant within brown food webs. We tested these hypotheses using isotopic (15N) analyses of amino acids extracted from wild and laboratory-cultured consumers. Vertebrate (fish) and invertebrate detritivores (beetles and moths) were reared on detritus, with and without microbial colonization. In the field, detritivorous animal specimens were collected and analyzed to compare trophic identities among laboratory-reared and free-roaming detritivores. When colonized by bacteria or fungi, the trophic positions of detrital complexes increased significantly over time. The magnitude of trophic inflation was mediated by the extent of microbial consumption of detrital substrates. When detrital complexes were fed to vertebrate and invertebrate animals, the consumers registered similar degrees of trophic inflation, albeit one trophic level higher than their diets. The wild-collected detritivore fauna in our study exhibited significantly elevated trophic positions. Our findings suggest that the trophic positions of detrital complexes rise predictably as microbes convert nonliving organic matter into living microbial biomass. Animals consuming such detrital complexes exhibit similar trophic inflation, directly attributable to the assimilation of microbe-derived amino acids. Our data demonstrate that detritivorous microbes elevate metazoan trophic position, suggesting that detritivory among animals is, functionally, omnivory. By quantifying the impacts of microbivory on the trophic positions of detritivorous animals and then tracking how these effects propagate "up" food chains, we reveal the degree to which microbes influence consumer groups within trophic hierarchies. The trophic inflation observed among our field-collected fauna further suggests that microbial proteins represent an immense contribution to metazoan biomass. Collectively, these findings provide an empirical basis to interpret detritivore trophic identity, and further illuminate the magnitude of microbial contributions to food webs.

  15. Human Impacts and Climate Change Influence Nestedness and Modularity in Food-Web and Mutualistic Networks.

    Science.gov (United States)

    Takemoto, Kazuhiro; Kajihara, Kosuke

    2016-01-01

    Theoretical studies have indicated that nestedness and modularity-non-random structural patterns of ecological networks-influence the stability of ecosystems against perturbations; as such, climate change and human activity, as well as other sources of environmental perturbations, affect the nestedness and modularity of ecological networks. However, the effects of climate change and human activities on ecological networks are poorly understood. Here, we used a spatial analysis approach to examine the effects of climate change and human activities on the structural patterns of food webs and mutualistic networks, and found that ecological network structure is globally affected by climate change and human impacts, in addition to current climate. In pollination networks, for instance, nestedness increased and modularity decreased in response to increased human impacts. Modularity in seed-dispersal networks decreased with temperature change (i.e., warming), whereas food web nestedness increased and modularity declined in response to global warming. Although our findings are preliminary owing to data-analysis limitations, they enhance our understanding of the effects of environmental change on ecological communities.

  16. Eutrophication, Nile perch and food-web interactions in south-east Lake Victoria

    NARCIS (Netherlands)

    Cornelissen, I.J.M.

    2015-01-01

    The increasing eutrophication, the introduction of Nile perch (Lates niloticus) and the increasing fishing pressure has changed Lake Victoria tremendously the last century. Since the 1960s, eutrophication increased primary production, enabling an increase in fish production. However, eutrophication

  17. Townet database - Evaluating the ecological health of Puget Sound's pelagic foodweb

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — To evaluate effects of human influence on the health of Puget Sound's pelagic ecosystems, we propose a sampling program across multiple oceanographic basins...

  18. Human Impacts and Climate Change Influence Nestedness and Modularity in Food-Web and Mutualistic Networks.

    Directory of Open Access Journals (Sweden)

    Kazuhiro Takemoto

    Full Text Available Theoretical studies have indicated that nestedness and modularity-non-random structural patterns of ecological networks-influence the stability of ecosystems against perturbations; as such, climate change and human activity, as well as other sources of environmental perturbations, affect the nestedness and modularity of ecological networks. However, the effects of climate change and human activities on ecological networks are poorly understood. Here, we used a spatial analysis approach to examine the effects of climate change and human activities on the structural patterns of food webs and mutualistic networks, and found that ecological network structure is globally affected by climate change and human impacts, in addition to current climate. In pollination networks, for instance, nestedness increased and modularity decreased in response to increased human impacts. Modularity in seed-dispersal networks decreased with temperature change (i.e., warming, whereas food web nestedness increased and modularity declined in response to global warming. Although our findings are preliminary owing to data-analysis limitations, they enhance our understanding of the effects of environmental change on ecological communities.

  19. Food-web complexity across hydrothermal vents on the Azores triple junction

    Science.gov (United States)

    Portail, Marie; Brandily, Christophe; Cathalot, Cécile; Colaço, Ana; Gélinas, Yves; Husson, Bérengère; Sarradin, Pierre-Marie; Sarrazin, Jozée

    2018-01-01

    The assessment and comparison of food webs across various hydrothermal vent sites can enhance our understanding of ecological processes involved in the structure and function of biodiversity. The Menez Gwen, Lucky Strike and Rainbow vent fields are located on the Azores triple junction of the Mid-Atlantic Ridge. These fields have distinct depths (from 850 to 2320 m) and geological contexts (basaltic and ultramafic), but share similar faunal assemblages defined by the presence of foundation species that include Bathymodiolus azoricus, alvinocarid shrimp and gastropods. We compared the food webs of 13 faunal assemblages at these three sites using carbon and nitrogen stable isotope analyses (SIA). Results showed that photosynthesis-derived organic matter is a negligible basal source for vent food webs, at all depths. The contribution of methanotrophy versus autotrophy based on Calvin-Benson-Bassham (CBB) or reductive tricarboxylic acid (rTCA) cycles varied between and within vent fields according to the concentrations of reduced compounds (e.g. CH4, H2S). Species that were common to vent fields showed high trophic flexibility, suggesting weak trophic links to the metabolism of chemosynthetic primary producers. At the community level, a comparison of SIA-derived metrics between mussel assemblages from two vent fields (Menez Gwen & Lucky Strike) showed that the functional structure of food webs was highly similar in terms of basal niche diversification, functional specialization and redundancy. Coupling SIA to functional trait approaches included more variability within the analyses, but the functional structures were still highly comparable. These results suggest that despite variable environmental conditions (physico-chemical factors and basal sources) and faunal community structure, functional complexity remained relatively constant among mussel assemblages. This functional similarity may be favoured by the propensity of species to adapt to fluid variations and practise trophic flexibility. Furthermore, the different pools of species at vent fields may play similar functions in the community such as the change in composition does not affect the overall functional structure. Finally, the absence of a relationship between the functional structure and taxonomic diversity as well as the high overlap between species' isotopic niches within communities indicates that co-occuring species may have redundant functions. Therefore, the addition of species within in a functional group does not necessarily lead to more complexity. Overall, this study highlights the complexity of food webs within chemosynthetic communities and emphasizes the need to better characterize species' ecological niches and biotic interactions.

  20. Primary productivity as a control over soil microbial diversity along environmental gradients in a polar desert ecosystem

    Directory of Open Access Journals (Sweden)

    Kevin M. Geyer

    2017-07-01

    Full Text Available Primary production is the fundamental source of energy to foodwebs and ecosystems, and is thus an important constraint on soil communities. This coupling is particularly evident in polar terrestrial ecosystems where biological diversity and activity is tightly constrained by edaphic gradients of productivity (e.g., soil moisture, organic carbon availability and geochemical severity (e.g., pH, electrical conductivity. In the McMurdo Dry Valleys of Antarctica, environmental gradients determine numerous properties of soil communities and yet relatively few estimates of gross or net primary productivity (GPP, NPP exist for this region. Here we describe a survey utilizing pulse amplitude modulation (PAM fluorometry to estimate rates of GPP across a broad environmental gradient along with belowground microbial diversity and decomposition. PAM estimates of GPP ranged from an average of 0.27 μmol O2/m2/s in the most arid soils to an average of 6.97 μmol O2/m2/s in the most productive soils, the latter equivalent to 217 g C/m2/y in annual NPP assuming a 60 day growing season. A diversity index of four carbon-acquiring enzyme activities also increased with soil productivity, suggesting that the diversity of organic substrates in mesic environments may be an additional driver of microbial diversity. Overall, soil productivity was a stronger predictor of microbial diversity and enzymatic activity than any estimate of geochemical severity. These results highlight the fundamental role of environmental gradients to control community diversity and the dynamics of ecosystem-scale carbon pools in arid systems.

  1. The control, monitor, and alarm system for the ICT equipment of the ASTRI SST-2M telescope prototype for the Cherenkov Telescope Array

    Science.gov (United States)

    Gianotti, Fulvio; Fioretti, Valentina; Tanci, Claudio; Conforti, Vito; Tacchini, Alessandro; Leto, Giuseppe; Gallozzi, Stefano; Bulgarelli, Andrea; Trifoglio, Massimo; Malaguti, Giuseppe; Zoli, Andrea

    2014-07-01

    ASTRI is an Italian flagship project whose first goal is the realization of an end-to-end telescope prototype, named ASTRI SST-2M, for the Cherenkov Telescope Array (CTA). The prototype will be installed in Italy during Fall 2014. A second goal will be the realization of the ASTRI/CTA mini-array which will be composed of seven SST-2M telescopes placed at the CTA Southern Site. The Information and Communication Technology (ICT) equipment necessary to drive the infrastructure for the ASTRI SST-2M prototype is being designed as a complete and stand-alone computer center. The design goal is to obtain basic ICT equipment that might be scaled, with a low level of redundancy, for the ASTRI/CTA mini-array, taking into account the necessary control, monitor and alarm system requirements. The ICT equipment envisaged at the Serra La Nave observing station in Italy, where the ASTRI SST-2M telescope prototype will operate, includes computers, servers and workstations, network devices, an uninterruptable power supply system, and air conditioning systems. Suitable hardware and software tools will allow the parameters related to the behavior and health of each item of equipment to be controlled and monitored. This paper presents the proposed architecture and technical solutions that integrate the ICT equipment in the framework of the Observatory Control System package of the ASTRI/CTA Mini- Array Software System, MASS, to allow their local and remote control and monitoring. An end-toend test case using an Internet Protocol thermometer is reported in detail.

  2. Perancangan Controller

    Directory of Open Access Journals (Sweden)

    Djunaidi Santoso

    2010-12-01

    Full Text Available In making controller, there must be knowledge about component controlling functions that used for controller designing. In the control unit, it needs simple method as micro programming. This micro programming is to create a micro program in binary numbering that used for controlling pin’s component controller and outside the controller. Controller design in general needs several prerequisites, which are digital system and controller and assembly language. 

  3. Control Areas

    Data.gov (United States)

    Department of Homeland Security — This feature class represents electric power Control Areas. Control Areas, also known as Balancing Authority Areas, are controlled by Balancing Authorities, who are...

  4. Dream controller

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L; Wang, Qiang; Chow, Andrew J

    2013-11-26

    A method and apparatus for intelligently controlling continuous process variables. A Dream Controller comprises an Intelligent Engine mechanism and a number of Model-Free Adaptive (MFA) controllers, each of which is suitable to control a process with specific behaviors. The Intelligent Engine can automatically select the appropriate MFA controller and its parameters so that the Dream Controller can be easily used by people with limited control experience and those who do not have the time to commission, tune, and maintain automatic controllers.

  5. Controllable dose; Dosis controlable

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez R, J.T.; Anaya M, R.A. [ININ, A.P. 18-1027, 11801 Mexico D.F. (Mexico)]. E-mail: jtar@nuclear.inin.mx

    2004-07-01

    With the purpose of eliminating the controversy about the lineal hypothesis without threshold which found the systems of dose limitation of the recommendations of ICRP 26 and 60, at the end of last decade R. Clarke president of the ICRP proposed the concept of Controllable Dose: as the dose or dose sum that an individual receives from a particular source which can be reasonably controllable by means of any means; said concept proposes a change in the philosophy of the radiological protection of its concern by social approaches to an individual focus. In this work a panorama of the foundations is presented, convenient and inconveniences that this proposal has loosened in the international community of the radiological protection, with the purpose of to familiarize to our Mexican community in radiological protection with these new concepts. (Author)

  6. High-performance flat data center network architecture based on scalable and flow-controlled optical switching system

    Science.gov (United States)

    Calabretta, Nicola; Miao, Wang; Dorren, Harm

    2016-03-01

    Traffic in data centers networks (DCNs) is steadily growing to support various applications and virtualization technologies. Multi-tenancy enabling efficient resource utilization is considered as a key requirement for the next generation DCs resulting from the growing demands for services and applications. Virtualization mechanisms and technologies can leverage statistical multiplexing and fast switch reconfiguration to further extend the DC efficiency and agility. We present a novel high performance flat DCN employing bufferless and distributed fast (sub-microsecond) optical switches with wavelength, space, and time switching operation. The fast optical switches can enhance the performance of the DCNs by providing large-capacity switching capability and efficiently sharing the data plane resources by exploiting statistical multiplexing. Benefiting from the Software-Defined Networking (SDN) control of the optical switches, virtual DCNs can be flexibly created and reconfigured by the DCN provider. Numerical and experimental investigations of the DCN based on the fast optical switches show the successful setup of virtual network slices for intra-data center interconnections. Experimental results to assess the DCN performance in terms of latency and packet loss show less than 10^-5 packet loss and 640ns end-to-end latency with 0.4 load and 16- packet size buffer. Numerical investigation on the performance of the systems when the port number of the optical switch is scaled to 32x32 system indicate that more than 1000 ToRs each with Terabit/s interface can be interconnected providing a Petabit/s capacity. The roadmap to photonic integration of large port optical switches will be also presented.

  7. Mosquito Control

    Science.gov (United States)

    ... Protection Agency Search Search Contact Us Share Mosquito Control About Mosquitoes General Information Life Cycle Information from ... Repellent that is Right for You DEET Mosquito Control Methods Success in mosquito control: an integrated approach ...

  8. Birth Control

    Science.gov (United States)

    Birth control, also known as contraception, is designed to prevent pregnancy. Birth control methods may work in a number of different ... eggs that could be fertilized. Types include birth control pills, patches, shots, vaginal rings, and emergency contraceptive ...

  9. Controllable circuit

    DEFF Research Database (Denmark)

    2010-01-01

    signal. The control unit comprises a first signal processing unit, a second signal processing unit, and a combiner unit. The first signal processing unit has an output and is supplied with a first carrier signal and an input signal. The second signal processing unit has an output and is supplied...... with a second carrier signal and the input signal. The combiner unit is connected to the first and second signal processing units combining the outputs of the first and the second signal processing units to form a signal representative of the control signal......A switch-mode power circuit comprises a controllable element and a control unit. The controllable element is configured to control a current in response to a control signal supplied to the controllable element. The control unit is connected to the controllable element and provides the control...

  10. Field Testing of Telemetry for Demand Response Control of Small Loads

    Energy Technology Data Exchange (ETDEWEB)

    Lanzisera, Steven [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Weber, Adam [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Liao, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2018-01-30

    The electricity system in California, from generation through loads, must be prepared for high renewable penetration and increased electrification of end uses while providing increased resilience and lower operating cost. California has an aggressive renewable portfolio standard that is complemented by world-leading greenhouse gas goals. The goal of this project was to evaluate methods of enabling fast demand response (DR) signaling to small loads for low-cost site enablement. We used OpenADR 2.0 to meet telemetry requirements for providing ancillary services, and we used a variety of low-cost devices coupled with open-source software to enable an end-to-end fast DR. The devices, architecture, implementation, and testing of the system is discussed in this report. We demonstrate that the emerging Internet of Things (IoT) and Smart Home movements provide an opportunity for diverse small loads to provide fast, low-cost demand response. We used Internet-connected lights, thermostats, load interruption devices, and water heaters to demonstrate an ecosystem of controllable devices. The system demonstrated is capable of providing fast load shed for between 20 dollars and $300 per kilowatt (kW) of available load. The wide range results from some loads may have very low cost but also very little shed capability (a 10 watt [W] LED light can only shed a maximum of 10 W) while some loads (e.g., water heaters or air conditioners) can shed several kilowatts but have a higher initial cost. These costs, however, compare well with other fast demand response costs, with typically are over $100/kilowatt of shed. We contend these loads are even more attractive than their price suggests because many of them will be installed for energy efficiency or non-energy benefits (e.g., improved lighting quality or controllability), and the ability to use them for fast DR is a secondary benefit. Therefore the cost of enabling them for DR may approach zero if a software-only solution can be

  11. Neurofuzzy Control

    DEFF Research Database (Denmark)

    Jantzen, Jan

    1997-01-01

    These notes are for a course in fuzzy control and neural networks. By neural networks we more precisely mean neurofuzzy systems rather than pure neural network theory. The notes are an extension to the existing notes on fuzzy control (Jantzen, Fuzzy Control, 1994).......These notes are for a course in fuzzy control and neural networks. By neural networks we more precisely mean neurofuzzy systems rather than pure neural network theory. The notes are an extension to the existing notes on fuzzy control (Jantzen, Fuzzy Control, 1994)....

  12. Helicopter controllability

    OpenAIRE

    Carico, Dean

    1989-01-01

    Approved for public release; distribution is unlimited The concept of helicopter controllability is explained. A background study reviews helicopter development in the U.S. General helicopter configurations, linearized equations of motion, stability, and piloting requirements are discussed. Helicopter flight controls, handling qualities, and associated specification are reviewed. Analytical, simulation and flight test methods for evaluating helicopter automatic flight control systems ar...

  13. Gaining control

    NARCIS (Netherlands)

    Enden, van der E.; Laan, van der R.

    2008-01-01

    The article reports on the efforts of companies to find a solution for tax risk management, tax accounting and being in control. In trying to find a solution, companies work towards an integrated tax control framework (TCF), a tax risk management and control environment embedded in the internal

  14. Optimal control

    CERN Document Server

    Aschepkov, Leonid T; Kim, Taekyun; Agarwal, Ravi P

    2016-01-01

    This book is based on lectures from a one-year course at the Far Eastern Federal University (Vladivostok, Russia) as well as on workshops on optimal control offered to students at various mathematical departments at the university level. The main themes of the theory of linear and nonlinear systems are considered, including the basic problem of establishing the necessary and sufficient conditions of optimal processes. In the first part of the course, the theory of linear control systems is constructed on the basis of the separation theorem and the concept of a reachability set. The authors prove the closure of a reachability set in the class of piecewise continuous controls, and the problems of controllability, observability, identification, performance and terminal control are also considered. The second part of the course is devoted to nonlinear control systems. Using the method of variations and the Lagrange multipliers rule of nonlinear problems, the authors prove the Pontryagin maximum principle for prob...

  15. CONTROL ORGANIZACIONAL

    OpenAIRE

    Granda Carazas, Segundo Eloy; Facultad de Ciencias Contables, Universidad Nacional Mayor de San Marcos

    2014-01-01

    El concepto de control puede ser muy general y empleársele como punto focal para el sistema administrativo. Por ejemplo la planeación puede ser imaginada como medio para lograr el control del comportamiento individual u organizacional. En forma similar, la tarea de organización puede ser construida de manera que proporcione un medio para el control de las actividades.

  16. Smart building temperature control using occupant feedback

    Science.gov (United States)

    Gupta, Santosh K.

    structure necessary for truthful comfort feedback from the occupants. Finally, we present an end-to-end framework designed for enabling occupant feedback collection and incorporating the feedback data towards energy efficient operation of a building. We have designed a mobile application that occupants can use on their smart phones to provide their thermal preference feedback. When relaying the occupant feedback to the central server the mobile application also uses indoor localization techniques to tie the occupant preference to their current thermal zone. Texas Instruments sensortags are used for real time zonal temperature readings. The mobile application relays the occupant preference along with the location to a central server that also hosts our learning algorithm to learn the environment and using occupant feedback calculates the optimal temperature set point. The entire process is triggered upon change of occupancy, environmental conditions, and or occupant preference. The learning algorithm is scheduled to run at regular intervals to respond dynamically to environmental and occupancy changes. We describe results from experimental studies in two different settings: a single family residential home setting and in a university based laboratory space setting. (Abstract shortened by UMI.).

  17. Birth Control

    Science.gov (United States)

    ... to have sex makes sense Talking to your parents about sex Deciding about sex Birth control Types of birth ... not planned. Some young people are afraid their parents will find out they’re having sex. If you get birth control from a doctor, ...

  18. Experiencing control

    NARCIS (Netherlands)

    Monaci, G.; Braspenning, R.A.C.; Meerbeek, B.W.; Bingley, P.; Rajagopalan, R.; Triki, M.

    2009-01-01

    This report describes the activities carried out in the first part of the Experiencing Control project (2008-324). The guiding idea of the project is to make control part of the experience, exploring new interaction solutions for complex, engaging interactions with Philips devices in the living

  19. Taking Control

    Centers for Disease Control (CDC) Podcasts

    2007-11-01

    This podcast gives action steps and reasons to control diabetes.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/2/2007.

  20. Detonation control

    Science.gov (United States)

    Mace, Jonathan L.; Seitz, Gerald J.; Bronisz, Lawrence E.

    2016-10-25

    Detonation control modules and detonation control circuits are provided herein. A trigger input signal can cause a detonation control module to trigger a detonator. A detonation control module can include a timing circuit, a light-producing diode such as a laser diode, an optically triggered diode, and a high-voltage capacitor. The trigger input signal can activate the timing circuit. The timing circuit can control activation of the light-producing diode. Activation of the light-producing diode illuminates and activates the optically triggered diode. The optically triggered diode can be coupled between the high-voltage capacitor and the detonator. Activation of the optically triggered diode causes a power pulse to be released from the high-voltage capacitor that triggers the detonator.

  1. Control theory

    CERN Document Server

    Simrock, S

    2008-01-01

    In engineering and mathematics, control theory deals with the behaviour of dynamical systems. The desired output of a system is called the reference. When one or more output variables of a system need to follow a certain reference over time, a controller manipulates the inputs to a system to obtain the desired effect on the output of the system. Rapid advances in digital system technology have radically altered the control design options. It has become routinely practicable to design very complicated digital controllers and to carry out the extensive calculations required for their design. These advances in implementation and design capability can be obtained at low cost because of the widespread availability of inexpensive and powerful digital processing platforms and high-speed analog IO devices.

  2. Institutional Controls

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset consists of institutional control data from multiple Superfund sites in U.S. EPA Region 8. These data were acquired from multiple sources at different...

  3. Losing control

    DEFF Research Database (Denmark)

    Leppink, Eric; Odlaug, Brian Lawrence; Lust, Katherine

    2014-01-01

    OBJECTIVE: Assaultive behaviors are common among young people and have been associated with a range of other unhealthy, impulsive behaviors such as substance use and problem gambling. This study sought to determine the predictive ability of single assaultive incidents for impulse control disorders...... with respondents without lifetime assaultive behavior, those with a history of assaultive or destructive behavior reported more depressive symptoms, more stress, and higher rates of a range of impulse control disorders (intermittent explosive disorder, compulsive sexual behavior, compulsive buying, and skin...... picking disorder). CONCLUSIONS: Assaultive behavior appears fairly common among college students and is associated with symptoms of depression and impulse control disorders. Significant distress and diminished behavioral control suggest that assaultive behaviors may often be associated with significant...

  4. Depth gradients in food-web processes linking habitats in large lakes: Lake Superior as an exemplar ecosystem

    Science.gov (United States)

    Sierszen, Michael E.; Hrabik, Thomas R.; Stockwell, Jason D.; Cotter, Anne M; Hoffman, Joel C.; Yule, Daniel L.

    2014-01-01

    In large lakes around the world, depth-based changes in the abundance and distribution of invertebrate and fish species suggest that there may be concomitant changes in patterns of resource allocation. Using Lake Superior of the Laurentian Great Lakes as an example, we explored this idea through stable isotope analyses of 13 major fish taxa.

  5. The structure of mammalian food-webs: interpreting, predicting, and informing estimates of species interactions in paleontological and modern communities

    OpenAIRE

    Yeakel, Justin Douglas

    2012-01-01

    Patterns of species interactions are both the cause and consequence of ecosystem dy- namics. Understanding the origin and function of specific interaction patterns at the ecosystem scale has been a long-term goal in ecology. These efforts are often limited by the enormous size of biological systems, the temporal transience of ecological inter- actions, difficulties in obtaining reliable measurements [3], knowing what is important to measure in the first place, and the time-scale over which ob...

  6. Jellyfish decomposition at the seafloor rapidly alters biogeochemical cycling and carbon flow through benthic food-webs

    NARCIS (Netherlands)

    Sweetman, A.K.; Chelsky, A.; Pitt, K.A.; Andrade, H.; Van Oevelen, D.; Renaud, P.E.

    2016-01-01

    Jellyfish blooms have increased in magnitude in several locations around the world, including in fjords.While the factors that promote jellyfish blooms and the impacts of live blooms on marine ecosystems areoften investigated, the post-bloom effects from the sinking and accumulation of dead

  7. Radioactive cesium. Dynamics and transport in forestal food-webs; Radioaktivt cesium. Dynamik och transport i skogliga naeringsvaevar

    Energy Technology Data Exchange (ETDEWEB)

    Palo, T.; Nelin, P. [Swedish Univ. of Agricultural Sciences, Umeaa (Sweden). Dept. of Animal Ecology; Bergman, R.; Nylen, T. [FOA NBC Defence, Umeaa (Sweden)

    1995-12-01

    This report summarises results from a radioecological study during 1994-1995 concerning turnover, redistribution and loss of radioactive Cesium (134 and 137) in boreal forest ecosystems, as well as uptake and transfer in important food-chains over moose, vole and vegetation. The basis for this report are 9 publications published 1994-95. These reports are presented in summary form. 9 refs, 17 figs.

  8. Importance of the aquatic detritus foodweb in the transfer of radionuclides from water to higher trophic levels

    Energy Technology Data Exchange (ETDEWEB)

    Bird, G.A., E-mail: gbird@ecometrix.ca [EcoMetrix Inc., Mississauga, Ontario (Canada)

    2012-07-01

    Uptake of radionuclides from water to decaying elm-leaf discs and their transfer to the amphipod Gammarus lacustris was investigated. Uptake by decomposing elm leaves is microbially mediated. Uptake coefficients, k, for G. lacustris fed contaminated leaf discs were 3.66▾ d{sup -1} for {sup 60}Co, 4.34▾ d{sup -1} for {sup 131}I and 9.74▾ d{sup -1} for {sup 134}Cs, whereas loss rates were -1.08▾ d{sup -1} for {sup 60}Co, -1.07▾ d{sup -1} for {sup 131}I and -1.04▾ d{sup -1} for {sup 134}Cs. Assimilation efficiencies were 11% for leaf material, 53% for {sup 60}Co, 55% for {sup 131}I and 75% for {sup 134}Cs. These results indicate that there is considerable potential for the transfer of radionuclides to higher trophic levels by the detritus-food web. (author)

  9. Jellyfish decomposition at the seafloor rapidly alters biogeochemical cycling and carbon flow through benthic food-webs

    OpenAIRE

    A. K. Sweetman; Chelsky, A.; Pitt, K.A.; H. Andrade; van Oevelen, D.; Renaud, P.E.

    2016-01-01

    Jellyfish blooms have increased in magnitude in several locations around the world, including in fjords.While the factors that promote jellyfish blooms and the impacts of live blooms on marine ecosystems areoften investigated, the post-bloom effects from the sinking and accumulation of dead jellyfish at the seafloorremain poorly known. Here, we quantified the effect of jellyfish deposition on short-term benthic carboncycling dynamics in benthic cores taken from a cold and deep fjord environme...

  10. Historic Habitat Opportunities and Food-Web Linkages of Juvenile Salmon in the Columbia River Estuary, Annual Report of Research.

    Energy Technology Data Exchange (ETDEWEB)

    Bottom, Daniel L.; Simenstad, Charles A.; Campbell, Lance [Northwest Fisheries Science Center

    2009-05-15

    In 2002 with support from the U.S. Army Corps of Engineers (USACE), an interagency research team began investigating salmon life histories and habitat use in the lower Columbia River estuary to fill significant data gaps about the estuary's potential role in salmon decline and recovery . The Bonneville Power Administration (BPA) provided additional funding in 2004 to reconstruct historical changes in estuarine habitat opportunities and food web linkages of Columbia River salmon (Onchorhynchus spp.). Together these studies constitute the estuary's first comprehensive investigation of shallow-water habitats, including selected emergent, forested, and scrub-shrub wetlands. Among other findings, this research documented the importance of wetlands as nursery areas for juvenile salmon; quantified historical changes in the amounts and distributions of diverse habitat types in the lower estuary; documented estuarine residence times, ranging from weeks to months for many juvenile Chinook salmon (O. tshawytscha); and provided new evidence that contemporary salmonid food webs are supported disproportionately by wetland-derived prey resources. The results of these lower-estuary investigations also raised many new questions about habitat functions, historical habitat distributions, and salmon life histories in other areas of the Columbia River estuary that have not been adequately investigated. For example, quantitative estimates of historical habitat changes are available only for the lower 75 km of the estuary, although tidal influence extends 217 km upriver to Bonneville Dam. Because the otolith techniques used to reconstruct salmon life histories rely on detection of a chemical signature (strontium) for salt water, the estuarine residency information we have collected to date applies only to the lower 30 or 35 km of the estuary, where fish first encounter ocean water. We lack information about salmon habitat use, life histories, and growth within the long tidal-fresh reaches of the main-stem river and many tidally-influenced estuary tributaries. Finally, our surveys to date characterize wetland habitats within island complexes distributed in the main channel of the lower estuary. Yet some of the most significant wetland losses have occurred along the estuary's periphery, including shoreline areas and tributary junctions. These habitats may or may not function similarly as the island complexes that we have surveyed to date. In 2007 we initiated a second phase of the BPA estuary study (Phase II) to address specific uncertainties about salmon in tidal-fresh and tributary habitats of the Columbia River estuary. This report summarizes 2007 and 2008 Phase II results and addresses three principal research questions: (1) What was the historic distribution of estuarine and floodplain habitats from Astoria to Bonneville Dam? (2) Do individual patterns of estuarine residency and growth of juvenile Chinook salmon vary among wetland habitat types along the estuarine tidal gradient? (3) Are salmon rearing opportunities and life histories in the restoring wetland landscape of lower Grays River similar to those documented for island complexes of the main-stem estuary? Phase II extended our analysis of historical habitat distribution in the estuary above Rkm 75 to near Bonneville Dam. For this analysis we digitized the original nineteenth-century topographic (T-sheets) and hydrographic (H-sheets) survey maps for the entire estuary. Although all T-sheets (Rkm 0 to Rkm 206) were converted to GIS in 2005 with support for the USACE estuary project, final reconstruction of historical habitats throughout the estuary requires completion of the remaining H-sheet GIS maps above Rkm 75 and their integration with the T-sheets. This report summarizes progress to date on compiling the upper estuary H-sheets above Rkm 75. For the USACE estuary project, we analyzed otoliths from Chinook salmon collected near the estuary mouth in 2003-05 to estimate variability in estuary residence times among juvenile out migrants. In Phase II we expanded these analyses to compare growth and residency among individuals collected in tidal-fresh water wetlands of the lower main-stem estuary. Although no known otolith structural or chemical indicators currently exist to define entry into tidal fresh environments, our previous analyses indicate that otolith barium concentrations frequently increase before individuals encounter salt water. Here we evaluate whether otolith barium levels may provide a valid indicator of tidal fresh water entry by Columbia River Chinook salmon. We also examine otolith growth increments to quantify and compare recent (i.e., the previous 30 d) growth rates among individuals sampled in different wetland habitats along the estuarine tidal gradient.

  11. Feedbacks between protistan single-cell activity and bacterial physiological structure reinforce the predator/prey link in microbial foodwebs

    Directory of Open Access Journals (Sweden)

    Eva eSintes

    2014-09-01

    Full Text Available The trophic interactions between bacteria and their main predators, the heterotrophic nanoflagellates (HNF, play a key role in the structuring and functioning of aquatic microbial food webs. Grazing regulation of bacterial communities, both of biomass and community structure, have been frequently reported. Additionally, bottom-up responses of the HNF at the population level (numerical responses have also been extensively described. However, the functional response of HNF at the single cell level has not been well explored. In this study, we concurrently measured the physiological structure of bacterial communities and HNF single cell activities during re-growth cultures of natural aquatic communities. We found that changes in the abundance and proportion of the preferred, highly-active bacterial prey, caused by the feeding activity of their predators (HNF, induced a negative feedback effect on the single cell activity of these HNF. These shifts in the specific cellular activity of HNF occur at a much shorter time scale than population-level shifts in flagellate abundance, and offer a complementary mechanism to explain not only the tight coupling between bacteria and HNF, but also the relative constancy of bacterial abundance in aquatic ecosystems.

  12. CYCLE CONTROL

    African Journals Online (AJOL)

    changed to gestodene. Although large- scale comparative trials are needed to confirm this finding, evidence suggests that cycle control with gestodene is better than for monophasic preparations containing desogestrel, norgestimate or levonorgestrel,10 as well as for levonorg- estrel-or norethisterone-containing triphasics.

  13. Associational control

    DEFF Research Database (Denmark)

    Hvid, Helge Søndergaard; Lund, Henrik Lambrecht; Grosen, Sidsel Lond

    2010-01-01

    significance in the modern workplace which is simultaneously characterized by self-management and standardization. It is concluded that the concept of control remains important, but needs to evolve from its focus on the work of individuals to a focus on the associational aspects of work if it is to retain its...

  14. Control Cognitivo

    Directory of Open Access Journals (Sweden)

    María Pavón

    2013-12-01

    Full Text Available Las respuestas obtenidas permitieron dilucidar tanto las diversas situaciones en que las personas intentan controlar cognitivamente sus emociones, así como las estrategias que utilizan para lograr dicho fin. A partir de un análisis de contenido centrado en las respuestas, surgieron dos sistemas clasificatorios: el primero hace referencia a los motivos que hacen que una persona controle cognitivamente sus emociones y contiene las siguientes dimensiones: según valencia de emoción, intensidad de la emoción, según tipo de relación, según situación o contexto y según grado de simetría. El segundo sistema clasificatorio mencionado, refiere a los mecanismos de control que las personas entrevistadas utilizan. Esta clasificación consta de 4 dimensiones: distracción, evitación reflexión y supresión emocional. Los resultados indican tanto similitudes como diferencias entre los grupos. Respecto a estas últimas, las más relevantes se encontraron entre el grupo de personas de 20 a 30 años y las de 80 a 90. Se ha encontrado que con el correr de los años, las personas empiezan a poner énfasis en el valor de la experiencia para afrontar y controlar las diferentes situaciones.

  15. Effects of damage control surgery on systemic inflammatory response and early mortality in a pig model of severe abdominal gunshot injury

    Directory of Open Access Journals (Sweden)

    You-sheng LI

    2011-05-01

    Full Text Available Objective To compare the difference in systemic inflammatory response and early mortality between damage control surgery(DCS and the traditional surgery in a pig model of severe abdominal gunshot injury.Methods Thirty-two female domestic outbreeding pigs with multiple bowel injury and exsanguination followed by lethal triad(low body temperature,acidosis and coagulopathy were randomly divided into two groups(16 each: conventional surgery(CS and DCS groups.Laparotomy was performed after transfusion of appropriate amount of autologous blood and fluid infusion,pigs in CS group were then underwent end-to-end intestinal anastomosis,peritoneal lavage and definitive abdominal closure;while in DCS group the pigs received clamping of intestinal lumens,ligation of bleeding vessels,peritoneal lavage,and temporary abdominal closure.The operation time,blood loss and volume of fluid infusion were recorded.The hemodynamic parameters,arterial blood gases and coagulation parameters were examined.Six pigs of each group were sacrificed 6 hours after operation,and myocardium,lung,small intestine and liver were harvested for histological observation.The remaining 10 pigs of each group were used for assessment of 24h survival rate.Results No significant difference was detected in blood loss and mean arterial pressure(MAP between the two groups.Compared with CS group,the operation time in DCS group was shorter(1.1±0.2h vs 2.4±0.3h,P < 0.01,and the amount of infusion was less(3204±254ml vs 3756±313ml,P < 0.05.Pigs in DCS group showed higher urinary output,lower heart rate,and lower dagree of acidosis and coagulopathy.The tissue injuries and neutrophil infiltration were milder in DCS group than in CS group.Although the 24h survival rate was higher in DCS group(80% than in CS group(50%,no significant difference was found with statistical analysis(P=0.175.Conclusion Damage control surgery,compared with conventional surgery,may shorten the operation time,reduce fluid

  16. COPD - control drugs

    Science.gov (United States)

    Chronic obstructive pulmonary disease - control drugs; Bronchodilators - COPD - control drugs; Beta agonist inhaler - COPD - control drugs; Anticholinergic inhaler - COPD - control drugs; Long-acting inhaler - COPD - control drugs; Corticosteroid inhaler - COPD - control ...

  17. Control Prenatal

    Directory of Open Access Journals (Sweden)

    P. Susana Aguilera, DRA.

    2014-11-01

    Full Text Available Los principales objetivos del control prenatal son identificar aquellos pacientes de mayor riesgo, con el fin de realizar intervenciones en forma oportuna que permitan prevenir dichos riesgos y así lograr un buen resultado perinatal. Esto se realiza a través de la historia médica y reproductiva de la mujer, el examen físico, la realización de algunos exámenes de laboratorio y exámenes de ultrasonido. Además es importante promover estilos de vida saludables, la suplementación de ácido fólico, una consejería nutricional y educación al respecto.

  18. Robotic Automation in Computer Controlled Polishing

    Science.gov (United States)

    Walker, D. D.; Yu, G.; Bibby, M.; Dunn, C.; Li, H.; Wu, Y.; Zheng, X.; Zhang, P.

    2016-02-01

    We first present a Case Study - the manufacture of 1.4 m prototype mirror-segments for the European Extremely Large Telescope, undertaken by the National Facility for Ultra Precision Surfaces, at the OpTIC facility operated by Glyndwr University. Scale-up to serial-manufacture demands delivery of a 1.4 m off-axis aspheric hexagonal segment with surface precision use of robots to automate currently-manual operations on CNC polishing machines, to improve work-throughput, mitigate risk of damage to parts, and reduce dependence on highly-skilled staff. Second is the use of robots to pre-process surfaces prior to CNC polishing, to reduce total process time. The third draws the threads together, describing our vision of the automated manufacturing cell, where the operator interacts at cell rather than machine level. This promises to deliver a step-change in end-to-end manufacturing times and costs, compared with either platform used on its own or, indeed, the state-of-the-art used elsewhere.

  19. Custom controls

    Science.gov (United States)

    Butell, Bart

    1996-02-01

    Microsoft's Visual Basic (VB) and Borland's Delphi provide an extremely robust programming environment for delivering multimedia solutions for interactive kiosks, games and titles. Their object oriented use of standard and custom controls enable a user to build extremely powerful applications. A multipurpose, database enabled programming environment that can provide an event driven interface functions as a multimedia kernel. This kernel can provide a variety of authoring solutions (e.g. a timeline based model similar to Macromedia Director or a node authoring model similar to Icon Author). At the heart of the kernel is a set of low level multimedia components providing object oriented interfaces for graphics, audio, video and imaging. Data preparation tools (e.g., layout, palette and Sprite Editors) could be built to manage the media database. The flexible interface for VB allows the construction of an infinite number of user models. The proliferation of these models within a popular, easy to use environment will allow the vast developer segment of 'producer' types to bring their ideas to the market. This is the key to building exciting, content rich multimedia solutions. Microsoft's VB and Borland's Delphi environments combined with multimedia components enable these possibilities.

  20. Noise-Assisted Concurrent Multipath Traffic Distribution in Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Narun Asvarujanon

    2013-01-01

    Full Text Available The concept of biologically inspired networking has been introduced to tackle unpredictable and unstable situations in computer networks, especially in wireless ad hoc networks where network conditions are continuously changing, resulting in the need of robustness and adaptability of control methods. Unfortunately, existing methods often rely heavily on the detailed knowledge of each network component and the preconfigured, that is, fine-tuned, parameters. In this paper, we utilize a new concept, called attractor perturbation (AP, which enables controlling the network performance using only end-to-end information. Based on AP, we propose a concurrent multipath traffic distribution method, which aims at lowering the average end-to-end delay by only adjusting the transmission rate on each path. We demonstrate through simulations that, by utilizing the attractor perturbation relationship, the proposed method achieves a lower average end-to-end delay compared to other methods which do not take fluctuations into account.

  1. Hierarchical Control of Thermostatically Controller Loads for Primary Frequency Control

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Huang, Shaojun

    2016-01-01

    This paper proposes a hierarchical control of Thermostatically Controlled Loads (TCLs) to provide primary frequency control support. The control architecture is comprised of three levels. At the high level, an aggregator coordinates multiple distribution substations and dispatches the primary...... reserve references. At the middle level, distribution substations estimate the available power of TCLs based on the aggregated bin model, and dispatch control signals to individual TCLs. At the local level, a supplementary frequency control loop is implemented at the local controller, which makes TCLs...

  2. Adaptive Vehicle Traction Control

    OpenAIRE

    Lee, Hyeongcheol; Tomizuka, Masayoshi

    1995-01-01

    This report presents two different control algorithms for adaptive vehicle traction control, which includes wheel slip control, optimal time control, anti-spin acceleration and anti-skid control, and longitudinal platoon control. The two control algorithms are respectively based on adaptive fuzzy logic control and sliding mode control with on-line road condition estimation. Simulations of the two control methods are conducted using a complex nonlinear vehicle model as well as a simple linear ...

  3. Controlling Separation in Turbomachines

    Science.gov (United States)

    Evans, Simon; Himmel, Christoph; Power, Bronwyn; Wakelam, Christian; Xu, Liping; Hynes, Tom; Hodson, Howard

    2010-01-01

    Four examples of flow control: 1) Passive control of LP turbine blades (Laminar separation control). 2) Aspiration of a conventional axial compressor blade (Turbulent separation control). 3) Compressor blade designed for aspiration (Turbulent separation control). 4.Control of intakes in crosswinds (Turbulent separation control).

  4. thesis that carnivorous fish

    African Journals Online (AJOL)

    spamer

    The work of Iverson (1990) supported the hypo- thesis that carnivorous fish (and squid) production is controlled by the quantity of new nitrogen (NO3-N) incorporated annually into phytoplankton biomass. (annual new production) and transferred through marine foodwebs. This hypothesis is tested here in the context of the ...

  5. Ecotoxicological effects on zooplankton-phytoplankton interactions in eutrophied waters

    NARCIS (Netherlands)

    Scholten, M.C.T.

    2004-01-01

    This thesis focuses on the cascade effects of reduced grazing of algae by daphnids in eutrophic waters due to toxic stress. Daphnids (and other zoöplankton) play a critical role in the pelagic foodweb by balancing the top-down control over the lower trophic levels (their foodsource, i.e. algae) and

  6. Identity and access management business performance through connected intelligence

    CERN Document Server

    Osmanoglu, Ertem

    2013-01-01

    Identity and Access Management: Controlling Your Network provides you with a practical, in-depth walkthrough of how to plan, assess, design, and deploy IAM solutions. This book breaks down IAM into manageable components to ease systemwide implementation. The hands-on, end-to-end approach includes a proven step-by-step method for deploying IAM that has been used successfully in over 200 deployments. The book also provides reusable templates and source code examples in Java, XML, and SPML.Focuses on real-word implementations Provides end-to-end coverage of IAM from business drivers, requirements

  7. Contact Control, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-21

    The contact control code is a generalized force control scheme meant to interface with a robotic arm being controlled using the Robot Operating System (ROS). The code allows the user to specify a control scheme for each control dimension in a way that many different control task controllers could be built from the same generalized controller. The input to the code includes maximum velocity, maximum force, maximum displacement, and a control law assigned to each direction and the output is a 6 degree of freedom velocity command that is sent to the robot controller.

  8. Quantum feedback control and classical control theory

    OpenAIRE

    Doherty, Andrew C.; Habib, Salman; Jacobs, Kurt; Mabuchi, Hideo; Tan, Sze M.

    2000-01-01

    We introduce and discuss the problem of quantum feedback control in the context of established formulations of classical control theory, examining conceptual analogies and essential differences. We describe the application of state-observer based control laws, familiar in classical control theory, to quantum systems and apply our methods to the particular case of switching the state of a particle in a double-well potential.

  9. PID control with robust disturbance feedback control

    DEFF Research Database (Denmark)

    Kawai, Fukiko; Vinther, Kasper; Andersen, Palle

    2015-01-01

    Disturbance Feedback Control (DFC) is a technique, originally proposed by Fuji Electric, for augmenting existing control systems with an extra feedback for attenuation of disturbances and model errors. In this work, we analyze the robustness and performance of a PID-based control system with DFC...... and performance (if such gains exist). Finally, two different simulation case studies are evaluated and compared. Our numerical studies indicate that better performance can be achieved with the proposed method compared with a conservatively tuned PID controller and comparable performance can be achieved when...... compared with an H-infinity controller....

  10. Adaptive Extremum Control and Wind Turbine Control

    DEFF Research Database (Denmark)

    Ma, Xin

    1997-01-01

    This thesis is divided into two parts, i.e., adaptive extremum control and modelling and control of a wind turbine. The rst part of the thesis deals with the design of adaptive extremum controllers for some processes which have the behaviour that process should have as high e ciency as possible...... in parameters, and thus directly lends itself to parameter estimation and adaptive control. The extremum control law is derived based on static optimization of a performance function. For a process with nonlinearity at output the intermediate signal between the linear part and nonlinear part plays an important...... role. If it can be emphasis on control design. The models have beenvalidated by experimental data obtained from an existing wind turbine. The e ective wind speed experienced by the rotor of a wind turbine, which is often required by some control methods, is estimated by using a wind turbine as a wind...

  11. Motion control report

    CERN Document Server

    2013-01-01

    Please note this is a short discount publication. In today's manufacturing environment, Motion Control plays a major role in virtually every project.The Motion Control Report provides a comprehensive overview of the technology of Motion Control:* Design Considerations* Technologies* Methods to Control Motion* Examples of Motion Control in Systems* A Detailed Vendors List

  12. PID controllers' fragility.

    Science.gov (United States)

    Alfaro, Víctor M

    2007-10-01

    In this paper, an index for measuring fragility of proportional integral derivative (PID) controllers is proposed. This index relates the losses of robustness of the control loop when controller parameters change, to the nominal robustness of the control loop. Furthermore, it defines when a PID controller is fragile, nonfragile or resilient.

  13. Control Measure Dataset

    Data.gov (United States)

    U.S. Environmental Protection Agency — The EPA Control Measure Dataset is a collection of documents describing air pollution control available to regulated facilities for the control and abatement of air...

  14. Gross motor control

    Science.gov (United States)

    Gross motor control is the ability to make large, general movements (such as waving an arm or lifting a ... Gross motor control is a milestone in the development of an infant. Infants develop gross motor control before they ...

  15. Riot Control Agents

    Science.gov (United States)

    ... Submit What's this? Submit Button Facts About Riot Control Agents Interim document Recommend on Facebook Tweet Share Compartir FACT SHEET What riot control agents are Riot control agents (sometimes referred to ...

  16. Essure Permanent Birth Control

    Science.gov (United States)

    ... Prosthetics Essure Permanent Birth Control Essure Permanent Birth Control Share Tweet Linkedin Pin it More sharing options ... Print Essure is a a permanently implanted birth control device for women (female sterilization). Implantation of Essure ...

  17. Birth control pills - combination

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/patientinstructions/000655.htm Birth control pills - combination To use the sharing features on ... both progestin and estrogen. What Are Combination Birth Control Pills? Birth control pills help keep you from ...

  18. Birth Control Explorer

    Science.gov (United States)

    ... STIs Media Facebook Twitter Tumblr Shares · 579 Birth Control Explorer Sort by all methods most effective methods ... You are here Home » Birth Control Explorer Birth Control Explorer If you’re having sex —or if ...

  19. Wisdom Appliance Control System

    Science.gov (United States)

    Hendrick; Jheng, Jyun-Teng; Tsai, Chen-Chai; Liou, Jia-Wei; Wang, Zhi-Hao; Jong, Gwo-Jia

    2017-07-01

    Intelligent appliances wisdom involves security, home care, convenient and energy saving, but the home automation system is still one of the core unit, and also using micro-processing electronics technology to centralized and control the home electrical products and systems, such as: lighting, television, fan, air conditioning, stereo, it composed of front-controller systems and back-controller panels, user using front-controller to control command, and then through the back-controller to powered the device.

  20. Applying Dataflow Analysis to Dimension Buffers for Guaranteed Performance in Networks on Chip

    NARCIS (Netherlands)

    Hansson, A.; Hansson, Andreas; Wiggers, M.H.; Moonen, Arno; Goossens, Kees; Bekooij, Marco Jan Gerrit; Bekooij, Marco

    A Network on Chip (NoC) with end-to-end flow control is modelled by a cyclo-static dataflow graph. Using the proposed model together with state-of-the-art dataflow analysis algorithms, we size the buffers in the network interfaces. We show, for a range of NoC designs, that buffer sizes are

  1. A survey of TCP over ad hoc networks

    NARCIS (Netherlands)

    Al Hanbali, Ahmad; Altman, Eitan; Nain, Philippe

    2005-01-01

    The Transmission Control Protocol (TCP) was designed to provide reliable end-to-end delivery of data over unreliable networks. In practice, most TCP deployments have been carefully designed in the context of wired networks. Ignoring the properties of wireless ad hoc networks can lead to TCP

  2. Remote management of non-TR-069 UPnP end-user devices in a private network

    NARCIS (Netherlands)

    Hillen, B.A.G.; Passchier, I.; Schoonhoven, B.H.A. van; Hartog, F.T.H. den

    2009-01-01

    End-to - end broadband service delivery requires remote management of devices in the home network, beyond the home gateway (HG). The service provider can only put limited requirements to these of-the-shelf devices, and therefore has to make intelligent use of their given control and management

  3. Remote discovery and management of end-user devices in heterogeneous private networks

    NARCIS (Netherlands)

    Delphinanto, A.; Hillen, B.A.G.; Passchier, I.; Schoonhoven, B.H.A. van; Hartog, F.T.H. den

    2009-01-01

    End-to-end broadband service delivery requires remote management of devices in the home network, beyond the home gateway (HG). The service provider can only put limited requirements to these of-the-shelf devices, and therefore has to make intelligent use of their given control and management

  4. Dynamic Flow Migration for Delay Constrained Traffic in Software-Defined Networks

    NARCIS (Netherlands)

    Berger, Andre; Gross, James; Danielis, Peter; Dán, György

    2017-01-01

    Various industrial control applications have stringent end-to-end latency requirements in the order of a few milliseconds. Software-defined networking (SDN) is a promising solution in order to meet these stringent requirements under varying traffic patterns, as it enables the flexible management of

  5. Towards a format-agnostic approach for production, delivery and rendering of immersive media

    NARCIS (Netherlands)

    Niamut, O.A.; Kochale, A.; Hidalgo, J.R.; Kaiser, R.; Spille, J.; Macq, J.F.; Kienast, G.; Schreer, O.; Shirley, B.

    2013-01-01

    The media industry is currently being pulled in the often-opposing directions of increased realism (high resolution, stereoscopic, large screen) and personalization (selection and control of content, availability on many devices). We investigate the feasibility of an end-to-end format-agnostic

  6. Iterative Available Bandwidth Estimation for Mobile Transport Networks

    DEFF Research Database (Denmark)

    Ubeda Castellanos, Carlos; López Villa, Dimas; Teyeb, Oumer Mohammed

    2007-01-01

    Available bandwidth estimation has lately been proposed to be used for end-to-end resource management in existing and emerging mobile communication systems, whose transport networks could end up being the bottleneck rather than the air interface. Algorithms for admission control, handover...

  7. Test Control Center (TCC)

    Data.gov (United States)

    Federal Laboratory Consortium — The Test Control Center (TCC) provides a consolidated facility for planning, coordinating, controlling, monitoring, and analyzing distributed test events. ,The TCC...

  8. Fuzzy controller adaptation

    Science.gov (United States)

    Myravyova, E. A.; Sharipov, M. I.; Radakina, D. S.

    2017-10-01

    During writing this work, the fuzzy controller with a double base of rules was studied, which was applied for the synthesis of the automated control system. A method for fuzzy controller adaptation has been developed. The adaptation allows the fuzzy controller to automatically compensate for parametric interferences that occur at the control object. Specifically, the fuzzy controller controlled the outlet steam temperature in the boiler unit BKZ-75-39 GMA. The software code was written in the programming support environment Unity Pro XL designed for fuzzy controller adaptation.

  9. Dynamic power flow controllers

    Science.gov (United States)

    Divan, Deepakraj M.; Prasai, Anish

    2017-03-07

    Dynamic power flow controllers are provided. A dynamic power flow controller may comprise a transformer and a power converter. The power converter is subject to low voltage stresses and not floated at line voltage. In addition, the power converter is rated at a fraction of the total power controlled. A dynamic power flow controller controls both the real and the reactive power flow between two AC sources having the same frequency. A dynamic power flow controller inserts a voltage with controllable magnitude and phase between two AC sources; thereby effecting control of active and reactive power flows between two AC sources.

  10. Robust control charts in statistical process control

    OpenAIRE

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust against violations of the basic assumptions, like normality. The effect of using robust or other alternative estimators has not been investigated thoroughly in perspective of control charting literature....

  11. Space Digital Controller for Improved Motor Control

    Science.gov (United States)

    Alves-Nunes, Samuel; Daras, Gaetan; Dehez, Bruno; Maillard, Christophe; Bekemans, Marc; Michel, Raymond

    2014-08-01

    Performing digital motor control into space equipment is a new challenge. The new DPC (Digital Programmable Controller) is the first chip that we can use as a micro-controller, allowing us to drive motors with digital control schemes. In this paper, the digital control of hybrid stepper motors is considered. This kind of motor is used for solar array rotation and antenna actuation. New digital control technology brings a lot of advantages, allowing an important reduction of thermal losses inside the motor, and a reduction of thermal constraints on power drive electronic components. The opportunity to drive motors with a digital controller also brings many new functionalities like post-failure torque analysis, micro- vibrations and cogging torque reduction, or electro- mechanical damping of solar array oscillations. To evaluate the performance of the system, Field-Oriented Control (FOC) is implemented on a hybrid stepper motor. A test-bench, made of an active load, has been made to emulate the mechanical behaviour of the solar array, by the use of a torsionally-compliant model. The experimental results show that we can drastically reduce electrical power consumption, compared with the currently used open-loop control scheme.

  12. Control and optimal control theories with applications

    CERN Document Server

    Burghes, D N

    2004-01-01

    This sound introduction to classical and modern control theory concentrates on fundamental concepts. Employing the minimum of mathematical elaboration, it investigates the many applications of control theory to varied and important present-day problems, e.g. economic growth, resource depletion, disease epidemics, exploited population, and rocket trajectories. An original feature is the amount of space devoted to the important and fascinating subject of optimal control. The work is divided into two parts. Part one deals with the control of linear time-continuous systems, using both transfer fun

  13. Improving Control of Two Motor Controllers

    Science.gov (United States)

    Toland, Ronald W.

    2004-01-01

    A computer program controls motors that drive translation stages in a metrology system that consists of a pair of two-axis cathetometers. This program is specific to Compumotor Gemini (or equivalent) motors and the Compumotor 6K-series (or equivalent) motor controller. Relative to the software supplied with the controller, this program affords more capabilities and is easier to use. Written as a Virtual Instrument in the LabVIEW software system, the program presents an imitation control panel that the user can manipulate by use of a keyboard and mouse. There are three modes of operation: command, movement, and joystick. In command mode, single commands are sent to the controller for troubleshooting. In movement mode, distance, speed, and/or acceleration commands are sent to the controller. Position readouts from the motors and from position encoders on the translation stages are displayed in marked fields. At any time, the position readouts can be recorded in a file named by the user. In joystick mode, the program yields control of the motors to a joystick. The program sends commands to, and receives data from, the controller via a serial cable connection, using the serial-communication portion of the software supplied with the controller.

  14. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  15. Control of Bioprocesses

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    2015-01-01

    industries. It also provides a number of typical control loops for different objectives. A brief introduction to the general principles of process control, the PID control algorithm is discussed, and the design and effect of tuning are shown in an example. Finally, a discussion of novel, model-free control...

  16. Aircraft adaptive learning control

    Science.gov (United States)

    Lee, P. S. T.; Vanlandingham, H. F.

    1979-01-01

    The optimal control theory of stochastic linear systems is discussed in terms of the advantages of distributed-control systems, and the control of randomly-sampled systems. An optimal solution to longitudinal control is derived and applied to the F-8 DFBW aircraft. A randomly-sampled linear process model with additive process and noise is developed.

  17. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  18. Foundations Of Fuzzy Control

    DEFF Research Database (Denmark)

    Jantzen, Jan

    The objective of this textbook is to acquire an understanding of the behaviour of fuzzy logic controllers. Under certain conditions a fuzzy controller is equivalent to a proportional-integral-derivative (PID) controller. Using that equivalence as a link, the book applies analysis methods from...... linear and nonlinear control theory. In the linear domain, PID tuning methods and stability analyses are transferred to linear fuzzy controllers. The Nyquist plot shows the robustness of different settings of the fuzzy gain parameters. As a result, a fuzzy controller is guaranteed to perform as well...... as any PID controller. In the nonlinear domain, the stability of four standard control surfaces is analysed by means of describing functions and Nyquist plots. The self-organizing controller (SOC) is shown to be a model reference adaptive controller. There is a possibility that a nonlinear fuzzy PID...

  19. Self adapting control charts

    OpenAIRE

    Albers, Willem/Wim; Kallenberg, W. C. M.

    2004-01-01

    When the distributional form of the observations differs from normality, standard control charts are often seriously in error. Such model errors can be avoided with (modified) nonparametric control charts. Unfortunately, these control charts suffer from large stochastic errors due to estimation. In between are so called parametric control charts. All three of them are discussed in this paper as well as a combined chart, which chooses one of the three control charts according to the appropriat...

  20. Robo-Wrist Controller

    OpenAIRE

    Faraquddin Ahamed, Mohamed Athiq

    2015-01-01

    Robo-wrist controller is a Graphical User Interface (GUI) based application designed and developed to control, and monitor a wrist exoskeleton driven by micro-controllers. The software application is developed to better assist a physiotherapist in administering physical therapy to stroke patients with the help of the exoskeleton. Since the micro-controller device that directly controls the exoskeleton is not very intuitive to use and lacks visual feedback, a software application fills these g...

  1. Controller Architectures for Switching

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2009-01-01

    This paper investigate different controller architectures in connection with controller switching. The controller switching is derived by using the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization. A number of different architectures for the implementation of the YJBK parameterization...... are described and applied in connection with controller switching. An architecture that does not include inversion of the coprime factors is introduced. This architecture will make controller switching particular simple....

  2. Fuzzy and neural control

    Science.gov (United States)

    Berenji, Hamid R.

    1992-01-01

    Fuzzy logic and neural networks provide new methods for designing control systems. Fuzzy logic controllers do not require a complete analytical model of a dynamic system and can provide knowledge-based heuristic controllers for ill-defined and complex systems. Neural networks can be used for learning control. In this chapter, we discuss hybrid methods using fuzzy logic and neural networks which can start with an approximate control knowledge base and refine it through reinforcement learning.

  3. Insight into the Physical and Dynamical Processes that Control Rapid Increases in Total Flash Rate

    Science.gov (United States)

    Schultz, Christopher J.; Carey, Lawrence D.; Schultz, Elise V.; Blakeslee, Richard J.; Goodman, Steven J.

    2015-01-01

    Rapid increases in total lightning (also termed "lightning jumps") have been observed for many decades. Lightning jumps have been well correlated to severe and hazardous weather occurrence. The main focus of lightning jump work has been on the development of lightning algorithms to be used in real-time assessment of storm intensity. However, in these studies it is typically assumed that the updraft "increases" without direct measurements of the vertical motion, or specification of which updraft characteristic actually increases (e.g., average speed, maximum speed, or convective updraft volume). Therefore, an end-to-end physical and dynamical basis for coupling rapid increases in total flash rate to increases in updraft speed and volume must be understood in order to ultimately relate lightning occurrence to severe storm metrics. Herein, we use polarimetric, multi-Doppler, and lightning mapping array measurements to provide physical context as to why rapid increases in total lightning are closely tied to severe and hazardous weather.

  4. Vehicle Dynamics and Control

    CERN Document Server

    Rajamani, Rajesh

    2012-01-01

    Vehicle Dynamics and Control provides a comprehensive coverage of vehicle control systems and the dynamic models used in the development of these control systems. The control system applications covered in the book include cruise control, adaptive cruise control, ABS, automated lane keeping, automated highway systems, yaw stability control, engine control, passive, active and semi-active suspensions, tire-road friction coefficient estimation, rollover prevention, and hybrid electric vehicle. In developing the dynamic model for each application, an effort is made to both keep the model simple enough for control system design but at the same time rich enough to capture the essential features of the dynamics. A special effort has been made to explain the several different tire models commonly used in literature and to interpret them physically. In the second edition of the book, chapters on roll dynamics, rollover prevention and hybrid electric vehicles have been added, and the chapter on electronic stability co...

  5. Switching Between Multivariable Controllers

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob; Abrahamsen, Rune

    2004-01-01

    it is possible to smoothly switch between multivariable controllers with guaranteed closed-loop stability. This includes also the case where one or more controllers are unstable. The concept for smooth online changes of multivariable controllers based on the YJBK architecture can also handle the start up......A concept for implementation of multivariable controllers is presented in this paper. The concept is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization of all stabilizing controllers. By using this architecture for implementation of multivariable controllers, it is shown how...... and shut down of multivariable systems. Furthermore, the start up of unstable multivariable controllers can be handled as well. Finally, implementation of (unstable) controllers as a stable Q parameter in a Q-parameterized controller can also be achieved....

  6. An Efficient Algorithm for Congestion Control in Highly Loaded DiffServ/MPLS Networks

    Directory of Open Access Journals (Sweden)

    Srecko Krile

    2009-06-01

    Full Text Available The optimal QoS path provisioning of coexisted and aggregated traffic in networks is still demanding problem. All traffic flows in a domain are distributed among LSPs (Label Switching Path related to N service classes, but the congestion problem of concurrent flows can appear. As we know the IGP (Interior Getaway Protocol uses simple on-line routing algorithms (e.g. OSPFS, IS-IS based on shortest path methodology. In QoS end-to-end provisioning where some links may be reserved for certain traffic classes (for particular set of users it becomes insufficient technique. On other hand, constraint based explicit routing (CR based on IGP metric ensures traffic engineering (TE capabilities. The algorithm proposed in this paper may find a longer but lightly loaded path, better than the heavily loaded shortest path. LSP can be pre-computed much earlier, possibly during SLA (Service Level Agreement negotiation process.  As we need firm correlation with bandwidth management and traffic engineering (TE the initial (pro-active routing can be pre-computed in the context of all priority traffic flows (former contracted SLAs traversing the network simultaneously. It could be a very good solution for congestion avoidance and for better load-balancing purpose where links are running close to capacity. Also, such technique could be useful in inter-domain end-to-end provisioning, where bandwidth reservation has to be negotiated with neighbor ASes (Autonomous System. To be acceptable for real applications such complicated routing algorithm can be significantly improved. Algorithm was tested on the network of M core routers on the path (between edge routers and results are given for N=3 service classes. Further improvements through heuristic approach are made and results are discussed.

  7. Commissioning and quality control of a dedicated wide bore 3T MRI simulator for radiotherapy planning

    Directory of Open Access Journals (Sweden)

    Aitang Xing

    2016-06-01

    Full Text Available Purpose: The purpose of this paper is to describe a practical approach to commissioning and quality assurance (QA of a dedicated wide-bore 3 Tesla (3T magnetic resonance imaging (MRI scanner for radiotherapy planning.Methods: A comprehensive commissioning protocol focusing on radiotherapy (RT specific requirements was developed and performed. RT specific tests included: uniformity characteristics of radio-frequency (RF coil, couch top attenuation, geometric distortion, laser and couch movement and an end-to-end radiotherapy treatment planning test. General tests for overall system performance and safety measurements were also performed.Results: The use of pre-scan based intensity correction increased the uniformity from 61.7% to 97% (body flexible coil, from 50% to 90% (large flexible coil and from 51% to 98% (small flexible coil. RT flat top couch decreased signal-to-noise ratio (SNR by an average of 42%. The mean and maximum geometric distortion was found to be 1.25 mm and 4.08 mm for three dimensional (3D corrected image acquisition, 2.07 mm and 7.88 mm for two dimensional (2D corrected image acquisition over 500 mm × 375 mm × 252 mm field of view (FOV. The accuracy of the laser and couch movement was less than ±1 mm. The standard deviation of registration parameters for the end-to-end test was less than 0.41 mm. An on-going QA program was developed to monitor the system’s performance.Conclusion: A number of RT specific tests have been described for commissioning and subsequent performance monitoring of a dedicated MRI simulator (MRI-Sim. These tests have been important in establishing and maintaining its operation for RT planning.

  8. Robot welding process control

    Science.gov (United States)

    Romine, Peter L.

    1991-01-01

    This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

  9. Control system design method

    Science.gov (United States)

    Wilson, David G [Tijeras, NM; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  10. Control integral systems; Sistemas integrales de control

    Energy Technology Data Exchange (ETDEWEB)

    Burgos, Estrella [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1998-12-31

    Almost two third of the electric power generation in Mexico are obtained from hydrocarbons, for that reasons Comision Federal de Electricidad (CFE) dedicated special commitment in modernizing the operation of fossil fuel central stations. In attaining this objective the control systems play a fundamental roll, from them depend a good share of the reliability and the efficiency of the electric power generation process, as well as the extension of the equipment useful life. Since 1984 the Instituto de Investigaciones Electricas (IIE) has been working, upon the request of CFE, on the development of digital control systems. To date it has designed and implemented a logic control system for gas burners, which controls 32 burners of the Unit 4 boiler of the Generation Central of Valle de Mexico and two systems for distributed control for two combined cycle central stations, which are: Dos Bocas, Veracruz Combined cycle central, and Gomez Palacio, Durango combined cycle central. With these two developments the IIE enters the World tendency of implementing distributed control systems for the fossil fuel power central update [Espanol] Casi las dos terceras partes de la generacion electrica en Mexico se obtienen a partir de hidrocarburos, es por eso que la Comision Federal de Electricidad (CFE) puso especial empeno en modernizar la operacion de las centrales termoelectricas de combustibles fosiles. En el logro de este objetivo los sistemas de control desempenan un papel fundamental, de ellos depende una buena parte la confiabilidad y la eficiencia en el proceso de generacion de energia electrica, asi como la prolongacion de la vida util de los equipos. Desde 1984 el Instituto de Investigaciones Electricas (IIE) ha trabajado, a solicitud de la CFE, en el desarrollo de sistemas digitales de control. A la fecha se han disenado e implantado un sistema de control logico de quemadores de gas, el cual controla 32 quemadores de la caldera de la unidad 4 de la central de generacion

  11. Switching Systems: Controllability and Control Design

    Science.gov (United States)

    2009-04-25

    TERMS EOARD, Navigation, Comunications & Guidance, Complex Systems 16. SECURITY CLASSIFICATION OF: 19a. NAME OF RESPONSIBLE PERSON JAMES LAWTON Ph...Switching Systems : Controllability and Control Design 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER 5d. TASK NUMBER 6. AUTHOR(S) Professor...ABSTRACT Motivated by the need of dealing with physical systems that exhibit a more complicated behavior than those normally described by

  12. Ontology Based Access Control

    Directory of Open Access Journals (Sweden)

    Özgü CAN

    2010-02-01

    Full Text Available As computer technologies become pervasive, the need for access control mechanisms grow. The purpose of an access control is to limit the operations that a computer system user can perform. Thus, access control ensures to prevent an activity which can lead to a security breach. For the success of Semantic Web, that allows machines to share and reuse the information by using formal semantics for machines to communicate with other machines, access control mechanisms are needed. Access control mechanism indicates certain constraints which must be achieved by the user before performing an operation to provide a secure Semantic Web. In this work, unlike traditional access control mechanisms, an "Ontology Based Access Control" mechanism has been developed by using Semantic Web based policies. In this mechanism, ontologies are used to model the access control knowledge and domain knowledge is used to create policy ontologies.

  13. Controls and Interfaces

    CERN Document Server

    King, Q

    2015-01-01

    Reliable powering of accelerator magnets requires reliable power converters and controls, able to meet the powering specifications in the long term. In this paper, some of the issues that will challenge a power converter controls engineer are discussed.

  14. Switching Between Multivariable Controllers

    DEFF Research Database (Denmark)

    Niemann, H.; Stoustrup, Jakob; Abrahamsen, R.B.

    2004-01-01

    A concept for implementation of multivariable controllers is presented in this paper. The concept is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization of all stabilizing controllers. By using this architecture for implementation of multivariable controllers, it is shown how it is p...... it is possible to smoothly switch between multivariable controllers with guaranteed closed-loop stability. This includes also the case where one or more controllers are unstable. Udgivelsesdato: MAR-APR......A concept for implementation of multivariable controllers is presented in this paper. The concept is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization of all stabilizing controllers. By using this architecture for implementation of multivariable controllers, it is shown how...

  15. Scabies: Prevention and Control

    Science.gov (United States)

    ... Diseases Laboratory Diagnostic Assistance [DPDx] Parasites Home Prevention & Control Recommend on Facebook Tweet Share Compartir When a ... avoid outbreaks. Institutional outbreaks can be difficult to control and require a rapid, aggressive, and sustained response. ...

  16. Poison Control Centers

    Science.gov (United States)

    ... 1222 immediately. Name State American Association of Poison Control Centers Address AAPCC Central Office NOT A POISON ... not for emergency use. Arkansas ASPCA Animal Poison Control Center Address 1717 S. Philo Road, Suite 36 Urbana, ...

  17. Tight Diabetes Control

    Science.gov (United States)

    ... A A A Listen En Español Tight Diabetes Control Managing blood glucose can prevent or slow the ... prevents or slows some complications. For the Diabetes Control and Complications Trial (DCCT), researchers followed 1,441 ...

  18. Birth control pill - slideshow

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/presentations/100108.htm Birth control pill - series—Normal female anatomy To use the ... produce a successful pregnancy. To prevent pregnancy, birth control pills affect how these organs normally function. Review ...

  19. TIPONLINE Control Table

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The tiponline application is quite complex.It contains code, control,use local lists and control data. Those whou care about the inter workings of the application...

  20. NGS Survey Control Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NGS Survey Control Map provides a map of the US which allows you to find and display geodetic survey control points stored in the database of the National...