WorldWideScience

Sample records for dna end-to-end distance

  1. Changes in the end-to-end distance distribution in an oligonucleotide following hybridization

    Science.gov (United States)

    Parkhurst, Lawrence J.; Parkhurst, Kay M.

    1994-08-01

    A 16-mer deoxy oligonucleotide was labeled at the 5' end with x- rhodamine and at the 3' end with fluorescein. The fluorescence lifetime of the donor, fluorescein, under conditions for resonance energy transfer, was studied using the SLM 4850 multiharmonic frequency phase fluorometer in order to obtain information on the end-to-end distance distribution P(R) in the oligomer. When this doubly labeled oligonucleotide was hybridized to its 16-mer complement, the fluorescein fluorescence decay could be very well described by a P(R) that was a symmetric shifted Gaussian with center at 68.4 angstrom and (sigma) equals6.4 angstrom. Simulations suggested that part of the width might be attributable to a distribution in (kappa) 2. In the single- stranded labeled oligomer, there was enhanced energy transfer from the fluorescein to the rhodamine and the best fitting symmetrical shifted Gaussian representation of P(R) was centered at 53.8 angstrom with (kappa) equals6.9 angstrom. There was significant lack of fit with this model, however. A model independent procedure was developed for extracting P(R) as a sum of weighted Hermite polynomials. This procedure gave a P(R) with a large negative region at R<20 angstrom, suggesting that rotational averaging for (kappa) 2 was not quite complete prior to significant decay of the donor excited state.

  2. End-to-End Optimization of High-Throughput DNA Sequencing.

    Science.gov (United States)

    O'Reilly, Eliza; Baccelli, Francois; De Veciana, Gustavo; Vikalo, Haris

    2016-10-01

    At the core of Illumina's high-throughput DNA sequencing platforms lies a biophysical surface process that results in a random geometry of clusters of homogeneous short DNA fragments typically hundreds of base pairs long-bridge amplification. The statistical properties of this random process and the lengths of the fragments are critical as they affect the information that can be subsequently extracted, that is, density of successfully inferred DNA fragment reads. The ensembles of overlapping DNA fragment reads are then used to computationally reconstruct the much longer target genome sequence. The success of the reconstruction in turn depends on having a sufficiently large ensemble of DNA fragments that are sufficiently long. In this article using stochastic geometry, we model and optimize the end-to-end flow cell synthesis and target genome sequencing process, linking and partially controlling the statistics of the physical processes to the success of the final computational step. Based on a rough calibration of our model, we provide, for the first time, a mathematical framework capturing the salient features of the sequencing platform that serves as a basis for optimizing cost, performance, and/or sensitivity analysis to various parameters.

  3. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  4. End to end Distance and its Probability Distribution of Polymer Chains near a Flat Barrier%平面壁限制的高分子链末端距及其概率分布

    Institute of Scientific and Technical Information of China (English)

    黄建花; 蒋文华; 韩世钧

    2001-01-01

    The problem of polymer chains near an impenetrable plane is investigated by means of the probability method. It is shown that the 2kth moment of the reduced normal component of the end-to-end distance A2k only depends on the reduced distance to the plane of the first segment AZ0, here, A=l- 1· , n is the chain length, l is the bond length and fixed to be unity, which can be expressed as A2k=f(AZ0). When AZ0≈ 0, A2k is the maximum(A2k=k!), then it decreases rapidly and soon reaches the minimum with the increase of AZ0, afterwards A2k goes up gradually and reaches the limit value [(2k- 1)× (2k- 3)× … × 1]/2k when AZ0 is large enough. Suggesting that the polymer chain can be significantly elongated for small Z0 and contracted for an intermediate range of Z0 due to the barrier. The distribution of the end-to-end distance also depends on the distance Z0 to the plane of the first segment.

  5. An End-to-End DNA Taxonomy Methodology for Benthic Biodiversity Survey in the Clarion-Clipperton Zone, Central Pacific Abyss

    Directory of Open Access Journals (Sweden)

    Adrian G. Glover

    2015-12-01

    Full Text Available Recent years have seen increased survey and sampling expeditions to the Clarion-Clipperton Zone (CCZ, central Pacific Ocean abyss, driven by commercial interests from contractors in the potential extraction of polymetallic nodules in the region. Part of the International Seabed Authority (ISA regulatory requirements are that these contractors undertake environmental research expeditions to their CCZ exploration claims following guidelines approved by the ISA Legal and Technical Commission (ISA, 2010. Section 9 (e of these guidelines instructs contractors to “…collect data on the sea floor communities specifically relating to megafauna, macrofauna, meiofauna, microfauna, nodule fauna and demersal scavengers”. There are a number of methodological challenges to this, including the water depth (4000–5000 m, extremely warm surface waters (~28 °C compared to bottom water (~1.5 °C and great distances to ports requiring a large and long seagoing expedition with only a limited number of scientists. Both scientists and regulators have recently realized that a major gap in our knowledge of the region is the fundamental taxonomy of the animals that live there; this is essential to inform our knowledge of the biogeography, natural history and ultimately our stewardship of the region. Recognising this, the ISA is currently sponsoring a series of taxonomic workshops on the CCZ fauna and to assist in this process we present here a series of methodological pipelines for DNA taxonomy (incorporating both molecular and morphological data of the macrofauna and megafauna from the CCZ benthic habitat in the recent ABYSSLINE cruise program to the UK-1 exploration claim. A major problem on recent CCZ cruises has been the collection of high-quality samples suitable for both morphology and DNA taxonomy, coupled with a workflow that ensures these data are made available. The DNA sequencing techniques themselves are relatively standard, once good samples have been

  6. Internet end-to-end delay dynamics

    Institute of Scientific and Technical Information of China (English)

    Zhu Changhua; Pei Changxing; Li Jiandong; Chen Nan; Yi Yunhui

    2006-01-01

    End-to-end delay is one of the most important characteristics of Internet end-to-end packet dynamics, which can be applied to quality of services (QoS) management, service level agreement (SLA) management, congestion control algorithm development, etc. Nonstationarity and nonlinearity are found by the analysis of various delay series measured from different links. The fact that different types of links have different degree of Self-Similarity is also obtained. By constructing appropriate network architecture and neural functions, functional networks can be used to model the Internet end-to-end nonlinear delay time series. Furthermore, by using adaptive parameter studying algorithm, the nonstationarity can also be well modeled. The numerical results show that the provided functional network architecture and adaptive algorithm can precisely characterize the Internet end-to-end delay dynamics.

  7. Applying Trustworthy Computing to End-to-End Electronic Voting

    Science.gov (United States)

    Fink, Russell A.

    2010-01-01

    "End-to-End (E2E)" voting systems provide cryptographic proof that the voter's intention is captured, cast, and tallied correctly. While E2E systems guarantee integrity independent of software, most E2E systems rely on software to provide confidentiality, availability, authentication, and access control; thus, end-to-end integrity is not…

  8. Research on End-to-End Encryption of TETRA

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhi-hui; YANG Yi-xian

    2006-01-01

    The Terrestrial Trunked Radio(TETRA) system uses end-to-end encryption in addition to the air interface encryption to provide enhanced security. The TETRA system uses a synchronization technique known as frame stealing to provide synchronization of end-to-end encrypted data. However, the frame stealing process degrades the quality of video. This paper proposes an end-to-end encryption system with the frame stealing technique for voice and frame insertion for video.A block cipher in the output feedback mode is used to implement the end-to-end key stream generator. Moreover. In the Short Data Service (SDS) message encryption, a block cipher in the Cipher Block Chaining (CBC) mode is used to calculate the cryptographically secure checksum, which is sufficient to certify the integrity.

  9. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  10. End-to-end delay analysis for networked systems

    Institute of Scientific and Technical Information of China (English)

    Jie SHEN; Wen-bo HE; Xue LIU; Zhi-bo WANG; Zhi WANG; Jian-guo YAO

    2015-01-01

    End-to-end delay measurement has been an essential element in the deployment of real-time services in networked systems. Traditional methods of delay measurement based on time domain analysis, however, are not efficient as the network scale and the complexity increase. We propose a novel theoretical framework to analyze the end-to-end delay distributions of networked systems from the frequency domain. We use a signal fl ow graph to model the delay distribution of a networked system and prove that the end-to-end delay distribution is indeed the inverse Laplace transform of the transfer function of the signal fl ow graph. Two efficient methods, Cramer’s rule-based method and the Mason gain rule-based method, are adopted to obtain the transfer function. By analyzing the time responses of the transfer function, we obtain the end-to-end delay distribution. Based on our framework, we propose an efficient method using the dominant poles of the transfer function to work out the bottleneck links of the network. Moreover, we use the framework to study the network protocol performance. Theoretical analysis and extensive evaluations show the effectiveness of the proposed approach.

  11. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this pa

  12. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  13. End-To-End Verifiability in Electronic Elections

    OpenAIRE

    Risvik, Chris Csomos

    2016-01-01

    Voting has traditionally been performed by casting paper ballots in public polling places. However, advancements in computer technology in the latest decades have enabled voters to cast their votes electronically. By 2016, eleven different countries have made various trials enabling voters to cast votes on the Internet using personal devices. End-to-end verifiability is regarded by election experts as an important property for retaining democratic principles in Internet elections. Nevertheles...

  14. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  15. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, V [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N{sup 2}) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  16. End-to-end network/application performance troubleshooting methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  17. The CarbonSat End-to-End Simulator

    Science.gov (United States)

    Bramstedt, Klaus; Noel, Stefan; Bovensmann, Heinrich; Reuter, Max; Burrows, John P.; Jurado Lozano, Pedro Jose; Meijer, Yasjka; Loescher, Armin; Acarreta, Juan R.; Sturm, Philipp; Tesmer, Volker; Sanchez Monero, Ana Maria; Atapuerca Rodreiguez de Dios, Francisco Javier; Toledano Sanchez, Daniel; Boesch, Hartmut

    2016-08-01

    The objective of the CarbonSat mission is to improve our knowledge on natural and anthropogenic sources and sinks of CO2 and CH4. CarbonSat was one of the two candidate missions selected for definition studies for becoming Earth Explorer 8 (EE8).The CarbonSat End-to-End Simulator (CSE2ES) simulates the full data flow of the mission with a set of modules embedded in ESA's generic simulation framework OpenSF. A Geometry Module (GM) defines the orbital geometry and related parameters. A Scene Generation Module (SGM) provides simulated radiances and irradiances for the selected scenes. The Level 1 Module (L1M) compromises the instrument simulator and the Level 1b processor, and provide as main output calibrated spectra. The L1M is implemented in two versions, reflecting the instrument concepts from the two competing industrial system studies. The Level 2 Retrieval Module (L2M) performs the retrieval from the input level 1b spectra to the atmospheric parameters (CO2 and CH4).In this paper, we show sensitivity studies with respect to atmospheric parameters, simulations along the orbit and a case study for the detection of a point source emitting carbon dioxide. In summary, the end-to-end simulation with CSE2ES proves the capability of the CarbonSat concept to reach its requirements.

  18. Toward End-to-End Face Recognition Through Alignment Learning

    Science.gov (United States)

    Zhong, Yuanyi; Chen, Jiansheng; Huang, Bo

    2017-08-01

    Plenty of effective methods have been proposed for face recognition during the past decade. Although these methods differ essentially in many aspects, a common practice of them is to specifically align the facial area based on the prior knowledge of human face structure before feature extraction. In most systems, the face alignment module is implemented independently. This has actually caused difficulties in the designing and training of end-to-end face recognition models. In this paper we study the possibility of alignment learning in end-to-end face recognition, in which neither prior knowledge on facial landmarks nor artificially defined geometric transformations are required. Specifically, spatial transformer layers are inserted in front of the feature extraction layers in a Convolutional Neural Network (CNN) for face recognition. Only human identity clues are used for driving the neural network to automatically learn the most suitable geometric transformation and the most appropriate facial area for the recognition task. To ensure reproducibility, our model is trained purely on the publicly available CASIA-WebFace dataset, and is tested on the Labeled Face in the Wild (LFW) dataset. We have achieved a verification accuracy of 99.08\\% which is comparable to state-of-the-art single model based methods.

  19. Context Aware End-to-End Connectivity Management

    CERN Document Server

    Sen, Jaydip; Chandra, M Girish; G., Harihara S; Reddy, Harish

    2010-01-01

    In a dynamic heterogeneous environment, such as pervasive and ubiquitous computing, context-aware adaptation is a key concept to meet the varying requirements of different users. Connectivity is an important context source that can be utilized for optimal management of diverse networking resources. Application QoS (Quality of service) is another important issue that should be taken into consideration for design of a context-aware system. This paper presents connectivity from the view point of context awareness, identifies various relevant raw connectivity contexts, and discusses how high-level context information can be abstracted from the raw context information. Further, rich context information is utilized in various policy representation with respect to user profile and preference, application characteristics, device capability, and network QoS conditions. Finally, a context-aware end-to-end evaluation algorithm is presented for adaptive connectivity management in a multi-access wireless network. Unlike t...

  20. Euclid end-to-end straylight performance assessment

    Science.gov (United States)

    Gaspar Venancio, Luis M.; Pachot, Charlotte; Carminati, Lionel; Lorenzo Alvarez, Jose; Amiaux, Jérôme; Prieto, Eric; Bonino, Luciana; Salvignol, Jean-Christophe; Short, Alex; Boenke, Tobias; Strada, Paulo; Laureijs, Rene

    2016-07-01

    In the Euclid mission the straylight has been identified at an early stage as the main driver for the final imaging quality of the telescope. The assessment by simulation of the final straylight in the focal plane of both instruments in Euclid's payload have required a complex workflow involving all stakeholders in the mission, from industry to the scientific community. The straylight is defined as a Normalized Detector Irradiance (NDI) which is a convenient definition tool to separate the contributions of the telescope and of the instruments. The end-to-end straylight of the payload is then simply the sum of the NDIs of the telescope and of each instrument. The NDIs for both instruments are presented in this paper for photometry and spectrometry.

  1. END-TO-END INDIA-UK TRANSNATIONAL WIRELESS TESTBED

    Directory of Open Access Journals (Sweden)

    Rohit Budhiraja

    2011-06-01

    Full Text Available Wireless Communication is a fast growing technology area where tremendous amount of research is ongoing. It is also an area where the use of technology in the market has seen wide and far-reaching impact. The India-UK Advanced Technology Centre initiative is a collaborative research project between various institutes and companies across UK and India, which envisages, apart from several research outcomes, putting in place of a support infrastructure for facilitating R&D of Next Generation networks, Systems and Services. As part of this project, an end-to-end trans-national advanced wireless testbed is being developed which will facilitate and support research and implementation of new ideas, concepts and technologies. The testbed will provide a framework which can be used to rapidly prototype and evaluate emerging concepts and technologies, and enables researchers to investigate/demonstrate the feasibility of new ideas in a realistic test environment. The testbed complements analytical and simulation based studies undertaken as part of the initial study when new ideas are proposed. This paper gives the details of the testbed and shows how a 4G technology like LTE has been implemented as one of the realisations of the test bed.

  2. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    Science.gov (United States)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  3. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    Directory of Open Access Journals (Sweden)

    Zhao Hong-hao

    2016-01-01

    Full Text Available Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distributed normal process. Then the Bases theory is used to characterize the end-to-end network traffic. By calculating the parameters, the model is determined correctly. Simulation results show that our approach is feasible and effective.

  4. VisualCommander for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the development of a highly extensible and user-configurable software application for end-to-end mission simulation and design. We will leverage...

  5. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  6. Simulation study on delay of end-to-end data communication for protective relaying in substations

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The end-to-end delay of protective relaying data flow in a substation was studied by dynamic simulation modeling technology.The distribution characteristics of protective relaying data flow and the constitution of the end-to-end delay of messages were analyzed.The simulation model for digital communication between protective relaying equipment and monitoring equipment of interval layer was suggested.The end-to-end delay of protective relaying data flow in different network configurations was analyzed.It is found that the size and interval of the data frame,utilization of the link background and protocols of higher layer are key factors of real-time performance,Detailed analysis results are presented.A proposal for network configuration is suggested to reduce end-to-end delay of protective relaying data flow.

  7. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  8. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  9. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  10. End-to-end test of spatial accuracy in Gamma Knife treatments for trigeminal neuralgia

    Energy Technology Data Exchange (ETDEWEB)

    Brezovich, Ivan A., E-mail: ibrezovich@uabmc.edu; Wu, Xingen; Duan, Jun; Popple, Richard A.; Shen, Sui; Benhabib, Sidi; Huang, Mi; Christian Dobelbower, M. [Department of Radiation Oncology, University of Alabama at Birmingham, Birmingham, Alabama 35249 (United States); Fisher III, Winfield S. [Department of Neurosurgery, University of Alabama at Birmingham, Birmingham, Alabama 35249 (United States)

    2014-11-01

    Purpose: Spatial accuracy is most crucial when small targets like the trigeminal nerve are treated. Although current quality assurance procedures typically verify that individual apparatus, like the MRI scanner, CT scanner, Gamma Knife, etc., are meeting specifications, the cumulative error of all equipment and procedures combined may exceed safe margins. This study uses an end-to-end approach to assess the overall targeting errors that may have occurred in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 in.) diameter MRI-contrast filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the location of the cavity matches the Gamma Knife coordinates of an arbitrarily chosen, previously treated patient. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pinprick to identify the cavity center. Treatments are planned for radiation delivery with 4 mm collimators according to MRI and CT scans using the clinical localizer boxes and acquisition protocols. Shots are planned so that the 50% isodose surface encompasses the cavity. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pinprick, which represents the intended target, and the centroid of the 50% isodose line, which is the center of the radiation field that was actually delivered. Results: Averaged over ten patient simulations, targeting errors along the x, y, and z coordinates (patient’s left-to-right, posterior-to-anterior, and head-to-foot) were, respectively, −0.060 ± 0.363, −0.350 ± 0.253, and 0.348 ± 0.204 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely, 0.109 ± 0.167, −0.191 ± 0.144, and 0.211 ± 0.094 mm. The largest errors along individual axes in MRI

  11. Effect of end-to-end invagination pancreaticojejunostomy with circle discontinuous U suture in pancreatic surgery

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xuewen; XUAN Wei; JIANG Tao; JI Degang; YANG Yongsheng; ZHANG Dan; XIE Yingjun; MENG Zihui; ZHAO Jisheng

    2007-01-01

    The aim of this paper is to summarize the methods of pancreaficojejunostomy in the pancreatic operation and to study the safety and feasibility of a new operative method called end-to-end invagination pancreaticojejunostomy with circle discontinuous U suture to prevent fistula of pancreaticojejunostomy.Eight-three patients with pancreaticoduodenectomy in the 3rd Hospital,Jilin University from 2001 January to 2006 April were reviewed.The incidences of pancreatic fistula with different types of pancreaticojejunostomy were compared.The overall incidence rate of pancreatic fistula was 26.5% (22/83).No pancreatic fistula occurred in end-to-end invagination pancreaticojejunostomy with circle discontinuous U suture.The incidence rate of the fistula following end-to-end invagination pancreaticojejunostomy with circle discontinuous U suture was significantly lower than that of traditional end-to-end pancreaticojejunostomy [40%,(10/25),P<0.01] and end-to-side pancreaticojejunostomy [27.3%,(12/44),P < 0.05],but no significant difference (P>0.05) between traditional end-to-end pancreaticojejunostomy and end-to-side pancreaticojejunostomywas discovered.End-to-end invagination pancreaticojejunostomy with circle discontinuous U suture has a definite effect on avoiding pancreatic fistula following pancreaticojejunostomy and is worth being recommended.But the cases were limited,so this method would still need to be observed and confirmed further in the future.

  12. Characterizing End-to-End Delay Performance of Randomized TCP Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Mohammad Shorfuzzaman

    2016-03-01

    Full Text Available TCP (Transmission Control Protocol is the main transport protocol used in high speed network. In the OSI Model, TCP exists in the Transport Layer and it serves as a connection-oriented protocol which performs handshaking to create a connection. In addition, TCP provides end-to-end reliability. There are different standard variants of TCP (e.g. TCP Reno, TCP NewReno etc.which implement mechanisms to dynamically control the size of congestion window but they do not have any control on the sending time of successive packets. TCP pacing introduces the concept of controlling the packet sending time at TCP sources to reduce packet loss in a bursty traffic network. Randomized TCP is a new TCP pacing scheme which has shown better performance (considering throughput, fairness over other TCP variants in bursty networks. The end-to-end delay of Randomized TCP is a very important performance measure which has not yet been addressed. In the current high speed networks, it is increasingly important to have mechanisms that keep end-to-end to delay within an acceptable range. In this paper, we present the performance evaluation of end-to-end delay of Randomized TCP. To this end, we have used an analytical and a simulation model to characterize the end-to-end delay performance of Randomized TCP.

  13. Automatic provisioning of end-to-end QoS into the home

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Skoldström, Pontus; Nelis, Jelle;

    2011-01-01

    Due to a growing number of high bandwidth applications today (such as HDTV), and an increasing amount of network and cloud based applications, service providers need to pay attention to QoS in their networks. We believe there is a need for an end-to-end approach reaching into the home as well....... The Home Gateway (HG) as a key component of the home network is crucial for enabling the end-to-end solutions. UPnP-QoS has been proposed as an inhome solution for resource reservations. In this paper we assess a solution for automatic QoS reservations, on behalf of non-UPnP-QoS aware applications....... Additionally we focus on an integrated end-to-end solution, combining GMPLS-based reservations in e.g., access/metro and UPnP-QoS based reservation in the home network....

  14. End-to-End Information System design at the NASA Jet Propulsion Laboratory

    Science.gov (United States)

    Hooke, A. J.

    1978-01-01

    Recognizing a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote space-based sensor, an end-to-end approach to the design of information systems has been adopted at the Jet Propulsion Laboratory. The objectives of this effort are to ensure that all flight projects adequately cope with information flow problems at an early stage of system design, and that cost-effective, multi-mission capabilities are developed when capital investments are made in supporting elements. The paper reviews the End-to-End Information System (EEIS) activity at the Laboratory, and notes the ties to the NASA End-to-End Data System program.

  15. Measuring End-To-End Bandwidth with Iperf Using Web100

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, Les

    2003-04-30

    End-to-end bandwidth estimation tools like Iperf though fairly accurate are intrusive. In this paper, we describe how with an instrumented TCP stack (Web100), we can estimate the end-to-end bandwidth accurately, while consuming significantly less network bandwidth and time. We modified Iperf to use Web100 to detect the end of slow-start and estimate the end-to-end bandwidth by measuring the amount of data sent for a short period (1 second) after the slow-start, when the TCP throughput is relatively stable. We obtained bandwidth estimates differing by less than 10% when compared to running Iperf for 20 seconds, and savings in bandwidth estimation time of up to 94% and savings in network traffic of up to 92%.

  16. An end-to-end communications architecture for condition-based maintenance applications

    Science.gov (United States)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  17. IPTV Resource and Performance Management using End-to-End Available Bandwidth Estimation Techniques

    OpenAIRE

    2014-01-01

    Over-The-Top IPTV services have seen a huge increase in popularity in recent years. This fact coupled with the ever increasing resource requirements of IPTV services has created a necessity for e�cient and e�ective management of these IPTV services. This thesis presents contributions and �ndings into the use of end-to-end Available Bandwidth estimation to help govern Over-The-Top IPTV service delivery. An ex- amination is presented of the conditions under which end-to-end Avail...

  18. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian;

    2013-01-01

    observe an enhancement of the NV-centers' decay rate in both cases as a result of the coupling to the plasmons. The devices are nano-assembled with a scanning probe technique. Through simulations, we show that end-to-end aligned silver nanowires can be used as a controllable splitter for emission from...

  19. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    Science.gov (United States)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  20. End-To-End Esophagojejunostomy Versus Standard End-To-Side Esophagojejunostomy: Which One Is Preferable?

    Directory of Open Access Journals (Sweden)

    A. Kavyani

    2007-11-01

    Full Text Available Background:End-to-side esophagojejunostomy has almost always been associated with some degree of dysphagia. To overcome this complication we decided to perform an end-to-end anastomosis and compare it with end-to-side Roux-en-Y esophagojejunostomy. Methods: In this prospective study, between 1998 and 2005,71 patients with a diagnosis of gastric adenocarcinoma underwent total gastrectomy. Standard esophagojejunostomy with an end-to-side fashion was performed in 41 patients and compared with our recommended technique of end-to-end esophagojejunostomy in 30 patients. Results: This study showed that esophagojejunostomy with an end-to-end fashion has a low incidence of postoperative dysphagia (33.3%,whereas in those with an endto- side anastomosis the rate of ysphagia is very high (83%.Conclusion:A Roux-en-Y esophagojejunostomy with an end-to-end anastomosis has a low incidence of postoperative dysphagia and we strongly recommend using this technique.

  1. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    Generation Networks (NGNs). In this paper, an end-to-end availability model is proposed and evaluated using a combination of Reliability Block Diagrams (RBD) and a proposed five-state Markov model. The overall availability for intra- and inter domain communication in IMS is analyzed, and the state...

  2. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    implementation on Android based tablets. The simulations cover a wide range of scenarios for two mobile users in an urban area with ubiquitous cellular coverage, and shows our algorithm leads to increased throughput, with fewer handovers, when considering the end-to-end connection than to other handover schemes...

  3. Security Considerations around End-to-End Security in the IP-based Internet of Things

    NARCIS (Netherlands)

    Brachmann, M.; Garcia-Mochon, O.; Keoh, S.L.; Kumar, S.S.

    2012-01-01

    The IP-based Internet of Things refers to the interconnection of smart objects in a Low-power and Lossy Network (LLN) with the Internetby means of protocols such as 6LoWPAN or CoAP. The provisioning of an end-to-end security connection is the key to ensure basic functionalities such as software upda

  4. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    This paper describes a wireless real-time communication system design using two Time Division Multiple Access (TDMA) protocols. Messages are subject to prioritization and queuing. For this interoperation scenario, we show a method for end-to-end configuration of protocols and queue sizes. Such co...

  5. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Science.gov (United States)

    Gordel, M.; Piela, K.; Kołkowski, R.; Koźlecki, T.; Buckle, M.; Samoć, M.

    2015-12-01

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.

  6. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  7. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  8. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  9. Providing end-to-end QoS for multimedia applications in 3G wireless networks

    Science.gov (United States)

    Guo, Katherine; Rangarajan, Samapth; Siddiqui, M. A.; Paul, Sanjoy

    2003-11-01

    As the usage of wireless packet data services increases, wireless carriers today are faced with the challenge of offering multimedia applications with QoS requirements within current 3G data networks. End-to-end QoS requires support at the application, network, link and medium access control (MAC) layers. We discuss existing CDMA2000 network architecture and show its shortcomings that prevent supporting multiple classes of traffic at the Radio Access Network (RAN). We then propose changes in RAN within the standards framework that enable support for multiple traffic classes. In addition, we discuss how Session Initiation Protocol (SIP) can be augmented with QoS signaling for supporting end-to-end QoS. We also review state of the art scheduling algorithms at the base station and provide possible extensions to these algorithms to support different classes of traffic as well as different classes of users.

  10. Chaos Based Joint Compression and Encryption Framework for End-to-End Communication Systems

    Directory of Open Access Journals (Sweden)

    Nidhi Goel

    2014-01-01

    Full Text Available Augmentation in communication and coding technology has made encryption an integral part of secure multimedia communication systems. Security solution for end-to-end image transmission requires content adaptation at intermediate nodes, which consumes significant resources to decrypt, process, and reencrypt the secured data. To save the computational resources, this paper proposes a network-friendly encryption technique, which can be implemented in transparency to content adaptation techniques. The proposed encryption technique maintains the compression efficiency of underlying entropy coder, and enables the processing of encrypted data. Thorough analysis of the technique, as regards various standard evaluation parameters and attack scenarios, demonstrates its ability to withstand known-plaintext, ciphertext-only, and approximation attacks. This justifies its implementation for secure image transmission for end-to-end communication systems.

  11. End-to-end calculation of the radiation characteristics of VVER-1000 spent fuel assemblies

    Science.gov (United States)

    Linge, I. I.; Mitenkova, E. F.; Novikov, N. V.

    2012-12-01

    The results of end-to-end calculation of the radiation characteristics of VVER-1000 spent nuclear fuel are presented. Details of formation of neutron and gamma-radiation sources are analyzed. Distributed sources of different types of radiation are considered. A comparative analysis of calculated radiation characteristics is performed with the use of nuclear data from different ENDF/B and EAF files and ANSI/ANS and ICRP standards.

  12. An End-to-End QoS Control Model for Enhanced Internet

    Institute of Scientific and Technical Information of China (English)

    张尧学; 王晓春; 顾钧

    2000-01-01

    This paper describes an end-to-end QoS (Quality of Services) control model for distributed multimedia computing on enhanced Internet, and gives the design and implementation of this model including hosts and routers. The architecture, the mathematical definition about QoS parameters, and the mapping between Integrated Service (IS) and Differentiated Service (DS) are discussed in this paper.The simulation shows that this model can improve the performance of audio streams when it is used in IPhone system.

  13. Financing the End-to-end Supply Chain: A Reference Guide to Supply Chain Finance

    OpenAIRE

    Templar, Simon; Hofmann, Erik; Findlay, Charles

    2016-01-01

    Financing the End to End Supply Chain provides readers with a real insight into the increasingly important area of supply chain finance. It demonstrates the importance of the strategic relationship between the physical supply of goods and services and the associated financial flows. The book provides a clear introduction, demonstrating the importance of the strategic relationship between supply chain and financial communities within an organization. It contains vital information on how supply...

  14. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer;

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP...... algorithm provide a fexible and effcient QoS guarantees for multiple UMTS classes....

  15. Towards End-to-End Learning for Dialog State Tracking and Management using Deep Reinforcement Learning

    OpenAIRE

    Zhao, Tiancheng; Eskenazi, Maxine

    2016-01-01

    This paper presents an end-to-end framework for task-oriented dialog systems using a variant of Deep Recurrent Q-Networks (DRQN). The model is able to interface with a relational database and jointly learn policies for both language understanding and dialog strategy. Moreover, we propose a hybrid algorithm that combines the strength of reinforcement learning and supervised learning to achieve faster learning speed. We evaluated the proposed model on a 20 Question Game conversational game simu...

  16. Testing Application (End-to-End Performance of Networks With EFT Traffic

    Directory of Open Access Journals (Sweden)

    Vlatko Lipovac

    2009-01-01

    Full Text Available This paper studies how end-to-end application peiformance(of Electronic Financial Transaction traffic, in particulardepends on the actual protocol stacks, operating systemsand network transmission rates. With this respect, the respectivesimulation tests of peiformance of TCP and UDP protocolsrunning on various operating systems, ranging from Windows,Sun Solmis, to Linux have been implemented, and thedifferences in peiformance addressed focusing on throughputand response time.

  17. The International Space Station Alpha (ISSA) End-to-End On-Orbit Maintenance Process Flow

    Science.gov (United States)

    Zingrebe, Kenneth W., II

    1995-01-01

    As a tool for construction and refinement of the on-orbit maintenance system to sustain the International Space Station Alpha (ISSA), the Mission Operations Directorate (MOD) developed an end to-end on-orbit maintenance process flow. This paper discusses and demonstrates that process flow. This tool is being used by MOD to identify areas which require further work in preparation for MOD's role in the conduct of on-orbit maintenance operations.

  18. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-01-01

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function. PMID:25538749

  19. An end-to-end robust approach for scalable video over the Internet

    Institute of Scientific and Technical Information of China (English)

    WANG Guijin; ZHANG Qian; ZHU Wenwu; LIN Xinggang

    2004-01-01

    This paper introduces an end-to-end robust approach for scalable video over the Internet. The traditional method only considers congestion control, error control and is unable to achieve end-to-end high-quality video transmission in the error-prone environment like the Internet since it does not consider the packetization behavior,network conditions and the media characteristics simultaneously. This paper presents an end-to-end approach for scalable video over the Internet, combining network adaptive congestion control and unequal error control. Considering requirements of multimedia transmission, this paper introduces multimedia congestion control to estimate available bandwidth and smooth the media sending rate. Specially in the transport layer we propose unequal interleaving packetization method and unequal error protection scheme,which can alleviate the effect of the packet loss well. Further we develop the rate-distortion theory for the scalable video over the Internet. Thereafter the optimal bit allocation is presented to determine the bits budgets for the source part and error control part. Simulation shows our scheme can achieve good performance for scalable video over the Internet.

  20. Sutureless end-to-end bowel anastomosis in rabbit using Iow-power CO2 laser

    Institute of Scientific and Technical Information of China (English)

    Zhong Rong Li; Yong Long Chi; Run Cong Ke

    2000-01-01

    The use of laser energy to weld biological tissues and produce sutureless anastomosis has its advantages over conventional silk-sutured anastomosis since it was reported in small vessels[1] and fallopian tubes[2], in the late 1970s. Since then, more investigators have welded a larger variety of tissues[3-13] and have expanded its application to welding trials of entertomies of rabbit and rat small intestine[14-17] Sauer et al[18] reported results from Nd: YAG laser in reconstruction of end-to-end welding in rabbit small intestine. Recently, controlled temperature during YAG and argon laser-assisted welding of entertomies of rabbit and rat was implemented to eliminate exponential increases in the rate of denaturation associated with rapidly increasing temperature[19,20]. Yet there was no report of sutureless end-to-end bowel anastomosis using low-power CO2 laser. This is a report of a circumferential end-to-end laser welding bowel anastomosis in rabbit by using 3 different CO2 laser powers to explore the feasibility of CO2 laser welding of a circumferential intestinal tissue and to determine the optimal laser-welding parameter. Then the appropriate CO2 laser power was chosen to weld bowels in rabbit and its long-term healing effect was evaluated.

  1. QoS Modeling for End-to-End Performance Evaluation over Networks with Wireless Access

    Directory of Open Access Journals (Sweden)

    Gerardo Gómez

    2010-01-01

    Full Text Available This paper presents an end-to-end Quality of Service (QoS model for assessing the performance of data services over networks with wireless access. The proposed model deals with performance degradation across protocol layers using a bottom-up strategy, starting with the physical layer and moving on up to the application layer. This approach makes it possible to analytically assess performance at different layers, thereby facilitating a possible end-to-end optimization process. As a representative case, a scenario where a set of mobile terminals connected to a streaming server through an IP access node has been studied. UDP, TCP, and the new TCP-Friendly Rate Control (TFRC protocols were analyzed at the transport layer. The radio interface consisted of a variable-rate multiuser and multichannel subsystem, including retransmissions and adaptive modulation and coding. The proposed analytical QoS model was validated on a real-time emulator of an end-to-end network with wireless access and proved to be very useful for the purposes of service performance estimation and optimization.

  2. End-to-end modeling: a new modular and flexible approach

    Science.gov (United States)

    Genoni, M.; Riva, M.; Landoni, M.; Pariani, G.

    2016-08-01

    In this paper we present an innovative philosophy to develop the End-to-End model for astronomical observation projects, i.e. the architecture which allows physical modeling of the whole system from the light source to the reduced data. This alternative philosophy foresees the development of the physical model of the different modules, which compose the entire End-to-End system, directly during the project design phase. This approach is strongly characterized by modularity and flexibility; these aspects will be of relevant importance in the next generation astronomical observation projects like E-ELT (European Extremely Large Telescope) because of the high complexity and long-time design and development. With this approach it will be possible to keep the whole system and its different modules efficiently under control during every project phase and to exploit a reliable tool at a system engineering level to evaluate the effects on the final performance both of the main parameters and of different instrument architectures and technologies. This philosophy will be important to allow scientific community to perform in advance simulations and tests on the scientific drivers. This will translate in a continuous feedback to the (system) design process with a resulting improvement in the effectively achievable scientific goals and consistent tool for efficiently planning observation proposals and programs. We present the application case for this End-to-End modeling technique, which is the high resolution spectrograph at the E-ELT (E-ELT HIRES). In particular we present the definition of the system modular architecture, describing the interface parameters of the modules.

  3. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    Science.gov (United States)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    End-to-end marine ecosystem models link climate and oceanography to the food web and human activities. These models can be used as forecasting tools, to strategically evaluate management options and to support ecosystem-based management. Here we report the results of such forecasts in the California Current, using an Atlantis end-to-end model. We worked collaboratively with fishery managers at NOAA’s regional offices and staff at the National Marine Sanctuaries (NMS) to explore the impact of fishery policies on management objectives at different spatial scales, from single Marine Sanctuaries to the entire Northern California Current. In addition to examining Status Quo management, we explored the consequences of several gear switching and spatial management scenarios. Of the scenarios that involved large scale management changes, no single scenario maximized all performance metrics. Any policy choice would involve trade-offs between stakeholder groups and policy goals. For example, a coast-wide 25% gear shift from trawl to pot or longline appeared to be one possible compromise between an increase in spatial management (which sacrificed revenue) and scenarios such as the one consolidating bottom impacts to deeper areas (which did not perform substantially differently from Status Quo). Judged on a coast-wide scale, most of the scenarios that involved minor or local management changes (e.g. within Monterey Bay NMS only) yielded results similar to Status Quo. When impacts did occur in these cases, they often involved local interactions that were difficult to predict a priori based solely on fishing patterns. However, judged on the local scale, deviation from Status Quo did emerge, particularly for metrics related to stationary species or variables (i.e. habitat and local metrics of landed value or bycatch). We also found that isolated management actions within Monterey Bay NMS would cause local fishers to pay a cost for conservation, in terms of reductions in landed

  4. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  5. Using Network Calculus to compute end-to-end delays in SpaceWire networks

    OpenAIRE

    Ferrandiz, Thomas; Frances, Fabrice; Fraboul, Christian

    2011-01-01

    The SpaceWire network standard is promoted by the ESA and is scheduled to be used as the sole on-board network for future satellites. This network uses a wormhole routing mechanism that can lead to packet blocking in routers and consequently to variable end-to-end delays. As the network will be shared by real-time and non real- time traffic, network designers require a tool to check that temporal constraints are verified for all the critical messages. Network Calculus can be used for evaluati...

  6. END-TO-END DEPTH FROM MOTION WITH STABILIZED MONOCULAR VIDEOS

    Directory of Open Access Journals (Sweden)

    C. Pinard

    2017-08-01

    Full Text Available We propose a depth map inference system from monocular videos based on a novel dataset for navigation that mimics aerial footage from gimbal stabilized monocular camera in rigid scenes. Unlike most navigation datasets, the lack of rotation implies an easier structure from motion problem which can be leveraged for different kinds of tasks such as depth inference and obstacle avoidance. We also propose an architecture for end-to-end depth inference with a fully convolutional network. Results show that although tied to camera inner parameters, the problem is locally solvable and leads to good quality depth prediction.

  7. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  8. Deriving comprehensive error breakdown for wide field adaptive optics systems using end-to-end simulations

    Science.gov (United States)

    Ferreira, F.; Gendron, E.; Rousset, G.; Gratadour, D.

    2016-07-01

    The future European Extremely Large Telescope (E-ELT) adaptive optics (AO) systems will aim at wide field correction and large sky coverage. Their performance will be improved by using post processing techniques, such as point spread function (PSF) deconvolution. The PSF estimation involves characterization of the different error sources in the AO system. Such error contributors are difficult to estimate: simulation tools are a good way to do that. We have developed in COMPASS (COMputing Platform for Adaptive opticS Systems), an end-to-end simulation tool using GPU (Graphics Processing Unit) acceleration, an estimation tool that provides a comprehensive error budget by the outputs of a single simulation run.

  9. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  10. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  11. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    Science.gov (United States)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  12. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  13. End-to-End Beam Dynamics Simulations for the ANL-RIA Driver Linac

    CERN Document Server

    Ostroumov, P N

    2004-01-01

    The proposed Rare Isotope Accelerator (RIA) Facility consists of a superconducting (SC) 1.4 GV driver linac capable of producing 400 kW beams of any ion from hydrogen to uranium. The driver is configured as an array of ~350 SC cavities, each with independently controllable rf phase. For the end-to-end beam dynamics design and simulation we use a dedicated code, TRACK. The code integrates ion motion through the three-dimensional fields of all elements of the driver linac beginning from the exit of the electron cyclotron resonance (ECR) ion source to the production targets. TRACK has been parallelized and is able to track large number of particles in randomly seeded accelerators with misalignments and a comprehensive set of errors. The simulation starts with multi-component dc ion beams extracted from the ECR. Beam losses are obtained by tracking up to million particles in hundreds of randomly seeded accelerators. To control beam losses a set of collimators is applied in designated areas. The end-to-end simulat...

  14. Statistical End-to-end Performance Bounds for Networks under Long Memory FBM Cross Traffic

    CERN Document Server

    Rizk, Amr

    2009-01-01

    Fractional Brownian motion (fBm) emerged as a useful model for self-similar and long-range dependent Internet traffic. Approximate performance measures are known from large deviations theory for single queuing systems with fBm through traffic. In this paper we derive end-to-end performance bounds for a through flow in a network of tandem queues under fBm cross traffic. To this end, we prove a rigorous sample path envelope for fBm that complements previous approximate results. We find that both approaches agree in their outcome that overflow probabilities for fBm traffic have a Weibullian tail. We employ the sample path envelope and the concept of leftover service curves to model the remaining service after scheduling fBm cross traffic at a system. Using composition results for tandem systems from the stochastic network calculus we derive end-to-end statistical performance bounds for individual flows in networks under fBm cross traffic. We discover that these bounds grow in O(n (log n)^(1/(2-2H))) for n system...

  15. The end-to-end testbed of the Optical Metrology System on-board LISA Pathfinder

    CERN Document Server

    Steier, Frank; Marín, Antonio F García; Gerardi, Domenico; Heinzel, Gerhard; Danzmann, Karsten; 10.1088/0264-9381/26/9/094010

    2012-01-01

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3x10^(-14) ms^(-2)/sqrt[Hz] between 1 mHz and 30 mHz. This measurement is performed interferometrically by the Optical Metrology System (OMS) on-board LISA Pathfinder. In this paper we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer back-end which is a phasemeter and the processing of the phasemeter output data. Furthermore, 3-axes piezo actuated mirrors are used instead of the free-falling test masses for the characterisation of the dynamic behaviour of the system and some parts of the Drag-free and Attitude Control System (DFACS) which controls the test masses and the satellite. The end-to-end testbe...

  16. Semantic Service Composition with QoS End - to - End Constraints via AND/OR Graphs

    Directory of Open Access Journals (Sweden)

    Xhemal Zenuni

    2012-03-01

    Full Text Available In this paper we present AND/OR graphs as a unifying framework for semantic service composition that considers users QoS constraints. The main virtues of this representation among others are its ability to express semantic inference and to deal with QoS constraints from different perspectives. In addition it correctly handles multiple inputs/outputs of services, and allows high degree of automation. Once service dependencies and QoS features are formalized as AND/OR graph, we apply a search algorithm to discover composite services that considers user QoS end - to - end preferences. The implementation of a prototype system and the experimental results support our underlying hypothesis that AND/OR graphs are not only elegant and expressive formalism for addressing QoS - aware semantic service composition, but efficient as well.

  17. SPOKES: an End-to-End Simulation Facility for Spectroscopic Cosmological Surveys

    CERN Document Server

    Nord, B; Refregier, A; Gamper, La; Gamper, Lu; Hambrecht, B; Chang, C; Forero-Romero, J E; Serrano, S; Cunha, C; Coles, O; Nicola, A; Busha, M; Bauer, A; Saunders, W; Jouvel, S; Kirk, D; Wechsler, R

    2016-01-01

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science outpu...

  18. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  19. End-to-end simulations of the E-ELT/METIS coronagraphs

    Science.gov (United States)

    Carlomagno, Brunella; Absil, Olivier; Kenworthy, Matthew; Ruane, Garreth; Keller, Christoph U.; Otten, Gilles; Feldt, Markus; Hippler, Stefan; Huby, Elsa; Mawet, Dimitri; Delacroix, Christian; Surdej, Jean; Habraken, Serge; Forsberg, Pontus; Karlsson, Mikael; Vargas Catalan, Ernesto; Brandl, Bernhard R.

    2016-07-01

    The direct detection of low-mass planets in the habitable zone of nearby stars is an important science case for future E-ELT instruments such as the mid-infrared imager and spectrograph METIS, which features vortex phase masks and apodizing phase plates (APP) in its baseline design. In this work, we present end-to-end performance simulations, using Fourier propagation, of several METIS coronagraphic modes, including focal-plane vortex phase masks and pupil-plane apodizing phase plates, for the centrally obscured, segmented E-ELT pupil. The atmosphere and the AO contributions are taken into account. Hybrid coronagraphs combining the advantages of vortex phase masks and APPs are considered to improve the METIS coronagraphic performance.

  20. Genetic algorithm for autonomic joint radio resource management in end-to-end reconfigurable systems

    Institute of Scientific and Technical Information of China (English)

    ZENG Xian; MA Tao; LIN Yue-wei; FENG Zhi-yong

    2008-01-01

    This article presents the genetic algorithm (GA) asan autonomic approach for the joint radio resource management(JRRM) amongst heterogeneous radio access technologies(RATs) in the end-to-end reconfigurable systems. The jointsession admission control (JOSAC) and the bandwidth allocationare combined as a specific decision made by the operations ofthe genetic algorithm with certain advisable modifications. Theproposed algorithm is triggered on the following two conditions.When a session is initiated, it is triggered for the session tocamp on the most appropriate RAT and select the most suitablebandwidth for the desired service. When a session terminates, itis also used to adjust the distribution of the ongoing sessionsthrough the handovers. This will increase the adjustmentfrequency of the JRRM controller for the best systemperformance. Simulation results indicate that the proposedautonomic JRRM scheme not only effectively reduces thehandover times, but also achieves well trade-off between thespectrum utility and the blocking probability.

  1. Data analysis Pipeline for EChO end-to-end simulations

    CERN Document Server

    Waldmann, Ingo P

    2014-01-01

    Atmospheric spectroscopy of extrasolar planets is an intricate business. Atmospheric signatures typically require a photometric precision of $1 \\times 10^{-4}$ in flux over several hours. Such precision demands high instrument stability as well as an understanding of stellar variability and an optimal data reduction and removal of systematic noise. In the context of the EChO mission concept, we here discuss the data reduction and analysis pipeline developed for the EChO end-to-end simulator EChOSim. We present and discuss the step by step procedures required in order to obtain the final exoplanetary spectrum from the EChOSim`raw data' using a simulated observation of the secondary eclipse of the hot-Neptune 55 Cnc e.

  2. End to end numerical simulations of the MAORY multiconjugate adaptive optics system

    CERN Document Server

    Arcidiacono, Carmelo; Bregoli, Giovanni; Diolaiti, Emiliano; Foppiani, Italo; Cosentino, Giuseppe; Lombini, Matteo; Butler, R C; Ciliegi, Paolo

    2014-01-01

    MAORY is the adaptive optics module of the E-ELT that will feed the MICADO imaging camera through a gravity invariant exit port. MAORY has been foreseen to implement MCAO correction through three high order deformable mirrors driven by the reference signals of six Laser Guide Stars (LGSs) feeding as many Shack-Hartmann Wavefront Sensors. A three Natural Guide Stars (NGSs) system will provide the low order correction. We develop a code for the end-to-end simulation of the MAORY adaptive optics (AO) system in order to obtain high-delity modeling of the system performance. It is based on the IDL language and makes extensively uses of the GPUs. Here we present the architecture of the simulation tool and its achieved and expected performance.

  3. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    Science.gov (United States)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be small as a 40 ppm is a difficult and exacting task. The end-to end processing of determining planetary candidates from noisy, raw photometric measurements is discussed.

  4. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    The research described in this PhD thesis focuses on the phenomenon that formalized requirements management, as many studies have shown, has yet to find its way into the construction industry, even though it is effectively used in other fields e.g. software development and the aerospace and defence...... structure is aimed at covering the entire life cycle of a building by considering future events. However, the developed requirements structure is not enough for managing requirements. Therefore an intensive literature study on requirements management in general and in particular requirements management...... industries. The research gives at the same time managers of construction projects a tool with which to manage their requirements end-to-end. In order to investigate how construction companies handle requirements, a case project – a Danish construction syndicate producing sandwich elements made from High...

  5. An Extended Model Driven Framework for End-to-End Consistent Model Transformation

    Directory of Open Access Journals (Sweden)

    Mr. G. Ramesh

    2016-08-01

    Full Text Available Model Driven Development (MDD results in quick transformation from models to corresponding systems. Forward engineering features of modelling tools can help in generating source code from models. To build a robust system it is important to have consistency checking in the design models and the same between design model and the transformed implementation. Our framework named as Extensible Real Time Software Design Inconsistency Checker (XRTSDIC proposed in our previous papers supports consistency checking in design models. This paper focuses on automatic model transformation. An algorithm and defined transformation rules for model transformation from UML class diagram to ERD and SQL are being proposed. The model transformation bestows many advantages such as reducing cost of development, improving quality, enhancing productivity and leveraging customer satisfaction. Proposed framework has been enhanced to ensure that the transformed implementations conform to their model counterparts besides checking end-to-end consistency.

  6. Force Characteristics of the Rat Sternomastoid Muscle Reinnervated with End-to-End Nerve Repair

    Directory of Open Access Journals (Sweden)

    Stanislaw Sobotka

    2011-01-01

    Full Text Available The goal of this study was to establish force data for the rat sternomastoid (SM muscle after reinnervation with nerve end-to-end anastomosis (EEA, which could be used as a baseline for evaluating the efficacy of new reinnervation techniques. The SM muscle on one side was paralyzed by transecting its nerve and then EEA was performed at different time points: immediate EEA, 1-month and 3-month delay EEA. At the end of 3-month recovery period, the magnitude of functional recovery of the reinnervated SM muscle was evaluated by measuring muscle force and comparing with the force of the contralateral control muscle. Our results demonstrated that the immediately reinnervated SM produced approximately 60% of the maximal tetanic force of the control. The SM with delayed nerve repair yielded approximately 40% of the maximal force. Suboptimal recovery of muscle force after EEA demonstrates the importance of developing alternative surgical techniques to treat muscle paralysis.

  7. Single and Multi-bunch End-to-end Tracking in the LHeC

    CERN Document Server

    Pellegrini, D; Latina, A; Schulte, D

    2015-01-01

    The LHeC study aims at delivering an electron beam for collision with the LHC proton beam. The current base- line design consists of a multi-pass superconductive energy- recovery linac operating in a continuous wave mode. The high current beam ($\\sim$ 100 mA) in the linacs excites long- range wake-fields between bunches of different turns, which induce instabilities and might cause beam losses. PLACET2, a novel version of the tracking code PLACET, capable to handle recirculation and time dependencies, has been em- ployed to perform the first LHeC end-to-end tracking. The impact of long-range wake-fields, synchrotron radiation, and beam-beam effects has been assessed. The simulation results and recent improvements in the lattice design are presented and discussed in this paper.

  8. End-to-end assessment of a large aperture segmented ultraviolet optical infrared (UVOIR) telescope architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Olivier; Stark, Chris; Arenberg, Jon

    2016-07-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield exo-earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an exo-earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and exo-earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling these missions.

  9. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  10. End-to-end rate-based congestion control with random loss: convergence and stability

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The convergence and stability analysis for two end-to-end rate-based congestion control algorithms with unavoidable random loss in packets are presented, which can be caused by, for example, errors on wireless links. The convergence rates of these two algorithms are analyzed by linearizing them around their equilibrium points, since they are globally stable and can converge to their unique equilibrium points. Some sufficient conditions for local stability in the presence of round-trip delay are obtained based on the general Nyquist criterion of stability. The stability conditions can be considered to be more general. If random loss in the first congestion control algorithm is not considered, they reduce to the local stability conditions which have been obtained in some literatures. Furthermore, sufficient conditions for local stability of a new congestion control algorithm have also been obtained if random loss is not considered in the second congestion control algorithm.

  11. 'Smart' from a NEEDS perspective. [sensors in NASA End-to-End Data System

    Science.gov (United States)

    Blanchard, D. L.

    1978-01-01

    A description is presented of the characteristics which are considered essential for the NASA information system if it is to satisfy the sensor/user needs of the 1980's. The present data/information (DI) system is a complex arrangement of functions which has evolved to meet the increased demands placed upon it by the changing nature of the space program. Even though today's ground system is handling today's DI flow, it has certain limitations, and in some cases lacks consistency in organization and structure that would allow it to cope with the new requirements. A systematic analysis and overview of the total system is necessary for the planning of an efficient and effective information system for the 1980's. The NASA End-to-End Data System (NEEDS) program is an attempt to significantly increase the effectiveness and efficiency of the system that couples the user of the space data with the sensors.

  12. The Consolidation of the End-to-End Avionics Systems Testbench

    Science.gov (United States)

    Wijnands, Quirien; Torelli, Felice; Blommestijn, Robert; Kranz, Stephan; Koster, Jean-Paul

    2014-08-01

    Over the past years, the Avionics System Test Bench (ATB) has been used to support the demonstration and validation of upcoming space avionics related standards and technologies in a representative environment. Next to this another main use-case of the facility has been to support projects in their needs of assessing particular technology related issues. In doing so, it was necessary to add activity- and project specifics to different configurations of the ATB, leading to a proliferation of facilities and technologies. In some cases however the results and lessons-learned from these efforts and activities were considered valuable to the ATB-concept in general and therefore needed preservation in the ATB mainstream for future reuse. Currently activities are ongoing to consolidate the End-To-End Avionics Systems TestBench (E2E-ATB). In this paper the resulting details of these activities are described as enhancements and improvements per ATB configuration.

  13. Enhancing End-to-End Performance of Information Services Over Ka-Band Global Satellite Networks

    Science.gov (United States)

    Bhasin, Kul B.; Glover, Daniel R.; Ivancic, William D.; vonDeak, Thomas C.

    1997-01-01

    The Internet has been growing at a rapid rate as the key medium to provide information services such as e-mail, WWW and multimedia etc., however its global reach is limited. Ka-band communication satellite networks are being developed to increase the accessibility of information services via the Internet at global scale. There is need to assess satellite networks in their ability to provide these services and interconnect seamlessly with existing and proposed terrestrial telecommunication networks. In this paper the significant issues and requirements in providing end-to-end high performance for the delivery of information services over satellite networks based on various layers in the OSI reference model are identified. Key experiments have been performed to evaluate the performance of digital video and Internet over satellite-like testbeds. The results of the early developments in ATM and TCP protocols over satellite networks are summarized.

  14. End-to-end performance analysis using engineering confidence models and a ground processor prototype

    Science.gov (United States)

    Kruse, Klaus-Werner; Sauer, Maximilian; Jäger, Thomas; Herzog, Alexandra; Schmitt, Michael; Huchler, Markus; Wallace, Kotska; Eisinger, Michael; Heliere, Arnaud; Lefebvre, Alain; Maher, Mat; Chang, Mark; Phillips, Tracy; Knight, Steve; de Goeij, Bryan T. G.; van der Knaap, Frits; Van't Hof, Adriaan

    2015-10-01

    The European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) are co-operating to develop the EarthCARE satellite mission with the fundamental objective of improving the understanding of the processes involving clouds, aerosols and radiation in the Earth's atmosphere. The EarthCARE Multispectral Imager (MSI) is relatively compact for a space borne imager. As a consequence, the immediate point-spread function (PSF) of the instrument will be mainly determined by the diffraction caused by the relatively small optical aperture. In order to still achieve a high contrast image, de-convolution processing is applied to remove the impact of diffraction on the PSF. A Lucy-Richardson algorithm has been chosen for this purpose. This paper will describe the system setup and the necessary data pre-processing and post-processing steps applied in order to compare the end-to-end image quality with the L1b performance required by the science community.

  15. Sieving of H2 and D2 Through End-to-End Nanotubes

    Science.gov (United States)

    Devagnik, Dasgupta; Debra, J. Searles; Lamberto, Rondoni; Stefano, Bernardi

    2014-10-01

    We study the quantum molecular sieving of H2 and D2 through two nanotubes placed end-to-end. An analytic treatment, assuming that the particles have classical motion along the axis of the nanotube and are confined in a potential well in the radial direction, is considered. Using this idealistic model, and under certain conditions, it is found that this device can act as a complete sieve, allowing chemically pure deuterium to be isolated from an isotope mixture. We also consider a more realistic model of two carbon nanotubes and carry out molecular dynamics simulations using a Feynman—Hibbs potential to model the quantum effects on the dynamics of H2 and D2. Sieving is also observed in this case, but is caused by a different process.

  16. Flexible end-to-end system design for synthetic aperture radar applications

    Science.gov (United States)

    Zaugg, Evan C.; Edwards, Matthew C.; Bradley, Joshua P.

    2012-06-01

    This paper presents ARTEMIS, Inc.'s approach to development of end-to-end synthetic aperture radar systems for multiple applications and platforms. The flexible design of the radar and the image processing tools facilitates their inclusion in a variety of application-specific end-to-end systems. Any given application comes with certain requirements that must be met in order to achieve success. A concept of operation is defined which states how the technology is used to meet the requirements of the application. This drives the design decisions. Key to adapting our system to multiple applications is the flexible SlimSAR radar system, which is programmable on-the-fly to meet the imaging requirements of a wide range of altitudes, swath-widths, and platform velocities. The processing software can be used for real-time imagery production or post-flight processing. The ground station is adaptable, and the radar controls can be run by an operator on the ground, on-board the aircraft, or even automated as part of the aircraft autopilot controls. System integration takes the whole operation into account, seeking to flawlessly work with data links and on-board data storage, aircraft and payload control systems, mission planning, and image processing and exploitation. Examples of applications are presented including using a small unmanned aircraft at low altitude with a line of sight data link, a long-endurance UAV maritime surveillance mission with on-board processing, and a manned ground moving target indicator application with the radar using multiple receive channels.

  17. Portable end-to-end ground system for low-cost mission support

    Science.gov (United States)

    Lam, Barbara

    1996-11-01

    This paper presents a revolutionary architecture of the end-to-end ground system to reduce overall mission support costs. The present ground system of the Jet Propulsion Laboratory (JPL) is costly to operate, maintain, deploy, reproduce, and document. In the present climate of shrinking NASA budgets, this proposed architecture takes on added importance as it should dramatically reduce all of the above costs. Currently, the ground support functions (i.e., receiver, tracking, ranging, telemetry, command, monitor and control) are distributed among several subsystems that are housed in individual rack-mounted chassis. These subsystems can be integrated into one portable laptop system using established Multi Chip Module (MCM) packaging technology and object-based software libraries. The large scale integration of subsystems into a small portable system connected to the World Wide Web (WWW) will greatly reduce operations, maintenance and reproduction costs. Several of the subsystems can be implemented using Commercial Off-The-Shelf (COTS) products further decreasing non-recurring engineering costs. The inherent portability of the system will open up new ways for using the ground system at the "point-of-use" site as opposed to maintaining several large centralized stations. This eliminates the propagation delay of the data to the Principal Investigator (PI), enabling the capture of data in real-time and performing multiple tasks concurrently from any location in the world. Sample applications are to use the portable ground system in remote areas or mobile vessels for real-time correlation of satellite data with earth-bound instruments; thus, allowing near real-time feedback and control of scientific instruments. This end-to-end portable ground system will undoubtedly create opportunities for better scientific observation and data acquisition.

  18. Integrating end-to-end threads of control into object-oriented analysis and design

    Science.gov (United States)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  19. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, W.

    2000-02-22

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results

  20. Fistulotomy with end-to-end primary sphincteroplasty for anal fistula: results from a prospective study.

    Science.gov (United States)

    Ratto, Carlo; Litta, Francesco; Parello, Angelo; Zaccone, Giuseppe; Donisi, Lorenza; De Simone, Veronica

    2013-02-01

    Fistulotomy plus primary sphincteroplasty for complex anal fistulas is regarded with scepticism, mainly because of the risk of postoperative incontinence. The aim of this study was to evaluate safety and effectiveness of this technique in medium-term follow up and to identify potential predictive factors of success and postoperative continence impairment. This was a prospective observational study conducted at a tertiary care university hospital in Italy. A total of 72 patients with complex anal fistula of cryptoglandular origin underwent fistulotomy and end-to-end primary sphincteroplasty; patients were followed up at 1 week, 1 and 3 months, 1 year, and were invited to participate in a recent follow-up session. Success regarding healing of the fistula was assessed with 3-dimensional endoanal ultrasound and clinical evaluation. Continence status was evaluated using the Cleveland Clinic fecal incontinence score and by patient report of post-defecation soiling. Of the 72 patients, 12 (16.7%) had recurrent fistulas and 29 patients (40.3%) had undergone seton drainage before definitive surgery. At a mean follow-up of 29.4 (SD, 23.7; range, 6-91 months, the success rate of treatment was 95.8% (69 patients). Fistula recurrence was observed in 3 patients at a mean of 17.3 (SD, 10.3; range, 6-26) months of follow-up. Cleveland Clinic fecal incontinence score did not change significantly (p = 0.16). Eight patients (11.6% of those with no baseline incontinence) reported de novo postdefecation soiling. None of the investigated factors was a significant predictor of success. Patients with recurrent fistula after previous fistula surgery had a 5-fold increased probability of having impaired continence (relative risk = 5.00, 95% CI, 1.45-17.27, p = 0.02). The study was limited by potential single-institution bias, lack of anorectal manometry, and lack of quality of life assessment. Fistulotomy with end-to-end primary sphincteroplasty can be considered to be an effective

  1. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  2. Availability and End-to-end Reliability in Low Duty Cycle MultihopWireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Timo D. Hämäläinen

    2009-03-01

    Full Text Available A wireless sensor network (WSN is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS. Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER. The forwarding algorithm guarantees reliability up to 30% PER.

  3. SIMsim: An End-to-End Simulation of The Space Interferometer Mission

    CERN Document Server

    Meier, D L; Meier, David L.; Folkner, William M.

    2003-01-01

    We present the basic elements and first results of an end-to-end simulation package whose purpose is to test the validity of the Space Interferometer Mission design. The fundamental simulation time step is one millisecond, with substructure at 1/8 ms, and the total duration of the simulation is five years. The end product of a given "wide-angle" astrometry run is an estimated grid star catalog over the entire sky with an accuracy of about 4 micro-arcseconds. SIMsim is divided into five separate modules that communicate via data pipes. The first generates the 'truth' data on the spacecraft structure and laser metrology. The second module generates uncorrupted fringes for the three SIM interferometers, based on the current spacecraft orientation, target stars' positions, etc. The third module reads out the CCD white light fringe data at specified times, corrupting that and the metrology data with appropriate errors. The data stream out of this module represents the basic data stream on the simulated spacecraft....

  4. End-to-end simulation of bunch merging for a muon collider

    Energy Technology Data Exchange (ETDEWEB)

    Bao, Yu [Univ. of California, Riverside, CA (United States); Stratakis, Diktys [Brookhaven National Lab. (BNL), Upton, NY (United States); Hanson, Gail G. [Univ. of California, Riverside, CA (United States); Palmer, Robert B. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2015-05-03

    Muon accelerator beams are commonly produced indirectly through pion decay by interaction of a charged particle beam with a target. Efficient muon capture requires the muons to be first phase-rotated by rf cavities into a train of 21 bunches with much reduced energy spread. Since luminosity is proportional to the square of the number of muons per bunch, it is crucial for a Muon Collider to use relatively few bunches with many muons per bunch. In this paper we will describe a bunch merging scheme that should achieve this goal. We present for the first time a complete end-to-end simulation of a 6D bunch merger for a Muon Collider. The 21 bunches arising from the phase-rotator, after some initial cooling, are merged in longitudinal phase space into seven bunches, which then go through seven paths with different lengths and reach the final collecting "funnel" at the same time. The final single bunch has a transverse and a longitudinal emittance that matches well with the subsequent 6D rectilinear cooling scheme.

  5. The optical performance of the PILOT instrument from ground end-to-end tests

    Science.gov (United States)

    Misawa, R.; Bernard, J.-Ph.; Longval, Y.; Ristorcelli, I.; Ade, P.; Alina, D.; André, Y.; Aumont, J.; Bautista, L.; de Bernardis, P.; Boulade, O.; Bousqet, F.; Bouzit, M.; Buttice, V.; Caillat, A.; Chaigneau, M.; Charra, M.; Crane, B.; Douchin, F.; Doumayrou, E.; Dubois, J. P.; Engel, C.; Griffin, M.; Foenard, G.; Grabarnik, S.; Hargrave, P.; Hughes, A.; Laureijs, R.; Leriche, B.; Maestre, S.; Maffei, B.; Marty, C.; Marty, W.; Masi, S.; Montel, J.; Montier, L.; Mot, B.; Narbonne, J.; Pajot, F.; Pérot, E.; Pimentao, J.; Pisano, G.; Ponthieu, N.; Rodriguez, L.; Roudil, G.; Salatino, M.; Savini, G.; Simonella, O.; Saccoccio, M.; Tauber, J.; Tucker, C.

    2017-06-01

    The Polarized Instrument for Long-wavelength Observation of the Tenuous interstellar medium ( PILOT) is a balloon-borne astronomy experiment designed to study the linear polarization of thermal dust emission in two photometric bands centred at wavelengths 240 μm (1.2 THz) and 550 μm (545 GHz), with an angular resolution of a few arcminutes. Several end-to-end tests of the instrument were performed on the ground between 2012 and 2014, in order to prepare for the first scientific flight of the experiment that took place in September 2015 from Timmins, Ontario, Canada. This paper presents the results of those tests, focussing on an evaluation of the instrument's optical performance. We quantify image quality across the extent of the focal plane, and describe the tests that we conducted to determine the focal plane geometry, the optimal focus position, and sources of internal straylight. We present estimates of the detector response, obtained using an internal calibration source, and estimates of the background intensity and background polarization.

  6. Characterisation of residual ionospheric errors in bending angles using GNSS RO end-to-end simulations

    Science.gov (United States)

    Liu, C. L.; Kirchengast, G.; Zhang, K. F.; Norman, R.; Li, Y.; Zhang, S. C.; Carter, B.; Fritzer, J.; Schwaerz, M.; Choy, S. L.; Wu, S. Q.; Tan, Z. X.

    2013-09-01

    Global Navigation Satellite System (GNSS) radio occultation (RO) is an innovative meteorological remote sensing technique for measuring atmospheric parameters such as refractivity, temperature, water vapour and pressure for the improvement of numerical weather prediction (NWP) and global climate monitoring (GCM). GNSS RO has many unique characteristics including global coverage, long-term stability of observations, as well as high accuracy and high vertical resolution of the derived atmospheric profiles. One of the main error sources in GNSS RO observations that significantly affect the accuracy of the derived atmospheric parameters in the stratosphere is the ionospheric error. In order to mitigate the effect of this error, the linear ionospheric correction approach for dual-frequency GNSS RO observations is commonly used. However, the residual ionospheric errors (RIEs) can be still significant, especially when large ionospheric disturbances occur and prevail such as during the periods of active space weather. In this study, the RIEs were investigated under different local time, propagation direction and solar activity conditions and their effects on RO bending angles are characterised using end-to-end simulations. A three-step simulation study was designed to investigate the characteristics of the RIEs through comparing the bending angles with and without the effects of the RIEs. This research forms an important step forward in improving the accuracy of the atmospheric profiles derived from the GNSS RO technique.

  7. End-To-End performance test of the LINC-NIRVANA Wavefront-Sensor system.

    Science.gov (United States)

    Berwein, Juergen; Bertram, Thomas; Conrad, Al; Briegel, Florian; Kittmann, Frank; Zhang, Xiangyu; Mohr, Lars

    2011-09-01

    LINC-NIRVANA is an imaging Fizeau interferometer, for use in near infrared wavelengths, being built for the Large Binocular Telescope. Multi-conjugate adaptive optics (MCAO) increases the sky coverage and the field of view over which diffraction limited images can be obtained. For its MCAO implementation, Linc-Nirvana utilizes four total wavefront sensors; each of the two beams is corrected by both a ground-layer wavefront sensor (GWS) and a high-layer wavefront sensor (HWS). The GWS controls the adaptive secondary deformable mirror (DM), which is based on an DSP slope computing unit. Whereas the HWS controls an internal DM via computations provided by an off-the-shelf multi-core Linux system. Using wavefront sensor data collected from a prior lab experiment, we have shown via simulation that the Linux based system is sufficient to operate at 1kHz, with jitter well below the needs of the final system. Based on that setup we tested the end-to-end performance and latency through all parts of the system which includes the camera, the wavefront controller, and the deformable mirror. We will present our loop control structure and the results of those performance tests.

  8. An end-to-end architecture for distributing weather alerts to wireless handsets

    Science.gov (United States)

    Jones, Karen L.; Nguyen, Hung

    2005-06-01

    This paper describes the current National Weather Service's (NWS) system for providing weather alerts in the U.S. and will review how the existing end-to-end architecture is being leveraged to provide non-weather alerts, also known as "all-hazard alerts", to the general public. The paper then describes how a legacy system that transmits weather and all-hazard alerts can be extended via commercial wireless networks and protocols to reach 154 million Americans who carry cell phones. This approach uses commercial SATCOM and existing wireless carriers and services such as Short Messaging Service (SMS) for text and emerging Multimedia Messaging Service (MMS) protocol, which would allow for photos, maps, audio and video alerts to be sent to end users. This wireless broadcast alert delivery architecture is designed to be open and to embrace the National Weather Service's mandate to become an "" warning system for the general public. Examples of other public and private sector applications that require timely and intelligent push mechanisms using this alert dissemination approach are also given.

  9. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  10. Telemetry Ranging: Laboratory Validation Tests and End-to-End Performance

    Science.gov (United States)

    Hamkins, J.; Kinman, P.; Xie, H.; Vilnrotter, V.; Dolinar, S.; Adams, N.; Sanchez, E.; Millard, W.

    2016-08-01

    This article reports on a set of laboratory tests of telemetry ranging conducted at Development Test Facility 21 (DTF-21) in Monrovia, California. An uplink pseudorandom noise (PN) ranging signal was generated by DTF-21, acquired by the Frontier Radio designed and built at the Johns Hopkins University Applied Physics Laboratory, and downlink telemetry frames from the radio were recorded by an open-loop receiver. In four of the tests, the data indicate that telemetry ranging can resolve the two-way time delay to a standard deviation of 2.1-3.4 ns, corresponding to about 30 to 51 cm in (one-way) range accuracy, when 30 s averaging of timing estimates is used. Other tests performed worse because of unsatisfactory receiver sampling rate, quantizer resolution, dc bias, improper configuration, or other reasons. The article also presents an analysis of the expected end-to-end performance of the telemetry ranging system. In one case considered, the theoretically-predicted performance matches the test results, within 10 percent, which provides a reasonable validation that the expected performance was achieved by the test. The analysis also shows that in one typical ranging scenario, one-way range accuracy of 1 m can be achieved with telemetry ranging when the data rate is above 2 kbps.

  11. End to End Delay Improvement in Heterogeneous Multicast Network using Genetic Optimization

    Directory of Open Access Journals (Sweden)

    V. Chandrasekar

    2012-01-01

    Full Text Available Problem statement: Multicast is a concept of group communication which refers to transmitting the same data or messages from a source to multiple destinations in the network. This one-to-many group communication is a generalization of the concepts of one-to-one unicast and one-to-all broadcast. To deliver data from the sender to all receivers efficiently, routing plays an important role in multicast communication. In QoS multicast, every receiver must receive the data within their own specified QoS constraints. This becomes challenging especially if the network is a heterogeneous network made up of wired and wireless devices. Approach: This study investigates the performance of Protocol Independent Multicast-Sparse Mode (PIM-SM protocol in a heterogeneous network running an video conferencing application and proposes an enhanced routing protocol using Genetic Optimizing techniques to improve QOS parameters in the wireless part. Results and Conclusion: Extensive simulations were carried out using the proposed technique and existing PIM-SM. The proposed optimization technique not only improves the throughput of the network but also decreased the end to end delay."

  12. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  13. End-to-End Beam Simulations for the MSU RIA Driver Linac

    CERN Document Server

    Wu, X; Gorelov, D; Grimm, T L; Marti, F; York, R C; Zhao, Q

    2004-01-01

    The Rare Isotope Accelerator (RIA) driver linac proposed by Michigan State University (MSU) will use a 10th sub-harmonic based, superconducting, cw linac to accelerate light and heavy ions to final energies of ≤400 MeV/u with beam powers of 100 to 400 kW. The driver linac uses superconducting quarter-wave, half-wave, and six-cell elliptical cavities with frequencies ranging from 80.5 MHz to 805 MHz for acceleration, and superconducting solenoids and room temperature quadrupoles for transverse focusing. For the heavier ions, two stages of charge-stripping and multiple-charge-state acceleration will be used to meet the beam power requirements and to minimize the requisite accelerating voltage. End-to-end, three-dimensional (3D), beam dynamics simulations from the ECR to the radioactive beam production targets have been performed. These studies include a 3D analysis of multi-charge-state beam acceleration, evaluation of transverse misalignment and rf errors on the machine performance, modeling of the c...

  14. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    Science.gov (United States)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  15. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  16. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    Science.gov (United States)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  17. Addressing the Bandwidth issue in End-to-End Header Compression over IPv6 Tunneling Mechanism

    Directory of Open Access Journals (Sweden)

    Dipti Chauhan

    2015-08-01

    Full Text Available One day IPv6 is going to be the default protocol used over the internet. But till then we are going to have the networks which IPv4, IPv6 or both networks. There are a number of migration technologies which support this transition like dual stack, tunneling & header translation. In this paper we are improving the efficiency of IPv6 tunneling mechanism, by compressing the IPv6 header of the tunneled packet as IPv6 header is of largest length of 40 bytes. Here the tunnel is a multi hop wireless tunnel and results are analyzed on the basis of varying bandwidth of wireless network. Here different network performance parameters like throughput, End-to-End delay, Jitter, and Packet delivery ratio are taken into account and the results are compared with uncompressed network. We have used Qualnet 5.1 Simulator and the simulation results shows that using header compression over multi hop IPv6 tunnel results in better network performance and bandwidth savings than uncompressed network.

  18. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  19. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  20. Vascular Coupling System for End-to-End Anastomosis: An In Vivo Pilot Case Report.

    Science.gov (United States)

    Li, Huizhong; Gale, Bruce; Shea, Jill; Sant, Himanshu; Terry, Christi M; Agarwal, Jay

    2017-03-01

    This paper presents the latest in vivo findings of a novel vascular coupling system. Vascular anastomosis is a common procedure in reconstructive surgeries and traditional hand suturing is very time consuming. The vascular coupling system described herein was designed to be used on arteries for a rapid and error-free anastomosis. The system consists of an engaging ring made from high density polyethylene using computer numerical control machining and a back ring made from polymethylmethacrylate using laser cutting. The vascular coupling system and its corresponding installation tools were tested in a pilot animal study to evaluate their efficacy in completing arterial anastomosis. A segment of expanded polytetrafluoroethylene (ePTFE) tubing was interposed into a transected carotid artery by anastomosis using two couplers in a pig. Two end-to-end anastomoses were accomplished. Ultrasound images were obtained to evaluate the blood flow at the anastomotic site immediately after the surgery. MRI was also performed 2 weeks after the surgery to evaluate vessel and ePTFE graft patency. This anastomotic system demonstrated high efficacy and easy usability, which should facilitate vascular anastomosis procedures in trauma and reconstructive surgeries.

  1. Status report of the end-to-end ASKAP software system: towards early science operations

    Science.gov (United States)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  2. End-to-end self-assembly of RADA 16-I nanofibrils in aqueous solutions.

    Science.gov (United States)

    Arosio, Paolo; Owczarz, Marta; Wu, Hua; Butté, Alessandro; Morbidelli, Massimo

    2012-04-04

    RADARADARADARADA (RADA 16-I) is a synthetic amphiphilic peptide designed to self-assemble in a controlled way into fibrils and higher ordered structures depending on pH. In this work, we use various techniques to investigate the state of the peptide dispersed in water under dilute conditions at different pH and in the presence of trifluoroacetic acid or hydrochloric acid. We have identified stable RADA 16-I fibrils at pH 2.0-4.5, which have a length of ∼200-400 nm and diameter of 10 nm. The fibrils have the characteristic antiparallel β-sheet structure of amyloid fibrils, as measured by circular dichroism and Fourier transform infrared spectrometry. During incubation at pH 2.0-4.5, the fibrils elongate very slowly via an end-to-end fibril-fibril aggregation mechanism, without changing their diameter, and the kinetics of such aggregation depends on pH and anion type. At pH 2.0, we also observed a substantial amount of monomers in the system, which do not participate in the fibril elongation and degrade to fragments. The fibril-fibril elongation kinetics has been simulated using the Smoluchowski kinetic model, population balance equations, and the simulation results are in good agreement with the experimental data. It is also found that the aggregation process is not limited by diffusion but rather is an activated process with energy barrier in the order of 20 kcal/mol.

  3. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  4. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  5. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    Science.gov (United States)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  6. The relationship between suture number and the healing process of end-to-end arterial anastomosis

    Directory of Open Access Journals (Sweden)

    Winston B Yoshida

    1997-06-01

    Full Text Available In spite of the grate experience accumulated in vascular repairing, the ideal number of sutures for vascular anastomosis remains controversial. It is generally thought that the more stitches applied in a vascular anastomosis, the lesser resistant the anastomosis will be. The purpose of this study was to test this hypothesis in 20 rabbits, in which both carotid arteries were cross sectioned and repaired by end-to-end anastomosis with 8 interrupted sutures in one side (G1 and 16 in the other side (G2. After 3 and 15 days, the animals were randomly allocated for tensile strength, hydroxyproline determination (7 animals and for histologic analysis of the anastomosis (3 animals. Conventional staining procedures (hematoxylin-eosin and Masson methods and the picrosirius red polarization (PSP technique for collagen type determination were used. From 3 to 15 days, the tensile strength increased in both groups, from 265.0±-44.4g to 391.2±-49.0g in G1 and from 310.0±-71.5g to 348.7±-84.0g in G2 (p<0.005, with no statistical difference between the groups in each period of study. The hydroxyproline content, expressed as hydroxyproline/protein ratio, varied from 0.04±-0.01 to 0.05±-0.02 in G1 and from 0.05±-0.01 to 0.05±-0.02 in G2, with no significant difference between periods and groups. The histology showed similar inflammatory and reparative aspects in both groups. In both groups and periods the PSP technique demonstrated predominantly type I collagen in relation to type III collagen in the anastomosis. We concluded that even doubling the number of stitches, the healing process and strength did not change in the arterial anastomosis.

  7. Weaving marine food webs from end to end under global change

    Science.gov (United States)

    Moloney, Coleen L.; St John, Michael A.; Denman, Kenneth L.; Karl, David M.; Köster, Friedrich W.; Sundby, Svein; Wilson, Rory P.

    2011-02-01

    Marine food web dynamics are determined by interactions within and between species and between species and their environment. Global change directly affects abiotic conditions and living organisms, impinging on all trophic levels in food webs. Different groups of marine researchers traditionally study different aspects of these changes. However, over medium to long time scales perturbations affecting food webs need to be considered across the full range from nutrients to top predators. Studies of end-to-end marine food webs not only span organism sizes and trophic levels, but should also help align multidisciplinary research to common goals and perspectives. Topics are described that bridge disciplinary gaps and are needed to develop new understanding of the reciprocal impacts of global change on marine food webs and ocean biogeochemistry. These include (1) the effects of nutrients on biomass and production, (2) the effects of varying element ratios on food web structure and food quality, (3) bulk flows of energy and material in food webs and their efficiencies of transfer, (4) the ecological effects of species richness and the roles of microbial organisms, (5) the role of feeding behaviour in food web dynamics and trophic controls, (6) the spatial dynamics of communities and links between different food webs, (7) the combined effects of body size and behaviour in determining dynamics of food webs, and (8) the extent to which the ability of marine organisms (and communities) to adapt will influence food web dynamics. An overriding issue that influences all topics concerns the time and space scales of ecosystem variability. Threads link different nodes of information among various topics, emphasizing the importance of tackling food web studies with a variety of modelling approaches and through a combination of field and experimental studies with a strong comparative approach.

  8. End-to-End Models for Effects of System Noise on LIMS Analysis of Igneous Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Bender, Steven [Los Alamos National Laboratory; Wiens, R. C. [Los Alamos National Laboratory; Carmosino, Marco L [MT. HOLYOKE COLLEGE; Speicher, Elly A [MT. HOLYOKE COLLEGE; Dyar, M. D. [MT. HOLYOKE COLLEGE

    2010-12-23

    The ChemCam instrument on the Mars Science Laboratory will be the first extraterrestial deployment of laser-induced breakdown spectroscopy (UBS) for remote geochemical analysis. LIBS instruments are also being proposed for future NASA missions. In quantitative LIBS applications using multivariate analysis techniques, it is essential to understand the effects of key instrument parameters and their variability on the elemental predictions. Baseline experiments were run on a laboratory instrument in conditions reproducing ChemCam performance on Mars. These experiments employed Nd:YAG laser producing 17 mJ/pulse on target and an with a 200 {micro}m FWHM spot size on the surface of a sample. The emission is collected by a telescope, imaged on a fiber optic and then interfaced to a demultiplexer capable of >40% transmission into each spectrometer. We report here on an integrated end-to-end system performance model that simulates the effects of output signal degradation that might result from the input signal chain and the impact on multivariate model predictions. There are two approaches to modifying signal to noise (SNR): degrade the signal and/or increase the noise. Ishibashi used a much smaller data set to show that the addition of noise had significant impact while degradation of spectral resolution had much less impact on accuracy and precision. Here, we specifically focus on aspects of remote LIBS instrument performance as they relate to various types of signal degradation. To assess the sensitivity of LIBS analysis to signal-to-noise ratio (SNR) and spectral resolution, the signal in each spectrum from a suite of 50 laboratory spectra of igneous rocks was variably degraded by increasing the peak widths (simulating misalignment) and decreasing the spectral amplitude (simulating decreases in SNR).

  9. A Mechanistic End-to-End Concussion Model That Translates Head Kinematics to Neurologic Injury

    Directory of Open Access Journals (Sweden)

    Laurel J. Ng

    2017-06-01

    Full Text Available Past concussion studies have focused on understanding the injury processes occurring on discrete length scales (e.g., tissue-level stresses and strains, cell-level stresses and strains, or injury-induced cellular pathology. A comprehensive approach that connects all length scales and relates measurable macroscopic parameters to neurological outcomes is the first step toward rationally unraveling the complexity of this multi-scale system, for better guidance of future research. This paper describes the development of the first quantitative end-to-end (E2E multi-scale model that links gross head motion to neurological injury by integrating fundamental elements of tissue and cellular mechanical response with axonal dysfunction. The model quantifies axonal stretch (i.e., tension injury in the corpus callosum, with axonal functionality parameterized in terms of axonal signaling. An internal injury correlate is obtained by calculating a neurological injury measure (the average reduction in the axonal signal amplitude over the corpus callosum. By using a neurologically based quantity rather than externally measured head kinematics, the E2E model is able to unify concussion data across a range of exposure conditions and species with greater sensitivity and specificity than correlates based on external measures. In addition, this model quantitatively links injury of the corpus callosum to observed specific neurobehavioral outcomes that reflect clinical measures of mild traumatic brain injury. This comprehensive modeling framework provides a basis for the systematic improvement and expansion of this mechanistic-based understanding, including widening the range of neurological injury estimation, improving concussion risk correlates, guiding the design of protective equipment, and setting safety standards.

  10. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    Science.gov (United States)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  11. The determination of the link with the smallest end-to-end network latency in ethernet architecture

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Ethernet fundamental and its data transmission model are introduced in brief and end-to-end network latency was analyzed in this paper. On the premise of not considering transmission quality and transmission cost, latency was the function of the rest of network resource parameter (NRP). The relation between the number of nodes and that of end-to-end links was presented. In ethernet architecture, the algorithm to determine the link with the smallest latency is a polynomial issue when the number of network nodes is limited, so it can be solved by way of polynomial equations. Latency measuring is the key issue to determine the link with the smallest network latency. 3-node brigade (regiment) level network centric warfare (NCW) demonstration platform was studied and the latency between the detectors and weapon control stations was taken as an example. The algorithm of end-to-end network latency and link information in NCW was presented. The algorithm program based on Server/Client architecture was developed. The data transmission optimal link is one whose end-to-end latency is the smallest. This paper solves the key issue to determine the link whose end-to-end latency is the smallest in ethernet architecture. The study can be widely applied to determine the optimal link which is in the complex network environment of multiple service provision points.

  12. SU-E-T-150: End to End Tests On the First Clinical EDGETM

    Energy Technology Data Exchange (ETDEWEB)

    Scheib, S; Schmelzer, P [Varian Medical Systems, Baden - Daettwil (Switzerland); Vieira, S; Greco, C [Champalimaud Foundation, Lisbon (Portugal)

    2014-06-01

    Purpose: To quantify the sub millimeter overall accuracy of EDGETM, the dedicated linac based SRS/SABR treatment platform from Varian, using a novel End-to-End (E2E) test phantom. Methods: The new E2E test phantom developed by Varian consists of a cube with an outer dimension of 15x15x15 cm3. The phantom is equipped with an exchangable inner cube (7×7×7 cm3) to hold radiochromic films or a tungsten ball (diameter = 5 mm) for Winston-Lutz tests. 16 ceramic balls (diameter = 5 mm) are embedded in the outer cube. Three embedded Calypso transponders allow for Calypso based monitoring. The outer surface of the phantom is tracked using the Optical Surface Monitoring System (OSMS). The phantom is positioned using kV, MV and CBCT images. A simCT of the phantom was acquired and SRS/SABR plans were treated using the new phantom on the first clinical installed EDGETM. As a first step a series of EPID based Winston-Lutz tests have been performed. As a second step the calculated dose distribution applied to the phantom was verified with radiochromic films in orthogonal planes. The measured dose distribution is compared with the calculated (Eclipse) one based on the known isocenter on both dose distributions. The geometrical shift needed to match both dose distributions is the overall accuracy and is determined using dose profiles, isodose lines or gamma pass rates (3%, 1 mm). Results: Winston-Lutz tests using the central tungsten BB demonstrated a targeting accuracy of 0.44±0.18mm for jaw (2cm × 2cm) defined 0.39±0.19mm for MLC (2cm × 2cm) defined and 0.37±0.15mm for cone (12.5 mm) defined fields. A treated patient plan (spinal metastases lesion with integrated boost) showed a dosimetric dose localization accuracy of 0.6mm. Conclusion: Geometric and dosimetric E2E tests on EDGETM, show sub-millimeter E2E targeting and dose localisation accuracy.

  13. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    Science.gov (United States)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  14. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  15. Urban Biomining Meets Printable Electronics: End-To-End at Destination Biological Recycling and Reprinting

    Science.gov (United States)

    Rothschild, Lynn J. (Principal Investigator); Koehne, Jessica; Gandhiraman, Ram; Navarrete, Jesica; Spangle, Dylan

    2017-01-01

    Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a limited lifespan. Thus, current mission architectures must compensate for replacement. In space, spent electronics are discarded; on earth, there is some recycling but current processes are toxic and environmentally hazardous. Imagine instead an end-to-end recycling of spent electronics at low mass, low cost, room temperature, and in a non-toxic manner. Here, we propose a solution that will not only enhance mission success by decreasing upmass and providing a fresh supply of electronics, but in addition has immediate applications to a serious environmental issue on the Earth. Spent electronics will be used as feedstock to make fresh electronic components, a process we will accomplish with so-called 'urban biomining' using synthetically enhanced microbes to bind metals with elemental specificity. To create new electronics, the microbes will be used as 'bioink' to print a new IC chip, using plasma jet electronics printing. The plasma jet electronics printing technology will have the potential to use martian atmospheric gas to print and to tailor the electronic and chemical properties of the materials. Our preliminary results have suggested that this process also serves as a purification step to enhance the proportion of metals in the 'bioink'. The presence of electric field and plasma can ensure printing in microgravity environment while also providing material morphology and electronic structure tunabiity and thus optimization. Here we propose to increase the TRL level of the concept by engineering microbes to dissolve the siliceous matrix in the IC, extract copper from a mixture of metals, and use the microbes as feedstock to print interconnects using mars gas simulant. To assess the ability of this concept to influence mission architecture, we will do an analysis of the infrastructure required to execute

  16. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services over Differentiated Service Networks

    Science.gov (United States)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  17. Differentiated CW Policy and Strict Priority Policy for Location-Independent End-to-End Delay in Multi-Hop Wireless Mesh Networks

    Science.gov (United States)

    Bae, Yun Han; Kim, Kyung Jae; Park, Jin Soo; Choi, Bong Dae

    We investigate delay analysis of multi-hop wireless mesh network (WMN) where nodes have multi-channel and multiple transceivers to increase the network capacity. The functionality of the multi-channel and multiple transceivers allows the whole WMN to be decomposed into disjoint zones in such a way that i) nodes in a zone are within one-hop distance, and relay node and end nodes with different CWmins contend to access the channel based on IEEE 802.11e EDCA, ii) different channels are assigned to neighbor zones to prevent the hidden node problem, iii) relay nodes can transmit and receive the packets simultaneously by multi-channel and multiple transceivers. With this decomposition of the network, we focus on the delay at a single zone and then the end-to-end delay can be obtained as the sum of zone-delays. In order to have the location-independent end-to-end delay to the gateway regardless of source nodes' locations, we propose two packet management schemes, called the differentiated CW policy and the strict priority policy, at each relay node where relay packets with longer hop count are buffered in higher priority queues according to their experienced hop count. For the differentiated CW policy, a relay node adopts the functionality of IEEE 802.11e EDCA where a higher priority queue has a shorter minimum contention window. We model a typical zone as a one-hop IEEE 802.11e EDCA network under non-saturation condition where priority queues have different packet arrival rates and different minimum contention window sizes. First, we find the PGF (probability generating function) of the HoL-delay of packets at priority queues in a zone. Second, by modeling each queue as M/G/1 queue with the HoL-delay as a service time, we obtain the packet delay (the sum of the queueing delay and the HoL-delay) of each priority queue in a zone. Third, the average end-to-end delay of packet generated at end node in each zone is obtained by summing up the packet delays at each zone. For

  18. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  19. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  20. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    Science.gov (United States)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  1. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  2. Integration proposal through standard-based design of an end-to-end platform for p-Health environments.

    Science.gov (United States)

    Martíínez, I; Trigo, J D; Martínez-Espronceda, M; Escayola, J; Muñoz, P; Serrano, L; García, J

    2009-01-01

    Interoperability among medical devices and compute engines in the personal environment of the patient, and with healthcare information systems in the remote monitoring and management process is a key need that requires developments supported on standard-based design. Even though there have been some international initiatives to combine different standards, the vision of an entire end-to-end standard-based system is the next challenge. This paper presents the implementation guidelines of a ubiquitous platform for Personal Health (p-Health). It is standard-based using the two main medical norms in this context: ISO/IEEE11073 in the patient environment for medical device interoperability, and EN13606 to allow the interoperable communication of the Electronic Healthcare Record of the patient. Furthermore, the proposal of a new protocol for End-to-End Standard Harmonization (E2ESHP) is presented in order to make possible the end-to-end standard integration. The platform has been designed to comply with the last ISO/IEEE11073 and EN13606 available versions, and tested in a laboratory environment as a proof-of-concept to illustrate its feasibility as an end-to-end standard-based solution.

  3. SLA calculus for end-to-end QOS of TCP-based applications in a multi-domain environment

    NARCIS (Netherlands)

    Kooij, R.E.; Berg, J.L. van den; Yang, R.; Mei, R.D. van der

    2006-01-01

    Next-generation communication services will be offered over distributed information and communication infrastructures consisting of a multitude of administrative domains, owned by different parties. This raises the problem for service providers to provide satisfactory levels of end-to-end Quality of

  4. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  5. WE-G-BRD-08: End-To-End Targeting Accuracy of the Gamma Knife for Trigeminal Neuralgia

    Energy Technology Data Exchange (ETDEWEB)

    Brezovich, I; Wu, X; Duan, J; Benhabib, S; Huang, M; Shen, S; Cardan, R; Popple, R [University Alabama Birmingham, Birmingham, AL (United States)

    2014-06-15

    Purpose: Current QA procedures verify accuracy of individual equipment parameters, but may not include CT and MRI localizers. This study uses an end-to-end approach to measure the overall targeting errors in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 inch) diameter MRI contrast-filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the cavity position matches the Gamma Knife coordinates of 10 previously treated patients. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pin prick to identify the cavity center. Treatments are planned for delivery with 4 mm collimators using MRI and CT scans acquired with the clinical localizer boxes and acquisition protocols. Coordinates of shots are chosen so that the cavity is centered within the 50% isodose volume. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pin prick and the centroid of the 50% isodose line. Results: Averaged over 10 patient simulations, targeting errors along the x, y and z coordinates (patient left-to-right, posterior-anterior, head-to-foot) were, respectively, −0.060 +/− 0.363, −0.350 +/− 0.253, and 0.364 +/− 0.191 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely 0.109 +/− 0.167, −0.191 +/− 0.144, and 0.211 +/− 0.94 mm. The largest errors in MRI and CT planned treatments were, respectively, y = −0.761 and x = 0.428 mm. Conclusion: Unless patient motion or stronger MRI image distortion in actual treatments caused additional errors, all patients received the prescribed dose, i.e., the targeted section of the trig±eminal nerve was contained within the 50% isodose surface in all cases.

  6. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  7. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  8. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene....... In essence, our methods hence demonstrate the fabrication of a nanostructure with a molecule connected to two nanoelectrodes by bottom-up chemical assembly....

  9. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  10. End-to-End Printed-Circuit Board Assembly Design Using Altium Designer and Solid Works Systems

    Directory of Open Access Journals (Sweden)

    A. M. Goncharenko

    2015-01-01

    Full Text Available The main goal of this white paper is to investigate the methods to accelerate the end-to-end simultaneous development of electronic PC assemblies in MCAD/ECAD systems. With raising the produced electronic equipment rates and quantities, there is a need to speed up the yield of new products. The article offers an alternate approach to the end-to-end simultaneous development in Altium Designer / Solid Works CAD/CAE systems, which enables a radically shortened time to design new devices and databases of components.The first part of the paper analyses the methods and models to solve the tasks of the endto-end simultaneous development of PC assemblies using the Circuit Works module for Solid Works. It examines the problems of traditional data exchange methods between Altium Designer and Solid Works arising from the limitations of the IDF 2.0 format used, as well as from the problems of 3D-models of components and because it is necessary to support two different databases.The second part gives guidelines and an example of the end-to-end simultaneous PC assembly development using the Altium Modeler module for Solid Works aimed at Altium Designer and presents a brief review of algorithms. The proposed method neither requires an additional database, nor uses an intermediate format such as IDF. The module translates the PCB model directly to Solid Works to generate the assembly model. The Altium Modeler is also capable to update its created assembly in Solid Works, which is very useful in case of modification of components and PCB itself. This approach is better tailored to the end-to-end development in terms of acceleration, enhancing facility of simultaneous work in different MCAD/ECAD systems, and eliminating errors arising from the need to support two CAD databases of the same functionality.In the conclusion the paper gives suggestions for using the modules for simultaneous development of electronic PC assemblies in Altium Designer and Solid Works.

  11. Effect of swirling flow on platelet concentration distribution in small-caliber artificial grafts and end-to-end anastomoses

    Institute of Scientific and Technical Information of China (English)

    Fan Zhan; Yu-Bo Fan; Xiao-Yan Deng

    2011-01-01

    Platelet concentration near the blood vessel wall is one of the major factors in the adhesion of platelets to the wall.In our previous studies,it was found that swirling flows could suppress platelet adhesion in small-caliber artificial grafts and end-to-end anastomoses.In order to better understand the beneficial effect of the swirling flow,we numerically analyzed the near-wall concentration distribution of platelets in a straight tube and a sudden tubular expansion tube under both swirling flow and normal flow conditions.The numerical models were created based on our previous experimental studies.The simulation results revealed that when compared with the normal flow,the swirling flow could significantly reduce the near-wall concentration of platelets in both the straight tube and the expansion tube.The present numerical study therefore indicates that the reduction in platelet adhesion under swirling flow conditions in small-caliber arterial grafts,or in end-to-end anastomoses as observed in our previous experimental study,was possibly through a mechanism of platelet transport,in which the swirling flow reduced the near-wall concentration of platelets.

  12. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  13. Weighted-DESYNC and Its Application to End-to-End Throughput Fairness in Wireless Multihop Network

    Directory of Open Access Journals (Sweden)

    Ui-Seong Yu

    2017-01-01

    Full Text Available The end-to-end throughput of a routing path in wireless multihop network is restricted by a bottleneck node that has the smallest bandwidth among the nodes on the routing path. In this study, we propose a method for resolving the bottleneck-node problem in multihop networks, which is based on multihop DESYNC (MH-DESYNC algorithm that is a bioinspired resource allocation method developed for use in multihop environments and enables fair resource allocation among nearby (up to two hops neighbors. Based on MH-DESYNC, we newly propose weighted-DESYNC (W-DESYNC as a tool artificially to control the amount of resource allocated to the specific user and thus to achieve throughput fairness over a routing path. Proposed W-DESYNC employs the weight factor of a link to determine the amount of bandwidth allocated to a node. By letting the weight factor be the link quality of a routing path and making it the same across a routing path via Cucker-Smale flocking model, we can obtain throughput fairness over a routing path. The simulation results show that the proposed algorithm achieves throughput fairness over a routing path and can increase total end-to-end throughput in wireless multihop networks.

  14. Adjusting Sink Location to Reduce End-to-End Delay in Low-Duty-Cycle Wireless Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    Yu-Yuan Lin; Kuo-Feng Ssu; Hau-Yu Chiang

    2015-01-01

    Abstract⎯Low-duty-cycle mechanisms can reduce the energy consumptionsignificantly in wireless sensor networks (WSNs). Sensors stay dormant most of the time to save their energy and wake up based on their needs. However, such a technique, while prolonging the network lifetime, sets excessive challenges for reducing the end-to-end (E2E) delay within the network. In this paper, the centralized cluster-based location finding (CCLF) algorithm is proposed to reduce the high latency in low-duty-cycle WSNs by finding a suitable position for the sink. The algorithm is mainly composed of three steps: a) the cluster construction, b) the fast look-up table (FLU-table) construction, and c) the sink location decision. The simulation results show that the performance of the CCLF algorithm is significantly similar to that of the optimal algorithm. Moreover, the CCLF algorithm requires less operation time compared with the optimal algorithm.

  15. A New Fault Detection Method Using End-to-End Data and Sequential Testing for Computer Networks

    Directory of Open Access Journals (Sweden)

    Mohammad Sadeq Garshasbi

    2013-12-01

    Full Text Available Fault localization, a central part of network fault management, is a process of deducing the exact source of a failure from a set of observed failure indications. in the network, end systems and hosts communicate through routers and links connecting them. When a link or a router faces with a fault, the information sent through these components will be damaged. Hence, faulty components in a network need to be detected and repaired to sustain the health of the network. In this paper we introduce an end to end method that detect and repair the faulty components in the network. The proposed method is a heuristic algorithm that uses the embedded information retrieved from disseminated data over the network to detect and repair faulty components. Simulation results show that our heuristic scheme only requires testing a very small set of network components to detect and repair all faults in the network.

  16. Cross-Layer Design for End-to-End Throughput Maximization and Fairness in MIMO Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Jain-Shing Liu

    2010-01-01

    Full Text Available MIMO links can significantly improve network throughput by supporting multiple concurrent data streams between a pair of nodes and suppressing wireless interference. In this paper, we study joint rate control, routing, and scheduling in MIMO-based multihop wireless networks, which are traditionally known as transport layer, network layer, and MAC layer issues, respectively. Our aim is to find a rate allocation along with a flow allocation and a transmission schedule for a set of end-to-end communication sessions so as to maximize the network throughput and also to achieve the proportional or weighted fairness among these sessions. To this end, we develop Transmission Mode Generating Algorithms (TMGAs, and Linear Programming- (LP- and Convex Programming- (CP- based optimization schemes for the MIMO networks. The performances of the proposed schemes are verified by simulation experiments, and the results show that the different schemes have different performance benefits when achieving a tradeoff between throughput and fairness.

  17. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  18. The ZEPLIN-III dark matter detector: performance study using an end-to-end simulation tool

    CERN Document Server

    Araújo, H M; Alner, G J; Bewick, A; Bungau, C; Camanzi, B; Carson, M J; Chagani, H; Chepel, V Yu; Cunha, J P; Davidge, D; Davies, J C; Daw, E; Dawson, J; Durkin, T; Edwards, B; Gamble, T; Ghag, C; Hollingworth, R; Howard, A S; Jones, W G; Joshi, M; Kirkpatrick, J; Kovalenko, A; Kudryavtsev, V A; Lawson, T; Lebedenko, V N; Lewin, J D; Lightfoot, P; Lindote, A; Liubarsky, I; Lopes, M I; Lüscher, R; Majewski, P; Mavrokoridis, K; McMillan, J; Morgan, B; Muna, D; Murphy, A S; Neves, F; Nicklin, G; Paling, S; Preece, R; Quenby, J J; Robinson, M; Silva, C; Smith, N J T; Smith, P F; Solovov, V N; Spooner, N J C; Stekhanov, V; Sumner, T J; Thorne, C; Tovey, Daniel R; Tziaferi, E; Walker, R J

    2006-01-01

    We present results from a GEANT4-based Monte Carlo tool for end-to-end simulations of the ZEPLIN-III dark matter experiment. ZEPLIN-III is a two-phase detector which measures both the scintillation light and the ionisation charge generated in liquid xenon by interacting particles and radiation. The software models the instrument response to radioactive backgrounds and calibration sources, including the generation, ray-tracing and detection of the primary and secondary scintillations in liquid and gaseous xenon, and subsequent processing by data acquisition electronics. A flexible user interface allows easy modification of detector parameters at run time. Realistic datasets can be produced to help with data analysis, an example of which is the position reconstruction algorithm developed from simulated data. We present a range of simulation results confirming the original design sensitivity of a few times $10^{-8}$ pb to the WIMP-nucleon cross-section.

  19. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    Science.gov (United States)

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  20. SU-E-T-282: Dose Measurements with An End-To-End Audit Phantom for Stereotactic Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R; Artschan, R [Calvary Mater Newcastle, Newcastle, NSW (Australia); Thwaites, D [University of Sydney, Sydney, NSW (Australia); Lehmann, J [Calvary Mater Newcastle, Newcastle, NSW (Australia); University of Sydney, Sydney, NSW (Australia)

    2015-06-15

    Purpose: Report on dose measurements as part of an end-to-end test for stereotactic radiotherapy, using a new audit tool, which allows audits to be performed efficiently either by an onsite team or as a postal audit. Methods: Film measurements have been performed with a new Stereotactic Cube Phantom. The phantom has been designed to perform Winston Lutz type position verification measurements and dose measurements in one setup. It comprises a plastic cube with a high density ball in its centre (used for MV imaging with film or EPID) and low density markers in the periphery (used for Cone Beam Computed Tomography, CBCT imaging). It also features strategically placed gold markers near the posterior and right surfaces, which can be used to calculate phantom rotations on MV images. Slit-like openings allow insertion of film or other detectors.The phantom was scanned and small field treatment plans were created. The fields do not traverse any inhomogeneities of the phantom on their paths to the measurement location. The phantom was setup at the delivery system using CBCT imaging. The calculated treatment fields were delivered, each with a piece of radiochromic film (EBT3) placed in the anterior film holder of the phantom. MU had been selected in planning to achieve similar exposures on all films. Calibration films were exposed in solid water for dose levels around the expected doses. Films were scanned and analysed following established procedures. Results: Setup of the cube showed excellent suitability for CBCT 3D alignment. MV imaging with EPID allowed for clear identification of all markers. Film based dose measurements showed good agreement for MLC created fields down to 0.5 mm × 0.5 mm. Conclusion: An end-to-end audit phantom for stereotactic radiotherapy has been developed and tested.

  1. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  2. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  3. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  4. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    Science.gov (United States)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  5. A novel end-to-end classifier using domain transferred deep convolutional neural networks for biomedical images.

    Science.gov (United States)

    Pang, Shuchao; Yu, Zhezhou; Orgun, Mehmet A

    2017-03-01

    Highly accurate classification of biomedical images is an essential task in the clinical diagnosis of numerous medical diseases identified from those images. Traditional image classification methods combined with hand-crafted image feature descriptors and various classifiers are not able to effectively improve the accuracy rate and meet the high requirements of classification of biomedical images. The same also holds true for artificial neural network models directly trained with limited biomedical images used as training data or directly used as a black box to extract the deep features based on another distant dataset. In this study, we propose a highly reliable and accurate end-to-end classifier for all kinds of biomedical images via deep learning and transfer learning. We first apply domain transferred deep convolutional neural network for building a deep model; and then develop an overall deep learning architecture based on the raw pixels of original biomedical images using supervised training. In our model, we do not need the manual design of the feature space, seek an effective feature vector classifier or segment specific detection object and image patches, which are the main technological difficulties in the adoption of traditional image classification methods. Moreover, we do not need to be concerned with whether there are large training sets of annotated biomedical images, affordable parallel computing resources featuring GPUs or long times to wait for training a perfect deep model, which are the main problems to train deep neural networks for biomedical image classification as observed in recent works. With the utilization of a simple data augmentation method and fast convergence speed, our algorithm can achieve the best accuracy rate and outstanding classification ability for biomedical images. We have evaluated our classifier on several well-known public biomedical datasets and compared it with several state-of-the-art approaches. We propose a robust

  6. Profiling wind and greenhouse gases by infrared-laser occultation: results from end-to-end simulations in windy air

    Directory of Open Access Journals (Sweden)

    A. Plach

    2015-07-01

    Full Text Available The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs, and line-of-sight (l.o.s. wind speed with focus on the upper troposphere and lower stratosphere (UTLS. While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting. Here we use a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both stand-alone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm−1 and exploits transmission differences from a wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s. wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s−1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to the decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s−1 but is found to benefit in the case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and

  7. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  8. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  9. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  10. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  11. Minimizing Barriers in Learning for On-Call Radiology Residents-End-to-End Web-Based Resident Feedback System.

    Science.gov (United States)

    Choi, Hailey H; Clark, Jennifer; Jay, Ann K; Filice, Ross W

    2017-08-24

    Feedback is an essential part of medical training, where trainees are provided with information regarding their performance and further directions for improvement. In diagnostic radiology, feedback entails a detailed review of the differences between the residents' preliminary interpretation and the attendings' final interpretation of imaging studies. While the on-call experience of independently interpreting complex cases is important to resident education, the more traditional synchronous "read-out" or joint review is impossible due to multiple constraints. Without an efficient method to compare reports, grade discrepancies, convey salient teaching points, and view images, valuable lessons in image interpretation and report construction are lost. We developed a streamlined web-based system, including report comparison and image viewing, to minimize barriers in asynchronous communication between attending radiologists and on-call residents. Our system provides real-time, end-to-end delivery of case-specific and user-specific feedback in a streamlined, easy-to-view format. We assessed quality improvement subjectively through surveys and objectively through participation metrics. Our web-based feedback system improved user satisfaction for both attending and resident radiologists, and increased attending participation, particularly with regards to cases where substantive discrepancies were identified.

  12. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  13. Acute traumatic subclavian artery thrombosis and its successful repair via resection and end-to-end anastomosis

    Institute of Scientific and Technical Information of China (English)

    Saulat H Fatimi; Amna Anees; Marium Muzaffar; Hashim M Hanif

    2010-01-01

    Subclavian artery thrombosis is a rare complication of clavicle fractures. We reported a 20-year-old man who was admitted to the emergency room after a road traffic accident. He was a pedestrian who was initially hit by a bus and after he fell down on the road, he was run over by a car. On evaluation, he was found to have multiple facial and rib fractures, distal right humerus and right clavicle fracture. Significantly, right radial pulse was absent. After further evaluation including Doppler studies and an angiography which revealed complete obstruction of right subclavian artery just distal to its 1st portion, the patient was urgently taken to the operation room. A midclavicular fracture was adjacent to the injured vessel. We established proximal and distal control, removed damaged part. After mobilizing the subclavian artery, an end-to-end anastomosis was made. Then open reduction and internal fixation of right distal humerus was performed. The rest of the postoperative course was unremarkable. To prevent complications of subclavian artery thrombosis, different treatment modalities can be used, including anticoagulation therapy,angioplasty, stenting and bypass procedures.

  14. An anthropomorphic multimodality (CT/MRI) phantom prototype for end-to-end tests in radiation therapy

    CERN Document Server

    Gallas, Raya R; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2014-01-01

    With the increasing complexity of external beam therapy, so-called "end-to-end" tests are intended to cover all steps from therapy planning to follow-up to fulfill the high demands on quality assurance. As magnetic resonance imaging (MRI) gains growing importance in the treatment process and established phantoms (such as the Alderson head) cannot be used for those tests, novel multimodality phantoms have to be developed. Here, we present a feasibility study for such a customizable multimodality head phantom. We used a set of patient CT images as the basis for the anthropomorphic head shape. The recipient - consisting of an epoxy resin - was produced using rapid prototyping (3D printing). The phantom recipient includes a nasal air cavity, two soft tissues volumes and cranial bone. Additionally a spherical tumor volume was positioned in the center. The volumes were filled with dipotassium phosphate-based cranial bone surrogate, agarose gel, and distilled water. The tumor volume was filled with normoxic dosimetr...

  15. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    Science.gov (United States)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  16. Towards a cross-platform software framework to support end-to-end hydrometeorological sensor network deployment

    Science.gov (United States)

    Celicourt, P.; Sam, R.; Piasecki, M.

    2016-12-01

    Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.

  17. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  18. End-to-end simulations of the Visible Tunable Filter for the Daniel K. Inouye Solar Telescope

    CERN Document Server

    Schmidt, Wolfgang; Ellwarth, Monika; Baumgartner, Jörg; Bell, Alexander; Fischer, Andreas; Halbgewachs, Clemens; Heidecke, Frank; Kentischer, Thomas; von der Lühe, Oskar; Scheiffelen, Thomas; Sigwarth, Michael

    2016-01-01

    The Visible Tunable Filter (VTF) is a narrowband tunable filter system for imaging spectroscopy and spectropolarimetry based. The instrument will be one of the first-light instruments of the Daniel K. Inouye Solar Telescope that is currently under construction on Maui (Hawaii). The VTF is being developed by the Kiepenheuer Institut fuer Sonnenphysik in Freiburg as a German contribution to the DKIST. We perform end-to-end simulations of spectropolarimetric observations with the VTF to verify the science requirements of the instrument. The instrument is simulated with two Etalons, and with a single Etalon. The clear aperture of the Etalons is 250 mm, corresponding to a field of view with a diameter of 60 arcsec in the sky (42,000 km on the Sun). To model the large-scale figure errors we employ low-order Zernike polynomials (power and spherical aberration) with amplitudes of 2.5 nm RMS. We use an ideal polarization modulator with equal modulation coefficients of 3-1/2 for the polarization modulation We synthesiz...

  19. Scaffold-integrated microchips for end-to-end in vitro tumor cell attachment and xenograft formation.

    Science.gov (United States)

    Lee, Jungwoo; Kohl, Nathaniel; Shanbhang, Sachin; Parekkadan, Biju

    2015-12-01

    Microfluidic technologies have substantially advanced cancer research by enabling the isolation of rare circulating tumor cells (CTCs) for diagnostic and prognostic purposes. The characterization of isolated CTCs has been limited due to the difficulty in recovering and growing isolated cells with high fidelity. Here, we present a strategy that uses a 3D scaffold, integrated into a microfludic device, as a transferable substrate that can be readily isolated after device operation for serial use in vivo as a transplanted tissue bed. Hydrogel scaffolds were incorporated into a PDMS fluidic chamber prior to bonding and were rehydrated in the chamber after fluid contact. The hydrogel matrix completely filled the fluid chamber, significantly increasing the surface area to volume ratio, and could be directly visualized under a microscope. Computational modeling defined different flow and pressure regimes that guided the conditions used to operate the chip. As a proof of concept using a model cell line, we confirmed human prostate tumor cell attachment in the microfluidic scaffold chip, retrieval of the scaffold en masse, and serial implantation of the scaffold to a mouse model with preserved xenograft development. With further improvement in capture efficiency, this approach can offer an end-to-end platform for the continuous study of isolated cancer cells from a biological fluid to a xenograft in mice.

  20. DynaChanAl: Dynamic Channel Allocation with Minimal End-to-end Delay for Wireless Sensor Networks

    CERN Document Server

    Ko, JeongGil

    2010-01-01

    With recent advances in wireless communication, networking, and low power sensor technology, wireless sensor network (WSN) systems have begun to take significant roles in various applications ranging from environmental sensing to mobile healthcare sensing. While some WSN applications only require a lim- ited amount of bandwidth, new emerging applications operate with a notice- ably large amount of data transfers. One way to deal with such applications is to maximize the available capacity by utilizing the use of multiple wireless channels. This work proposes DynaChannAl, a distributed dynamic wireless channel algorithm with the goal of effectively distributing nodes on multiple wireless channels in WSN systems. Specifically, DynaChannAl targets applica- tions where mobile nodes connect to a pre-existing wireless backbone and takes the expected end-to-end queuing delay as its core metric. We use the link qual- ity indicator (LQI) values provided by IEEE 802.15.4 radios white-list potential links with good link...

  1. Effects of collagen membranes enriched with in vitro-differentiated N1E-115 cells on rat sciatic nerve regeneration after end-to-end repair

    Directory of Open Access Journals (Sweden)

    Fornaro Michele

    2010-02-01

    Full Text Available Abstract Peripheral nerves possess the capacity of self-regeneration after traumatic injury but the extent of regeneration is often poor and may benefit from exogenous factors that enhance growth. The use of cellular systems is a rational approach for delivering neurotrophic factors at the nerve lesion site, and in the present study we investigated the effects of enwrapping the site of end-to-end rat sciatic nerve repair with an equine type III collagen membrane enriched or not with N1E-115 pre-differentiated neural cells. After neurotmesis, the sciatic nerve was repaired by end-to-end suture (End-to-End group, end-to-end suture enwrapped with an equine collagen type III membrane (End-to-EndMemb group; and end-to-end suture enwrapped with an equine collagen type III membrane previously covered with neural cells pre-differentiated in vitro from N1E-115 cells (End-to-EndMembCell group. Along the postoperative, motor and sensory functional recovery was evaluated using extensor postural thrust (EPT, withdrawal reflex latency (WRL and ankle kinematics. After 20 weeks animals were sacrificed and the repaired sciatic nerves were processed for histological and stereological analysis. Results showed that enwrapment of the rapair site with a collagen membrane, with or without neural cell enrichment, did not lead to any significant improvement in most of functional and stereological predictors of nerve regeneration that we have assessed, with the exception of EPT which recovered significantly better after neural cell enriched membrane employment. It can thus be concluded that this particular type of nerve tissue engineering approach has very limited effects on nerve regeneration after sciatic end-to-end nerve reconstruction in the rat.

  2. End-to-End Simulation for a Forest-Dedicated Full-Waveform Lidar Onboard a Satellite Initialized from Airborne Ultraviolet Lidar Experiments

    OpenAIRE

    Xiaoxia Shang; Patrick Chazette

    2015-01-01

    In order to study forests at the global scale, a detailed link budget for a lidar system onboard satellite is presented. It is based on an original approach coupling airborne lidar observations and an end-to-end simulator. The simulator is initialized by airborne lidar measurements performed over temperate and tropical forests on the French territory, representing a wide range of forests ecosystems. Considering two complementary wavelengths of 355 and 1064 nm, the end-to-end simulator compute...

  3. Surgical repair of acute Achilles tendon rupture with an end-to-end tendon suture and tendon flap.

    Science.gov (United States)

    Corradino, B; Di Lorenzo, S; Calamia, C; Moschella, F

    2015-08-01

    Achilles tendon ruptures are becoming more common. Complications after open or minimally invasive surgery are: recurrent rupture (2-8%), wound breakdown, deep infections, granuloma, and fistulas. The authors expose their experience with a personal technique. In 8 patients with acute rupture of Achilles tendon the surgery was performed at least 25 days after trauma. Clinical exam and MR demonstrated in all case a total lesion of tendon. After a posterolateral skin incision the tendon stumps were debrided and suture in end-to-end fashion. A tendon flap was harvested from the proximal part of the tendon, in order to protect and reinforce the suture itself. A plaster cast was applied for 3 weeks and the patients started the rehabilitation protocol. After 4 months all patients returned to pre-injury daily activities. The mean follow up was 13 months (ranged between 6 and 24 months). No major complications occurred. The posterolateral skin incision, not above the tendon, preserves the vascularity of the soft tissues, allows identifying and not accidentally injuring the sural nerve, and prevents the cutaneous scar is overlapped the tendon. In this way is favoured physiological tendon sliding. The preparation of the flap tendon does not weaken the overall strength of the tendon and protects the tendon suture. The tension on sutured stumps is less than being spread over a larger area. In our sample of 8 patients the absence of short-and long-term complications and the rapid functional recovery after surgery suggest that the technique used is safe and effective. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study

    Science.gov (United States)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd

    2016-08-01

    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  5. SBSS Demonstrator: A design for efficient demonstration of Space-based Space Surveillance end-to-end capabilities

    Science.gov (United States)

    Utzmann, Jens; Flohrer, Tim; Schildknecht, Thomas; Wagner, Axel; Silha, Jiri; Willemsen, Philip; Teston, Frederic

    This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. Space Surveillance and Tracking is part of Space Situational Awareness (SSA) and covers the detection, tracking and cataloguing of space debris and satellites. Derived SST services comprise a catalogue of these man-made objects, collision warning, detection and characterisation of in-orbit fragmentations, sub-catalogue debris characterisation, etc. The assessment of SBSS in a SST system architecture has shown that both an operational SBSS and also already a well-designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing and fusion, etc.) until the final products can be offered to the users. Also past and current missions by the US (SBV, SBSS) and Canada (Sapphire, NEOSSat) underline the advantages of space-based space surveillance. The presented SBSS system concept takes the ESA SST System Requirements (derived within the ESA SSA Preparatory Program) into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirments for detection and characterisation of small-sized LEO debris are

  6. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    Science.gov (United States)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  7. Transport distance of invertebrate environmental DNA in a natural river.

    Directory of Open Access Journals (Sweden)

    Kristy Deiner

    Full Text Available Environmental DNA (eDNA monitoring is a novel molecular technique to detect species in natural habitats. Many eDNA studies in aquatic systems have focused on lake or ponds, and/or on large vertebrate species, but applications to invertebrates in river systems are emerging. A challenge in applying eDNA monitoring in flowing waters is that a species' DNA can be transported downstream. Whether and how far eDNA can be detected due to downstream transport remains largely unknown. In this study we tested for downstream detection of eDNA for two invertebrate species, Daphnia longispina and Unio tumidus, which are lake dwelling species in our study area. The goal was to determine how far away from the source population in a lake their eDNA could be detected in an outflowing river. We sampled water from eleven river sites in regular intervals up to 12.3 km downstream of the lake, developed new eDNA probes for both species, and used a standard PCR and Sanger sequencing detection method to confirm presence of each species' eDNA in the river. We detected D. longispina at all locations and across two time points (July and October; whereas with U. tumidus, we observed a decreased detection rate and did not detect its eDNA after 9.1 km. We also observed a difference in detection for this species at different times of year. The observed movement of eDNA from the source amounting to nearly 10 km for these species indicates that the resolution of an eDNA sample can be large in river systems. Our results indicate that there may be species' specific transport distances for eDNA and demonstrate for the first time that invertebrate eDNA can persist over relatively large distances in a natural river system.

  8. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  9. Evolving a Fully Integrated Lean Six Sigma Continuous Process Improvement Systems Approach for Enterprise End-to-End Value Stream Excellence

    Science.gov (United States)

    2007-05-01

    To Ensure Proper Strategic “Balance,” Map Business Ys to Balanced Scorecard Business Ys = LSS/TOC Major North Stars & Areas of Strategic Emphasis... Balanced Scorecard • Reinforce AIRSpeed Commitment: Tools, Methodologies & Strategies • Apply HICVS End-to-End across NAE inc. linkage to DoN and

  10. First examples of two ferromagnetic end-to-end cyanate bridged 1D linear coordination polymers of nickel(II) containing an unsymmetrical diamine.

    Science.gov (United States)

    Choubey, Somnath; Bhar, Kishalay; Chattopadhyay, Soumi; Hazra, Arpan; Maji, Tapas Kumar; Ribas, Joan; Ghosh, Barindra Kumar

    2012-10-14

    Two new end-to-end (EE) cyanate bridged 1D coordination polymers of Ni(II) are isolated which contain linear (180°) Ni-N-C and Ni-O-C angles in Ni-NCO-Ni bridges and show ferromagnetic (F) coupling in agreement with the reported theoretical model for linear EE bridges.

  11. Analysis of outcome of end-to-end and end-to-side internal iliac artery anastomosis in renal transplantation: Our initial experience with a case series.

    Science.gov (United States)

    Pal, Dilip Kumar; Sanki, Prakash Kumar; Roy, Sayak

    2017-01-01

    In renal transplantation, there is end-to-side anastomosis of renal artery to external iliac artery and end-to-end anastomosis of renal artery to internal iliac artery. The end-to-end internal iliac artery anastomosis can be associated with complications due to compromised distal vascular supply to limbs and penile erectile tissue. A method of end-to-side anastomosis can overcome them. Till date, there is no case series or trial that has studied the effect of end-to-side anastomosis. This study is aimed at comparing the outcome of end-to-side and end-to-end anastomosis, so as to evaluate the efficacy of end-to-side technique. A total of 40 renal transplant recipients were taken, with internal iliac artery anastomosis, and were divided into two groups, 20 patients with end-to-end and 20 patients with end-to-side anastomosis. The cold ischemia time, arterial anastomosis time, post-operative bleeding and urine leak, claudication, saddle anesthesia and erectile dysfunction, and follow-up recipient creatinine and eGFR and Doppler to look for graft renal artery patency (at 6 months post-transplant) were compared between the two groups. The intraoperative cold ischemia time was slightly more in the group with end-to-end anastomosis, but it was statistically significant (P = 0.22). The arterial anastomosis time was comparable in both the groups (P = 0.65). In the end-to-end group, 15%, 20% and 15% patients had post-operative saddle anaesthesia, claudication and mild-to-moderate erectile dysfunction, which were absent in the end-to-side group. On follow-up, the mean recipient serum creatinine and eGFR were comparable in the two groups. Also, the graft renal artery patency on Doppler was comparable. The end-to-side technique can be definitely applied for renal transplantation, with some advantages over end-to-end technique, and without compromising efficacy.

  12. (YIP-07) Economic Models for End-to-End Decision Making in an AD HOC Network Environment

    Science.gov (United States)

    2010-02-28

    like Ad hoc On-Demand Distance Vector Routing ( AODV ) [40] and Dynamic Source Routing (DSR) [19] protocols , a route selection mechanism should be in...Proceedings of Mobiquitous 2005, pp. 3-11. [40] E. M. Royer and C. E. Perkins. ”An Implementation Study of the AODV Routing Protocol ”, IEEE WCNC 2000...35, 36, 41], routing protocols [17, 37, 45, 52] and other system design issues [4, 26, 27, 30, 31, 50]. As far as cooperation enforcement in wireless

  13. End-to-End System Test and Optical Performance Evaluation for the Solar and Heliosphere Observatory (SOHO) Ultraviolet Coronagraph Spectrometer (UVCS)

    Science.gov (United States)

    Carosso, Paolo A.; Gardner, Larry D.; Jhabvala, Marzy; Nicolosi, P.

    1997-01-01

    The UVCS is one of the instruments carried by the Solar and Heliospheric Observatory (SOHO), a joint NASA/ESA Spacecraft launched in November 1995. It is designed to perform ultraviolet spectroscopy and visible light polarimetry of the extended solar corona. The primary scientific objectives of the UVCS investigation are to study the physical processes occurring in the extended solar corona, such as: the mechanism of acceleration of the solar wind, the mechanism of coronal plasma heating, the identification of solar wind sources, and the investigation of the plasma properties of the solar wind. The UVCS End-to-End test activities included a comprehensive set of system level functional and optical tests. Although performed under severe schedule constraints, the End-to-End System Test was very successful and served to fully validate the UVCS optical design. All test results showed that the primary scientific objectives of the UVCS Mission were achievable.

  14. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...... of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers....

  15. Software system requirements for the Army Tactical Missile System (ATACMS) End-To-End System using the Computer Aided Prototyping System (CAPS) multi-file approach

    OpenAIRE

    Angrisani, David Stuart; Whitbeck, George Steven.

    1996-01-01

    The Department of Defense (DOD) is seeking software system requirements for the Army Tactical Missile System (ATACMS) End to End System, which comprises both ATACMS and all sensors, links, and command centers which enable integration across system and service boundaries. The complexity, multiple interfaces, and joint nature of planned ATACMS operations demands accurate specification of software system requirements. DOD also desires automated tools capable of developing rapid prototypes to ass...

  16. Direct muscle neurotization after end-to end and end-to-side neurorrhaphy: An experimental study in the rat forelimb model.

    Science.gov (United States)

    Papalia, Igor; Ronchi, Giulia; Muratori, Luisa; Mazzucco, Alessandra; Magaudda, Ludovico; Geuna, Stefano

    2012-10-15

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients. It has been previously shown that the combined use of two reconstructive techniques, namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model, can lead to good results in terms of skeletal muscle reinnervation. Here we show that, in the rat forelimb model, the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles, leads to muscle atrophy prevention over a long postoperative time lapse (10 months). By contrast, very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles. It can thus be concluded that, at least in the rat, direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function.

  17. Direct muscle neurotization after end-to-end and end-to-side neurorrhaphy An experimental study in the rat forelimb model

    Institute of Scientific and Technical Information of China (English)

    Igor Papalia; Giulia Ronchi; Luisa Muratori; Alessandra Mazzucco; Ludovico Magaudda; Stefano Geuna

    2012-01-01

    The need for the continuous research of new tools for improving motor function recovery after nerve injury is justified by the still often unsatisfactory clinical outcome in these patients.It has been previously shown that the combined use of two reconstructive techniques,namely end-to-side neurorrhaphy and direct muscle neurotization in the rat hindlimb model,can lead to good results in terms of skeletal muscle reinnervation.Here we show that,in the rat forelimb model,the combined use of direct muscle neurotization with either end-to-end or end-to-side neurorrhaphy to reinnervate the denervated flexor digitorum muscles,leads to muscle atrophy prevention over a long postoperative time lapse (10 months).By contrast,very little motor recovery (in case of end-to-end neurorrhaphy) and almost no motor recovery (in case of end-to-side neurorrhaphy) were observed in the grasping activity controlled by flexor digitorum muscles.It can thus be concluded that,at least in the rat,direct muscle neurotization after both end-to-end and end-to-side neurorrhaphy represents a good strategy for preventing denervation-related muscle atrophy but not for regaining the lost motor function.

  18. An end-to-end system in support of a broad scope of GOES-R sensor and data processing study

    Science.gov (United States)

    Huang, Hung-Lung

    2005-08-01

    The mission of NOAA's Geostationary Operational Environmental Satellite System (GOES) R series satellites, in the 2012 time frame, is to provide continuous, near real-time meteorological, oceanographic, solar, and space environment data that supports NOAA's strategic mission goals. It presents an exciting opportunity to explore new instruments, satellite designs, and system architectures utilizing new communication and instrument technologies in order to meet the ever-increasing demands made of Earth observation systems by national agencies and end users alike. The GOES-R sensor suite includes a 16 spectral band Advanced Baseline Imager (ABI), an approximately 1500 high spectral resolution band Hyperspectral Environmental Suite (HES), plus other sensors designed to detect lightning and to explore the ocean, solar and space environment. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) as part of the Space Science and Engineering Center (SSEC) of the University of Wisconsin-Madison, the long time partner of NOAA, has developed the first operational end-to-end processing system for GOES. Based on this heritage, and with recent support from the NASA/NOAA Geosynchrous Imaging FTS (GIFTS) project, the Navy's Multiple University Research Initiative (MURI), and NOAA's GOES-R Risk Reduction program, SSEC has built a near-complete end-to-end system that is capable of simulating sensor measurements from top of atmosphere radiances, raw sensor data (level 0) through calibrated and navigated sensor physical measurements (level 1) to the processed products (level 2). In this paper, the SSEC Hyperspectral Imaging and Sounding Simulator and Processor (HISSP) will be presented in detail. HISSP is capable of demonstrating most of the processing functions such as data compression/decompression, sensor calibration, data processing, algorithm development, and product generation. In summary, HISSP is an end-to-end system designed to support both government and

  19. X-ray follow up of end-to-end nerve repair site: A new technique, Nerve Repair Site Marking (NRSM

    Directory of Open Access Journals (Sweden)

    Aydin Yuceturk

    2016-12-01

    Conclusion: To get the best results following end-to-end nerve repair, nerve continuity must continued, though there is always the risk of rupture at the repair site. As ultrasonography and MRI can be utilized to examine the repair site, they are not helpful in brachial plexus repairs, are expensive and can be time-consuming when employed with peripheral nerves. NRSM is an easy, objective, and cheap follow-up technique after nerve repair and provides a chance for early re-repair. [Hand Microsurg 2016; 5(3.000: 118-123

  20. DNA as a molecular wire: distance and sequence dependence.

    Science.gov (United States)

    Wohlgamuth, Chris H; McWilliams, Marc A; Slinker, Jason D

    2013-09-17

    Functional nanowires and nanoelectronics are sought for their use in next generation integrated circuits, but several challenges limit the use of most nanoscale devices on large scales. DNA has great potential for use as a molecular wire due to high yield synthesis, near-unity purification, and nanoscale self-organization. Nonetheless, a thorough understanding of ground state DNA charge transport (CT) in electronic configurations under biologically relevant conditions, where the fully base-paired, double-helical structure is preserved, is lacking. Here, we explore the fundamentals of CT through double-stranded DNA monolayers on gold by assessing 17 base pair bridges at discrete points with a redox active probe conjugated to a modified thymine. This assessment is performed under temperature-controlled and biologically relevant conditions with cyclic and square wave voltammetry, and redox peaks are analyzed to assess transfer rate and yield. We demonstrate that the yield of transport is strongly tied to the stability of the duplex, linearly correlating with the melting temperature. Transfer rate is found to be temperature-activated and to follow an inverse distance dependence, consistent with a hopping mechanism of transport. These results establish the governing factors of charge transfer speed and throughput in DNA molecular wires for device configurations, guiding subsequent application for nanoscale electronics.

  1. Interoperable End-to-End Remote Patient Monitoring Platform based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2017-08-07

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for Personal Health Devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory and power and that use short range wireless technology. It explains aspects of IEEE 11073, including the Domain Information Model, state model and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger eco-system of interoperable devices and systems that include IHE PCD-01, HL7 and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living (AAL) in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  2. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Ionospheric Cusp and Resulting Ion Outflow to the Magnetosphere

    Science.gov (United States)

    Coffey, Victoria; Chandler, Michael; Singh, Nagendra; Avanov, Levon

    2003-01-01

    We will show results from an end-to-end study of the energy transfer from injected magnetosheath plasmas to the near-Earth magnetospheric and ionospheric plasmas and the resulting ion outflow to the magnetosphere. This study includes modeling of the evolution of the magnetosheath precipitation in the cusp using a kinetic code with a realistic magnetic field configuration. These evolved, highly non-Maxwellian distributions are used as input to a 2D PIC code to analyze the resulting wave generation. The wave analysis is used in the kinetic code as input to the cold ionospheric ions to study the transfer of energy to these ions and their outflow to the magnetosphere. Observations from the Thermal Ion Dynamics Experiment (TIDE) and other instruments on the Polar Spacecraft will be compared to the modeling.

  3. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    Science.gov (United States)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  4. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    Swarm, a satellite constellation to measure Earth's magnetic field with unpreceded accuracy, has been selected by ESA for launch in 2009. The mission will provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system...... by improving our understanding of the Earth's interior and climate. An End-to-End mission performance simulation was carried out during Phase A of the mission, with the aim of analyzing the key system requirements, particularly with respect to the number of Swarm satellites and their orbits related...... to the science objectives of Swarm. In order to be able to use realistic parameters of the Earth's environment, the mission simulation starts at January 1, 1997 and lasts until re-entry of the lower satellites five years later. Synthetic magnetic field values were generated for all relevant contributions...

  5. Current transformer verification system for in situ, live line end-to-end calibration of medium and high voltage current transformers

    Energy Technology Data Exchange (ETDEWEB)

    Gunn, C.; Irwin, L.; Marr, D. [Schneider Electric Canada, Toronto, ON (Canada)

    2007-07-01

    IEC Instrument Transformer Standard 60044-1 was created in response to recent industrial and regulatory trends for improved metering performance. The accuracy of high voltage (HV) metering depends on the quality of the instrument transformer signals feeding the energy meters. The complete current transformer (CT) system, including the end-to-end interconnection and metering burden, must be routinely verified in order to claim ongoing compliance. However, utilities rarely perform routine verification and calibration of medium voltage (MV) and HV current instrument transformers because of the complexities associated with CT primary circuit removal and subsequent primary injection. In some cases, routine verification and calibration is not performed due to the belief that conventional CT performance is a constant that never changes. However, conventional CT accuracies may change over time due to transformer insulation breakdown, electric field effects, core magnetization, and burden wiring effects. In addition, CT secondary circuits may be shared with multiple metering and protection devices that present a complex burden to the CT. Changes in burden impedance can also have an effect on CT accuracy. This paper presented the Current Transformer Verification System (CTVS), which provides a unique and novel means to precisely characterize and verify the actual end-to-end performance of MV and HV current transformers while still energized. The CTVS system takes into account any voltage or specific installed site burden dependency that may be present. It provides the utility with the ability to verify CTs in-situ without system interruption through live-line deployment and to readily check the accuracy of metering systems to ensure compliance with industry requirements. 2 tabs., 9 figs.

  6. End-to-End Simulation for a Forest-Dedicated Full-Waveform Lidar Onboard a Satellite Initialized from Airborne Ultraviolet Lidar Experiments

    Directory of Open Access Journals (Sweden)

    Xiaoxia Shang

    2015-04-01

    Full Text Available In order to study forests at the global scale, a detailed link budget for a lidar system onboard satellite is presented. It is based on an original approach coupling airborne lidar observations and an end-to-end simulator. The simulator is initialized by airborne lidar measurements performed over temperate and tropical forests on the French territory, representing a wide range of forests ecosystems. Considering two complementary wavelengths of 355 and 1064 nm, the end-to-end simulator computes the performance of spaceborne lidar systems for different orbits. The analysis is based on forest structural (tree top height, quadratic mean canopy height and optical (forest optical thickness parameters. Although an ultraviolet lidar appears to be a good candidate for airborne measurements, our results show that the limited energy is not favorable for spaceborne missions with such a wavelength. A near infrared wavelength at 1064 nm is preferable, requiring ~100 mJ laser emitted energy, which is in agreement with current and future spaceborne missions involving a lidar. We find that the signal-to-noise ratio at the ground level to extract both the structural and optical parameters of forests must be larger than 10. Hence, considering the presence of clouds and aerosols in the atmosphere and assuming a stationary forest, a good detection probability of 99% can be reached when 4 or 5 satellite revisits are considered for a lidar system onboard the ISS or ICESat, respectively. This concerns ~90% of forest covers observed from the lidar, which have an optical thickness less than 3.

  7. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    Science.gov (United States)

    Bowen, S. R.; Nyflot, M. J.; Herrmann, C.; Groh, C. M.; Meyer, J.; Wollenweber, S. D.; Stearns, C. W.; Kinahan, P. E.; Sandison, G. A.

    2015-05-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT

  8. Profiling wind and greenhouse gases by infrared-laser occultation: algorithm and results from end-to-end simulations in windy air

    Directory of Open Access Journals (Sweden)

    A. Plach

    2015-01-01

    Full Text Available The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs, and line-of-sight (l.o.s. wind speed with focus on the upper troposphere and lower stratosphere (UTLS. While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting so far. Here we describe a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both standalone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm−1 and exploits transmission differences from wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s−1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s−1 but is found to benefit in case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and

  9. Comparison Between End-to-end Anastomosis and Buccal Mucosa Graft in Short Segment Bulbar Urethral Stricture: a Meta-analysis Study

    Directory of Open Access Journals (Sweden)

    Prahara Yuri

    2016-09-01

    Full Text Available Aim: to compare long term follow-up between end-to-end urethroplasty and bucal mucosal graft for the management of patients with short bulbar urethral stricture. Methods:we conducted a meta-analysis of cohort studies. Literature research was performed on the MEDLINE, Science Direct, and EMBASE database including studies from 1980 through 2014. The inclusion criteria were patients with short bulbar urethral strictrure (sized ≤3 cm undergoing end-to-end anastomosis (EE and buccal mucosa graft (BMG with the complication of voiding symptoms and sexual dysfunction ≥12 months. Pooled risk ratio (RRs and 95% confidence interval (CIs were calculated using Mantzel-Haenzel method, while the heterogeneity were determined through I2 value. Data analysis were done using Stata software version 10.0 (StataCorp. Results:We analyze 10 studies in this meta-analysis. Sexual dysfunction following EE and BMG were found in 24.6% (45/183 patients and 9.1% (11/122 patients, respectively (overall RR 2.54; 95% CI: 1,44-4,47; p=0.001. Voiding symptoms following EE and BMG were found in 14% (8/57 patients and 12.5% (7/56 patients, respectively (overall RR 0.77; 95% CI: 0.3–2.0; p=0.591. Furthermore, stricture recurrent following EE and BMG were 8.4% (8/107 and 30% (14/46, respectively (overall RR 0.38; 95% CI: 0.17–0.84; p=0.016. The effectiveness of EE and BMG were found to be equal as both demonstrated few complications. BMG were found to be superior than EE terms of minimal sexual dysfunction complication. On the contrary, EE were found to be superior than BMG in terms of stricture recurence following short bulbar urethral stricture surgery. Conclusion:BMG can be considered as the primary treatment rather than EE for managing short urethral stricture cases.

  10. The PLATO End-to-End CCD Simulator -- Modelling space-based ultra-high precision CCD photometry for the assessment study of the PLATO Mission

    CERN Document Server

    Zima, W; De Ridder, J; Salmon, S; Catala, C; Kjeldsen, H; Aerts, C

    2010-01-01

    The PLATO satellite mission project is a next generation ESA Cosmic Vision satellite project dedicated to the detection of exo-planets and to asteroseismology of their host-stars using ultra-high precision photometry. The main goal of the PLATO mission is to provide a full statistical analysis of exo-planetary systems around stars that are bright and close enough for detailed follow-up studies. Many aspects concerning the design trade-off of a space-based instrument and its performance can best be tackled through realistic simulations of the expected observations. The complex interplay of various noise sources in the course of the observations made such simulations an indispensable part of the assessment study of the PLATO Payload Consortium. We created an end-to-end CCD simulation software-tool, dubbed PLATOSim, which simulates photometric time-series of CCD images by including realistic models of the CCD and its electronics, the telescope optics, the stellar field, the pointing uncertainty of the satellite ...

  11. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  12. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  13. Determination of anions using monolithic capillary column ion chromatography with end-to-end differential contactless conductometric detectors under resonance approach.

    Science.gov (United States)

    Zhang, Zhenli; Li, Dongdong; Liu, Xueyong; Subhani, Qamar; Zhu, Yan; Kang, Qi; Shen, Dazhong

    2012-06-21

    An end-to-end differential measurement approach with capacitively coupled contactless conductivity detection (C(4)D) was applied to anion-exchange monolithic capillary column ion chromatography. The column was prepared by thermally initiated radical polymerization of poly(glycidyl methacrylate) in a fused-silica capillary of 320 μm i.d. and modified by quaternary ammonium latex surface coating. Two C(4)Ds were placed near both ends of the capillary column and the output difference between them was measured. With 15 mM potassium hydrogen phthalate used as the eluent, good separation of a mixture of inorganic anions (F(-), Cl(-), NO(2)(-), NO(3)(-)) was achieved. The detection limits of conventional C(4)D are 1.6, 0.28, 0.53, and 0.47 mg L(-1) for F(-), Cl(-), NO(2)(-), and NO(3)(-), respectively. To further enhance the sensitivity, the capacitive impedance from C(4)D was neutralized by an inductive impedance from a piezoelectric resonator. An increase in sensitivity by a factor of 7-8 was achieved in the resonating C(4)D in comparison with the conventional C(4)D. The detection limits of the resonating C(4)D are 0.23, 0.041, 0.065, and 0.059 mg L(-1) for F(-), Cl(-), NO(2)(-), and NO(3)(-), respectively. The response of the resonating C(4)D was analyzed based on an equivalent circuit model.

  14. An end-to-end examination of geometric accuracy of IGRT using a new digital accelerator equipped with onboard imaging system.

    Science.gov (United States)

    Wang, Lei; Kielar, Kayla N; Mok, Ed; Hsu, Annie; Dieterich, Sonja; Xing, Lei

    2012-02-01

    The Varian's new digital linear accelerator (LINAC), TrueBeam STx, is equipped with a high dose rate flattening filter free (FFF) mode (6 MV and 10 MV), a high definition multileaf collimator (2.5 mm leaf width), as well as onboard imaging capabilities. A series of end-to-end phantom tests were performed, TrueBeam-based image guided radiation therapy (IGRT), to determine the geometric accuracy of the image-guided setup and dose delivery process for all beam modalities delivered using intensity modulated radiation therapy (IMRT) and RapidArc. In these tests, an anthropomorphic phantom with a Ball Cube II insert and the analysis software (FilmQA (3cognition)) were used to evaluate the accuracy of TrueBeam image-guided setup and dose delivery. Laser cut EBT2 films with 0.15 mm accuracy were embedded into the phantom. The phantom with the film inserted was first scanned with a GE Discovery-ST CT scanner, and the images were then imported to the planning system. Plans with steep dose fall off surrounding hypothetical targets of different sizes were created using RapidArc and IMRT with FFF and WFF (with flattening filter) beams. Four RapidArc plans (6 MV and 10 MV FFF) and five IMRT plans (6 MV and 10 MV FFF; 6 MV, 10 MV and 15 MV WFF) were studied. The RapidArc plans with 6 MV FFF were planned with target diameters of 1 cm (0.52 cc), 2 cm (4.2 cc) and 3 cm (14.1 cc), and all other plans with a target diameter of 3 cm. Both onboard planar and volumetric imaging procedures were used for phantom setup and target localization. The IMRT and RapidArc plans were then delivered, and the film measurements were compared with the original treatment plans using a gamma criteria of 3%/1 mm and 3%/2 mm. The shifts required in order to align the film measured dose with the calculated dose distributions was attributed to be the targeting error. Targeting accuracy of image-guided treatment using TrueBeam was found to be within 1 mm. For irradiation of the 3 cm target, the gammas (3%, 1

  15. Design of a satellite end-to-end mission performance simulator for imaging spectrometers and its application to the ESA's FLEX/Sentinel-3 tandem mission

    Science.gov (United States)

    Vicent, Jorge; Sabater, Neus; Tenjo, Carolina; Acarreta, Juan R.; Manzano, María.; Rivera, Juan P.; Jurado, Pedro; Franco, Raffaella; Alonso, Luis; Moreno, Jose

    2015-09-01

    The performance analysis of a satellite mission requires specific tools that can simulate the behavior of the platform; its payload; and the acquisition of scientific data from synthetic scenes. These software tools, called End-to-End Mission Performance Simulators (E2ES), are promoted by the European Space Agency (ESA) with the goal of consolidating the instrument and mission requirements as well as optimizing the implemented data processing algorithms. Nevertheless, most developed E2ES are designed for a specific satellite mission and can hardly be adapted to other satellite missions. In the frame of ESA's FLEX mission activities, an E2ES is being developed based on a generic architecture for passive optical missions. FLEX E2ES implements a state-of-the-art synthetic scene generator that is coupled with dedicated algorithms that model the platform and instrument characteristics. This work will describe the flexibility of the FLEX E2ES to simulate complex synthetic scenes with a variety of land cover classes, topography and cloud cover that are observed separately by each instrument (FLORIS, OLCI and SLSTR). The implemented algorithms allows modelling the sensor behavior, i.e. the spectral/spatial resampling of the input scene; the geometry of acquisition; the sensor noises and non-uniformity effects (e.g. stray-light, spectral smile and radiometric noise); and the full retrieval scheme up to Level-2 products. It is expected that the design methodology implemented in FLEX E2ES can be used as baseline for other imaging spectrometer missions and will be further expanded towards a generic E2ES software tool.

  16. Automated segmentation of 3D anatomical structures on CT images by using a deep convolutional network based on end-to-end learning approach

    Science.gov (United States)

    Zhou, Xiangrong; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2017-02-01

    We have proposed an end-to-end learning approach that trained a deep convolutional neural network (CNN) for automatic CT image segmentation, which accomplished a voxel-wised multiple classification to directly map each voxel on 3D CT images to an anatomical label automatically. The novelties of our proposed method were (1) transforming the anatomical structures segmentation on 3D CT images into a majority voting of the results of 2D semantic image segmentation on a number of 2D-slices from different image orientations, and (2) using "convolution" and "deconvolution" networks to achieve the conventional "coarse recognition" and "fine extraction" functions which were integrated into a compact all-in-one deep CNN for CT image segmentation. The advantage comparing to previous works was its capability to accomplish real-time image segmentations on 2D slices of arbitrary CT-scan-range (e.g. body, chest, abdomen) and produced correspondingly-sized output. In this paper, we propose an improvement of our proposed approach by adding an organ localization module to limit CT image range for training and testing deep CNNs. A database consisting of 240 3D CT scans and a human annotated ground truth was used for training (228 cases) and testing (the remaining 12 cases). We applied the improved method to segment pancreas and left kidney regions, respectively. The preliminary results showed that the accuracies of the segmentation results were improved significantly (pancreas was 34% and kidney was 8% increased in Jaccard index from our previous results). The effectiveness and usefulness of proposed improvement for CT image segmentations were confirmed.

  17. Assessing the value of seasonal climate forecast information through an end-to-end forecasting framework: Application to U.S. 2012 drought in central Illinois

    Science.gov (United States)

    Shafiee-Jood, Majid; Cai, Ximing; Chen, Ligang; Liang, Xin-Zhong; Kumar, Praveen

    2014-08-01

    This study proposes an end-to-end forecasting framework to incorporate operational seasonal climate forecasts to help farmers improve their decisions prior to the crop growth season, which are vulnerable to unanticipated drought conditions. The framework couples a crop growth model with a decision-making model for rainfed agriculture and translates probabilistic seasonal forecasts into more user-related information that can be used to support farmers' decisions on crop type and some market choices (e.g., contracts with ethanol refinery). The regional Climate-Weather Research and Forecasting model (CWRF) driven by two operational general circulation models (GCMs) is used to provide the seasonal forecasts of weather parameters. To better assess the developed framework, CWRF is also driven by observational reanalysis data, which theoretically can be considered as the best seasonal forecast. The proposed framework is applied to the Salt Creek watershed in Illinois that experienced an extreme drought event during 2012 crop growth season. The results show that the forecasts cannot capture the 2012 drought condition in Salt Creek and therefore the suggested decisions can make farmers worse off if the suggestions are adopted. Alternatively, the optimal decisions based on reanalysis-based CWRF forecasts, which can capture the 2012 drought conditions, make farmers better off by suggesting "no-contract" with ethanol refineries. This study suggests that the conventional metric used for ex ante value assessment is not capable of providing meaningful information in the case of extreme drought. Also, it is observed that institutional interventions (e.g., crop insurance) highly influences farmers' decisions and, thereby, the assessment of forecast value.

  18. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    Science.gov (United States)

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W; Ramey, John; Davis, Mark M; Kalams, Spyros A; De Rosa, Stephen C; Gottardo, Raphael

    2014-08-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational

  19. Stereological and biochemical analysis of the urethral edges in patients submitted to end-to-end anastomosis for bulbar urethral stricture

    Directory of Open Access Journals (Sweden)

    Joao P. M. de Carvalho

    2012-10-01

    Full Text Available PURPOSE: To study the morphologic alterations in the proximal and distal urethral edges from patients submitted to end-to-end bulbar urethroplasty. MATERIALS AND METHODS: We analyzed 12 patients submitted to anastomotic urethroplasty to treat bulbar strictures less than 2.0 cm in length. After excision of the fibrotic segment to a 28Fr urethral caliber, we obtained biopsies from the spongious tissue of the free edges (proximal: PROX and distal: DIST. Controls included normal bulbar urethras obtained from autopsies of 10 age matched individuals. The samples were histologically processed for smooth muscle cells (SMC, elastic system fibers and collagen. Stereological analysis was performed to determine the volumetric density (Vv of each element. Also, a biochemical analysis was performed to quantify the total collagen content. RESULTS: Vv of SMC was reduced in PROX (31.48 ± 7.01 p < 0.05 and similar in DIST when compared to controls (55.65 ± 9.60% with no statistical difference. Elastic fibers were increased in PROX (25.70 ± 3.21%; p < 0.05 and were similar to controls in DIST (15.87 ± 4.26%. Total collagen concentration in PROX (46.39 �� 8.20 μg/mg, and DIST (47.96 ± 9.42 μg/mg did not differ from controls (48.85 ± 6.91 μg/mg. Type III collagen was similarly present in all samples. CONCLUSIONS: After excision of the stenotic segment to a caliber of 28Fr, the exposed and macroscopically normal urethral edges may present altered amounts of elastic fibers and SMC, but are free from fibrotic tissue. When excising the peri-stenotic tissue, the surgeon should be more careful in the proximal end, which is the most altered.

  20. Enzymatic reaction modulated gold nanorod end-to-end self-assembly for ultrahigh sensitively colorimetric sensing of cholinesterase and organophosphate pesticides in human blood.

    Science.gov (United States)

    Lu, Linlin; Xia, Yunsheng

    2015-08-18

    We present herein the first reported self-assembly modulation of gold nanorods (AuNRs) by enzymatic reaction, which is further employed for colorimetric assays of cholinesterase (ChE) and organophosphate pesticides (OPs) in human blood. ChE catalyzes its substrate (acetylthiocholine) and produces thiocholine and acetate acid. The resulting thiols then react with the tips of the AuNRs by S-Au conjunction and prevent subsequent cysteine-induced AuNR end-to-end (EE) self-assembly. Correspondingly, the AuNR surface plasmon resonance is regulated, which results in a distinctly ratiometric signal output. Under optimal conditions, the linear range is 0.042 to 8.4 μU/mL, and the detection limit is as low as 0.018 μU/mL. As ChE is incubated with OPs, the enzymatic activity is inhibited. So, the cysteine-induced assembly is observed again. On the basis of this principle, OPs can be well determined ranging from 0.12 to 40 pM with a 0.039 pM detection limit. To our knowledge, the present quasi pU/mL level sensitivity for ChE and the quasi femtomolar level sensitivity for OPs are at least 500 and 7000 times lower than those of previous colorimetric methods, respectively. The ultrahigh sensitivity results from (1) the rational choice of anisotropic AuNRs as building blocks and reporters and (2) the specific structure of the enzymatic thiocholine. Because of ultrahigh sensitivity, serum samples are allowed to be extremely diluted in the assay. Accordingly, various nonspecific interactions, even from glutathione/cysteine, are well avoided. So, both ChE and OPs in human blood can be directly assayed without any prepurification, indicating the simplicity and practical promise of the proposed method.

  1. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    Science.gov (United States)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  2. Combined fishing and climate forcing in the southern Benguela upwelling ecosystem: an end-to-end modelling approach reveals dampened effects.

    Directory of Open Access Journals (Sweden)

    Morgane Travers-Trolet

    Full Text Available The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE. Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects

  3. Combined fishing and climate forcing in the southern Benguela upwelling ecosystem: an end-to-end modelling approach reveals dampened effects.

    Science.gov (United States)

    Travers-Trolet, Morgane; Shin, Yunne-Jai; Shannon, Lynne J; Moloney, Coleen L; Field, John G

    2014-01-01

    The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE). Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects of fishing and

  4. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L [Clinica Luganese, Radiotherapy Center, Lugano (Switzerland)

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  5. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Directory of Open Access Journals (Sweden)

    V. Proschek

    2011-04-01

    Full Text Available Measuring greenhouse gas (GHG profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling did not yet exist. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO data as recently introduced in detail by Schweitzer et al. (2011b. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2, water vapor (H2O, methane (CH4, and ozone (O3. The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The

  6. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Directory of Open Access Journals (Sweden)

    V. Proschek

    2011-10-01

    Full Text Available Measuring greenhouse gas (GHG profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2, water vapor (H2O, methane (CH4, and ozone (O3. The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points

  7. End-to-end available bandwidth measurement methodology%一种端到端的有效带宽测量方法

    Institute of Scientific and Technical Information of China (English)

    石祥滨; 谭俏男; 杜玲

    2011-01-01

    有效带宽测量在服务器选择、覆盖网络路由选择和网络流量工程等方面有广泛的应用.通过分析发送频率与单向延时之间的关系,提出了一种基于多项式拟合的有效带宽测量方法PFAB(Polynomial Fitting for Available Bandwidth).PFAB通过发送探测速率逐渐下降的探测包,并检测OWD的变化,推断探测速率和有效带宽之间的关系.当探测速率大于有效带宽时,OWD持续增长;在探测速率等于有效带宽时,OWD达到最大值,之后开始下降.NS-2仿真实验结果表明该方法能有效减少探测包数量,缩短探测时间,并且对网络造成的干扰小;在探测包瞬时频率变化较大时,可以快速到极值点,减少测量误差.%The available bandwidth measurement is important for many Internet applications, such as server selection, overlay network routing, and traffic engineering. This paper analyzes the relationship of the probing rate and the one-way delays. Then based on this,a methodology which call PFAB(polynomial fitting for available bandwidth) in the basic idea from polynomial fitting is presented to measure end-to-end available bandwidth.In PFAB, the change of the one-way delay happen according to one-way delay trends to infer the relationship between the probing rate and the available bandwidth. If one-way delay is increasing,the probing rate is larger than the present available bandwidth;if one-way delay is not increasing,the probing rate is equals the present available bandwidth; otherwise,one-way delay is decreasing. This methodology can reduce the number of probe bytes,the time of probing and network interference through NS-2 simulations. It also can find out extreme point with lowing measurement errors when probing rate has a big change.

  8. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    Directory of Open Access Journals (Sweden)

    Greg Finak

    2014-08-01

    Full Text Available Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in

  9. Two Dimensional Yau-Hausdorff Distance with Applications on Comparison of DNA and Protein Sequences.

    Directory of Open Access Journals (Sweden)

    Kun Tian

    Full Text Available Comparing DNA or protein sequences plays an important role in the functional analysis of genomes. Despite many methods available for sequences comparison, few methods retain the information content of sequences. We propose a new approach, the Yau-Hausdorff method, which considers all translations and rotations when seeking the best match of graphical curves of DNA or protein sequences. The complexity of this method is lower than that of any other two dimensional minimum Hausdorff algorithm. The Yau-Hausdorff method can be used for measuring the similarity of DNA sequences based on two important tools: the Yau-Hausdorff distance and graphical representation of DNA sequences. The graphical representations of DNA sequences conserve all sequence information and the Yau-Hausdorff distance is mathematically proved as a true metric. Therefore, the proposed distance can preciously measure the similarity of DNA sequences. The phylogenetic analyses of DNA sequences by the Yau-Hausdorff distance show the accuracy and stability of our approach in similarity comparison of DNA or protein sequences. This study demonstrates that Yau-Hausdorff distance is a natural metric for DNA and protein sequences with high level of stability. The approach can be also applied to similarity analysis of protein sequences by graphic representations, as well as general two dimensional shape matching.

  10. Two Dimensional Yau-Hausdorff Distance with Applications on Comparison of DNA and Protein Sequences.

    Science.gov (United States)

    Tian, Kun; Yang, Xiaoqian; Kong, Qin; Yin, Changchuan; He, Rong L; Yau, Stephen S-T

    2015-01-01

    Comparing DNA or protein sequences plays an important role in the functional analysis of genomes. Despite many methods available for sequences comparison, few methods retain the information content of sequences. We propose a new approach, the Yau-Hausdorff method, which considers all translations and rotations when seeking the best match of graphical curves of DNA or protein sequences. The complexity of this method is lower than that of any other two dimensional minimum Hausdorff algorithm. The Yau-Hausdorff method can be used for measuring the similarity of DNA sequences based on two important tools: the Yau-Hausdorff distance and graphical representation of DNA sequences. The graphical representations of DNA sequences conserve all sequence information and the Yau-Hausdorff distance is mathematically proved as a true metric. Therefore, the proposed distance can preciously measure the similarity of DNA sequences. The phylogenetic analyses of DNA sequences by the Yau-Hausdorff distance show the accuracy and stability of our approach in similarity comparison of DNA or protein sequences. This study demonstrates that Yau-Hausdorff distance is a natural metric for DNA and protein sequences with high level of stability. The approach can be also applied to similarity analysis of protein sequences by graphic representations, as well as general two dimensional shape matching.

  11. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  12. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  13. Genomic Signal Processing Methods for Computation of Alignment-Free Distances from DNA Sequences

    Science.gov (United States)

    Borrayo, Ernesto; Mendizabal-Ruiz, E. Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P.; Morales, J. Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments. PMID:25393409

  14. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    Science.gov (United States)

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  15. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  16. Coarse-grained modelling of strong DNA bending I: Thermodynamics and comparison to an experimental "molecular vice"

    OpenAIRE

    Harrison, Ryan M.; Romano, Flavio; Thomas E. Ouldridge; Louis, Ard A.; Doye, Jonathan P. K.

    2015-01-01

    DNA bending is biologically important for genome regulation and is relevant to a range of nanotechnological systems. Recent results suggest that sharp bending is much easier than implied by the widely-used worm-like chain model; many of these studies, however, remain controversial. We use a coarse-grained model, previously fitted to DNA's basic thermodynamic and mechanical properties, to explore strongly bent systems. We find that as the end-to-end distance is decreased sufficiently short dup...

  17. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  18. Adhoc: an R package to calculate ad hoc distance thresholds for DNA barcoding identification

    Directory of Open Access Journals (Sweden)

    Gontran Sonet

    2013-12-01

    Full Text Available Identification by DNA barcoding is more likely to be erroneous when it is based on a large distance between the query (the barcode sequence of the specimen to identify and its best match in a reference barcode library. The number of such false positive identifications can be decreased by setting a distance threshold above which identification has to be rejected. To this end, we proposed recently to use an ad hoc distance threshold producing identifications with an estimated relative error probability that can be fixed by the user (e.g. 5%. Here we introduce two R functions that automate the calculation of ad hoc distance thresholds for reference libraries of DNA barcodes. The scripts of both functions, a user manual and an example file are available on the JEMU website (http://jemu.myspecies.info/computer-programs as well as on the comprehensive R archive network (CRAN, http://cran.r-project.org.

  19. Effect of Cisplatin on the Flexibility of Linear DNA

    Institute of Scientific and Technical Information of China (English)

    JI Chao; ZHANG Ling-Yun; HOU Xi-Miao; DOU Shuo-Xing; WANG Peng-Ye

    2011-01-01

    With the aid of an atomic force microscope (AFM), we study the interaction between linear DNA fragment and cisplatin. For different cisplatin concentrations, the AFM used to observe the conformation of DNA has a gradual change. The contour length, the end-to-end distance and the local bend angles of the linear DNA fragment can be accurately measured. The persistence length of DNA interacting with cisplatin is decreased with the increasing cisplatin concentration. Furthermore, it is demonstrated that the local bend angles of DNA chains are increased by the binding interaction of cisplatin.%@@ With the aid of an atomic force microscope (AFM), we study the interaction between linear DNA fragment and cisplatin.For different cisplatin concentrations, the AFM used to observe the conformation of DNA has a gradual change.The contour length, the end-to-end distance and the local bend angles of the linear DNA fragment can be accurately measured.The persistence length of DNA interacting with cisplatin is decreased with the increasing cisplatin concentration.Furthermore, it is demonstrated that the local bend angles of DNA chains are increased by the binding interaction of cisplatin.

  20. LZ Complexity Distance of DNA Sequences and Its Application in Phylogenetic Tree Reconstruction

    Institute of Scientific and Technical Information of China (English)

    Bin Li; Yi-Bing Li; Hong-Bo He

    2005-01-01

    DNA sequences can be treated as finite-length symbol strings over a four-letter alphabet (A, C, T, G). As a universal and computable complexity measure, LZ complexity is valid to describe the complexity of DNA sequences. In this study, a concept of conditional LZ complexity between two sequences is proposed according to the principle of LZ complexity measure. An LZ complexity distance metric between two nonnull sequences is defined by utilizing conditional LZ complexity.Based on LZ complexity distance, a phylogenetic tree of 26 species of placental mammals (Eutheria) with three outgroup species was reconstructed from their complete mitochondrial genomes. On the debate that which two of the three main groups of placental mammals, namely Primates, Ferungulates, and Rodents, are more closely related, the phylogenetic tree reconstructed based on LZ complexity distance supports the suggestion that Primates and Ferungulates are more closely related.

  1. Cognitive Network End to End Situational Evaluation Algorithm Based on BP-DBN%基于BP-DBN的认知网络端到端态势评估算法

    Institute of Scientific and Technical Information of China (English)

    蒋云洁; 王莉

    2014-01-01

    文中提出了一种基于BP-深度信念网络( BP-DBN)的端到端态势评估算法,实现网络端到端态势等级判定。基于提出的分布式态势评估架构,使用BP-DBN分别构建认知域网元评估值、局部态势评估值和端到端态势评估值三者间的映射关系,最后实现端到端态势等级定性评估。实验结果表明,基于少量标记训练样本,BP-DBN测试错误率低,能够保证评估准确性,同时提出的评估算法能够有效评估端到端网络态势等级。%Cognitive network end to end situational evaluation algorithm based on BP-DBN is proposed to judge the situational level. Based on distributed situational evaluation architecture,BP-DBN is used to construct mapping relations among network element evalua-tion value in cognitive domain,local situational evaluation value and global situational evaluation value. End to end situational evaluation can be qualitatively evaluated. Simulation results show that test error rate of BP-DBN is low based on less labeled samples,which can en-sure accuracy of evaluation,and the algorithm proposed can estimate end to end situational level effectively.

  2. SU-F-P-39: End-To-End Validation of a 6 MV High Dose Rate Photon Beam, Configured for Eclipse AAA Algorithm Using Golden Beam Data, for SBRT Treatments Using RapidArc

    Energy Technology Data Exchange (ETDEWEB)

    Ferreyra, M; Salinas Aranda, F; Dodat, D; Sansogne, R; Arbiser, S [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To use end-to-end testing to validate a 6 MV high dose rate photon beam, configured for Eclipse AAA algorithm using Golden Beam Data (GBD), for SBRT treatments using RapidArc. Methods: Beam data was configured for Varian Eclipse AAA algorithm using the GBD provided by the vendor. Transverse and diagonals dose profiles, PDDs and output factors down to a field size of 2×2 cm2 were measured on a Varian Trilogy Linac and compared with GBD library using 2% 2mm 1D gamma analysis. The MLC transmission factor and dosimetric leaf gap were determined to characterize the MLC in Eclipse. Mechanical and dosimetric tests were performed combining different gantry rotation speeds, dose rates and leaf speeds to evaluate the delivery system performance according to VMAT accuracy requirements. An end-to-end test was implemented planning several SBRT RapidArc treatments on a CIRS 002LFC IMRT Thorax Phantom. The CT scanner calibration curve was acquired and loaded in Eclipse. PTW 31013 ionization chamber was used with Keithley 35617EBS electrometer for absolute point dose measurements in water and lung equivalent inserts. TPS calculated planar dose distributions were compared to those measured using EPID and MapCheck, as an independent verification method. Results were evaluated with gamma criteria of 2% dose difference and 2mm DTA for 95% of points. Results: GBD set vs. measured data passed 2% 2mm 1D gamma analysis even for small fields. Machine performance tests show results are independent of machine delivery configuration, as expected. Absolute point dosimetry comparison resulted within 4% for the worst case scenario in lung. Over 97% of the points evaluated in dose distributions passed gamma index analysis. Conclusion: Eclipse AAA algorithm configuration of the 6 MV high dose rate photon beam using GBD proved efficient. End-to-end test dose calculation results indicate it can be used clinically for SBRT using RapidArc.

  3. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up......, we show that colorless 40 Gb/s PAM-4 transmission over 20 km SMF in the C-band is achievable...

  4. Genetic distances and phylogenetic trees of different Awassi sheep populations based on DNA sequencing.

    Science.gov (United States)

    Al-Atiyat, R M; Aljumaah, R S

    2014-01-01

    This study aimed to estimate evolutionary distances and to reconstruct phylogeny trees between different Awassi sheep populations. Thirty-two sheep individuals from three different geographical areas of Jordan and the Kingdom of Saudi Arabia (KSA) were randomly sampled. DNA was extracted from the tissue samples and sequenced using the T7 promoter universal primer. Different phylogenetic trees were reconstructed from 0.64-kb DNA sequences using the MEGA software with the best general time reverse distance model. Three methods of distance estimation were then used. The maximum composite likelihood test was considered for reconstructing maximum likelihood, neighbor-joining and UPGMA trees. The maximum likelihood tree indicated three major clusters separated by cytosine (C) and thymine (T). The greatest distance was shown between the South sheep and North sheep. On the other hand, the KSA sheep as an outgroup showed shorter evolutionary distance to the North sheep population than to the others. The neighbor-joining and UPGMA trees showed quite reliable clusters of evolutionary differentiation of Jordan sheep populations from the Saudi population. The overall results support geographical information and ecological types of the sheep populations studied. Summing up, the resulting phylogeny trees may contribute to the limited information about the genetic relatedness and phylogeny of Awassi sheep in nearby Arab countries.

  5. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  6. CASTOR end-to-end monitoring

    CERN Document Server

    Rekatsinas, T; Pokorski, W; Ponce, S; Rabaçal, B; Waldron, D; Wojcieszuk, J

    2010-01-01

    With the start of Large Hadron Collider approaching, storage and management of raw event data, as well as reconstruction and analysis data, is of crucial importance for the researchers. The CERN Advanced STORage system (CASTOR) is a hierarchical system developed at CERN, used to store physics production files and user files. CASTOR, as one of the essential software tools used by the LHC experiments, has to provide reliable services for storing and managing data. Monitoring of this complicated system is mandatory in order to assure its stable operation and improve its future performance. This paper presents the new monitoring system of CASTOR which provides operation and user request specific metrics. This system is build around a dedicated, optimized database schema. The schema is populated by PL/SQL procedures, which process a stream of incoming raw metadata from different CASTOR components, initially collected by the Distributed Logging Facility (DLF). A web interface has been developed for the visualizatio...

  7. Regulation at a distance of biomolecular interactions using a DNA origami nanoactuator.

    Science.gov (United States)

    Ke, Yonggang; Meyer, Travis; Shih, William M; Bellot, Gaetan

    2016-03-18

    The creation of nanometre-sized structures that exhibit controllable motions and functions is a critical step towards building nanomachines. Recent developments in the field of DNA nanotechnology have begun to address these goals, demonstrating complex static or dynamic nanostructures made of DNA. Here we have designed and constructed a rhombus-shaped DNA origami 'nanoactuator' that uses mechanical linkages to copy distance changes induced on one half ('the driver') to be propagated to the other half ('the mirror'). By combining this nanoactuator with split enhanced green fluorescent protein (eGFP), we have constructed a DNA-protein hybrid nanostructure that demonstrates tunable fluorescent behaviours via long-range allosteric regulation. In addition, the nanoactuator can be used as a sensor that responds to specific stimuli, including changes in buffer composition and the presence of restriction enzymes or specific nucleic acids.

  8. Elasticity of DNA and the effect of Dendrimer Binding

    CERN Document Server

    Mogurampelly, Santosh; Netz, Roland R; Maiti, Prabal K

    2013-01-01

    Negatively charged DNA can be compacted by positively charged dendrimers and the degree of compaction is a delicate balance between the strength of the electrostatic interaction and the elasticity of DNA. We report various elastic properties of short double stranded DNA (dsDNA) and the effect of dendrimer binding using fully atomistic molecular dynamics and numerical simulations. In equilibrium at room temperature, the contour length distribution P(L) and end-to-end distance distribution P(R) are nearly Gaussian, the former gives an estimate of the stretch modulus {\\gamma}_1 of dsDNA in quantitative agreement with the literature value. The bend angle distribution P({\\theta}) of the dsDNA also has a Gaussian form and allows to extract a persistence length, L_p of 43 nm. When the dsDNA is compacted by positively charged dendrimer, the stretch modulus stays invariant but the effective bending rigidity estimated from the end-to-end distance distribution decreases dramatically due to backbone charge neutralization...

  9. Horses for courses: a DNA-based test for race distance aptitude in thoroughbred racehorses.

    Science.gov (United States)

    Hill, Emmeline W; Ryan, Donal P; MacHugh, David E

    2012-12-01

    Variation at the myostatin (MSTN) gene locus has been shown to influence racing phenotypes in Thoroughbred horses, and in particular, early skeletal muscle development and the aptitude for racing at short distances. Specifically, a single nucleotide polymorphism (SNP) in the first intron of MSTN (g.66493737C/T) is highly predictive of best race distance among Flat racing Thoroughbreds: homozygous C/C horses are best suited to short distance races, heterozygous C/T horses are best suited to middle distance races, and homozygous T/T horses are best suited to longer distance races. Patent applications for this gene marker association, and other linked markers, have been filed. The information contained within the patent applications is exclusively licensed to the commercial biotechnology company Equinome Ltd, which provides a DNA-based test to the international Thoroughbred horse racing and breeding industry. The application of this information in the industry enables informed decision making in breeding and racing and can be used to assist selection to accelerate the rate of change of genetic types among distinct populations (Case Study 1) and within individual breeding operations (Case Study 2).

  10. Syntheses, structures, and magnetic properties of three one-dimensional end-to-end azide/cyanate-bridged copper(II) compounds exhibiting ferromagnetic interaction: new type of solid state isomerism.

    Science.gov (United States)

    Sasmal, Sujit; Sarkar, Sohini; Aliaga-Alcalde, Núria; Mohanta, Sasankasekhar

    2011-06-20

    The work in this paper presents the syntheses, structures, and magnetic properties of three end-to-end (EE) azide/cyanate-bridged copper(II) compounds [Cu(II)L(1)(μ(1,3)-NCO)](n)·2nH(2)O (1), [Cu(II)L(1)(μ(1,3)-N(3))](n)·2nH(2)O (2), and [Cu(II)L(2)(μ(1,3)-N(3))](n) (3), where the ligands used to achieve these species, HL(1) and HL(2), are the tridentate Schiff base ligands obtained from [1 + 1] condensations of salicylaldehyde with 4-(2-aminoethyl)-morpholine and 3-methoxy salicylaldehyde with 1-(2-aminoethyl)-piperidine, respectively. Compounds 1 and 2 crystallize in the monoclinic P2(1)/c space group, while compound 3 crystallizes in the orthorhombic Pbca space group. The metal center in 1-3 is in all cases pentacoordinated. Three coordination positions of the metal center in 1, 2, or 3 are satisfied by the phenoxo oxygen atom, imine nitrogen atom, and morpholine (for 1 and 2) or piperidine (for 3) nitrogen atom of one deprotonated ligand, [L(1)](-) or [L(2)](-). The remaining two coordination positions are satisfied by two nitrogen atoms of two end-to-end bridging azide ligands for 2 and 3 and one nitrogen atom and one oxygen atom of two end-to-end bridging cyanate ligands for 1. The coordination geometry of the metal ion is distorted square pyramidal in which one EE azide/cyanate occupies the apical position. Variable-temperature (2-300 K) magnetic susceptibilities of 1-3 have been measured under magnetic fields of 0.05 (from 2 to 30 K) and 1.0 T (from 30 to 300 K). The simulation reveals a ferromagnetic interaction in all three compounds with J values of +0.19 ± 0.01, +0.79 ± 0.01, and +1.25 ± 0.007 cm(-1) for 1, 2, and 3, respectively. Compound 1 is the sole example of a ferromagnetically coupled EE cyanate-bridged 1-D copper(II) system. In addition, a rare example of supramolecular isomerism and a nice example of magnetic isomerism have been observed and most interestingly a new type of solid state isomerism has emerged as a result of the comparison

  11. Unmanned Aircraft Systems Detect and Avoid System: End-to-End Verification and Validation Simulation Study of Minimum Operations Performance Standards for Integrating Unmanned Aircraft into the National Airspace System

    Science.gov (United States)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.

  12. End to End Service Quality Model Study and Application Based on IPTV Business%基于IPTV业务的端到端业务质量模型研究与应用

    Institute of Scientific and Technical Information of China (English)

    刘辉

    2011-01-01

    In the current burgeoning tide of development of IPTV services,the main challenges is the quality of service management issues,especially the end to end quality of service problems,the main difficulty is that when high-definition video images,sound and in%在当前日渐蓬勃发展的IPTV业务大潮中,遇到的主要挑战是业务质量管理问题,尤其是端到端的业务质量问题,主要难点在于当视频的高清晰画面、声音和即时频道转换等信息在IP网络上传送时业务质量变得很难控制和保证。

  13. Research on end-to-end network link delay inference based on link reconstruction-destruction%基于链路重构-解构的端到端网络链路时延推测研究

    Institute of Scientific and Technical Information of China (English)

    梁永生; 高波; 邹粤; 张基宏; 张乃通

    2014-01-01

    基于网络时延推测的2个假设、时延推测模型和路径时延数据采集方法,提出了一种基于链路重构-解构的端到端网络链路时延推测方法,应用伪似然估计将原整体问题分解为若干独立子问题分别求解,利用链路重构-解构确定可求解的推测单元,控制平均采样精度和减少推测单元链路数,从而显著降低计算复杂度。通过基于模型的计算和基于NS2的仿真实验研究,验证了推测方法的准确性和有效性。%Based on two assumptions, inference model and end-to-end delay data acquisition, an approach to end-to-end network internal link delay inference based on link reconstruction-deconstruction (LRD) was proposed. Pseudo likelihood estimation (PLE) was adopted and the inference problem was divided into independent sub-problems. Inference units with definite solution are determined by LRD. By means of controlling average sampling precision and decreasing infe-rence unit links, the computation complexity of link delay inference was significantly lowered. Experimental study was performed based on model computation and NS2 simulation platform. Theoretical analysis and experimental results show that the approach is accurate and effective.

  14. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    Energy Technology Data Exchange (ETDEWEB)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R [Vidt Centro Medico, Ciudad Autonoma De Buenos Aires, Ciudad Autonoma de Buenos Aire (Argentina)

    2016-06-15

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  15. 云计算数据中心网络的端到端流量计算%END-TO-END TRAFFIC CALCULATION FOR CLOUD COMPUTING DATA CENTRE NETWORKS

    Institute of Scientific and Technical Information of China (English)

    韩冰; 宋正江; 鲁阳; 陈建成

    2016-01-01

    云计算数据中心网络的流量特征是研究和设计云计算网络的基础,现有的流量测量研究方法通常要求交换机支持额外功能模块或具备可编程能力,而目前大多数云计算数据中心网络的交换机并不满足此要求。提出一种基于网络层析技术的端到端流量推理算法,仅使用交换机普遍支持的 SNMP(简单的网络管理协议)数据,就能快速准确地计算出端到端的流量信息。并通过仿真实验与已有的网络层析算法进行比较,结果表明新算法更适用于大规模的云计算数据中心网络,可以在较短的时间内得到更准确的计算结果,从而为云计算网络的设计和研究提供了重要的参考依据。%Traffic characteristic of cloud computing data centre networks is the basis of research and design of cloud computing networks. Current study methods of traffic measurement techniques usually ask the switches supporting additional functional modules or being program-mable,however not many of the cloud computing data centre networks can afford such switches.In this paper,we proposed an end-to-end traffic inference algorithm which is based on network tomography.It can rapidly calculate the end-to-end traffic information with high accuracy by only utilising SNMP (simple network management protocol)data ubiquitously supported by switches.Through simulation experiment the algorithm was compared with existing network tomography algorithm,and result showed that the new algorithm was more applicable to large-scale cloud computing data centre networks,and could gain more accurate computing results in short period,so as to provide important refer-ence basis for the design and research of cloud computing networks.

  16. Distance-dependent interactions between gold nanoparticles and fluorescent molecules with DNA as tunable spacers

    Energy Technology Data Exchange (ETDEWEB)

    Chhabra, Rahul; Sharma, Jaswinder; Lin Su; Yan Hao; Lindsay, Stuart; Liu Yan [Biodesign Institute, Arizona State University, Tempe, AZ 85287 (United States); Wang Haining; Zou Shengli, E-mail: stuart.lindsay@asu.ed, E-mail: yan_liu@asu.ed [Department of Chemistry, University of Central Florida, Orlando, FL 32816 (United States)

    2009-12-02

    Using stoichiometrically controlled 1:1 functionalization of gold nanoparticles with fluorescent dye molecules in which the dye molecule is held away from the particle surface by a rigid DNA spacer allows precise determination of the distance-dependent effect of the metal nanoparticles on fluorescence intensity. Two dyes were studied, Cy3 and Cy5, with two sizes of nanoparticles, 5 and 10 nm. The larger the particle, the more quenching of the photoluminescence (PL) intensity, due to increased overlap of the dye's emission spectrum with the Au surface plasmon resonance. Fluorescence is quenched significantly for distances somewhat larger than the particle diameter, in good agreement with the predictions of an electrodynamics model based on interacting dipoles. The distance dependence of surface energy transfer behavior, i.e. quenching efficiency, is proportional to 1/d{sup 4}, which involves no consideration of the size of the particle and the spectral overlap of the dye and AuNp. This surface energy transfer model is found qualitatively and agrees with the electrodynamic model, though the exponent is greater than 4 for the smaller nanoparticles (5 nm), and smaller than 4 for the larger nanoparticles (10 nm).

  17. An improved algorithm of reconfiguration processes with end-to-end QoS constraints%一种重构带端到端服务质量约束进程的改进算法

    Institute of Scientific and Technical Information of China (English)

    刘玮玮

    2011-01-01

    This Paper presents an improved algorithm to resolve the violating of end-to-end quality of service (QoS) constraints,which occurs when SOA processes running with some services' wrong.By searching for a reconfiguration region that has replaceable services,the study replaces only faulty services and some of their neighboring services.And the replacements can be implemented with one-to-one,one-to-many,or many-to-one functions mappings,as replacing only services in reconfiguration regions rather than the whole service process,lowers the reconfiguration overheads and the adventures of service disruptions.Finally implements a simulation experiment in the Llama ESB middleware,performance study shows that the algorithm may efficiently repair SOA processes.%阐述一种改进算法以解决由不同服务动态组成的SOA进程运行时,一些服务出错引发进程违背端到端服务质量约束的问题.通过查找具有可替换服务的重构域,替换出错服务及其相关服务.替换可由一对一,一对多或多对一的函数映射来实现,由于只替换重构域中的服务而不是整个服务进程,降低了重构经费和服务进程崩溃的风险.最后在Llama的ESB企业服务总线中进行实验模拟,结果表明该算法能有效地修复SOA进程.

  18. End to end simulation of the large span cable control system in FAST%FAST大跨度索牵引运动控制系统全过程仿真分析

    Institute of Scientific and Technical Information of China (English)

    孙京海; 朱文白; 李辉

    2012-01-01

    分析了500m口径球面射电望远镜(FAST)使用柔性索牵引加之二次位姿精调机构的支撑方案在实现接收机的大范围精确运动时对结构和控制系统设计带来的巨大挑战,在现有概念方案的基础上,建立了馈源支撑系统在原型尺度上的完整机构和环境扰动模型,提出了相应的控制策略,并设计了控制器模型来实现精确定位与指向功能.利用全过程数值仿真分析方法,研究了系统的动力学响应以及各种扰动参数对控制效果的影响.仿真结果显示,现有的馈源支撑设计具备良好的控制性能,验证了方案的可行性,并为进一步的优化设计提供了参考依据.%To achieve the required pointing accuracy of the five-hundred-meter aperture spherical radio telescope (FAST) by positioning and orienting the receiver properly, a fine adjusting system was employed, and a model for FAST cabin suspension was created. The end to end simulation was carried out to evaluate the control performance. As a result, the control system showed the satisfied performance in compensating the position and orientation errors. The simulation work approved the feasibility of this engineering concept, and also presented an efficient approach for optimization of the future design work.

  19. Evaluation of Acridine Orange Derivatives as DNA-Targeted Radiopharmaceuticals for Auger Therapy: Influence of the Radionuclide and Distance to DNA

    Science.gov (United States)

    Pereira, Edgar; Do Quental, Letícia; Palma, Elisa; Oliveira, Maria Cristina; Mendes, Filipa; Raposinho, Paula; Correia, Isabel; Lavrado, João; di Maria, Salvatore; Belchior, Ana; Vaz, Pedro; Santos, Isabel; Paulo, António

    2017-02-01

    A new family of 99mTc(I)- tricarbonyl complexes and 125I-heteroaromatic compounds bearing an acridine orange (AO) DNA targeting unit was evaluated for Auger therapy. Characterization of the DNA interaction, performed with the non-radioactive Re and 127I congeners, confirmed that all compounds act as DNA intercalators. Both classes of compounds induce double strand breaks (DSB) in plasmid DNA but the extent of DNA damage is strongly dependent on the linker between the Auger emitter (99mTc or 125I) and the AO moiety. The in vitro evaluation was complemented with molecular docking studies and Monte Carlo simulations of the energy deposited at the nanometric scale, which corroborated the experimental data. Two of the tested compounds, 125I-C5 and 99mTc-C3, place the corresponding radionuclide at similar distances to DNA and produce comparable DSB yields in plasmid and cellular DNA. These results provide the first evidence that 99mTc can induce DNA damage with similar efficiency to that of 125I, when both are positioned at comparable distances to the double helix. Furthermore, the high nuclear retention of 99mTc-C3 in tumoral cells suggests that 99mTc-labelled AO derivatives are more promising for the design of Auger-emitting radiopharmaceuticals than the 125I-labelled congeners.

  20. Research of end to end data flow control strategy for mobile internet%移动互联网端到端流量管控策略研究

    Institute of Scientific and Technical Information of China (English)

    张高毅

    2013-01-01

    In this paper, we analyzed new technology and architecture of wireless network, aimed at the environment of 2G/3G/LTE/WLAN multiple networks, the QoS control method of data flow in the condition of different user type, service type, time quantum, and cumulant of the lfow is researched, to improve the business experience and user satisfaction. Based on the PCC character and ANDSF multiple connection management mechanism, by using the end to end lfue control technology, the usage ratio of the internet bandwidth is increased. We copied the principle of Dujiangyan water conservancy engineering, applying network flow control, equilibrium the network flow and reduce the flow peak, the fusion strategy control scheme of mobile internet cooperation is proposed, the cyber source is reasonable allocated, the investment of network is rationally and effectively controlled. The requirement for mobile internet business development is created, full technical reserves is made to cope with the digital lfood.%本文通过对无线网络新技术和新架构的分析,针对2G/3G/LTE/WLAN多网络环境下,研究如何对不同的用户类型、业务类型、时间段、累积量进行QoS控制,以提升业务体验和用户满意度。基于PCC承载架构和ANDSF多连接管理机制,通过端到端的流量管控技术提升互联网带宽的使用率。借鉴都江堰水利工程原理,通过实施网络流量控制、均匀网络流量、降低并有效控制流量峰值,提出了移动互联网网络协同的融合策略控制方案,实现了合理配置网络资源、合理有效控制网络投资,为移动互联网业务发展创造条件,为移动互联网时代应对数字洪水做好充分的技术储备。

  1. 基于PLE的有确定解的端到端网络链路时延推测方法%Resarch on Approach to End-to-end Network Link Delav Inference Based on PLE with Definite Solution

    Institute of Scientific and Technical Information of China (English)

    梁永生; 邹粤; 张基宏

    2011-01-01

    Network delay is one of the important network performance parameters. End-to-end network delay inference could deal with the difficulties caused by other network measurements based on internal routers or router cooperation. Under the condition of two assumptions;network topology structure is gotten and stable;link performance is temporally and spatially independent;network delay inference model was presented;a new approach to network internal link delay inference based on Pseudo Likelihood Estimation(PLE) with definite solution was proposed in this paper. Based on PLE solved with Expectation Maximum(EM) algorithm; inference units with definite solution were determined via back-toback packet sending way. This approach could solve the problem of indefinite solution and lower the computation complexity. Experimental study was performed based on model computatioa The experimental results show that the approach is accurate and effective.%网络时延是重要的网络性能指标,端到端网络时延推测能够克服传统的基于路由器或者路由器协作的网络测量技术的弊端.在网络拓扑已知且稳定和链路性能时空独立性的假设前提下,给出了网络链路时延推测模型,提出了一种基于伪似然估计(PLE)的有确定解的端到端网络链路时延推测方法.在应用期望最大化算法的伪似然估计的基础上,控制背靠背发包方式,确定可以求解的探测单元,解决了不满足有确定解拓扑下的求解问题,且有效降低了计算复杂度.最后利用基于模型的计算验证了该方法的准确性和有效性.

  2. The Real maccoyii: Identifying Tuna Sushi with DNA Barcodes – Contrasting Characteristic Attributes and Genetic Distances

    Science.gov (United States)

    Lowenstein, Jacob H.; Amato, George; Kolokotronis, Sergios-Orestis

    2009-01-01

    Background The use of DNA barcodes for the identification of described species is one of the least controversial and most promising applications of barcoding. There is no consensus, however, as to what constitutes an appropriate identification standard and most barcoding efforts simply attempt to pair a query sequence with reference sequences and deem identification successful if it falls within the bounds of some pre-established cutoffs using genetic distance. Since the Renaissance, however, most biological classification schemes have relied on the use of diagnostic characters to identify and place species. Methodology/Principal Findings Here we developed a cytochrome c oxidase subunit I character-based key for the identification of all tuna species of the genus Thunnus, and compared its performance with distance-based measures for identification of 68 samples of tuna sushi purchased from 31 restaurants in Manhattan (New York City) and Denver, Colorado. Both the character-based key and GenBank BLAST successfully identified 100% of the tuna samples, while the Barcode of Life Database (BOLD) as well as genetic distance thresholds, and neighbor-joining phylogenetic tree building performed poorly in terms of species identification. A piece of tuna sushi has the potential to be an endangered species, a fraud, or a health hazard. All three of these cases were uncovered in this study. Nineteen restaurant establishments were unable to clarify or misrepresented what species they sold. Five out of nine samples sold as a variant of “white tuna” were not albacore (T. alalunga), but escolar (Lepidocybium flavorunneum), a gempylid species banned for sale in Italy and Japan due to health concerns. Nineteen samples were northern bluefin tuna (T. thynnus) or the critically endangered southern bluefin tuna (T. maccoyii), though nine restaurants that sold these species did not state these species on their menus. Conclusions/Significance The Convention on International Trade

  3. The real maccoyii: identifying tuna sushi with DNA barcodes--contrasting characteristic attributes and genetic distances.

    Directory of Open Access Journals (Sweden)

    Jacob H Lowenstein

    Full Text Available BACKGROUND: The use of DNA barcodes for the identification of described species is one of the least controversial and most promising applications of barcoding. There is no consensus, however, as to what constitutes an appropriate identification standard and most barcoding efforts simply attempt to pair a query sequence with reference sequences and deem identification successful if it falls within the bounds of some pre-established cutoffs using genetic distance. Since the Renaissance, however, most biological classification schemes have relied on the use of diagnostic characters to identify and place species. METHODOLOGY/PRINCIPAL FINDINGS: Here we developed a cytochrome c oxidase subunit I character-based key for the identification of all tuna species of the genus Thunnus, and compared its performance with distance-based measures for identification of 68 samples of tuna sushi purchased from 31 restaurants in Manhattan (New York City and Denver, Colorado. Both the character-based key and GenBank BLAST successfully identified 100% of the tuna samples, while the Barcode of Life Database (BOLD as well as genetic distance thresholds, and neighbor-joining phylogenetic tree building performed poorly in terms of species identification. A piece of tuna sushi has the potential to be an endangered species, a fraud, or a health hazard. All three of these cases were uncovered in this study. Nineteen restaurant establishments were unable to clarify or misrepresented what species they sold. Five out of nine samples sold as a variant of "white tuna" were not albacore (T. alalunga, but escolar (Lepidocybium flavorunneum, a gempylid species banned for sale in Italy and Japan due to health concerns. Nineteen samples were northern bluefin tuna (T. thynnus or the critically endangered southern bluefin tuna (T. maccoyii, though nine restaurants that sold these species did not state these species on their menus. CONCLUSIONS/SIGNIFICANCE: The Convention on

  4. Evaluating the Relationship between FRET Changes and Distance Changes Using DNA Length and Restriction Enzyme Specificity

    Science.gov (United States)

    Pazhani, Yogitha; Horn, Abigail E.; Grado, Lizbeth; Kugel, Jennifer F.

    2016-01-01

    FRET (Fo¨rster resonance energy transfer) involves the transfer of energy from an excited donor fluorophore to an acceptor molecule in a manner that is dependent on the distance between the two. A biochemistry laboratory experiment is described that teaches students how to use FRET to evaluate distance changes in biological molecules. Students…

  5. When Maxwellian demon meets action at a distance. Comment on "Disentangling DNA molecules" by Alexander Vologodskii

    Science.gov (United States)

    Rybenkov, Valentin V.

    2016-09-01

    The ability of living systems to defy thermodynamics without explicitly violating it is a continued source of inspiration to many biophysicists. The story of type-2 DNA topoisomerases is a beautiful example from that book. DNA topoisomerases catalyze a concerted DNA cleavage-religation reaction, which is interjected by a strand passage event. This sequence of events results in a seemingly unhindered transfer of one piece of DNA through another upon their random collision. An obvious consequence of such transfer is a change in the topological state of the colliding DNAs; hence the name of the enzymes, topoisomerases. There are several classes of topoisomerases, which differ in how they capture the cleaved and transported DNA segments (which are often referred to as the gate and transfer segments; or the G- and T-segments, to be short). Type-2 topoisomerases have two cleavage-religation centers. They open a gate in double stranded DNA and transfer another piece of double stranded DNA through it [1]. And in doing so, they manage to collect information about the rest of the DNA and perform strand passage in a directional manner so as to take the molecule away from the thermodynamic equilibrium [2].

  6. iDNA-Prot|dis: identifying DNA-binding proteins by incorporating amino acid distance-pairs and reduced alphabet profile into the general pseudo amino acid composition.

    Directory of Open Access Journals (Sweden)

    Bin Liu

    Full Text Available Playing crucial roles in various cellular processes, such as recognition of specific nucleotide sequences, regulation of transcription, and regulation of gene expression, DNA-binding proteins are essential ingredients for both eukaryotic and prokaryotic proteomes. With the avalanche of protein sequences generated in the postgenomic age, it is a critical challenge to develop automated methods for accurate and rapidly identifying DNA-binding proteins based on their sequence information alone. Here, a novel predictor, called "iDNA-Prot|dis", was established by incorporating the amino acid distance-pair coupling information and the amino acid reduced alphabet profile into the general pseudo amino acid composition (PseAAC vector. The former can capture the characteristics of DNA-binding proteins so as to enhance its prediction quality, while the latter can reduce the dimension of PseAAC vector so as to speed up its prediction process. It was observed by the rigorous jackknife and independent dataset tests that the new predictor outperformed the existing predictors for the same purpose. As a user-friendly web-server, iDNA-Prot|dis is accessible to the public at http://bioinformatics.hitsz.edu.cn/iDNA-Prot_dis/. Moreover, for the convenience of the vast majority of experimental scientists, a step-by-step protocol guide is provided on how to use the web-server to get their desired results without the need to follow the complicated mathematic equations that are presented in this paper just for the integrity of its developing process. It is anticipated that the iDNA-Prot|dis predictor may become a useful high throughput tool for large-scale analysis of DNA-binding proteins, or at the very least, play a complementary role to the existing predictors in this regard.

  7. Probing Nucleosome Stability with a DNA Origami Nanocaliper.

    Science.gov (United States)

    Le, Jenny V; Luo, Yi; Darcy, Michael A; Lucas, Christopher R; Goodwin, Michelle F; Poirier, Michael G; Castro, Carlos E

    2016-07-26

    The organization of eukaryotic DNA into nucleosomes and chromatin undergoes dynamic structural changes to regulate genome processing, including transcription and DNA repair. Critical chromatin rearrangements occur over a wide range of distances, including the mesoscopic length scale of tens of nanometers. However, there is a lack of methodologies that probe changes over this mesoscopic length scale within chromatin. We have designed, constructed, and implemented a DNA-based nanocaliper that probes this mesoscopic length scale. We developed an approach of integrating nucleosomes into our nanocaliper at two attachment points with over 50% efficiency. Here, we focused on attaching the two DNA ends of the nucleosome to the ends of the two nanocaliper arms, so the hinge angle is a readout of the nucleosome end-to-end distance. We demonstrate that nucleosomes integrated with 6, 26, and 51 bp linker DNA are partially unwrapped by the nanocaliper by an amount consistent with previously observed structural transitions. In contrast, the nucleosomes integrated with the longer 75 bp linker DNA remain fully wrapped. We found that the nanocaliper angle is a sensitive measure of nucleosome disassembly and can read out transcription factor (TF) binding to its target site within the nucleosome. Interestingly, the nanocaliper not only detects TF binding but also significantly increases the probability of TF occupancy at its site by partially unwrapping the nucleosome. These studies demonstrate the feasibility of using DNA nanotechnology to both detect and manipulate nucleosome structure, which provides a foundation of future mesoscale studies of nucleosome and chromatin structural dynamics.

  8. Historical DNA documents long-distance natal homing in marine fish.

    Science.gov (United States)

    Bonanomi, Sara; Overgaard Therkildsen, Nina; Retzel, Anja; Berg Hedeholm, Rasmus; Pedersen, Martin Waever; Meldrup, Dorte; Pampoulie, Christophe; Hemmer-Hansen, Jakob; Grønkjaer, Peter; Nielsen, Einar Eg

    2016-06-01

    The occurrence of natal homing in marine fish remains a fundamental question in fish ecology as its unequivocal demonstration requires tracking of individuals from fertilization to reproduction. Here, we provide evidence of long-distance natal homing (>1000 km) over more than 60 years in Atlantic cod (Gadus morhua), through genetic analysis of archived samples from marked and recaptured individuals. Using a high differentiation single-nucleotide polymorphism assay, we demonstrate that the vast majority of cod tagged in West Greenland and recaptured on Icelandic spawning grounds belonged to the Iceland offshore population, strongly supporting a hypothesis of homing. The high degree of natal fidelity observed provides the evolutionary settings for development of locally adapted populations in marine fish and emphasize the need to consider portfolio effects in marine fisheries management strategies.

  9. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  10. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    This thesis addresses selected topics of Quality of Service (QoS) provisioning in heterogeneous data networks that construct the communication environment of today's Internet. In the vast range of protocols available in different domains of network infrastructures, a few chosen ones are discussed......, the general UPnPQoS performance was assessed analytically and confirmed by simulations results. The results validate the usability of UPnP-QoS, but some open issues in the specication were identified. As a result of addressing mentioned shortcomings of UPnP-QoS, a few pre-emption algorithms for home gateway...... were designed and compared. Similarly as for general UPnP-QoS assessment, analysis and intensive simulations were used for verification of proposed pre-emption techniques. The other proposed extension for UPnP-QoS was an integration of trac auto-classication within UPnP-QoS Architecture. Simulation...

  11. Study on End-to-End Web Performance

    Institute of Scientific and Technical Information of China (English)

    GAO Ke-li; DAI Li-zhong

    2004-01-01

    While there are lots of papers discussing one or more aspects of web performance, there are few papers talking of web performance as a whole. This paper most thoroughly discusses aspects that influence web performance and current known web techniques. In addition, we discussed the general methods of web performance measurement and explained the discrepancies between our results and those of others. Finally, we analyzed the bottlenecks of web and come up with possible solutions.

  12. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  13. A PDTB-Styled End-to-End Discourse Parser

    CERN Document Server

    Lin, Ziheng; Kan, Min-Yen

    2010-01-01

    We have developed a full discourse parser in the Penn Discourse Treebank (PDTB) style. Our trained parser first identifies all discourse and non-discourse relations, locates and labels their arguments, and then classifies their relation types. When appropriate, the attribution spans to these relations are also determined. We present a comprehensive evaluation from both component-wise and error-cascading perspectives.

  14. VT Linear Referencing System - End-to-End 2014

    Data.gov (United States)

    Vermont Center for Geographic Information — LRS2014 is a Linear Referencing System layer that includes interstate, U.S., state (VT), and other transportation routes logged by the Vermont Agency of...

  15. End to End Beam Dynamics of the ESS Linac

    DEFF Research Database (Denmark)

    Thomsen, Heine Dølrath

    2012-01-01

    The European Spallation Source, ESS, uses a linear accelerator to deliver a high intensity proton beam to the target station. The nominal beam power on target will be 5 MW at an energy of 2.5 GeV. We briefly describe the individual accelerating structures and transport lines through which we have...

  16. On end-to-end safety for mobile COTS devices

    Directory of Open Access Journals (Sweden)

    Markus Kucera

    2015-09-01

    Full Text Available Today, ubiquitous mobile devices have not only arrived but entered the safety critical domain. There, systems are about to be controlled where human health or even human life is put at risk. For example, in automation systems first ideas surface to control parts of the system via a COTS smartphone. Another example is the idea to control the autonomous parking function of a car via a COTS smartphone too. As beneficial and convenient these ideas are on the first thought, on the second thought, dangers of these approaches become obvious. Especially in case of failures the system’s safety has to be maintained. The open question is how to achieve this mandatory requirement with COTS components, e.g. smartphones that are not developed following the development process necessary for safetycritical systems. This paper presents a concept to reliably detect human interaction while activating safety critical functions via COTS mobile devices. Thus a means is provided to detect erroneous activation requests for the safetycritical function.

  17. Testing DNA barcode performance in 1000 species of European lepidoptera: large geographic distances have small genetic impacts.

    Science.gov (United States)

    Huemer, Peter; Mutanen, Marko; Sefc, Kristina M; Hebert, Paul D N

    2014-01-01

    This study examines the performance of DNA barcodes (mt cytochrome c oxidase 1 gene) in the identification of 1004 species of Lepidoptera shared by two localities (Finland, Austria) that are 1600 km apart. Maximum intraspecific distances for the pooled data were less than 2% for 880 species (87.6%), while deeper divergence was detected in 124 species. Despite such variation, the overall DNA barcode library possessed diagnostic COI sequences for 98.8% of the taxa. Because a reference library based on Finnish specimens was highly effective in identifying specimens from Austria, we conclude that barcode libraries based on regional sampling can often be effective for a much larger area. Moreover, dispersal ability (poor, good) and distribution patterns (disjunct, fragmented, continuous, migratory) had little impact on levels of intraspecific geographic divergence. Furthermore, the present study revealed that, despite the intensity of past taxonomic work on European Lepidoptera, nearly 20% of the species shared by Austria and Finland require further work to clarify their status. Particularly discordant BIN (Barcode Index Number) cases should be checked to ascertain possible explanatory factors such as incorrect taxonomy, hybridization, introgression, and Wolbachia infections.

  18. Flexibility of short DNA helices under mechanical stretching

    CERN Document Server

    Zoli, Marco

    2016-01-01

    The flexibility of short DNA fragments is studied by a Hamiltonian model which treats the inter-strand and intra-strand forces at the level of the base pair. The elastic response of a set of homogeneous helices to externally applied forces is obtained by computing the average bending angles between adjacent base pairs along the molecule axis. The ensemble averages are performed over a room temperature equilibrium distribution of base pair separations and bending fluctuations. The analysis of the end-to-end distances and persistence lengths shows that even short sequences with less than $100$ base pairs maintain a significant bendability ascribed to thermal fluctuational effects and kinks with large bending angles. The discrepancies between the outcomes of the discrete model and those of the worm-like-chain model are examined pointing out the inadequacy of the latter on short length scales.

  19. Carnation-22光子照射疗法在尿道端端吻合术患者术后护理中的应用%Application of Carnation-22 photon therapeutic in postoperative nursing of patients with end-to-end anastomosis of urethra

    Institute of Scientific and Technical Information of China (English)

    陈斌; 阮琦; 张丽

    2016-01-01

    Objective To observe the nursing effect of Carnation-22 photon therapeutic apparatus in patients with end-to-end anastomosis of urethra. Methods From October 2014 to October 2015, 200 patients with urethral stricture were randomly selected and divided into two groups after end-to-end anastomosis of urethra according to the admission order. 100 patients in the control group were given conventional nursing after operation. 100 patients in the observation group were given Carnation-22 photon therapeutic apparatus based on the routine nursing. The swelling and exudation of perineum incision, results of middle urine culture, patient comfort and satisfaction degree of patients were compared in two groups.Results Healing rate in the observation group (90.0%) was higher than that in the control group (76.0%);the positive rate of middle urine culture in the observation group (11.0%) was lower than that in the control group (23.0%);the ratio of patients with pain sense in the observation group (38.0%) was lower than that in the control group (78.0%); the percentage of patient degree >90 in the observation group (95.0%) was higher than that in the control group (78.0%)(P90分(95.0%)的百分率高于对照组(78.0%),差异有统计学意义( P<0.05)。结论尿道端端吻合术后患者应用光子治疗仪照射后,能促进伤口愈合,减少感染率,缩短住院时间,且无明显不良反应,患者依从性好。

  20. Evaluation of severe stenosis of end-to-end and end-to-side anastomosis of transplant renal artery with color Doppler sonography%彩色多普勒超声对端-端与端-侧吻合移植肾动脉重度狭窄的对比研究

    Institute of Scientific and Technical Information of China (English)

    李建初; Robert J Min; Amelia Ng; David Trost; Michael Goldstein; Sandip Kupur; John Wang; David Serur; 姜玉新; 高敬; 张丽娜; 戴睛; 孟华; 蔡胜; 吕珂; 孝梦甦; 张一休

    2008-01-01

    目的 探讨两种吻合方式(端-端吻合与端-侧吻合)移植肾动脉重度狭窄(内径减少≥80%)的彩色多普勒超声诊断指标的差异.方法 回顾性分析彩色多普勒超声检查发现后并经数字减影血管造影(DSA)证实的38例移植肾动脉重度狭窄患者(端-端吻合和端-侧吻合各19例).超声测量髂动脉、移植肾动脉主干和吻合口峰值流速(PSV),肾内叶间动脉或段动脉PSV和加速时间(AT),计算狭窄处与狭窄近端PSV比值(简称PSV前比).结果 血管造影显示所有患者的动脉内径减少≥80%,狭窄部位位于髂动脉4例,吻合口20例和移植肾动脉14例.狭窄处PSV、髂动脉PSV及PSV前比在两种吻合方式之间差异均有统计学意义(P0.05).结论 两种吻合方式移植肾动脉重度狭窄患者的肾动脉血流动力学差异很可能是导致它们之间狭窄处PSV和PSV前比差异的主要原因.为了提高移植肾动脉重度狭窄的诊断准确性,应依据吻合方式来建立PSV前比的诊断阈值,而同-AT诊断阈值很可能适合两种吻合方式患者.%Objective To investigate differences in Doppler parameters between severe transplant renal artery stenosis (TRAS,arterial lumen reduction ≥80%) with end-to-end anastomosis and that with end-to-side anastomosis. Methods Color Doppler sonography(CDS) and digital subtraction angiography(DSA) images were reviewed retrospectively in 38 patients with severe TRAS (19 cases with end-to-end anastomosis and 19 cases with end-to-side anastomosis). All 38 cases with severe TRAS were initially diagnosed with CDS and confirmed by DSA afterwards. Doppler parameters,including the peak systolic velocity(PSV) in the renal, lilac, anastomosis site and segmental or interlobar artery, pre-PSV ratio (the ratio of the PSV at the stenotic site to that in the iliae artery), acceleration time(AT) in the intrarenal arteries, were measured or calculated. Results DSA demonstrated all patients with severe arterial

  1. 基于序列号和可靠应答机制DSR安全路由的实现%Exploring an Effective Design of Secure Dynamic Source Routing (DSR) Protocol Using Request Sequence Number and End-to-End Acknowledgement Principle

    Institute of Scientific and Technical Information of China (English)

    王建平; 史浩山

    2011-01-01

    Aim. The introduction of the full paper analyzes a type of black-hole attack on a route request (RREQ) packet. To avoid such attack, it proposes the exploration of a secure DSR protocol design, which is explained in sections 1 and 2. Section 1 explains that the sequence number of the RREQ packet is monotonically increasing.The core of section 2 consists of: ( 1 ) we perform the secure DSR routing protocol design with the sequence number increment principle and the end-to-end acknowledgement principle so that the route can effectively resist the blackhole attack; (2) to establish the credible routing information list, we design the procedural steps for processing the nodes in the RREQ packet and the end-to-end acknowledgement packet, which are illustrated by the block diagram shown in Fig. 2. To validate the effectiveness for resisting the black-hole attack, section 3 simulates the DSR protocol design obtained with our method; the simulation results, given in Figs. 6 and 7, and their analysis show preliminarily that: ( 1 ) the secure DSR protocol design obtained with our method can effectively resist the black-hole attack on the RREQ packet, guarantee secure routing information without expending much resource; (2) compared with the conventional DSR protocol design, our DSR protocol design has higher packet delivery ratio and smaller average delay.%文章分析了动态源路由(DSR)协议面临的路由请求(route request,RREQ)报文的黑洞攻击,针对该攻击,设计一种利用可靠的端到端应答机制和根据RREQ报文序列号连续递增原则来建立路由信息表.该设计在不消耗过多资源的基础上保证路由信息.通过仿真验证该设计能够很好地抵御RREQ报文的黑洞攻击,且与经典的DSR路由协议相比,分组投递率维持在较高水平上,且平均时延很小,具有更好的性能.

  2. Representing distance, consuming distance

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    to mobility and its social context. Such an understanding can be approached through representations, as distance is being represented in various ways, most noticeably in maps and through the notions of space and Otherness. The question this talk subsequently asks is whether these representations of distance...... are being consumed in the contemporary society, in the same way as places, media, cultures and status are being consumed (Urry 1995, Featherstone 2007). An exploration of distance and its representations through contemporary consumption theory could expose what role distance plays in forming...... are present in theoretical and empirical elaborations on mobility, but these remain largely implicit and unchallenged (Bauman 1998). This talk will endeavour to unmask distance as a theoretical entity by exploring ways in which distance can be understood and by discussing distance through its representations...

  3. Patchiness of ion-exchanged mica revealed by DNA binding dynamics at short length scales

    Science.gov (United States)

    Billingsley, D. J.; Lee, A. J.; Johansson, N. A. B.; Walton, A.; Stanger, L.; Crampton, N.; Bonass, W. A.; Thomson, N. H.

    2014-01-01

    The binding of double-stranded (ds) DNA to mica can be controlled through ion-exchanging the mica with divalent cations. Measurements of the end-to-end distance of linear DNA molecules discriminate whether the binding mechanism occurs through 2D surface equilibration or kinetic trapping. A range of linear dsDNA fragments have been used to investigate length dependences of binding. Mica, ion-exchanged with Ni(II) usually gives rise to kinetically trapped DNA molecules, however, short linear fragments (ion-exchanged mica is heterogeneous, and contains patches or domains, separating different ionic species. These results correlate with imaging of dsDNA under aqueous buffer on Ni(II)-mica and indicate that binding domains are of the order of 100 nm in diameter. Shorter DNA fragments behave intermediate to the two extreme cases of 2D equilibration and kinetic trapping. Increasing the incubation time of Ni(II) on mica, from minutes to hours, brings the conformations of the shorter DNA fragments closer to the theoretical value for kinetic trapping, indicating that long timescale kinetics play a role in ion-exchange. X-ray photoelectron spectroscopy (XPS) was used to confirm that the relative abundance of Ni(II) ions on the mica surface increases with time. These findings can be used to enhance spatial control of binding of DNA to inorganic surfaces with a view to patterning high densities arrays.

  4. An optimised protocol to isolate high-quality genomic DNA from seed tissues streamlines the workflow to obtain direct estimates of seed dispersal distances in gymnosperms.

    Science.gov (United States)

    García, C; Escribano-Ávila, G

    2016-05-01

    Genotyping of maternally derived seed tissues from georefered seeds that moved away from their source tree yield direct estimates of seed dispersal distances when the location and the genotype of the fruiting tree are available. These estimates are instrumental in forecasting the response of plant communities to drivers of global change, such as fragmentation or the expansion of invasive species. Obtaining robust assessments of seed dispersal distances requires comparing reliable multilocus genotypes of maternally derived seed tissues and fruiting trees, as previously shown for angiosperm species. However, robust estimates of seed dispersal distances based on direct methods are rare in non-model gymnosperms due to the difficulty in isolating high quality DNA from inconspicuous maternally derived seed tissues. These tissues tend to yield low DNA quantities that increase the frequency of genotyping errors. Here, we deliver a step-by-step visual protocol used to identify and isolate different seed tissues of interest for dispersal studies: embryos (2n, bi-parentally derived), seed coats (2n, maternally derived), and megagametophytes (n, maternally derived). We also provide an optimised lab protocol used to obtain multilocus genotypes from the target seed tissue. These broadly applicable protocols proved successful both in avoiding contamination among different seed tissues and providing reliable multilocus genotypes.

  5. The potential of distance-based thresholds and character-based DNA barcoding for defining problematic taxonomic entities by CO1 and ND1.

    Science.gov (United States)

    Bergmann, T; Rach, J; Damm, S; Desalle, R; Schierwater, B; Hadrys, H

    2013-11-01

    The mitochondrial CO1 gene (cytochrome c oxidase I) is a widely accepted metazoan barcode region. In insects, the mitochondrial NADH dehydrogenase subunit 1 (ND1) gene region has proved to be another suitable marker especially for the identification of lower level taxonomic entities such as populations and sister species. To evaluate the potential of distance-based thresholds and character-based DNA barcoding for the identification of problematic species-rich taxa, both markers, CO1 and ND1, were used as test parameters in odonates. We sequenced and compared gene fragments of CO1 and ND1 for 271 odonate individuals representing 51 species, 22 genera and eight families. Our data suggests that (i) the combination of the CO1 and ND1 fragment forms a better identifier than a single region alone; and (ii) the character-based approach provides higher resolution than the distance-based method in Odonata especially in closely related taxonomic entities.

  6. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  7. Single-molecule imaging of DNA pairing by RecA reveals a three-dimensional homology search.

    Science.gov (United States)

    Forget, Anthony L; Kowalczykowski, Stephen C

    2012-02-08

    DNA breaks can be repaired with high fidelity by homologous recombination. A ubiquitous protein that is essential for this DNA template-directed repair is RecA. After resection of broken DNA to produce single-stranded DNA (ssDNA), RecA assembles on this ssDNA into a filament with the unique capacity to search and find DNA sequences in double-stranded DNA (dsDNA) that are homologous to the ssDNA. This homology search is vital to recombinational DNA repair, and results in homologous pairing and exchange of DNA strands. Homologous pairing involves DNA sequence-specific target location by the RecA-ssDNA complex. Despite decades of study, the mechanism of this enigmatic search process remains unknown. RecA is a DNA-dependent ATPase, but ATP hydrolysis is not required for DNA pairing and strand exchange, eliminating active search processes. Using dual optical trapping to manipulate DNA, and single-molecule fluorescence microscopy to image DNA pairing, we demonstrate that both the three-dimensional conformational state of the dsDNA target and the length of the homologous RecA-ssDNA filament have important roles in the homology search. We discovered that as the end-to-end distance of the target dsDNA molecule is increased, constraining the available three-dimensional (3D) conformations of the molecule, the rate of homologous pairing decreases. Conversely, when the length of the ssDNA in the nucleoprotein filament is increased, homology is found faster. We propose a model for the DNA homology search process termed 'intersegmental contact sampling', in which the intrinsic multivalent nature of the RecA nucleoprotein filament is used to search DNA sequence space within 3D domains of DNA, exploiting multiple weak contacts to rapidly search for homology. Our findings highlight the importance of the 3D conformational dynamics of DNA, reveal a previously unknown facet of the homology search, and provide insight into the mechanism of DNA target location by this member of a

  8. Long-distance entanglement and quantum communication in coupled cavity arrays

    CERN Document Server

    Giampaolo, S M

    2009-01-01

    We introduce quantum spin models that allow for long-distance end-to-end entanglement and long-distance, high-fidelity teleportation, even at moderately high temperatures. We show how these models, that realize an optimal compromise between scalability and resilience to decoherence, can be implemented in simply engineered arrays of coupled optical cavities. We demonstrate how the latter can be used to realize a quasi-deterministic scheme of long-distance quantum communication with high success rate, without direct projection on Bell states and Bell measurements.

  9. Direct and Auger Electron-Induced, Single- and Double-Strand Breaks on Plasmid DNA Caused by 99mTc-Labeled Pyrene Derivatives and the Effect of Bonding Distance

    Science.gov (United States)

    Reissig, Falco; Mamat, Constantin; Steinbach, Joerg; Pietzsch, Hans-Juergen; Freudenberg, Robert; Navarro-Retamal, Carlos; Caballero, Julio; Kotzerke, Joerg; Wunderlich, Gerd

    2016-01-01

    It is evident that 99mTc causes radical-mediated DNA damage due to Auger electrons, which were emitted simultaneously with the known γ-emission of 99mTc. We have synthesized a series of new 99mTc-labeled pyrene derivatives with varied distances between the pyrene moiety and the radionuclide. The pyrene motif is a common DNA intercalator and allowed us to test the influence of the radionuclide distance on damages of the DNA helix. In general, pUC 19 plasmid DNA enables the investigation of the unprotected interactions between the radiotracers and DNA that results in single-strand breaks (SSB) or double-strand breaks (DSB). The resulting DNA fragments were separated by gel electrophoresis and quantified by fluorescent staining. Direct DNA damage and radical-induced indirect DNA damage by radiolysis products of water were evaluated in the presence or absence of the radical scavenger DMSO. We demonstrated that Auger electrons directly induced both SSB and DSB in high efficiency when 99mTc was tightly bound to the plasmid DNA and this damage could not be completely prevented by DMSO, a free radical scavenger. For the first time, we were able to minimize this effect by increasing the carbon chain lengths between the pyrene moiety and the 99mTc nuclide. However, a critical distance between the 99mTc atom and the DNA helix could not be determined due to the significantly lowered DSB generation resulting from the interaction which is dependent on the type of the 99mTc binding motif. The effect of variable DNA damage caused by the different chain length between the pyrene residue and the Tc-core as well as the possible conformations of the applied Tc-complexes was supplemented with molecular dynamics (MD) calculations. The effectiveness of the DNA-binding 99mTc-labeled pyrene derivatives was demonstrated by comparison to non-DNA-binding 99mTcO4–, since nearly all DNA damage caused by 99mTcO4– was prevented by incubating with DMSO. PMID:27583677

  10. Distance Learning

    Science.gov (United States)

    1997-12-01

    A study reviewing the existing Army Distance Learning Plan (ADLP) and current Distance Learning practices, with a focus on the Army’s training and...educational challenges and the benefits of applying Distance Learning techniques. The ASB study panel makes six specific recommendations, the most

  11. Use of Plasmon Coupling to Reveal the Dynamics of DNA Bending andCleavage by Single EcoRV Restriction Enzymes

    Energy Technology Data Exchange (ETDEWEB)

    Reinhard, Bjorn; Sheikholeslami, Sassan; Mastroianni, Alexander; Alivisatos, A. Paul; Liphardt, Jan

    2006-09-06

    Pairs of Au nanoparticles have recently been proposed asplasmon rulers based on the dependence of their light scattering on theinterparticle distance. Preliminary work has suggested that plasmonrulers can be used to measure and monitor dynamic distance changes overthe 1 to 100nm length scale in biology. Here, we substantiate thatplasmon rulers can be used to effectively measure dynamical biophysicalprocesses by applying the ruler to a system that has been investigatedextensively using ensemble kinetic measurements: the cleavage of DNA bythe restriction enzyme EcoRV. Temporal resolutions of up to 240 Hz wereobtained, and the end-to-end extension of up to 1000 individual dsDNAenzyme substrates could be monitored in parallel for hours. The singlemolecule cleavage trajectories acquired here agree well with valuesobtained in bulk through other methods, and confirm well-known featuresof the cleavage process, such as the fact that the DNA is bent prior tocleavage. New dynamical information is revealed as well, for instance,the degree of softening of the DNA just prior to cleavage. The unlimitedlife time, high temporal resolution, and high signal/noise make theplasmon ruler an excellent tool for studying macromolecular assembliesand conformational changes at the single molecule level.

  12. [Rapid construction of full-length MnSOD cDNA of chickens by one-step 3'RACE].

    Science.gov (United States)

    Bu, You-Quan; Luo, Xu-Gang; Liu, Bin; Li, Su-Fen

    2004-07-01

    RACE (rapid amplification of cDNA ends) is a popular technique to rapidly obtain the full-length cDNA. After obtaining the 3' cDNA and 5' cDNA fragments with a overlapped region by 3' RACE and 5' RACE, the full-length cDNA could be generated by end-to-end PCR or subcloning. In this study, 3' RACE combined with touch-down PCR was successfully used for the rapid construction of full-length MnSOD cDNA of chickens. Compared with the conventional end-to-end PCR or subcloning, this method, called one-step 3' RACE, is fast, economical and highly specific. It especially fits the rapid construction of full-length cDNA by RACE method.

  13. End-to-End Thiocyanato-Bridged Helical Chain Polymer and Dichlorido-Bridged Copper(II) Complexes with a Hydrazone Ligand: Synthesis, Characterisation by Electron Paramagnetic Resonance and Variable-Temperature Magnetic Studies, and Inhibitory Effects on Human Colorectal Carcinoma Cells.

    Science.gov (United States)

    Das, Kuheli; Datta, Amitabha; Sinha, Chittaranjan; Huang, Jui-Hsien; Garribba, Eugenio; Hsiao, Ching-Sheng; Hsu, Chin-Lin

    2012-04-01

    The reactions of the tridentate hydrazone ligand, N'-[1-(pyridin-2-yl)ethylidene]acetohydrazide (HL), obtained by condensation of 2-acetylpyridine with acetic hyadrazide, with copper nitrate trihydrate in the presence of thiocyanate, or with CuCl2 produce two distinct coordination compounds, namely a one-dimensional helical coordination chain of [CuL(NCS)] n (1) units, and a doubly chlorido-bridged dinuclear complex [Cu2L2Cl2] (2) (where L=CH3C(O)=N-N=CCH3C5H4N). Single-crystal X-ray structural determination studies reveal that in complex 1, a deprotonated hydrazone ligand L(-) coordinates a copper(II) ion that is bridged to two neighbouring metal centres by SCN(-) anions, generating a one-dimensional helical coordination chain. In complex 2, two symmetry-related, adjacent copper(II) coordination entities are doubly chlorido-bridged, producing a dicopper entity with a Cu⋅⋅⋅Cu distance of 3.402 (1) Å. The two coordination compounds have been fully characterised by elemental analysis, spectroscopic techniques including IR, UV-vis and electron paramagnetic resonance, and variable-temperature magnetic studies. The biological effects of 1 and 2 on the viability of human colorectal carcinoma cells (COLO-205 and HT-29) were evaluated using an MTT assay, and the results indicate that these complexes induce a decrease in cell-population growth of human colorectal carcinoma cells with apoptosis.

  14. Information Distance

    CERN Document Server

    Bennett, Charles H; Li, Ming; Vitanyi, Paul M B; Zurek, Wojciech H

    2010-01-01

    While Kolmogorov complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term. We show that the information distance is a universal cognitive similarity distance. We investigate the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs (a generalization of the Slepian-Wolf theorem of classical information theory), and the density properties of the discrete metric spaces induced by the information distances. A related distance measures the amount of nonreversibility of a computation. Using the physical theory of reversible computation, we give...

  15. DNA

    Science.gov (United States)

    Stent, Gunther S.

    1970-01-01

    This history for molecular genetics and its explanation of DNA begins with an analysis of the Golden Jubilee essay papers, 1955. The paper ends stating that the higher nervous system is the one major frontier of biological inquiry which still offers some romance of research. (Author/VW)

  16. DBLAR: A DISTANCE-BASED LOCATION-AIDED ROUTING FOR MANET

    Institute of Scientific and Technical Information of China (English)

    Wang Kun; Wu Meng

    2009-01-01

    In location-aided routing of Mobile Ad hoc NETworks (MANET), nodes mobility and the inaccuracy of location information may result in constant flooding, which will reduce the network performance. In this paper, a Distance-Based Location-Aided Routing (DBLAR) for MANET has been proposed. By tracing the location information of destination nodes and referring to distance change between nodes to adjust route discovery dynamically, the proposed routing algorithm can avoid flooding in the whole networks. Besides, Distance Update Threshold (DUT) is set up to reach the balance between real-time ability and update overhead of location information of nodes, meanwhile, the detection of relative distance vector can achieve the goal of adjusting forwarding condition. Simulation results reveal that DBLAR performs better than LAR1 in terms of packet successful delivery ratio, average end-to-end delay and routing-load, and the set of DUT and relative distance vector has a significant impact on this algorithm.

  17. Structural characteristics of oligomeric DNA strands adsorbed onto single-walled carbon nanotubes.

    Science.gov (United States)

    Roxbury, Daniel; Jagota, Anand; Mittal, Jeetain

    2013-01-10

    The single-stranded DNA to single-walled carbon nanotube (SWCNT) hybrid continues to attract significant interest as an exemplary biological molecule-nanomaterial conjugate. In addition to their many biomedical uses, such as in vivo sensing and delivery of molecular cargo, DNA-SWCNT hybrids enable the sorting of SWCNTs according to their chirality. Current experimental methods have fallen short of identifying the actual structural ensemble of DNA adsorbed onto SWCNTs that enables and controls several of these phenomena. Molecular dynamics (MD) simulation has been a useful tool for studying the structure of these hybrid molecules. In recent studies, using replica exchange MD (REMD) simulation we have shown that novel secondary structures emerge and that these structures are DNA-sequence and SWCNT-type dependent. Here, we use REMD to investigate in detail the structural characteristics of two DNA-SWCNT recognition pairs: (TAT)(4)-(6,5)-SWCNT, i.e., DNA sequence TATTATTATTAT bound to the (6,5) chirality SWCNT, and (CCG)(2)CC-(8,7)-SWCNT as well as off-recognition pairs (TAT)(4)-(8,7)-SWCNT and (CCG)(2)CC-(6,5)-SWCNT. From a structural clustering analysis, dominant equilibrium structures are identified and show a right-handed self-stitched motif for (TAT)(4)-(6,5) in contrast to a left-handed β-barrel for (CCG)(2)CC-(8,7). Additionally, characteristics such as DNA end-to-end distance, solvent accessible SWCNT surface area, DNA hydrogen bonding between bases, and DNA dihedral distributions have been probed in detail as a function of the number of DNA strands adsorbed onto the nanotube. We find that the DNA structures adsorbed onto a nanotube are also stabilized by significant numbers of non-Watson-Crick hydrogen bonds (intrastrand and interstrand) in addition to π-π stacking between DNA bases and nanotube surface and Watson-Crick pairs. Finally, we provide a summary of DNA structures observed for various DNA-SWCNT hybrids as a preliminary set of motifs that may be

  18. Understanding the origin of liquid crystal ordering of ultrashort double-stranded DNA

    Science.gov (United States)

    Saurabh, Suman; Lansac, Yves; Jang, Yun Hee; Glaser, Matthew A.; Clark, Noel A.; Maiti, Prabal K.

    2017-03-01

    Recent experiments have shown that short double-stranded DNA (dsDNA) fragments having six- to 20-base pairs exhibit various liquid crystalline phases. This violates the condition of minimum molecular shape anisotropy that analytical theories demand for liquid crystalline ordering. It has been hypothesized that the liquid crystalline ordering is the result of end-to-end stacking of dsDNA to form long supramolecular columns which satisfy the shape anisotropy criterion necessary for ordering. To probe the thermodynamic feasibility of this process, we perform molecular dynamics simulations on ultrashort (four base pair long) dsDNA fragments, quantify the strong end-to-end attraction between them, and demonstrate that the nematic ordering of the self-assembled stacked columns is retained for a large range of temperature and salt concentration.

  19. Nearest-neighbor nitrogen and oxygen distances in the iron(II)-DNA complex studied by extended X-ray absorption fine structure.

    Science.gov (United States)

    Bertoncini, Clelia R A; Meneghini, Rogerio; Tolentino, Helio

    2010-11-01

    In mammalian cells, DNA-bound Fe(II) reacts with H₂O₂ producing the highly reactive hydroxyl radical (OH) in situ. Since ·OH attacks nearby DNA residue generating oxidative DNA damage, many questions have arisen regarding iron-DNA complex formations and their implication in pre-malignant mutations and aging. In this work, a solid sample of Fe(II)-DNA complex containing one Fe(II) per 10 nucleotides was analyzed from extended X-ray absorption fine structure (EXAFS) spectra collected in a synchrotron radiation light source. Best fitting parameters of the EXAFS signal for the first two shells provide evidence of five oxygen atoms at 1.99 ± 0.02 Å and one nitrogen atom at 2.20 ± 0.02 Å in the inner coordination sphere of the Fe(II)-DNA complex. Considering that both purine base moieties bearing nitrogen atoms are prone to chelate iron, these results are consistent with the previously observed lower levels of DNA damage in cytosine nucleotides relative to adenine and guanine sites in cells under more physiological conditions of Fe(II) Fenton reaction.

  20. Long-distance entanglement in many-body atomic and optical systems

    Energy Technology Data Exchange (ETDEWEB)

    Giampaolo, Salvatore M; Illuminati, Fabrizio [Dipartimento di Matematica e Informatica, Universita degli Studi di Salerno, Via Ponte don Melillo, I-84084 Fisciano, SA (Italy)], E-mail: illuminati@sa.infn.it

    2010-02-15

    We discuss the phenomenon of long-distance entanglement (LDE) in the ground state of quantum spin models, its use in high-fidelity and robust quantum communication, and its realization in many-body systems of ultracold atoms in optical lattices and in arrays of coupled optical cavities. We investigate XX quantum spin models on one-dimensional lattices with open ends and different patterns of site-dependent interaction couplings, singling out two general settings: patterns that allow for perfect LDE in the ground state of the system, namely such that the end-to-end entanglement remains finite in the thermodynamic limit, and patterns of quasi-long-distance entanglement (QLDE) in the ground state of the system, namely such that the end-to-end entanglement vanishes with a very slow power-law decay as the length of the spin chain is increased. We discuss physical realizations of these models in ensembles of ultracold bosonic atoms loaded in optical lattices. We show how, using either suitably engineered super-lattice structures or exploiting the presence of edge impurities in lattices with single periodicity, it is possible to realize models endowed with nonvanishing LDE or QLDE. We then study how to realize models that optimize the robustness of QLDE at finite temperature and in the presence of imperfections using suitably engineered arrays of coupled optical cavities. For both cases the numerical estimates of the end-to-end entanglement in the actual physical systems are thoroughly compared with the analytical results obtained for the spin model systems. We finally introduce LDE-based schemes of long-distance quantum teleportation in linear arrays of coupled cavities, and show that they allow for high-fidelity and high success rates even at moderately high temperatures.

  1. End-to-End Key Exchange through Disjoint Paths in P2P Networks

    Directory of Open Access Journals (Sweden)

    Daouda Ahmat

    2015-01-01

    Full Text Available Due to their inherent features, P2P networks have proven to be effective in the exchange of data between autonomous peers. Unfortunately, these networks are subject to various security threats that cannot be addressed readily since traditional security infrastructures, which are centralized, cannot be applied to them. Furthermore, communication reliability across the Internet is threatened by various attacks, including usurpation of identity, eavesdropping or traffic modification. Thus, in order to overcome these security issues and allow peers to securely exchange data, we propose a new key management scheme over P2P networks. Our approach introduces a new method that enables a secret key exchange through disjoint paths in the absence of a trusted central coordination point which would be required in traditional centralized security systems.

  2. Integration of DST's for non-conflicting end-to-end flight scheduling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR effort we propose an innovative approach for the integration of Decision Support Tools (DSTs) for increased situational awareness, improved cooperative...

  3. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effects...

  4. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  5. Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS

    Science.gov (United States)

    Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.

    Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.

  6. End-to-End Concurrent Multipath Transfer Using Transport Layer Multihoming

    Science.gov (United States)

    2006-07-01

    http://www.kame.net. [3] CAIDA : Packet Sizes and Sequencing, Mar 1998. http://traffic.caida.org. [4] H. Adiseshu, G. Parulkar, and G. Varghese. A...Flow Control. Technical report, Cooperative Association for Internet Data Analysis ( CAIDA ), February 2004. [59] H. Sivakumar, S. Bailey, and R. Grossman

  7. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    Water based examples include: sea kayaking and canoeing; snorkeling and scuba diving; whale watching; wind and regular board surfing, and marine smal...Provisions requiring the mandatory distribution of emergency alert messages. Broadcasting Regulatory Policy CRTC 2014-444 and Broadcasting Orders...compiled from Legal Surveys Division’s cadastral datasets and survey records archived in the Canada Lands Survey Records. Provincial Offshore Oil and

  8. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    Science.gov (United States)

    2016-04-01

    interchangeably with security. Security is desirable in today’s real-world software. Hack- ers often exploit software bugs to obtain information...communication between processes is disabled. The rest of this paper is organized as follows. Sec. 2 intro - duces the observation function and shows how

  9. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    with the analysers of the calculi, and the results of the analysis are reflected back into a modified version of the input UML model. The design platform supporting the methodology, Choreographer, interoperates with state-of-the-art UML modelling tools. We illustrate the approach with a well known protocol...... and report on the experience of industrial users who have applied Choreographer in their development work....

  10. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Since the majority of the potential ADR targets are large (>meters) upper stages and payloads between 800 and 1100 km altitude, they are relatively bright, with...

  11. From End to End: tRNA Editing at 5'- and 3'-Terminal Positions

    Science.gov (United States)

    Betat, Heike; Long, Yicheng; Jackman, Jane E.; Mörl, Mario

    2014-01-01

    During maturation, tRNA molecules undergo a series of individual processing steps, ranging from exo- and endonucleolytic trimming reactions at their 5'- and 3'-ends, specific base modifications and intron removal to the addition of the conserved 3'-terminal CCA sequence. Especially in mitochondria, this plethora of processing steps is completed by various editing events, where base identities at internal positions are changed and/or nucleotides at 5'- and 3'-ends are replaced or incorporated. In this review, we will focus predominantly on the latter reactions, where a growing number of cases indicate that these editing events represent a rather frequent and widespread phenomenon. While the mechanistic basis for 5'- and 3'-end editing differs dramatically, both reactions represent an absolute requirement for generating a functional tRNA. Current in vivo and in vitro model systems support a scenario in which these highly specific maturation reactions might have evolved out of ancient promiscuous RNA polymerization or quality control systems. PMID:25535083

  12. From End to End: tRNA Editing at 5'- and 3'-Terminal Positions

    Directory of Open Access Journals (Sweden)

    Heike Betat

    2014-12-01

    Full Text Available During maturation, tRNA molecules undergo a series of individual processing steps, ranging from exo- and endonucleolytic trimming reactions at their 5'- and 3'-ends, specific base modifications and intron removal to the addition of the conserved 3'-terminal CCA sequence. Especially in mitochondria, this plethora of processing steps is completed by various editing events, where base identities at internal positions are changed and/or nucleotides at 5'- and 3'-ends are replaced or incorporated. In this review, we will focus predominantly on the latter reactions, where a growing number of cases indicate that these editing events represent a rather frequent and widespread phenomenon. While the mechanistic basis for 5'- and 3'-end editing differs dramatically, both reactions represent an absolute requirement for generating a functional tRNA. Current in vivo and in vitro model systems support a scenario in which these highly specific maturation reactions might have evolved out of ancient promiscuous RNA polymerization or quality control systems.

  13. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  14. SecMon: End-to-End Quality and Security Monitoring System

    CERN Document Server

    Ciszkowski, Tomasz; Fiedler, Markus; Kotulski, Zbigniew; Lupu, Radu; Mazurczyk, Wojciech

    2008-01-01

    The Voice over Internet Protocol (VoIP) is becoming a more available and popular way of communicating for Internet users. This also applies to Peer-to-Peer (P2P) systems and merging these two have already proven to be successful (e.g. Skype). Even the existing standards of VoIP provide an assurance of security and Quality of Service (QoS), however, these features are usually optional and supported by limited number of implementations. As a result, the lack of mandatory and widely applicable QoS and security guaranties makes the contemporary VoIP systems vulnerable to attacks and network disturbances. In this paper we are facing these issues and propose the SecMon system, which simultaneously provides a lightweight security mechanism and improves quality parameters of the call. SecMon is intended specially for VoIP service over P2P networks and its main advantage is that it provides authentication, data integrity services, adaptive QoS and (D)DoS attack detection. Moreover, the SecMon approach represents a low...

  15. HIDE & SEEK: End-to-End Packages to Simulate and Process Radio Survey Data

    CERN Document Server

    Akeret, Joel; Chang, Chihway; Monstein, Christian; Amara, Adam; Refregier, Alexandre

    2016-01-01

    As several large radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these data sets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system - from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal....

  16. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF)?a model-based software framework that shall enable seamless continuity of mission design and...

  17. End-to-end Encryption for SMS Messages in the Health Care Domain.

    Science.gov (United States)

    Hassinen, Marko; Laitinen, Pertti

    2005-01-01

    The health care domain has a high level of expectation on security and privacy of patient information. The security, privacy, and confidentiality issues are consistent all over the domain. Technical development and increasing use of mobile phones has led us to a situation in which SMS messages are used in the electronic interactions between health care professionals and patients. We will show that it is possible to send, receive and store text messages securely with a mobile phone with no additional hardware required. More importantly we will show that it is possible to obtain a reliable user authentication in systems using text message communication. Programming language Java is used for realization of our goals. This paper describes the general application structure, while details for the technical implementation and encryption methods are described in the referenced articles. We also propose some crucial areas where the implementation of encrypted SMS can solve previous lack of security.

  18. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2017-01-01

    Industry 4.0 refers to the fourth industrial revolution, and introduces modern communication and computation technologies such as 5G, cloud computing and Internet of Things to industrial manufacturing systems. As a result, many devices, machines and applications will rely on connectivity, while...... having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...... as a mechanism to handle the diverse set of requirements to the network. We present methods for slicing deterministic and packet-switched industrial communication protocols at an abstraction level which is decoupled from the specific implementation of the underlying technologies, and hence simplifies the slicing...

  19. An autonomic joint radio resource management algorithm in end-to-end reconfigurable system

    Institute of Scientific and Technical Information of China (English)

    Lin Yuewei; Le yanbien; Xue Yuan; Feng Zhiyong; Zhang Yongjing

    2008-01-01

    This paper presents the multi-step Q-learning (MQL) algorithm as an autonomic approach to the joint radio resource management (JRRM) among heterogeneous radio access technologies (RATs) in the B3G environment.Through the "trial-and-error" on-line learning process, the JRRM controller can converge to the optimized admission control policy.The JRRM controller learns to give the best allocation for each session in terms of both the access RAT and the service bandwidth.Simulation results show that the proposed algorithm realizes the autonomy of JRRM and achieves well trade-off between the spectrum utility and the blocking probability comparing to the load-balancing algorithm and the utility-maximizing algorithm.Besides, the proposed algorithm has better online performances and convergence speed than the one-step Q-learning (QL) algorithm.Therefore, the user statisfaction degree could be improved also.

  20. Intelligent end-to-end resource virtualization using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, G.; Kontos, T.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of hi

  1. End-to-End Data Movement Using MPI-IO Over Routed Terabots Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Vallee, Geoffroy R [ORNL; Atchley, Scott [ORNL; Kim, Youngjae [ORNL; Shipman, Galen M [ORNL

    2013-01-01

    Scientific discovery is nowadays driven by large-scale simulations running on massively parallel high-performance computing (HPC) systems. These applications each generate a large amount of data, which then needs to be post-processed for example for data mining or visualization. Unfortunately, the computing platform used for post processing might be different from the one on which the data is initially generated, introducing the challenge of moving large amount of data between computing platforms. This is especially challenging when these two platforms are geographically separated since the data needs to be moved between computing facilities. This is even more critical when scientists tightly couple their domain specific applications with a post processing application. The paper presents a solution for the data transfer between MPI applications using a dedicated wide area network (WAN) terabit infrastructure. The proposed solution is based on parallel access to data files and the Message Passing Interface (MPI) over the Common Communication Infrastructure (CCI) for the data transfer over a routed infrastructure. In the context of this research, the Energy Sciences Network (ESnet) of the U.S. Department of Energy (DOE) is targeted for the transfer of data between DOE national laboratories.

  2. Architecting end-to-end convergence of Web and Telco services

    OpenAIRE

    Nicolas, Gérard; Sbata, Karim; Najm, Elie

    2011-01-01

    International audience; Over the last few years, significant evolutions such as the mobile phones' enhanced Web-browsing capabilities and the technical incursion of Web major players into the Telco world (e.g. Google, Facebook) have reduced the gap between Telecom and Web worlds. In this context, converging IMS or Internet Protocol Multimedia Subsystem and Web service platforms has become a key challenge that needs to be addressed by both Web and telecom players. Several interesting solutions...

  3. Websocket Enabler: achieving IMS and Web services end-to-end convergence

    OpenAIRE

    Nicolas, Gérard; Sbata, Karim; Najm, Elie

    2011-01-01

    International audience; Over the last few years, significant evolutions such as the mobile phones' enhanced web-browsing capabilities and the technical incursion of Web major players into the Telco world (e.g. Google, Facebook) have reduced the gap between Telecom and Web worlds. In this context, converging IMS and Web service platforms has become a key challenge that needs to be addressed by both Web and Telecom players. Several interesting solutions, illustrating different convergence appro...

  4. Assessing Natural Product-Drug Interactions: An End-to-End Safety Framework.

    Science.gov (United States)

    Roe, Amy L; Paine, Mary F; Gurley, Bill J; Brouwer, Kenneth R; Jordan, Scott; Griffiths, James C

    2016-04-01

    The use of natural products (NPs), including herbal medicines and other dietary supplements, by North Americans continues to increase across all age groups. This population has access to conventional medications, with significant polypharmacy observed in older adults. Thus, the safety of the interactions between multi-ingredient NPs and drugs is a topic of paramount importance. Considerations such as history of safe use, literature data from animal toxicity and human clinical studies, and NP constituent characterization would provide guidance on whether to assess NP-drug interactions experimentally. The literature is replete with reports of various NP extracts and constituents as potent inhibitors of drug metabolizing enzymes, and transporters. However, without standard methods for NP characterization or in vitro testing, extrapolating these reports to clinically-relevant NP-drug interactions is difficult. This lack of a clear definition of risk precludes clinicians and consumers from making informed decisions about the safety of taking NPs with conventional medications. A framework is needed that describes an integrated robust approach for assessing NP-drug interactions; and, translation of the data into formulation alterations, dose adjustment, labelling, and/or post-marketing surveillance strategies. A session was held at the 41st Annual Summer Meeting of the Toxicology Forum in Colorado Springs, CO, to highlight the challenges and critical components that should be included in a framework approach.

  5. An Algorithm for End-to-End Performance Analysis of Network Based on Traffic Engineering

    Institute of Scientific and Technical Information of China (English)

    Liu Huailiang; Zhang Xin; Wang Dong; Xu Guohua

    2003-01-01

    Based on traffic engineering, the network topology is described with network graph. An algorithm for the derivation of data communication capability in network links and the analysis of connectivity performance between node pairs is given through standardized transformation of the original descriptive matrix for the link performance, and resolution of transitive closure for adjacent-incident matrix of network in view of randomness of network events, which provides a feasible way for analysis and improvement of network performance

  6. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    virtual machines. For mobile platforms we developed and prototyped an architecture supporting separation of personalities on the same platform ...prototype motivated additional requirements for the virtual Trusted Platform Module (vTPM) and led to a new vTPM implementation (tpm server cuse...machines running on Linux/KVM on Power (Open-POWER platform ). This implies that virtual machines are protected from a potentially compromised Linux

  7. Improved sample filtering method for measuring end-to-end path capacity

    Institute of Scientific and Technical Information of China (English)

    LI Wen-wei; TANG Jun-long; ZHANG Da-fang; XIE Gao-gang

    2007-01-01

    By analyzing the effect of cross traffic (CT) enforced on packet delay, an improved path capacity measurement method,pcapminp algorithm, was proposed. With this method, path capacity was measured by filtering probe samples based on measured minimum packet-pair delay. The measurability of minimum packet-pair delay was also analyzed by simulation. The results show that,when comparing with pathrate, ifthe CT load is light, both pcapminp and pathrate have similar accuracy; but in the case of heavy CT load, pcapminp is more accurate than Pathrate. When CT load reaches 90%, pcapminp algorithm has only 5% measurement error,which is 10% lower than that of pathrate algorithm. At any CT load levels, the probe cost of pcapminp algorithm is two magnitudes smaller than that of pathrate, and the measurement duration is one magnitude shorter than that of pathrate algorithm.

  8. End-to-End Network QoS via Scheduling of Flexible Resource Reservation Requests

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, S.; Katramatos, D.; Yu, D.

    2011-11-14

    Modern data-intensive applications move vast amounts of data between multiple locations around the world. To enable predictable and reliable data transfer, next generation networks allow such applications to reserve network resources for exclusive use. In this paper, we solve an important problem (called SMR3) to accommodate multiple and concurrent network reservation requests between a pair of end-sites. Given the varying availability of bandwidth within the network, our goal is to accommodate as many reservation requests as possible while minimizing the total time needed to complete the data transfers. We first prove that SMR3 is an NP-hard problem. Then we solve it by developing a polynomial-time heuristic, called RRA. The RRA algorithm hinges on an efficient mechanism to accommodate large number of requests by minimizing the bandwidth wastage. Finally, via numerical results, we show that RRA constructs schedules that accommodate significantly larger number of requests compared to other, seemingly efficient, heuristics.

  9. Effect of 3 Key Factors on Average End to End Delay and Jitter in MANET

    Directory of Open Access Journals (Sweden)

    Saqib Hakak

    2015-01-01

    Full Text Available A mobile ad-hoc network (MANET is a self-configuring infrastructure-less network of mobile devices connected by wireless links where each node or mobile device is independent to move in any desired direction and thus the links keep moving from one node to another. In such a network, the mobile nodes are equipped with CSMA/CA (carrier sense multiple access with collision avoidance transceivers and communicate with each other via radio. In MANETs, routing is considered one of the most difficult and challenging tasks. Because of this, most studies on MANETs have focused on comparing protocols under varying network conditions. But to the best of our knowledge no one has studied the effect of other factors on network performance indicators like throughput, jitter and so on, revealing how much influence a particular factor or group of factors has on each network performance indicator. Thus, in this study the effects of three key factors, i.e. routing protocol, packet size and DSSS rate, were evaluated on key network performance metrics, i.e. average delay and average jitter, as these parameters are crucial for network performance and directly affect the buffering requirements for all video devices and downstream networks.

  10. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  11. Implementation of an End-to-End Simulator for the BepiColombo Rotation Experiment

    Science.gov (United States)

    Palli, A.; Bevilacqua, A.; Genova, A.; Gherardi, A.; Iess, L.; Meriggiola, R.; Tortora, P.

    2012-09-01

    Fundamental information on the interior of Mercury can be inferred from its rotational state, in terms of obliquity and amplitude of physical libration in longitude. For this reason a dedicated Rotation Experiment will be performed by the ESA mission BepiColombo. A system-level experiment simulator has been developed in order to optimize the observation strategy and is here presented. In particular, this abstract will focus on the estimation process, the optimization algorithms and the selection of optimal pattern matching strategies.

  12. CUSat: An End-to-End In-Orbit Inspection System University Nanosatellite Program

    Science.gov (United States)

    2007-01-01

    Delivered PPTs Need to make Waiting on Ultem stock to make nozzles. nozzles 29 Lifting Harness Not Manufactured Assembly Base Delivered Harnesses Periph 1...as appropriate. Fit checks have been performed on all mechanical hardware. PPT nozzles have been delayed due to processing issues with the Ultem stock...Once the Ultem order goes through and the stock arrives, manufacturing will begin immediately. This will most likely not affect the delivery

  13. Mattress sutures for the modification of end-to-end dunking pancreaticojejunostomy

    Institute of Scientific and Technical Information of China (English)

    Nurkan Torer

    2013-01-01

     Despite the improvement of surgical techniques, the rate of anastomotic failure of pancreaticojejunostomy remains high (30%-50%). Here we describe the use of vertical mattress sutures  in  the  modification  of  dunking  pancreaticojejunal anastomosis.  In  7  patients  who  used  this  technique,  neither anastomotic  failure  nor  any  major  postsurgical  complication developed. This technique is an easy, safe, and promising for the performance of pancreaticojejunostomy.

  14. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  15. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...... with the analysers of the calculi, and the results of the analysis are reflected back into a modified version of the input UML model. The design platform supporting the methodology, Choreographer, interoperates with state-of-the-art UML modelling tools. We illustrate the approach with a well known protocol...

  16. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published under the GPLv3 license on GitHub.

  17. End-to-end interstellar communication system design for power efficiency

    CERN Document Server

    Messerschmitt, David G

    2013-01-01

    Interstellar radio communication accounting for known impairments due to radio propagation in the interstellar medium (attenuation, noise, dispersion, and scattering) and motion is studied. Large propagation losses and large transmitted powers motivate us to maximize the power efficiency, defined as the ratio of information rate to average signal power. The fundamental limit on power efficiency is determined. The power efficiency for narrow-bandwidth signals assumed in many current SETI searches has a penalty in power efficiency of four to five orders of magnitude. A set of five power-efficient design principles can asymptotically approach the fundamental limit, and in practice increase the power efficiency by three to four orders of magnitude. The most fundamental is to trade higher bandwidth for lower average power. In addition to improving the power efficiency, average power can be reduced by lowering the information rate. The resulting low-power signals have characteristics diametrically opposite to those...

  18. The Use of End-to-End Multicast Measurements for Characterizing Internal Network Behavior

    Science.gov (United States)

    2002-08-01

    Loss Utilization Average Delay Delay Variance Delay Covariance (a) 0 0.2 0.4 0.6 0.8 1 0 500 1000 1500 2000 2500 3000 3500 4000 F ra...ct io n of c or re ct ly c la ss ifi ed tr ee s no. of probes Loss Utilization Average Delay Delay Variance Delay Covariance (b) Figure 2: MODEL... Loss Utilization Average Delay Delay Variance Delay Covariance (b) Figure 3: ns SIMULATION: (a) simulation topology; (b) fraction of

  19. End-to-end performance analysis using engineering confidence models and a ground processor prototype

    NARCIS (Netherlands)

    Kruse, K.W.; Sauer, M.; Jäger, T.; Herzog, A.; Schmitt, M.; Huchler, M.; Wallace, K.; Eisinger, M.; Heliere, A.; Lefebvre, A.; Maher, M.; Chang, M.; Phillips, T.; Knight, S.; Goeij, B.T.G. de; Knaap, F.G.P. van der; Hof, C.A. van 't

    2015-01-01

    The European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) are co-operating to develop the EarthCARE satellite mission with the fundamental objective of improving the understanding of the processes involving clouds, aerosols and radiation in the Earth's atmosphere. The

  20. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    based Integrity Check of Critical Underlying Protocol Semantics HTTP Hypertext Transfer Protocol ICMP Internet Control Message Protocol ICSI...crosses the wire. This particular packet is a Hypertext Transfer Protocol (HTTP) GET request. Web browsers send this type of request in order to fetch a...al. (1999, Jun.). Hypertext transfer protocol. RFC 2616 (Draft Standard). [Online]. Available: http://tools.ietf.org/html/rfc2616 [31] J. Postel. (1981

  1. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  2. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    Science.gov (United States)

    2012-08-01

    of the specification itself. Examples include the seL4 microkernel work by Klein et al. [KEH+09], which presents the experience of formally proving... electromagnetic , optical, acoustic or other nature [AARR02, Kuh02, ST04], or temperature drift [ZBA10]. We make the assumption that virtual machines have no

  3. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    in the information age. Harvard Business Review , 72(1), 100–107. Information Technology Association of America. (2008). Reliability program standard...Balanced Scorecard—Measures that drive performance. Harvard Business Review , 70(1), 71–79. Keebler, J. S., Manrodt, K. B., Durtsche, D. A., & Ledyard...America, 2008). Reliability in this project includes initial quality and fielded reliability, using the definition above. A detailed review of the quality

  4. An optimal quality adaptation mechanism for end-to-end FGS video transmission

    Institute of Scientific and Technical Information of China (English)

    FENG Shun; ER Gui-hua; DAI Qiong-hai; LIU Ye-bin

    2006-01-01

    In this paper, we propose a novel optimal quality adaptation algorithm for MPEG-4 fine granular scalability (FGS)stream over wired network. Our algorithm can maximize perceptual video quality by minimizing video quality variation and increasing available bandwidth usage rate. Under the condition that the whole bandwidth evolution is known, we design an optimal algorithm to select layer. When the knowledge of future bandwidth is not available, we also develop an online algorithm based on the optimal algorithm. Simulation showed that both optimal algorithm and online algorithm can offer smoothed video quality evolution.

  5. End-to-End simulation study of a full magnetic gradiometry mission

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Olsen, Nils

    2014-01-01

    of a simulated low Earth orbiting satellite. The observations are synthesized from realistic models based upon a combination of the major sources contributing to the Earth’s magnetic field. From those synthetic data, we estimate field models using either the magnetic vector field observations only or the full......In this paper, we investigate space magnetic gradiometry as a possible path for future exploration of the Earth’s magnetic field with satellites. Synthetic observations of the magnetic field vector and of six elements of the magnetic gradient tensor are calculated for times and positions...

  6. Low-overhead end-to-end performance measurement for next generation networks

    OpenAIRE

    Pezaros, D.P.; Hoerdt, M.; Hutchison, D

    2011-01-01

    Internet performance measurement is commonly perceived as a high-cost control-plane activity and until now it has tended to be implemented on top of the network’s forwarding\\ud operation. Consequently, measurement mechanisms have often\\ud had to trade relevance and accuracy over non-intrusiveness and cost effectiveness.\\ud In this paper, we present the software implementation of an inline measurement mechanism that uses native structures of the Internet Protocol version 6 (IPv6) stack to pigg...

  7. A new approach for Evolution of end to end Security in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    S. Anjali Devi,

    2011-06-01

    Full Text Available A wireless sensor network (WSN is a network consisting of spatially distributed autonomous devices using sensors to cooperatively monitor physical or environmental conditions such as temperature, sound, vibration, pressure, motion or pollutants, at different locations. Data security is essential for these mission-critical applications to work in unattended and even hostile environments. So, we have to take providing desirable data security, that is, confidentiality, authenticity, and availability, in wireless sensor networks (WSNs as a challenge. WSN consists of a large number of sensor nodes. These sensor nodes are mini, low-cost, smaller memory sizes and low bandwidth. Existing security designs are vulnerable to many types of Denial of Service (DoS attacks, such as report disruption attacks and selective forwarding attacks. In this paper, we seek to overcome these vulnerabilities for large-scale static WSNs.

  8. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  9. Long-distance entanglement in many-body atomic and optical systems

    CERN Document Server

    Giampaolo, Salvatore M

    2009-01-01

    We discuss the phenomenon of long-distance entanglement in the ground state of quantum spin models, its use in high-fidelity and robust quantum communication, and its realization in many-body systems of ultracold atoms in optical lattices and in arrays of coupled optical cavities. We investigate different patterns of site-dependent interaction couplings, singling out two general settings: Patterns that allow for perfect long-distance entanglement (LDE) in the ground state of the system, namely such that the end-to-end entanglement remains finite in the thermodynamic limit, and patterns of quasi long-distance entanglement (QLDE) in the ground state of the system, namely, such such that the end-to-end entanglement vanishes with a very slow power-law decay as the length of the spin chain is increased. We discuss physical realizations of these models in ensembles of ultracold bosonic atoms loaded in optical lattices. We show how, using either suitably engineered super-lattice structures or exploiting the presence...

  10. Moving Large Data Sets Over High-Performance Long Distance Networks

    Energy Technology Data Exchange (ETDEWEB)

    Hodson, Stephen W [ORNL; Poole, Stephen W [ORNL; Ruwart, Thomas [ORNL; Settlemyer, Bradley W [ORNL

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of the system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.

  11. Fast computation of distance estimators

    Directory of Open Access Journals (Sweden)

    Lagergren Jens

    2007-03-01

    Full Text Available Abstract Background Some distance methods are among the most commonly used methods for reconstructing phylogenetic trees from sequence data. The input to a distance method is a distance matrix, containing estimated pairwise distances between all pairs of taxa. Distance methods themselves are often fast, e.g., the famous and popular Neighbor Joining (NJ algorithm reconstructs a phylogeny of n taxa in time O(n3. Unfortunately, the fastest practical algorithms known for Computing the distance matrix, from n sequences of length l, takes time proportional to l·n2. Since the sequence length typically is much larger than the number of taxa, the distance estimation is the bottleneck in phylogeny reconstruction. This bottleneck is especially apparent in reconstruction of large phylogenies or in applications where many trees have to be reconstructed, e.g., bootstrapping and genome wide applications. Results We give an advanced algorithm for Computing the number of mutational events between DNA sequences which is significantly faster than both Phylip and Paup. Moreover, we give a new method for estimating pairwise distances between sequences which contain ambiguity Symbols. This new method is shown to be more accurate as well as faster than earlier methods. Conclusion Our novel algorithm for Computing distance estimators provides a valuable tool in phylogeny reconstruction. Since the running time of our distance estimation algorithm is comparable to that of most distance methods, the previous bottleneck is removed. All distance methods, such as NJ, require a distance matrix as input and, hence, our novel algorithm significantly improves the overall running time of all distance methods. In particular, we show for real world biological applications how the running time of phylogeny reconstruction using NJ is improved from a matter of hours to a matter of seconds.

  12. The Arginine Pairs and C-Termini of the Sso7c4 from Sulfolobus solfataricus Participate in Binding and Bending DNA

    Science.gov (United States)

    Huang, Chun-Hsiang; Ko, Tzu-Ping; Chiang, Cheng-Hung; Lin, Kuan-Fu; Chang, Yuan-Chih; Lin, Po-Yen; Tsai, Hui-Hsu Gavin; Wang, Andrew H.-J.

    2017-01-01

    The Sso7c4 from Sulfolobus solfataricus forms a dimer, which is believed to function as a chromosomal protein involved in genomic DNA compaction and gene regulation. Here, we present the crystal structure of wild-type Sso7c4 at a high resolution of 1.63 Å, showing that the two basic C-termini are disordered. Based on the fluorescence polarization (FP) binding assay, two arginine pairs, R11/R22′ and R11′/R22, on the top surface participate in binding DNA. As shown in electron microscopy (EM) images, wild-type Sso7c4 compacts DNA through bridging and bending interactions, whereas the binding of C-terminally truncated proteins rigidifies and opens DNA molecules, and no compaction of the DNA occurs. Moreover, the FP, EM and fluorescence resonance energy transfer (FRET) data indicated that the two basic and flexible C-terminal arms of the Sso7c4 dimer play a crucial role in binding and bending DNA. Sso7c4 has been classified as a repressor-like protein because of its similarity to Escherichia coli Ecrep 6.8 and Ecrep 7.3 as well as Agrobacterium tumefaciens ACCR in amino acid sequence. Based on these data, we proposed a model of the Sso7c4-DNA complex using a curved DNA molecule in the catabolite activator protein-DNA complex. The DNA end-to-end distance measured with FRET upon wild-type Sso7c4 binding is almost equal to the distance measured in the model, which supports the fidelity of the proposed model. The FRET data also confirm the EM observation showing that the binding of wild-type Sso7c4 reduces the DNA length while the C-terminal truncation does not. A functional role for Sso7c4 in the organization of chromosomal DNA and/or the regulation of gene expression through bridging and bending interactions is suggested. PMID:28068385

  13. Mapping of single-base differences between two DNA strands in a single molecule using holliday junction nanomechanics.

    Directory of Open Access Journals (Sweden)

    Camille Brème

    Full Text Available OBJECTIVE: The aim of this work is to demonstrate a novel single-molecule DNA sequence comparison assay that is purely based on DNA mechanics. METHODS: A molecular construct that contained the two homologous but non-identical DNA sequences that were to be compared was prepared such that a four-way (Holliday junction could be formed by the formation of heteroduplexes through the inter-recombination of the strands. Magnetic tweezers were used to manipulate the force and the winding applied to this construct for inducing both the formation and the migration of a Holliday junction. The end-to-end distance of the construct was measured as a function of the winding and was used to monitor the behavior of the Holliday junction in different regions of the intra-molecular recombination. MAIN RESULTS: In the appropriate buffer, the magnet rotation induces the migration of the Holliday junction in the regions where there is no sequence difference between the recombining sequences. In contrast, even a single-base difference between the recombining sequences leads to a long-lasting blockage of the migration in the same buffer; this effect was obtained when the junction was positioned near this locus (the site of the single-base difference and forced toward the formation of heteroduplexes that comprise the locus. The migration blockages were detected through the identification of the formation of plectonemes. The detection of the presence of sequence differences and their respective mappings were obtained from the series of blockages that were detected. SIGNIFICANCE: This work presents a novel single-molecule sequence comparison assay that is based on the use of a Holliday junction as an ultra-sensitive nanomechanism; the mismatches act as blocking grains of sand in the Holliday "DNA gearbox". This approach will potentially have future applications in biotechnology.

  14. Distance determinations to shield galaxies from Hubble space telescope imaging

    Energy Technology Data Exchange (ETDEWEB)

    McQuinn, Kristen B. W.; Skillman, Evan D. [Minnesota Institute for Astrophysics, School of Physics and Astronomy, University of Minnesota, 116 Church Street, S.E., Minneapolis, MN 55455 (United States); Cannon, John M.; Cave, Ian [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Dolphin, Andrew E. [Raytheon Company, 1151 E. Hermans Road, Tucson, AZ 85756 (United States); Salzer, John J. [Department of Astronomy, Indiana University, 727 East 3rd Street, Bloomington, IN 47405 (United States); Haynes, Martha P.; Adams, Elizabeth; Giovanelli, Riccardo [Center for Radiophysics and Space Research, Space Sciences Building, Cornell University, Ithaca, NY 14853 (United States); Elson, Ed C. [Astrophysics, Cosmology and Gravity Centre (ACGC), Department of Astronomy, University of Cape Town, Private Bag X3, Rondebosch 7701 (South Africa); Ott, Juërgen [National Radio Astronomy Observatory, P.O. Box O, 1003 Lopezville Road, Socorro, NM 87801 (United States); Saintonge, Amélie, E-mail: kmcquinn@astro.umn.edu [Max-Planck-Institute for Astrophysics, D-85741 Garching (Germany)

    2014-04-10

    The Survey of H I in Extremely Low-mass Dwarf (SHIELD) galaxies is an ongoing multi-wavelength program to characterize the gas, star formation, and evolution in gas-rich, very low-mass galaxies. The galaxies were selected from the first ∼10% of the H I Arecibo Legacy Fast ALFA (ALFALFA) survey based on their inferred low H I mass and low baryonic mass, and all systems have recent star formation. Thus, the SHIELD sample probes the faint end of the galaxy luminosity function for star-forming galaxies. Here, we measure the distances to the 12 SHIELD galaxies to be between 5 and 12 Mpc by applying the tip of the red giant method to the resolved stellar populations imaged by the Hubble Space Telescope. Based on these distances, the H I masses in the sample range from 4 × 10{sup 6} to 6 × 10{sup 7} M {sub ☉}, with a median H I mass of 1 × 10{sup 7} M {sub ☉}. The tip of the red giant branch distances are up to 73% farther than flow-model estimates in the ALFALFA catalog. Because of the relatively large uncertainties of flow-model distances, we are biased toward selecting galaxies from the ALFALFA catalog where the flow model underestimates the true distances. The measured distances allow for an assessment of the native environments around the sample members. Five of the galaxies are part of the NGC 672 and NGC 784 groups, which together constitute a single structure. One galaxy is part of a larger linear ensemble of nine systems that stretches 1.6 Mpc from end to end. Three galaxies reside in regions with 1-9 neighbors, and four galaxies are truly isolated with no known system identified within a radius of 1 Mpc.

  15. Flexibility of short ds-DNA intercalated by a dipyridophenazine ligand

    Science.gov (United States)

    Jia, Fuchao; Despax, Stéphane; Munch, Jean-Pierre; Hébraud, Pascal

    2015-04-01

    We use Förster Resonant Energy Transfer (FRET) in order to measure the increase of flexibility of short ds-DNA induced by the intercalation of dipyridophenazine (dppz) ligand in between DNA base pairs. By using a DNA double strand fluorescently labeled at its extremeties, it is shown that the end-to-end length increase of DNA due to the intercalation of one dppz ligand is smaller than the DNA base pair interdistance. This may be explained either by a local bending of the DNA or by an increase of its flexibility. The persistence length of the formed DNA/ligand is evaluated. The described structure may have implications in the photophysical damages induced by the complexation of DNA by organometallic molecules.

  16. Flexibility of short ds-DNA intercalated by a dipyridophenazine ligand

    Directory of Open Access Journals (Sweden)

    Fuchao eJia

    2015-04-01

    Full Text Available We use Förster Resonant Energy Transfer (FRET in order to measure the increase of flexibility of short ds-DNA induced by the intercalation of dipyridophenazine (dppz ligand in between DNA base pairs. By using a DNA double strand fluorescently labeled at its extremeties, it is shown that the end-to-end length increase of DNA due to the intercalation of one dppz ligand is smaller than the DNA base pair interdistance. This may be explained either by a local bending of the DNA or by an increase of its flexibility. The persistence length of the formed DNA/ligand is evaluated. The described structure may have implications in the photophysical damages induced by the complexation of DNA by organometallic molecules.

  17. Site-directed spin-labeling of DNA by the azide-alkyne 'click' reaction: nanometer distance measurements on 7-deaza-2'-deoxyadenosine and 2'-deoxyuridine nitroxide conjugates spatially separated or linked to a 'dA-dT' base pair.

    Science.gov (United States)

    Ding, Ping; Wunnicke, Dorith; Steinhoff, Heinz-Jürgen; Seela, Frank

    2010-12-27

    Nucleobase-directed spin-labeling by the azide-alkyne 'click' (CuAAC) reaction has been performed for the first time with oligonucleotides. 7-Deaza-7-ethynyl-2'-deoxyadenosine (1) and 5-ethynyl-2'-deoxyuridine (2) were chosen to incorporate terminal triple bonds into DNA. Oligonucleotides containing 1 or 2 were synthesized on a solid phase and spin labeling with 4-azido-2,2,6,6-tetramethylpiperidine 1-oxyl (4-azido-TEMPO, 3) was performed by post-modification in solution. Two spin labels (3) were incorporated with high efficiency into the DNA duplex at spatially separated positions or into a 'dA-dT' base pair. Modification at the 5-position of the pyrimidine base or at the 7-position of the 7-deazapurine residue gave steric freedom to the spin label in the major groove of duplex DNA. By applying cw and pulse EPR spectroscopy, very accurate distances between spin labels, within the range of 1-2 nm, were measured. The spin-spin distance was 1.8±0.2 nm for DNA duplex 17(dA*(7,10))⋅11 containing two spin labels that are separated by two nucleotides within one individual strand. A distance of 1.4±0.2 nm was found for the spin-labeled 'dA-dT' base pair 15(dA*(7))⋅16(dT*(6)). The 'click' approach has the potential to be applied to all four constituents of DNA, which indicates the universal applicability of the method. New insights into the structural changes of canonical or modified DNA are expected to provide additional information on novel DNA structures, protein interaction, DNA architecture, and synthetic biology.

  18. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2016-01-01

    This 4th edition of the leading reference volume on distance metrics is characterized by updated and rewritten sections on some items suggested by experts and readers, as well a general streamlining of content and the addition of essential new topics. Though the structure remains unchanged, the new edition also explores recent advances in the use of distances and metrics for e.g. generalized distances, probability theory, graph theory, coding theory, data analysis. New topics in the purely mathematical sections include e.g. the Vitanyi multiset-metric, algebraic point-conic distance, triangular ratio metric, Rossi-Hamming metric, Taneja distance, spectral semimetric between graphs, channel metrization, and Maryland bridge distance. The multidisciplinary sections have also been supplemented with new topics, including: dynamic time wrapping distance, memory distance, allometry, atmospheric depth, elliptic orbit distance, VLBI distance measurements, the astronomical system of units, and walkability distance. Lea...

  19. Distance Education Council.

    Science.gov (United States)

    Indira Gandhi National Open University, New Delhi (India). Distance Education Council.

    Since its inception in India in 1962, distance education has grown in popularity. The Distance Education Council (DEC) directs distance learning within India's higher education system. The DEC's promotion, coordination, and maintenance of standards for distance education are its three major roles. Its initiatives include grants, support for…

  20. Training for Distance Teaching through Distance Learning.

    Science.gov (United States)

    Cadorath, Jill; Harris, Simon; Encinas, Fatima

    2002-01-01

    Describes a mixed-mode bachelor degree course in English language teaching at the Universidad Autonoma de Puebla (Mexico) that was designed to help practicing teachers write appropriate distance education materials by giving them the experience of being distance students. Includes a course outline and results of a course evaluation. (Author/LRW)

  1. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2014-01-01

    This updated and revised third edition of the leading reference volume on distance metrics includes new items from very active research areas in the use of distances and metrics such as geometry, graph theory, probability theory and analysis. Among the new topics included are, for example, polyhedral metric space, nearness matrix problems, distances between belief assignments, distance-related animal settings, diamond-cutting distances, natural units of length, Heidegger’s de-severance distance, and brain distances. The publication of this volume coincides with intensifying research efforts into metric spaces and especially distance design for applications. Accurate metrics have become a crucial goal in computational biology, image analysis, speech recognition and information retrieval. Leaving aside the practical questions that arise during the selection of a ‘good’ distance function, this work focuses on providing the research community with an invaluable comprehensive listing of the main available di...

  2. Distances from Planetary Nebulae

    CERN Document Server

    Ciardullo, R

    2003-01-01

    The [O III] 5007 planetary nebula luminosity function (PNLF) occupies an important place on the extragalactic distance ladder. Since it is the only method that is applicable to all the large galaxies of the Local Supercluster, it is uniquely useful for cross-checking results and linking the Population I and Population II distance scales. We review the physics underlying the method, demonstrate its precision, and illustrate its value by comparing its distances to distances obtained from Cepheids and the Surface Brightness Fluctuation (SBF) method. We use the Cepheid and PNLF distances to 13 galaxies to show that the metallicity dependence of the PNLF cutoff is in excellent agreement with that predicted from theory, and that no additional systematic corrections are needed for either method. However, when we compare the Cepheid-calibrated PNLF distance scale with the Cepheid-calibrated SBF distance scale, we find a significant offset: although the relative distances of both methods are in excellent agreement, th...

  3. A generalized evidence distance

    Institute of Scientific and Technical Information of China (English)

    Hongming Mo; Xi Lu; Yong Deng

    2016-01-01

    How to efficiently measure the distance between two basic probability assignments (BPAs) is an open issue. In this paper, a new method to measure the distance between two BPAs is proposed, based on two existing measures of evidence distance. The new proposed method is comprehen-sive and generalized. Numerical examples are used to ilus-trate the effectiveness of the proposed method.

  4. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    Science.gov (United States)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  5. The NOAO Data Products Program: Developing an End-to-End Data Management System in Support of the Virtual Observatory

    Science.gov (United States)

    Smith, R. C.; Boroson, T.; Seaman, R.

    2007-10-01

    The NOAO Data Products Program (DPP) is responsible for the development and operation of the data management system for NOAO and affiliated observatories, and for the scientific support of users accessing our data holdings and using our tools and services. At the core of this mission is the capture of data from instruments at these observatories and the delivery of that content to both the Principle Investigators (PIs) who proposed for the observations and, after an appropriate proprietary period, to users worldwide who are interested in using the data for their own (often very different) scientific projects. However, delivery of raw and/or reduced images to users only scratches the surface of the extensive potential which the international Virtual Observatory (VO) initiative has to offer. By designing the whole NOAO/DPP program around not only VO standards, but more importantly around VO principles, the program becomes not an exercise in data management and NOAO user support, but rather a VO-centric program which serves the growing world-wide VO community. It is this more global aspect that drives NOAO/DPP planning, as well as more specifically the design, development, and operations of the various components of our system. In the following sections we discuss these components and how they work together to form our VO-centric program.

  6. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    Science.gov (United States)

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  7. Building the tree of life from scratch: an end-to-end work flow for phylogenomic studies

    Science.gov (United States)

    Whole genome sequences are rich sources of information about organisms that are superbly useful for addressing a wide variety of evolutionary questions. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understan...

  8. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  9. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B [Loma Linda UniversityMedical Center, Loma Linda, CA (United States)

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  10. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry giv

  11. End-to-end 9-D polarized bunch transport in eRHIC energy-recovery recirculator, some aspects

    Energy Technology Data Exchange (ETDEWEB)

    Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.; Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.; Brooks, S. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.; Ptitsyn, V. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.; Trbojevic, D. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.; Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.

    2015-05-03

    This paper is a brief overview of some of the numerous beam and spin dynamics investigations undertaken in the framework of the design of the FFAG based electron energy recovery re-circulator ring of the eRHIC electron-ion collider project

  12. End-To-End Solution for Integrated Workload and Data Management using glideinWMS and Globus Online

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the glideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Petascale Scienc...

  13. Rethinking the Design of the Internet: The End-to-End Arguments vs. the Brave New World

    Science.gov (United States)

    2001-08-01

    the millennium.” iMP Magazine, Sept. ,http://www.cisp.org/ imp/september_99/09_99blumenthal.htm.. (67) The popular fictional character Harry Potter received...keeps its brain.” Rowling, J.K., 1998. Harry Potter and the Chamber of Secrets, Bloomsbury, p. 242. (68) Pomfret, J., 2000. “China puts clamps on

  14. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    Science.gov (United States)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  15. Multicast Routing with End-to-End Delay forNumber of Tardy Member of Multicast Group

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xianwei; CHEN Changjia; ZHU Gang

    2001-01-01

    The problem of constructing multi-cast trees to meet the quality of service requirementsof real-time interactive applications operating in high-speed packet-switched environments is presented.Inparticular,the new concept of the delay is redefined,that is,the concepts of delay of the link (or the path)and delay of destination are distinguished.Routineend-to-end delay is defined as "deadline delay" or"bounded delay" and the delay of number of tardymember in this paper is defined as "slack delay".Theslack delay of the destination refers to the feature thatthe accumulated delay from the source to any destina-tion along the tree may exceed the value of slack delay.The problem of determining such a constrained tree isNP-complete.A heuristic presented demonstrates agood average case behavior in terms of the two objec-tives.We also show that it is possible to dynamicallyreorganize the initial tree in response to changes in thedestination set,in a way that is minimally disruptiveto the multicast session.

  16. The End-to-end Demonstrator for improved decision making in the water sector in Europe (EDgE)

    Science.gov (United States)

    Wood, Eric; Wanders, Niko; Pan, Ming; Sheffield, Justin; Samaniego, Luis; Thober, Stephan; Kumar, Rohinni; Prudhomme, Christel; Houghton-Carr, Helen

    2017-04-01

    High-resolution simulations of water resources from hydrological models are vital to supporting important climate services. Apart from a high level of detail, both spatially and temporally, it is important to provide simulations that consistently cover a range of timescales, from historical reanalysis to seasonal forecast and future projections. In the new EDgE project commissioned by the ECMWF (C3S) we try to fulfill these requirements. EDgE is a proof-of-concept project which combines climate data and state-of-the-art hydrological modelling to demonstrate a water-oriented information system implemented through a web application. EDgE is working with key European stakeholders representative of private and public sectors to jointly develop and tailor approaches and techniques. With these tools, stakeholders are assisted in using improved climate information in decision-making, and supported in the development of climate change adaptation and mitigation policies. Here, we present the first results of the EDgE modelling chain, which is divided into three main processes: 1) pre-processing and downscaling; 2) hydrological modelling; 3) post-processing. Consistent downscaling and bias corrections for historical simulations, seasonal forecasts and climate projections ensure that the results across scales are robust. The daily temporal resolution and 5km spatial resolution ensure locally relevant simulations. With the use of four hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), uncertainty between models is properly addressed, while consistency is guaranteed by using identical input data for static land surface parameterizations. The forecast results are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs) that have been created in collaboration with the end-user community of the EDgE project. The final product of this project is composed of 15 years of seasonal forecast and 10 climate change projections, all combined with four hydrological models. These unique high-resolution climate information simulations in the EDgE project provide an unprecedented information system for decision-making over Europe.

  17. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [ORNL; Fugate, David L [ORNL; Cetiner, Sacit M [ORNL; Qualls, A L [ORNL

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  18. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  19. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    Science.gov (United States)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  20. End-To-End Solution for Integrated Workload and Data Management using GlideinWMS and Globus Online

    Science.gov (United States)

    Mhashilkar, Parag; Miller, Zachary; Kettimuthu, Rajkumar; Garzoglio, Gabriele; Holzman, Burt; Weiss, Cathrin; Duan, Xi; Lacinski, Lukasz

    2012-12-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the GlideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates an on-demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Peta-scale Science (CEDPS) by integrating GlideinWMS with Globus Online (GO). Globus Online is a fast, reliable file transfer service that makes it easy for any user to move data. The solution eliminates the need for the users to provide custom data transfer solutions in the application by making this functionality part of the GlideinWMS infrastructure. To achieve this, GlideinWMS uses the file transfer plug-in architecture of Condor. The paper describes the system architecture and how this solution can be extended to support data transfer services other than Globus Online when used with Condor or GlideinWMS.

  1. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data autoloading capability of Django and a TEDS processing script to populate the database with the incoming data. The Python WaterOneFlow Web Services developed by the Texas Water Development Board will be used to publish the data. The software suite is being tested on the Raspberry Pi as end node and a laptop PC as the base station in a wireless setting.

  2. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry giv

  3. Tourists consuming distance

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    The environmental impact of tourism mobility is linked to the distances travelled in order to reach a holiday destination, and with tourists travelling more and further than previously, an understanding of how the tourists view the distance they travel across becomes relevant. Based on interviews...... contribute to an understanding of how it is possible to change tourism travel behaviour towards becoming more sustainable. How tourists 'consume distance' is discussed, from the practical level of actually driving the car or sitting in the air plane, to the symbolic consumption of distance that occurs when...... travelling on holiday becomes part of a lifestyle and a social positioning game. Further, different types of tourist distance consumers are identified, ranging from the reluctant to the deliberate and nonchalant distance consumers, who display very differing attitudes towards the distance they all travel...

  4. Learning string edit distance

    CERN Document Server

    Ristad, E S; Ristad, Eric Sven; Yianilos, Peter N.

    1996-01-01

    In many applications, it is necessary to determine the similarity of two strings. A widely-used notion of string similarity is the edit distance: the minimum number of insertions, deletions, and substitutions required to transform one string into the other. In this report, we provide a stochastic model for string edit distance. Our stochastic model allows us to learn a string edit distance function from a corpus of examples. We illustrate the utility of our approach by applying it to the difficult problem of learning the pronunciation of words in conversational speech. In this application, we learn a string edit distance with one fourth the error rate of the untrained Levenshtein distance. Our approach is applicable to any string classification problem that may be solved using a similarity function against a database of labeled prototypes. Keywords: string edit distance, Levenshtein distance, stochastic transduction, syntactic pattern recognition, prototype dictionary, spelling correction, string correction, ...

  5. Interface Simulation Distances

    Directory of Open Access Journals (Sweden)

    Pavol Černý

    2012-10-01

    Full Text Available The classical (boolean notion of refinement for behavioral interfaces of system components is the alternating refinement preorder. In this paper, we define a distance for interfaces, called interface simulation distance. It makes the alternating refinement preorder quantitative by, intuitively, tolerating errors (while counting them in the alternating simulation game. We show that the interface simulation distance satisfies the triangle inequality, that the distance between two interfaces does not increase under parallel composition with a third interface, and that the distance between two interfaces can be bounded from above and below by distances between abstractions of the two interfaces. We illustrate the framework, and the properties of the distances under composition of interfaces, with two case studies.

  6. The Carnegie-Chicago Hubble Program. I. An Independent Approach to the Extragalactic Distance Scale Using Only Population II Distance Indicators

    Science.gov (United States)

    Beaton, Rachael L.; Freedman, Wendy L.; Madore, Barry F.; Bono, Giuseppe; Carlson, Erika K.; Clementini, Gisella; Durbin, Meredith J.; Garofalo, Alessia; Hatt, Dylan; Jang, In Sung; Kollmeier, Juna A.; Lee, Myung Gyoon; Monson, Andrew J.; Rich, Jeffrey A.; Scowcroft, Victoria; Seibert, Mark; Sturch, Laura; Yang, Soung-Chul

    2016-12-01

    We present an overview of the Carnegie-Chicago Hubble Program, an ongoing program to obtain a 3% measurement of the Hubble constant (H 0) using alternative methods to the traditional Cepheid distance scale. We aim to establish a completely independent route to H 0 using RR Lyrae variables, the tip of the red giant branch (TRGB), and Type Ia supernovae (SNe Ia). This alternative distance ladder can be applied to galaxies of any Hubble type, of any inclination, and, using old stars in low-density environments, is robust to the degenerate effects of metallicity and interstellar extinction. Given the relatively small number of SNe Ia host galaxies with independently measured distances, these properties provide a great systematic advantage in the measurement of H 0 via the distance ladder. Initially, the accuracy of our value of H 0 will be set by the five Galactic RR Lyrae calibrators with Hubble Space Telescope Fine-Guidance Sensor parallaxes. With Gaia, both the RR Lyrae zero-point and TRGB method will be independently calibrated, the former with at least an order of magnitude more calibrators and the latter directly through parallax measurement of tip red giants. As the first end-to-enddistance ladder” completely independent of both Cepheid variables and the Large Magellanic Cloud, this path to H 0 will allow for the high-precision comparison at each rung of the traditional distance ladder that is necessary to understand tensions between this and other routes to H 0. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are associated with programs #13472 and #13691.

  7. Keeping Your Distance

    Directory of Open Access Journals (Sweden)

    Glen Gatin

    2013-06-01

    Full Text Available This analysis began with inquiries into the substantive area of distance education using the classic grounded theory method. Analysis revealed a pattern of problemsolving behavior, from which the theory Keeping Your Distance emerged. The theory is an integrated set of concepts referring to the conscious and unconscious strategiesthat people use to regulate distance, physical and representative, in their everyday lives. Strategies are used to control physical, emotional, and psychological realities and to conserve personal energy in interactions with individuals and/or institutions.For all social interactions, people use a personalized algorithm of engagement that mitigates conditions and consequences and preserves optimal distance. Keeping Your Distance provides a theoretical starting point for considerations of the changing notions of distance. In part, these changes have been brought about bydevelopments in the fields of Information and Communication Technology (ICT and online social networking.

  8. Numerical distance protection

    CERN Document Server

    Ziegler, Gerhard

    2011-01-01

    Distance protection provides the basis for network protection in transmission systems and meshed distribution systems. This book covers the fundamentals of distance protection and the special features of numerical technology. The emphasis is placed on the application of numerical distance relays in distribution and transmission systems.This book is aimed at students and engineers who wish to familiarise themselves with the subject of power system protection, as well as the experienced user, entering the area of numerical distance protection. Furthermore it serves as a reference guide for s

  9. Normalized information distance

    NARCIS (Netherlands)

    Vitányi, P.M.B.; Balbach, F.J.; Cilibrasi, R.L.; Li, M.; Emmert-Streib, F.; Dehmer, M.

    2009-01-01

    The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string

  10. Normalized information distance

    NARCIS (Netherlands)

    Vitányi, P.M.B.; Balbach, F.J.; Cilibrasi, R.L.; Li, M.

    2008-01-01

    The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string

  11. Incremental Distance Transforms (IDT)

    NARCIS (Netherlands)

    Schouten, Theo E.; van den Broek, Egon; Erçil, A.; Çetin, M.; Boyer, K.; Lee, S.-W.

    2010-01-01

    A new generic scheme for incremental implementations of distance transforms (DT) is presented: Incremental Distance Transforms (IDT). This scheme is applied on the cityblock, Chamfer, and three recent exact Euclidean DT (E2DT). A benchmark shows that for all five DT, the incremental implementation r

  12. ORDERED WEIGHTED DISTANCE MEASURE

    Institute of Scientific and Technical Information of China (English)

    Zeshui XU; Jian CHEN

    2008-01-01

    The aim of this paper is to develop an ordered weighted distance (OWD) measure, which is thegeneralization of some widely used distance measures, including the normalized Hamming distance, the normalized Euclidean distance, the normalized geometric distance, the max distance, the median distance and the min distance, etc. Moreover, the ordered weighted averaging operator, the generalized ordered weighted aggregation operator, the ordered weighted geometric operator, the averaging operator, the geometric mean operator, the ordered weighted square root operator, the square root operator, the max operator, the median operator and the min operator axe also the special cases of the OWD measure. Some methods depending on the input arguments are given to determine the weights associated with the OWD measure. The prominent characteristic of the OWD measure is that it can relieve (or intensify) the influence of unduly large or unduly small deviations on the aggregation results by assigning them low (or high) weights. This desirable characteristic makes the OWD measure very suitable to be used in many actual fields, including group decision making, medical diagnosis, data mining, and pattern recognition, etc. Finally, based on the OWD measure, we develop a group decision making approach, and illustrate it with a numerical example.

  13. Duty and Distance

    NARCIS (Netherlands)

    C. Binder (C.); C. Heilmann (Conrad)

    2017-01-01

    markdownabstractEver since the publication of Peter Singer’s article ‘‘Famine, Affluence, and Morality’’ has the question of whether the (geographical) distance to people in need affects our moral duties towards them been a hotly debated issue. Does geographical distance affect our moral duties?

  14. Biomechanics of Distance Running.

    Science.gov (United States)

    Cavanagh, Peter R., Ed.

    Contributions from researchers in the field of running mechanics are included in the 13 chapters of this book. The following topics are covered: (1) "The Mechanics of Distance Running: A Historical Perspective" (Peter Cavanagh); (2) "Stride Length in Distance Running: Velocity, Body Dimensions, and Added Mass Effects" (Peter Cavanagh, Rodger…

  15. Distance Learning Environment Demonstration.

    Science.gov (United States)

    1996-11-01

    The Distance Learning Environment Demonstration (DLED) was a comparative study of distributed multimedia computer-based training using low cost high...measurement. The DLED project provides baseline research in the effective use of distance learning and multimedia communications over a wide area ATM/SONET

  16. Duty and Distance

    NARCIS (Netherlands)

    C. Binder (C.); C. Heilmann (Conrad)

    2017-01-01

    markdownabstractEver since the publication of Peter Singer’s article ‘‘Famine, Affluence, and Morality’’ has the question of whether the (geographical) distance to people in need affects our moral duties towards them been a hotly debated issue. Does geographical distance affect our moral

  17. Estimating distances from parallaxes

    Science.gov (United States)

    Astraatmadja, Tri L.; Bailer-Jones, Coryn

    2017-01-01

    In astrometric surveys such as Gaia and LSST, parallaxes will be measured for about a billion stars, but zero distances will be measured. Distances must be inferred from the parallaxes, and the common inference practice is by inverting the parallax. This, however, is only appropriate when there is no noise present. As noise will always be present and most stars in future surveys will have non-negligible fractional parallax uncertainties, we must treat distance estimation as an inference problem. The usage of prior assumptions become unavoidable. In this talk I will present a method on how to infer distances using Bayesian inference. Three minimalists, isotropic priors are used, as well an anisotropic prior derived from the observability of stars in a Milky Way model. The performance of these priors are investigated using a simulated Gaia-like catalogue. Recent results of distance estimation using the parallaxes of 2 million Gaia DR1 stars will also be discussed.

  18. Normalized Information Distance

    CERN Document Server

    Vitanyi, Paul M B; Cilibrasi, Rudi L; Li, Ming

    2008-01-01

    The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string representation. Second, for names and abstract concepts, page count statistics from the World Wide Web can be used. These practical realizations of the normalized information distance can then be applied to machine learning tasks, expecially clustering, to perform feature-free and parameter-free data mining. This chapter discusses the theoretical foundations of the normalized information distance and both practical realizations. It presents numerous examples of successful real-world applications based on these distance measures, ranging from bioinformatics to music clustering to machine translation.

  19. On using Multiple Quality Link Metrics with Destination Sequenced Distance Vector Protocol for Wireless Multi-Hop Networks

    CERN Document Server

    Javaid, N; Khan, Z A; Djouani, K

    2012-01-01

    In this paper, we compare and analyze performance of five quality link metrics forWireless Multi-hop Networks (WMhNs). The metrics are based on loss probability measurements; ETX, ETT, InvETX, ML and MD, in a distance vector routing protocol; DSDV. Among these selected metrics, we have implemented ML, MD, InvETX and ETT in DSDV which are previously implemented with different protocols; ML, MD, InvETX are implemented with OLSR, while ETT is implemented in MR-LQSR. For our comparison, we have selected Throughput, Normalized Routing Load (NRL) and End-to-End Delay (E2ED) as performance parameters. Finally, we deduce that InvETX due to low computational burden and link asymmetry measurement outperforms among all metrics.

  20. On the inversion-indel distance.

    Science.gov (United States)

    Willing, Eyla; Zaccaria, Simone; Braga, Marília D V; Stoye, Jens

    2013-01-01

    The inversion distance, that is the distance between two unichromosomal genomes with the same content allowing only inversions of DNA segments, can be computed thanks to a pioneering approach of Hannenhalli and Pevzner in 1995. In 2000, El-Mabrouk extended the inversion model to allow the comparison of unichromosomal genomes with unequal contents, thus insertions and deletions of DNA segments besides inversions. However, an exact algorithm was presented only for the case in which we have insertions alone and no deletion (or vice versa), while a heuristic was provided for the symmetric case, that allows both insertions and deletions and is called the inversion-indel distance. In 2005, Yancopoulos, Attie and Friedberg started a new branch of research by introducing the generic double cut and join (DCJ) operation, that can represent several genome rearrangements (including inversions). Among others, the DCJ model gave rise to two important results. First, it has been shown that the inversion distance can be computed in a simpler way with the help of the DCJ operation. Second, the DCJ operation originated the DCJ-indel distance, that allows the comparison of genomes with unequal contents, considering DCJ, insertions and deletions, and can be computed in linear time. In the present work we put these two results together to solve an open problem, showing that, when the graph that represents the relation between the two compared genomes has no bad components, the inversion-indel distance is equal to the DCJ-indel distance. We also give a lower and an upper bound for the inversion-indel distance in the presence of bad components.

  1. Estimating distances from parallaxes

    CERN Document Server

    Bailer-Jones, C A L

    2015-01-01

    Astrometric surveys such as Gaia and LSST will measure parallaxes for hundreds of millions of stars. Yet they will not measure a single distance. Rather, a distance must be estimated from a parallax. In this didactic article, I show that doing this is not trivial once the fractional parallax error is larger than about 20%, which will be the case for about 80% of stars in the Gaia catalogue. Estimating distances is an inference problem in which the use of prior assumptions is unavoidable. I investigate the properties and performance of various priors and examine their implications. A supposed uninformative uniform prior in distance is shown to give very poor distance estimates (large bias and variance). Any prior with a sharp cut-off at some distance has similar problems. The choice of prior depends on the information one has available - and is willing to use - concerning, for example, the survey and the Galaxy. I demonstrate that a simple prior which decreases asymptotically to zero at infinite distance has g...

  2. MOTIVATION FOR DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    R zvan TEF NESCU

    2009-10-01

    Full Text Available Beginning with the 1980’s the new information, communication and computer based technologies stimulated the development of the distance education. In Romania the universities adapted rapidly to this type of learning that became an important financing source for most of them. In this article we approach the causes of attraction for the distance education. For this purpose we use an investigation we did on a group of students at Distance Education including interviews regarding their reasons for choosing this type of learning.

  3. Influence of packing interactions on the average conformation of B-DNA in crystalline structures.

    Science.gov (United States)

    Tereshko, V; Subirana, J A

    1999-04-01

    The molecular interactions in crystals of oligonucleotides in the B form have been analysed and in particular the end-to-end interactions. Phosphate-phosphate interactions in dodecamers are also reviewed. A strong influence of packing constraints on the average conformation of the double helix is found. There is a strong relationship between the space group, the end-to-end interactions and the average conformation of DNA. Dodecamers must have a B-form average conformation with 10 +/- 0.1 base pairs per turn in order to crystallize in the P212121 and related space groups usually found. Decamers show a wider range of conformational variation, with 9.7-10. 6 base pairs per turn, depending on the terminal sequence and the space group. The influence of the space group in decamers is quite striking and remains unexplained. Only small variations are allowed in each case. Thus, crystal packing is strongly related to the average DNA conformation in the crystals and deviations from the average are rather limited. The constraints imposed by the crystal lattice explain why the average twist of the DNA in solution (10.6 base pairs per turn) is seldom found in oligonucleotides crystallized in the B form.

  4. Distance learning perspectives.

    Science.gov (United States)

    Pandza, Haris; Masic, Izet

    2013-01-01

    The development of modern technology and the Internet has enabled the explosive growth of distance learning. distance learning is a process that is increasingly present in the world. This is the field of education focused on educating students who are not physically present in the traditional classrooms or student's campus. described as a process where the source of information is separated from the students in space and time. If there are situations that require the physical presence of students, such as when a student is required to physically attend the exam, this is called a hybrid form of distance learning. This technology is increasingly used worldwide. The Internet has become the main communication channel for the development of distance learning.

  5. Learning Pullback HMM Distances.

    Science.gov (United States)

    Cuzzolin, Fabio; Sapienza, Michael

    2014-07-01

    Recent work in action recognition has exposed the limitations of methods which directly classify local features extracted from spatio-temporal video volumes. In opposition, encoding the actions' dynamics via generative dynamical models has a number of attractive features: however, using all-purpose distances for their classification does not necessarily deliver good results. We propose a general framework for learning distance functions for generative dynamical models, given a training set of labelled videos. The optimal distance function is selected among a family of pullback ones, induced by a parametrised automorphism of the space of models. We focus here on hidden Markov models and their model space, and design an appropriate automorphism there. Experimental results are presented which show how pullback learning greatly improves action recognition performances with respect to base distances.

  6. Presence at a distance.

    Science.gov (United States)

    Haddouk, Lise

    2015-01-01

    Nowadays in the context of the cyberculture, computer-mediated inter-subjective relationships are part of our everyday lives, in both the professional and personal spheres, and for all age groups. In the clinical field, many applications have been developed to facilitate the exchange of informations and mediate the relationship between patient and therapist. In psychology, more or less immersive technologies are used, to encourage the feeling of presence among the users, and to trigger certain psychological processes. In our research, we have explored the remote clinical interview through videoconferencing, with the development and utilisation of the iPSY platform, totally focused on this objective. In this context, we have considered the notion of intersubjectivity, despite the physical absence. This research is leading us today to envision the notions of distance and presence, and possibly to redefine them. Thus, can we still oppose physical distance to psychological distance? Can we still affirm that the physical absence does not permit a psychological co-presence in certain interactions, like this observed in video interviews? The results show that the psychological processes, activated in this context, are similar to those observed in "traditional" clinical consults between the patient and the therapist. However, certain specifics have led us to consider the concept of distance, here influenced by the framework, and to observe its effects. This distance could possibly constitute a therapeutic lever for some patients, notably for those who have difficulties establishing the right psychological distance in their relationships with others. According to these results, can "distance" still be opposed to "presence", or could it be re-defined? This also opens up questions on the more general concept of digital relationships, and the definition of their specificities.

  7. Organization and dynamics of the nonhomologous end-joining machinery during DNA double-strand break repair.

    Science.gov (United States)

    Reid, Dylan A; Keegan, Sarah; Leo-Macias, Alejandra; Watanabe, Go; Strande, Natasha T; Chang, Howard H; Oksuz, Betul Akgol; Fenyo, David; Lieber, Michael R; Ramsden, Dale A; Rothenberg, Eli

    2015-05-19

    Nonhomologous end-joining (NHEJ) is a major repair pathway for DNA double-strand breaks (DSBs), involving synapsis and ligation of the broken strands. We describe the use of in vivo and in vitro single-molecule methods to define the organization and interaction of NHEJ repair proteins at DSB ends. Super-resolution fluorescence microscopy allowed the precise visualization of XRCC4, XLF, and DNA ligase IV filaments adjacent to DSBs, which bridge the broken chromosome and direct rejoining. We show, by single-molecule FRET analysis of the Ku/XRCC4/XLF/DNA ligase IV NHEJ ligation complex, that end-to-end synapsis involves a dynamic positioning of the two ends relative to one another. Our observations form the basis of a new model for NHEJ that describes the mechanism whereby filament-forming proteins bridge DNA DSBs in vivo. In this scheme, the filaments at either end of the DSB interact dynamically to achieve optimal configuration and end-to-end positioning and ligation.

  8. Experimental phase diagram of negatively supercoiled DNA measured by magnetic tweezers and fluorescence

    Science.gov (United States)

    Vlijm, Rifka; Mashaghi, Alireza; Bernard, Stéphanie; Modesti, Mauro; Dekker, Cees

    2015-02-01

    The most common form of DNA is the well-known B-structure of double-helix DNA. Many processes in the cell, however, exert force and torque, inducing structural changes to the DNA that are vital to biological function. Virtually all DNA in cells is in a state of negative supercoiling, with a DNA structure that is complex. Using magnetic tweezers combined with fluorescence imaging, we here study DNA structure as a function of negative supercoiling at the single-molecule level. We classify DNA phases based on DNA length as a function of supercoiling, down to a very high negative supercoiling density σ of -2.5, and forces up to 4.5 pN. We characterize plectonemes using fluorescence imaging. DNA bubbles are visualized by the binding of fluorescently labelled RPA, a eukaryotic single-strand-binding protein. The presence of Z-DNA, a left-handed form of DNA, is probed by the binding of Zα77, the minimal binding domain of a Z-DNA-binding protein. Without supercoiling, DNA is in the relaxed B-form. Upon going toward negative supercoiling, plectonemic B-DNA is being formed below 0.6 pN. At higher forces and supercoiling densities down to about -1.9, a mixed state occurs with plectonemes, multiple bubbles and left-handed L-DNA. Around σ = -1.9, a buckling transition occurs after which the DNA end-to-end length linearly decreases when applying more negative turns, into a state that we interpret as plectonemic L-DNA. By measuring DNA length, Zα77 binding, plectoneme and ssDNA visualisation, we thus have mapped the co-existence of many DNA structures and experimentally determined the DNA phase diagram at (extreme) negative supercoiling.The most common form of DNA is the well-known B-structure of double-helix DNA. Many processes in the cell, however, exert force and torque, inducing structural changes to the DNA that are vital to biological function. Virtually all DNA in cells is in a state of negative supercoiling, with a DNA structure that is complex. Using magnetic tweezers

  9. Distance learning for similarity estimation

    NARCIS (Netherlands)

    Yu, J.; Amores, J.; Sebe, N.; Radeva, P.; Tian, Q.

    2008-01-01

    In this paper, we present a general guideline to find a better distance measure for similarity estimation based on statistical analysis of distribution models and distance functions. A new set of distance measures are derived from the harmonic distance, the geometric distance, and their generalized

  10. Stereoscopic distance perception

    Science.gov (United States)

    Foley, John M.

    1989-01-01

    Limited cue, open-loop tasks in which a human observer indicates distances or relations among distances are discussed. By open-loop tasks, it is meant tasks in which the observer gets no feedback as to the accuracy of the responses. What happens when cues are added and when the loop is closed are considered. The implications of this research for the effectiveness of visual displays is discussed. Errors in visual distance tasks do not necessarily mean that the percept is in error. The error could arise in transformations that intervene between the percept and the response. It is argued that the percept is in error. It is also argued that there exist post-perceptual transformations that may contribute to the error or be modified by feedback to correct for the error.

  11. Experimental phase diagram of negatively supercoiled DNA measured by magnetic tweezers and fluorescence.

    Science.gov (United States)

    Vlijm, Rifka; Mashaghi, Alireza; Bernard, Stéphanie; Modesti, Mauro; Dekker, Cees

    2015-02-21

    The most common form of DNA is the well-known B-structure of double-helix DNA. Many processes in the cell, however, exert force and torque, inducing structural changes to the DNA that are vital to biological function. Virtually all DNA in cells is in a state of negative supercoiling, with a DNA structure that is complex. Using magnetic tweezers combined with fluorescence imaging, we here study DNA structure as a function of negative supercoiling at the single-molecule level. We classify DNA phases based on DNA length as a function of supercoiling, down to a very high negative supercoiling density σ of -2.5, and forces up to 4.5 pN. We characterize plectonemes using fluorescence imaging. DNA bubbles are visualized by the binding of fluorescently labelled RPA, a eukaryotic single-strand-binding protein. The presence of Z-DNA, a left-handed form of DNA, is probed by the binding of Zα77, the minimal binding domain of a Z-DNA-binding protein. Without supercoiling, DNA is in the relaxed B-form. Upon going toward negative supercoiling, plectonemic B-DNA is being formed below 0.6 pN. At higher forces and supercoiling densities down to about -1.9, a mixed state occurs with plectonemes, multiple bubbles and left-handed L-DNA. Around σ = -1.9, a buckling transition occurs after which the DNA end-to-end length linearly decreases when applying more negative turns, into a state that we interpret as plectonemic L-DNA. By measuring DNA length, Zα77 binding, plectoneme and ssDNA visualisation, we thus have mapped the co-existence of many DNA structures and experimentally determined the DNA phase diagram at (extreme) negative supercoiling.

  12. Distances to Dark Clouds: Comparing Extinction Distances to Maser Parallax Distances

    CERN Document Server

    Foster, Jonathan B; Benjamin, Robert A; Hoare, Melvin G; Jackson, James M

    2012-01-01

    We test two different methods of using near-infrared extinction to estimate distances to dark clouds in the first quadrant of the Galaxy using large near infrared (2MASS and UKIDSS) surveys. VLBI parallax measurements of masers around massive young stars provide the most direct and bias-free measurement of the distance to these dark clouds. We compare the extinction distance estimates to these maser parallax distances. We also compare these distances to kinematic distances, including recent re-calibrations of the Galactic rotation curve. The extinction distance methods agree with the maser parallax distances (within the errors) between 66% and 100% of the time (depending on method and input survey) and between 85% and 100% of the time outside of the crowded Galactic center. Although the sample size is small, extinction distance methods reproduce maser parallax distances better than kinematic distances; furthermore, extinction distance methods do not suffer from the kinematic distance ambiguity. This validatio...

  13. Sets avoiding integral distances

    CERN Document Server

    Kurz, Sascha

    2012-01-01

    We study open point sets in Euclidean spaces $\\mathbb{R}^d$ without a pair of points an integral distance apart. By a result of Furstenberg, Katznelson, and Weiss such sets must be of Lebesgue upper density zero. We are interested in how large such sets can be in $d$-dimensional volume. We determine the lower and upper bounds for the volumes of the sets in terms of the number of their connected components and dimension, and also give some exact values. Our problem can be viewed as a kind of inverse to known problems on sets with pairwise rational or integral distances.

  14. Distance Teaching on Bornholm

    DEFF Research Database (Denmark)

    Hansen, Finn J. S.; Clausen, Christian

    2001-01-01

    The case study represents an example of a top-down introduction of distance teaching as part of Danish trials with the introduction of multimedia in education. The study is concerned with the background, aim and context of the trial as well as the role and working of the technology and the organi......The case study represents an example of a top-down introduction of distance teaching as part of Danish trials with the introduction of multimedia in education. The study is concerned with the background, aim and context of the trial as well as the role and working of the technology...

  15. Distances to star forming regions

    CERN Document Server

    Loinard, Laurent

    2014-01-01

    The determination of accurate distances to star-forming regions are discussed in the broader historical context of astronomical distance measurements. We summarize recent results for regions within 1 kpc and present perspectives for the near and more distance future.

  16. Bacteriophage lambda DNA packaging: DNA site requirements for termination and processivity.

    Science.gov (United States)

    Cue, D; Feiss, M

    2001-08-10

    Bacteriophage lambda chromosomes are processively packaged into preformed shells, using end-to-end multimers of intracellular viral DNA as the packaging substate. A 200 bp long DNA segment, cos, contains all the sequences needed for DNA packaging. The work reported here shows that efficient DNA packaging termination requires cos's I2 segment, in addition to the required termination subsite, cosQ, and the nicking site, cosN. Efficient processivity requires cosB, in addition to cosQ and cosN. An initiation-defective mutant form of cosB sponsored efficient processivity, indicating that the terminase-cosB interactions required for termination are less stringent than those required at initiation. The finding that an initiation-defective form of cosB is functional for processivity allows a re-interpretation of a similar finding, obtained previously, that the initiation-defective cosB of phage 21 is functional for processivity by the lambda packaging machinery. The cosBphi21 result can now be interpreted as indicating that interactions between cosBphi21 and lambda terminase, while insufficient for initiation, function for processivity.

  17. Signaling Over Distances.

    Science.gov (United States)

    Saito, Atsushi; Cavalli, Valeria

    2016-02-01

    Neurons are extremely polarized cells. Axon lengths often exceed the dimension of the neuronal cell body by several orders of magnitude. These extreme axonal lengths imply that neurons have mastered efficient mechanisms for long distance signaling between soma and synaptic terminal. These elaborate mechanisms are required for neuronal development and maintenance of the nervous system. Neurons can fine-tune long distance signaling through calcium wave propagation and bidirectional transport of proteins, vesicles, and mRNAs along microtubules. The signal transmission over extreme lengths also ensures that information about axon injury is communicated to the soma and allows for repair mechanisms to be engaged. This review focuses on the different mechanisms employed by neurons to signal over long axonal distances and how signals are interpreted in the soma, with an emphasis on proteomic studies. We also discuss how proteomic approaches could help further deciphering the signaling mechanisms operating over long distance in axons. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. Weighted Feature Distance

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel; Yazdani, Hossein

    2017-01-01

    The accuracy of machine learning methods for clustering depends on the optimal selection of similarity functions. Conventional distance functions for the vector space might cause an algorithm to being affected by some dominant features that may skew its final results. This paper introduces a flexib...

  19. Encyclopedia of Distance Learning

    Science.gov (United States)

    Howard, Caroline, Ed.; Boettecher, Judith, Ed.; Justice, Lorraine, Ed.; Schenk, Karen, Ed.; Rogers, Patricia, Ed.; Berg, Gary, Ed.

    2005-01-01

    The innovations in computer and communications technologies combined with on-going needs to deliver educational programs to students regardless of their physical locations, have lead to the innovation of distance education programs and technologies. To keep up with recent developments in both areas of technologies and techniques related to…

  20. Accreditation of Distance Learning

    Science.gov (United States)

    Demirel, Ergün

    2016-01-01

    The higher education institutes aspire to gain reputation of quality having accreditation from internationally recognized awarding bodies. The accreditation leads and provides quality assurance for education. Although distance learning becomes a significant part of the education system in the 21st century, there is still a common opinion that the…

  1. Prospect of Distance Learning

    Science.gov (United States)

    Rahman, Monsurur; Karim, Reza; Byramjee, Framarz

    2015-01-01

    Many educational institutions in the United States are currently offering programs through distance learning, and that trend is rising. In almost all spheres of education a developing country like Bangladesh needs to make available the expertise of the most qualified faculty to her distant people. But the fundamental question remains as to whether…

  2. Rapport in Distance Education

    Science.gov (United States)

    Murphy, Elizabeth; Rodriguez-Manzanares, Maria A.

    2012-01-01

    Rapport has been recognized as important in learning in general but little is known about its importance in distance education (DE). The study we report on in this paper provides insights into the importance of rapport in DE as well as challenges to and indicators of rapport-building in DE. The study relied on interviews with 42 Canadian…

  3. Misconceptions of Astronomical Distances

    Science.gov (United States)

    Miller, Brian W.; Brewer, William F.

    2010-01-01

    Previous empirical studies using multiple-choice procedures have suggested that there are misconceptions about the scale of astronomical distances. The present study provides a quantitative estimate of the nature of this misconception among US university students by asking them, in an open-ended response format, to make estimates of the distances…

  4. Ancient DNA

    DEFF Research Database (Denmark)

    Willerslev, Eske; Cooper, Alan

    2004-01-01

    ancient DNA, palaeontology, palaeoecology, archaeology, population genetics, DNA damage and repair......ancient DNA, palaeontology, palaeoecology, archaeology, population genetics, DNA damage and repair...

  5. Fuzzy clustering with Minkowski distance

    NARCIS (Netherlands)

    P.J.F. Groenen (Patrick); U. Kaymak (Uzay); J.M. van Rosmalen (Joost)

    2006-01-01

    textabstractDistances in the well known fuzzy c-means algorithm of Bezdek (1973) are measured by the squared Euclidean distance. Other distances have been used as well in fuzzy clustering. For example, Jajuga (1991) proposed to use the L_1-distance and Bobrowski and Bezdek (1991) also used the L_inf

  6. The sound of distance.

    Science.gov (United States)

    Rabaglia, Cristina D; Maglio, Sam J; Krehm, Madelaine; Seok, Jin H; Trope, Yaacov

    2016-07-01

    Human languages may be more than completely arbitrary symbolic systems. A growing literature supports sound symbolism, or the existence of consistent, intuitive relationships between speech sounds and specific concepts. Prior work establishes that these sound-to-meaning mappings can shape language-related judgments and decisions, but do their effects generalize beyond merely the linguistic and truly color how we navigate our environment? We examine this possibility, relating a predominant sound symbolic distinction (vowel frontness) to a novel associate (spatial proximity) in five studies. We show that changing one vowel in a label can influence estimations of distance, impacting judgment, perception, and action. The results (1) provide the first experimental support for a relationship between vowels and spatial distance and (2) demonstrate that sound-to-meaning mappings have outcomes that extend beyond just language and can - through a single sound - influence how we perceive and behave toward objects in the world. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Tolman's Luminosity-Distance, Poincare's Light-Distance and Cayley-Klein's Hyperbolic Distance

    CERN Document Server

    Pierseaux, Yves

    2009-01-01

    We deduce Tolman's formula of luminosity-distance in Cosmology from Poincare's definition of light-distance with Lorentz Transformation (LT).In Minkowskian metric, if distance is proper time (as it is often argued) then light-distance must be also the shortest distance, like proper duration (unlike Einstein's longest length within rest system). By introducing Poincare's proper light-distance in Einstein's basic synchronization we deduce a dilated distance between observer and receding mirror (with relativistic Doppler factor). Such a distance corresponds not to an Euclidean distance (Einstein's rigid rod) but to an Hyperbolic distance (Cayley-Klein) with a Lobatchevskian Horizon. From a basic proportionality hyperbolic distance-velocity, we deduce the law of Hubble. By following Penrose's Lobatchevskian representation of LT, we transform Special Relativity (SR) into an Hyperbolic Cosmological Relativity (HCR). by using only the LT but the whole LT. In Hyperbolic Rotation motion (basic active LT or Einstein's ...

  8. Distance, Borders, and Time

    DEFF Research Database (Denmark)

    Skillicorn, David; Walther, Olivier; Zheng, Quan

    is a combination of the physical geography of the target environment, and the mental and physical cost of following a seemingly random pattern of attacks. Focusing on the distance and time between attacks and taking into consideration the transaction costs that state boundaries impose, we wish to understand what......” of North and West Africa that depicts the permeability to violence. A better understanding of how location, time, and borders condition attacks enables planning, prepositioning, and response....

  9. Monge Distance between Quantum States

    CERN Document Server

    Zyczkowski, K; Zyczkowski, Karol; Slomczynski, Wojciech

    1998-01-01

    We define a metric in the space of quantum states taking the Monge distance between corresponding Husimi distributions (Q--functions). This quantity fulfills the axioms of a metric and satisfies the following semiclassical property: the distance between two coherent states is equal to the Euclidean distance between corresponding points in the classical phase space. We compute analytically distances between certain states (coherent, squeezed, Fock and thermal) and discuss a scheme for numerical computation of Monge distance for two arbitrary quantum states.

  10. The DDO IVC Distance Project

    CERN Document Server

    Gladders, M D; Burns, C R; Attard, A; Casey, M P; Hamilton, D; Mallén-Ornelas, G; Karr, J L; Poirier, S M; Sawicki, M; Barrientos, F; Barkhouse, W A; Brodwin, M; Clark, J; McNaughton, R; Ruetalo-Pacheco, M; Mochnacki, S W; Gladders, Michael D.; Clarke, Tracy E.; Burns, Christopher R.; Attard, Allen; Casey, Michael P.; Hamilton, Devon; Mallen-Ornelas, Gabriela; Karr, Jennifer L.; Poirier, Sara M.; Sawicki, Marcin; Barrientos, Felipe; Barkhouse, Wayne; Brodwin, Mark; Clark, Jason; Naughton, Rosemary Mc; Ruetalo-Pacheco, Marcelo; Mochnacki, Stefan W.

    1998-01-01

    We present the first set of distance limits from the David Dunlap Observatory Intermediate Velocity Cloud (DDO IVC) distance project. Such distance measures are crucial to understanding the origins and dynamics of IVCs, as the distances set most of the basic physical parameters for the clouds. Currently there are very few IVCs with reliably known distances. This paper describes in some detail the basic techniques used to measure distances, with particular emphasis on the the analysis of interstellar absorption line data, which forms the basis of our distance determinations. As an example, we provide a detailed description of our distance determination for the Draco Cloud. Preliminary distance limits for a total of eleven clouds are provided.

  11. Advanced hierarchical distance sampling

    Science.gov (United States)

    Royle, Andy

    2016-01-01

    In this chapter, we cover a number of important extensions of the basic hierarchical distance-sampling (HDS) framework from Chapter 8. First, we discuss the inclusion of “individual covariates,” such as group size, in the HDS model. This is important in many surveys where animals form natural groups that are the primary observation unit, with the size of the group expected to have some influence on detectability. We also discuss HDS integrated with time-removal and double-observer or capture-recapture sampling. These “combined protocols” can be formulated as HDS models with individual covariates, and thus they have a commonality with HDS models involving group structure (group size being just another individual covariate). We cover several varieties of open-population HDS models that accommodate population dynamics. On one end of the spectrum, we cover models that allow replicate distance sampling surveys within a year, which estimate abundance relative to availability and temporary emigration through time. We consider a robust design version of that model. We then consider models with explicit dynamics based on the Dail and Madsen (2011) model and the work of Sollmann et al. (2015). The final major theme of this chapter is relatively newly developed spatial distance sampling models that accommodate explicit models describing the spatial distribution of individuals known as Point Process models. We provide novel formulations of spatial DS and HDS models in this chapter, including implementations of those models in the unmarked package using a hack of the pcount function for N-mixture models.

  12. Distance Metric Tracking

    Science.gov (United States)

    2016-03-02

    520, 2004. 16 [12] E.C. Hall and R.M. Willett. Online convex optimization in dynamic environ- ments. Selected Topics in Signal Processing, IEEE Journal...Conference on Machine Learning, pages 1160–1167. ACM, 2008. [25] Eric P Xing, Michael I Jordan, Stuart Russell, and Andrew Y Ng. Distance metric...whereBψ is any Bregman divergence and ηt is the learning rate parameter. From ( Hall & Willett, 2015) we have: Theorem 1. G` = max θ∈Θ,`∈L ‖∇f(θ)‖ φmax = 1

  13. Long distance tunneling

    CERN Document Server

    Ivlev, B I

    2005-01-01

    Quantum tunneling between two potential wells in a magnetic field can be strongly increased when the potential barrier varies in the direction perpendicular to the line connecting the two wells and remains constant along this line. A periodic structure of the wave function is formed in the direction joining the wells. The resulting motion can be coherent like motion in a conventional narrow band periodic structure. A particle penetrates the barrier over a long distance which strongly contrasts to WKB-like tunneling. The whole problem is stationary. The coherent process can be influenced by dissipation.

  14. Distances on Lozenge Tilings

    CERN Document Server

    Bodini, Olivier; Fernique, Thomas

    2009-01-01

    In this paper, a structural property of the set of lozenge tilings of a 2n-gon is highlighted. We introduce a simple combinatorial value called Hamming-distance, which is a lower bound for the flipdistance (i.e. the number of necessary local transformations involving three lozenges) between two given tilings. It is here proven that, for n5, We show that there is some deficient pairs of tilings for which the flip connection needs more flips than the combinatorial lower bound indicates.

  15. Distance Teaching on Bornholm

    DEFF Research Database (Denmark)

    Hansen, Finn J. S.; Clausen, Christian

    2001-01-01

    and the organisational set-up. It is debated which kind of social learning that has taken place. The innovation process was based on the implementation of an inflexible video-conference system without any proactive considerations of organisational change or pedagocical development. User appropriation of the technology......The case study represents an example of a top-down introduction of distance teaching as part of Danish trials with the introduction of multimedia in education. The study is concerned with the background, aim and context of the trial as well as the role and working of the technology...

  16. Distance Learning. Volume I: Distance Learning Analysis Study.

    Science.gov (United States)

    1998-09-01

    The primary focus of this project is the determination of the feasibility and cost effectiveness of applying Distance Learning strategies to 22...selected PPSCP courses and development of a Distance Learning Analysis Procedures Manual.

  17. PERBANDINGAN EUCLIDEAN DISTANCE DENGAN CANBERRA DISTANCE PADA FACE RECOGNITION

    Directory of Open Access Journals (Sweden)

    Sendhy Rachmat Wurdianarto

    2014-08-01

    Full Text Available Perkembangan ilmu pada dunia komputer sangatlah pesat. Salah satu yang menandai hal ini adalah ilmu komputer telah merambah pada dunia biometrik. Arti biometrik sendiri adalah karakter-karakter manusia yang dapat digunakan untuk membedakan antara orang yang satu dengan yang lainnya. Salah satu pemanfaatan karakter / organ tubuh pada setiap manusia yang digunakan untuk identifikasi (pengenalan adalah dengan memanfaatkan wajah. Dari permasalahan diatas dalam pengenalan lebih tentang aplikasi Matlab pada Face Recognation menggunakan metode Euclidean Distance dan Canberra Distance. Model pengembangan aplikasi yang digunakan adalah model waterfall. Model waterfall beriisi rangkaian aktivitas proses yang disajikan dalam proses analisa kebutuhan, desain menggunakan UML (Unified Modeling Language, inputan objek gambar diproses menggunakan Euclidean Distance dan Canberra Distance. Kesimpulan yang dapat ditarik adalah aplikasi face Recognation menggunakan metode euclidean Distance dan Canverra Distance terdapat kelebihan dan kekurangan masing-masing. Untuk kedepannya aplikasi tersebut dapat dikembangkan dengan menggunakan objek berupa video ataupun objek lainnya.   Kata kunci : Euclidean Distance, Face Recognition, Biometrik, Canberra Distance

  18. Minimal distances between SCFTs

    Energy Technology Data Exchange (ETDEWEB)

    Buican, Matthew [Department of Physics and Astronomy, Rutgers University,Piscataway, NJ 08854 (United States)

    2014-01-28

    We study lower bounds on the minimal distance in theory space between four-dimensional superconformal field theories (SCFTs) connected via broad classes of renormalization group (RG) flows preserving various amounts of supersymmetry (SUSY). For N=1 RG flows, the ultraviolet (UV) and infrared (IR) endpoints of the flow can be parametrically close. On the other hand, for RG flows emanating from a maximally supersymmetric SCFT, the distance to the IR theory cannot be arbitrarily small regardless of the amount of (non-trivial) SUSY preserved along the flow. The case of RG flows from N=2 UV SCFTs is more subtle. We argue that for RG flows preserving the full N=2 SUSY, there are various obstructions to finding examples with parametrically close UV and IR endpoints. Under reasonable assumptions, these obstructions include: unitarity, known bounds on the c central charge derived from associativity of the operator product expansion, and the central charge bounds of Hofman and Maldacena. On the other hand, for RG flows that break N=2→N=1, it is possible to find IR fixed points that are parametrically close to the UV ones. In this case, we argue that if the UV SCFT possesses a single stress tensor, then such RG flows excite of order all the degrees of freedom of the UV theory. Furthermore, if the UV theory has some flavor symmetry, we argue that the UV central charges should not be too large relative to certain parameters in the theory.

  19. DNA Bending elasticity

    Science.gov (United States)

    Sivak, David Alexander

    DNA bending elasticity on length scales of tens of basepairs is of critical importance in numerous biological contexts. Even the simplest models of DNA bending admit of few simple analytic results, thus there is a need for numerical methods to calculate experimental observables, such as distance distributions, forces, FRET efficiencies, and timescales of particular large-scale motions. We have implemented and helped develop a coarse-grained representation of DNA and various other covalently-linked groups that allows simple calculation of such observables for varied experimental systems. The simple freely-jointed chain (FJC) model and extremely coarse resolution proved useful in understanding DNA threading through nanopores, identifying steric occlusion by other parts of the chain as a prime culprit for slower capture as distance to the pore decreased. Enhanced sampling techniques of a finer resolution discrete wormlike chain (WLC) model permitted calculation of cyclization rates for small chains and identified the ramifications of a thermodynamically-sound treatment of thermal melts. Adding treatment of double-stranded DNA's helical nature and single-stranded DNA provided a model system that helped demonstrate the importance of statistical fluctuations in even highly-stressed DNA mini-loops, and allowed us to verify that even these constructs show no evidence of excitation-induced softening. Additional incorporation of salt-sensitivity to the model allowed us to calculate forces and FRET efficiencies for such mini-loops and their uncircularized precursors, thereby furthering the understanding of the nature of IHF binding and bending of its recognition sequence. Adding large volume-excluding spheres linked to the ends of the dsDNA permits calculation of distance distributions and thus small-angle X-ray scattering, whereby we demonstrated the validity of the WLC in describing bending fluctuations in DNA chains as short as 42 bp. We also make important connections

  20. The Distance to M51

    CERN Document Server

    McQuinn, Kristen B W; Dolphin, Andrew E; Berg, Danielle; Kennicutt, Robert

    2016-01-01

    Great investments of observing time have been dedicated to the study of nearby spiral galaxies with diverse goals ranging from understanding the star formation process to characterizing their dark matter distributions. Accurate distances are fundamental to interpreting observations of these galaxies, yet many of the best studied nearby galaxies have distances based on methods with relatively large uncertainties. We have started a program to derive accurate distances to these galaxies. Here we measure the distance to M51 - the Whirlpool galaxy - from newly obtained Hubble Space Telescope optical imaging using the tip of the red giant branch method. We measure the distance modulus to be 8.58+/-0.10 Mpc (statistical), corresponding to a distance modulus of 29.67+/-0.02 mag. Our distance is an improvement over previous results as we use a well-calibrated, stable distance indicator, precision photometry in a optimally selected field of view, and a Bayesian Maximum Likelihood technique that reduces measurement unce...

  1. Ad Hoc on-Demand Distance Vector (AODV Routing Protocol Performance Evaluation on Hybrid Ad Hoc Network: Comparison of Result of Ns-2 Simulation and Implementation on Testbed using PDA

    Directory of Open Access Journals (Sweden)

    Riri Sari

    2010-10-01

    Full Text Available In Mobile Ad hoc NETwork (MANET, node supplemented with wireless equipment has the capacity to manage and organise autonomously, without the presence of network infrastructures. Hybrid ad hoc network, enable several nodes to move freely (mobile to create instant communication. Independent from infrastructure. They could access the Local Area Network (LAN or the Internet. Functionalities of ad hoc network very much dependent on the routing protocol that determines the routing around node. Ad hoc On-demand Distance Vector (AODV is one of routing protocols in ad hoc network which has a reactive characteristic. This protocol is the most common protocol being researched and used. In this Research, AODV protocol investigation was conducted by developing a testbed using Personal Computer, several Laptops (the Linux Red Hat operation system 9.0 and Fedora Core 2, and Personal Digital Assistant (PDA. This research also made a complete package by mean of cross compilation for PDA iPAQ. In general, results obtained from the simulation of AODV protocol using Network Simulator NS-2 are packet delivery ratio 99.89%, end-to-end delay of 0.14 seconds and routing overhead of 1,756.61 byte per second. Afterwards results from simulation were compared to results from testbed. Results obtained from testbed are as follows: the packet delivery ratio is 99.57%, the end-to-end delay is 1.004 seconds and the routing overhead is 1,360.36 byte per second.

  2. Improved directional-distance filter

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper proposes a new spatial-distance weighting function.By combining the weighting function and the traditional directional-distance filter (DDF) in a novel way,a new vector filter-the adaptive distance-weighted directional-distance filter (ADWDDF)-is presented.The experimental results show that the proposed solution provides better filtering performance and preserves better image chromaticity and edge or detail information compared with the traditional DDF and some other typical vector filters.

  3. Isolation by distance, web service

    OpenAIRE

    Bohonak Andrew J; Jensen Jeffrey L; Kelley Scott T

    2005-01-01

    Abstract Background The population genetic pattern known as "isolation by distance" results from spatially limited gene flow and is a commonly observed phenomenon in natural populations. However, few software programs exist for estimating the degree of isolation by distance among populations, and they tend not to be user-friendly. Results We have created Isolation by Distance Web Service (IBDWS) a user-friendly web interface for determining patterns of isolation by distance. Using this site, ...

  4. Reconsidering Moore's Transactional Distance Theory

    Science.gov (United States)

    Giossos, Yiannis; Koutsouba, Maria; Lionarakis, Antonis; Skavantzos, Kosmas

    2009-01-01

    One of the core theories of distance education is Michael Graham Moore's "Theory of Transactional Distance" that provides the broad framework of the pedagogy of distance education and allows the generation of almost infinite number of hypotheses for research. However, the review of the existing studies relating to the theory showed the use of a…

  5. Comparative DNA binding abilities and phosphatase-like activities of mono-, di-, and trinuclear Ni(II) complexes: the influence of ligand denticity, metal-metal distance, and coordinating solvent/anion on kinetics studies.

    Science.gov (United States)

    Bhardwaj, Vimal K; Singh, Ajnesh

    2014-10-06

    Six novel Ni(II) complexes, namely, [Ni2(HL(1))(OAc)2] (1), [Ni3L(1)2]·H2O·2CH3CN (2), [Ni2(L(2))(L(3))(CH3CN)] (3), [Ni2(L(2))2(H2O)2] (4), [Ni2(L(2))2(DMF)2]2·2H2O (5), and [Ni(HL(2))2]·H2O (6), were synthesized by reacting nitrophenol-based tripodal (H3L(1)) and dipodal (H2L(2)) Schiff base ligands with Ni(II) metal salts at ambient conditions. All the complexes were fully characterized with different spectroscopic techniques such as elemental analyses, IR, UV-vis spectroscopy, and electrospray ionization mass spectrometry. The solid-state structures of 2, 3, 5, and 6 were determined using single-crystal X-ray crystallography. The compounds 1, 3, 4, and 5 are dinuclear complexes where the two Ni(II) centers have octahedral geometry with bridging phenoxo groups. Compound 2 is a trinuclear complex with two different types of Ni(II) centers. In compound 3 one of the Ni(II) centers has a coordinated acetonitrile molecule, whereas in compound 4, a water molecule has occupied one coordination site of each Ni(II) center. In complex 5, the coordinated water of complex 4 was displaced by the dimethylformamide (DMF) during its crystallization. Complex 6 is mononuclear with two amine-bis(phenolate) ligands in scissorlike fashion around the Ni(II) metal center. The single crystals of 1 and 4 could not be obtained; however, from the spectroscopic data and physicochemical properties (electronic and redox properties) it was assumed that the structures of these complexes are quite similar to other analogues. DNA binding abilities and phosphatase-like activities of all characterized complexes were also investigated. The ligand denticity, coordinated anions/solvents (such as acetate, acetonitrile, water, and DMF), and cooperative action of two metal centers play a significant role in the phosphate ester bond cleavage of 2-hydroxypropyl-p-nitropenylphosphate by transesterification mechanism. Complex 3 exhibits highest activity among complexes 1-6 with 3.86 × 10(5) times

  6. Keeping Your Distance is Hard

    OpenAIRE

    Burke, Kyle; Heubach, Silvia; Huggan, Melissa; Huntemann, Svenja

    2016-01-01

    We study the computational complexity of distance games, a class of combinatorial games played on graphs. A move consists of colouring an uncoloured vertex subject to it not being at certain distances determined by two sets, D and S. D is the set of forbidden distances for colouring vertices in different colors, while S is the set of forbidden distances for the same colour. The last player to move wins. Well-known examples of distance games are Node-Kayles, Snort, and Col, whose complexities ...

  7. Distance learning and perioperative nursing.

    Science.gov (United States)

    Gruendemann, Barbara J

    2007-03-01

    Distance learning in nursing education is arriving with unprecedented speed, which has led to much uncertainty among educators. This article provides an overview of distance learning and its application to perioperative nursing. Lack of face-to-face interaction is of foremost concern in distance learning, and educators must develop new teaching strategies to address this problem. Models for assessing outcomes and effectiveness are important tools to use when implementing a distance learning program. Basic perioperative nursing concepts, skills, procedures, and recommended practices can be introduced effectively with online distance learning modalities and then reinforced through a clinical component.

  8. New distances to RAVE stars

    CERN Document Server

    Binney, James; Kordopatis, Georges; McMillan, Paul J; Sharma, Sanjib; Zwitter, Tomaz; Bienayme, Olivier; Bland-Hawthorn, Joss; Steinmetz, Matthias; Gilmore, Gerry; Williams, Mary E K; Navarro, Julio; Grebel, Eva K; Helmi, Amina; Parker, Quentin; Reid, Warren A; Seabroke, George; Watson, Fred; Wyse, Rosie F G

    2013-01-01

    Probability density functions are determined from new stellar parameters for the distance moduli of stars for which the RAdial Velocity Experiment (RAVE) has obtained spectra with S/N>=10. The expectation value of distance is larger than the distance implied by the expectation of distance modulus; the latter is itself larger than the distance implied by the expectation value of the parallax. Our parallaxes of Hipparcos stars agree well with the values measured by Hipparcos, so the expectation of parallax is the most reliable distance indicator. The latter are improved by taking extinction into account. We provide one- two- or three-Gaussian fits to the distance pdfs. The effective temperature absolute-magnitude diagram of our stars is significantly improved when these pdfs are used to make the diagram. We use the method of kinematic corrections devised by Schoenrich, Binney & Asplund to check for systematic errors in our estimators for ordinary stars and confirm the conclusion reached from the Hipparcos s...

  9. Forced unraveling of chromatin fibers with nonuniform linker DNA lengths

    Science.gov (United States)

    Ozer, Gungor; Collepardo-Guevara, Rosana; Schlick, Tamar

    2015-02-01

    The chromatin fiber undergoes significant structural changes during the cell's life cycle to modulate DNA accessibility. Detailed mechanisms of such structural transformations of chromatin fibers as affected by various internal and external conditions such as the ionic conditions of the medium, the linker DNA length, and the presence of linker histones, constitute an open challenge. Here we utilize Monte Carlo (MC) simulations of a coarse grained model of chromatin with nonuniform linker DNA lengths as found in vivo to help explain some aspects of this challenge. We investigate the unfolding mechanisms of chromatin fibers with alternating linker lengths of 26-62 bp and 44-79 bp using a series of end-to-end stretching trajectories with and without linker histones and compare results to uniform-linker-length fibers. We find that linker histones increase overall resistance of nonuniform fibers and lead to fiber unfolding with superbeads-on-a-string cluster transitions. Chromatin fibers with nonuniform linker DNA lengths display a more complex, multi-step yet smoother process of unfolding compared to their uniform counterparts, likely due to the existence of a more continuous range of nucleosome-nucleosome interactions. This finding echoes the theme that some heterogeneity in fiber component is biologically advantageous.

  10. Planning with Reachable Distances

    KAUST Repository

    Tang, Xinyu

    2009-01-01

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the robot\\'s number of degrees of freedom. In addition to supporting efficient sampling, we show that the RD-space formulation naturally supports planning, and in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1000 links in time comparable to open chain sampling, and we can generate samples for 1000-link multi-loop systems of varying topology in less than a second. © 2009 Springer-Verlag.

  11. Distance learning for similarity estimation.

    Science.gov (United States)

    Yu, Jie; Amores, Jaume; Sebe, Nicu; Radeva, Petia; Tian, Qi

    2008-03-01

    In this paper, we present a general guideline to find a better distance measure for similarity estimation based on statistical analysis of distribution models and distance functions. A new set of distance measures are derived from the harmonic distance, the geometric distance, and their generalized variants according to the Maximum Likelihood theory. These measures can provide a more accurate feature model than the classical Euclidean and Manhattan distances. We also find that the feature elements are often from heterogeneous sources that may have different influence on similarity estimation. Therefore, the assumption of single isotropic distribution model is often inappropriate. To alleviate this problem, we use a boosted distance measure framework that finds multiple distance measures which fit the distribution of selected feature elements best for accurate similarity estimation. The new distance measures for similarity estimation are tested on two applications: stereo matching and motion tracking in video sequences. The performance of boosted distance measure is further evaluated on several benchmark data sets from the UCI repository and two image retrieval applications. In all the experiments, robust results are obtained based on the proposed methods.

  12. Partners in Long Distance Interactions

    NARCIS (Netherlands)

    S. Krpic (Sanja)

    2009-01-01

    textabstractThe genome of higher eukaryotes consists of DNA, which in case of the human genome measures 2m in length and is divided over 46 chromosomes. These long DNA molecules are packed in a nucleus that measures about 10μm in diameter. In order to fit the complete DNA into such a small volume,

  13. On distances between phylogenetic trees

    Energy Technology Data Exchange (ETDEWEB)

    DasGupta, B. [Rutgers Univ., Camden, NJ (United States); He, X. [SUNY, Buffalo, NY (United States); Jiang, T. [McMaster Univ., Hamilton, Ontario (Canada)] [and others

    1997-06-01

    Different phylogenetic trees for the same group of species are often produced either by procedures that use diverse optimality criteria or from different genes in the study of molecular evolution. Comparing these trees to find their similarities and dissimilarities, i.e. distance, is thus an important issue in computational molecular biology. The nearest neighbor interchange distance and the subtree-transfer distance are two major distance metrics that have been proposed and extensively studied for different reasons. Despite their many appealing aspects such as simplicity and sensitivity to tree topologies, computing these distances has remained very challenging. This article studies the complexity and efficient approximation algorithms for computing the nni distance and a natural extension of the subtree-transfer distance, called the linear-cost subtree-transfer distance. The linear-cost subtree-transfer model is more logical than the subtree-transfer model and in fact coincides with the nni model under certain conditions. The following results have been obtained as part of our project of building a comprehensive software package for computing distances between phylogenies. (1) Computing the nni distance is NP-complete. This solves a 25 year old open question appearing again and again in, for example, under the complexity-theoretic assumption of P {ne} NP. We also answer an open question regarding the nni distance between unlabeled trees for which an erroneous proof appeared in. We give an algorithm to compute the optimal nni sequence in time O(n{sup 2} logn + n {circ} 2{sup O(d)}), where the nni distance is at most d. (2) Biological applications require us to extend the nni and linear-cost subtree-transfer models to weighted phylogenies, where edge weights indicate the length of evolution along each edge. We present a logarithmic ratio approximation algorithm for nni and a ratio 2 approximation algorithm for linear-cost subtree-transfer, on weighted trees.

  14. DISTANCE EDUCATOR: A Multiskill Personality

    Directory of Open Access Journals (Sweden)

    Sangeeta MALIK

    2013-01-01

    Full Text Available When we talk about a distance educator and a conventional educator the difference we found nd about both of them is that, a distance educator needs to play multiple roles as compared to a conventional educator. They require more skills and knowledge cater to the needs of the learner. In this article we will cover all the responsible areas of a distance educator & why we should consider them as a multiskill personality?

  15. Atomic force microscopy reveals two phases in single stranded DNA self-assembled monolayers

    Science.gov (United States)

    Kosaka, Priscila M.; González, Sheila; Domínguez, Carmen M.; Cebollada, Alfonso; San Paulo, Alvaro; Calleja, Montserrat; Tamayo, Javier

    2013-07-01

    We have investigated the structure of single-stranded (ss) DNA self-assembled monolayers (SAMs) on gold by combining peak force tapping, Kelvin probe and phase contrast atomic force microscopy (AFM) techniques. The adhesion, surface potential and phase shift signals show heterogeneities in the DNA film structure at two levels: microscale and nanoscale; which cannot be clearly discerned in the topography. Firstly, there is multilayer aggregation covering less than 5% of the surface. The DNA multilayers seem to be ordered phases and their existence suggests that DNA end-to-end interaction can play a role in the self-assembly process. Secondly, we find the formation of two phases in the DNA monolayer, which differ both in surface energy and surface potential. We relate the two domains to differences in the packing density and in the ssDNA conformation. The discovered heterogeneities in ssDNA SAMs provide a new scenario in our vision of these relevant films that have direct consequences on their biological, chemical and physical properties.

  16. Doctoral education from a distance.

    Science.gov (United States)

    Effken, Judith A

    2008-12-01

    This article describes the environmental factors that have contributed to the recent rapid growth of nursing doctoral education at a distance. Early and recent efforts to deliver distance doctoral education are discussed, using The University of Arizona College of Nursing experience as the key exemplar. The Community of Inquiry model is introduced as an appropriate model for doctoral education and then used as a framework to evaluate the current state of the art in distance doctoral nursing education. Successes and challenges in delivering doctoral education from a distance are described.

  17. The Extended Edit Distance Metric

    CERN Document Server

    Fuad, Muhammad Marwan Muhammad

    2007-01-01

    Similarity search is an important problem in information retrieval. This similarity is based on a distance. Symbolic representation of time series has attracted many researchers recently, since it reduces the dimensionality of these high dimensional data objects. We propose a new distance metric that is applied to symbolic data objects and we test it on time series data bases in a classification task. We compare it to other distances that are well known in the literature for symbolic data objects. We also prove, mathematically, that our distance is metric.

  18. Transactional Distance and Autonomy in a Distance Learning Environment

    Science.gov (United States)

    Vasiloudis, G.; Koutsouba, M.; Giossos, Y.; Mavroidis, I.

    2015-01-01

    This paper studies the transactional distance between the students and the tutor as well as the autonomy of students in a postgraduate course of the Hellenic Open University (HOU). The aim of the paper is to examine how the relation between autonomy and transactional distance evolves during an academic year and how this relation is affected by…

  19. Distance spectra and Distance energy of Integral Circulant Graphs

    CERN Document Server

    c, Aleksandar Ili\\'

    2011-01-01

    The distance energy of a graph $G$ is a recently developed energy-type invariant, defined as the sum of absolute values of the eigenvalues of the distance matrix of $G$. There was a vast research for the pairs and families of non-cospectral graphs having equal distance energy, and most of these constructions were based on the join of graphs. A graph is called circulant if it is Cayley graph on the circulant group, i.e. its adjacency matrix is circulant. A graph is called integral if all eigenvalues of its adjacency matrix are integers. Integral circulant graphs play an important role in modeling quantum spin networks supporting the perfect state transfer. In this paper, we characterize the distance spectra of integral circulant graphs and prove that these graphs have integral eigenvalues of distance matrix $D$. Furthermore, we calculate the distance spectra and distance energy of unitary Cayley graphs. In conclusion, we present two families of pairs $(G_1, G_2)$ of integral circulant graphs with equal distanc...

  20. Virtual Bioinformatics Distance Learning Suite

    Science.gov (United States)

    Tolvanen, Martti; Vihinen, Mauno

    2004-01-01

    Distance learning as a computer-aided concept allows students to take courses from anywhere at any time. In bioinformatics, computers are needed to collect, store, process, and analyze massive amounts of biological and biomedical data. We have applied the concept of distance learning in virtual bioinformatics to provide university course material…

  1. The Psychology of Psychic Distance

    DEFF Research Database (Denmark)

    Håkanson, Lars; Ambos, Björn; Schuster, Anja

    2016-01-01

    and their theoretical underpinnings assume psychic distances to be symmetric. Building on insights from psychology and sociology, this paper demonstrates how national factors and cognitive processes interact in the formation of asymmetric distance perceptions. The results suggest that exposure to other countries...

  2. The Distance to M104

    Science.gov (United States)

    McQuinn, Kristen. B. W.; Skillman, Evan D.; Dolphin, Andrew E.; Berg, Danielle; Kennicutt, Robert

    2016-11-01

    M104 (NGC 4594; the Sombrero galaxy) is a nearby, well-studied elliptical galaxy included in scores of surveys focused on understanding the details of galaxy evolution. Despite the importance of observations of M104, a consensus distance has not yet been established. Here, we use newly obtained Hubble Space Telescope optical imaging to measure the distance to M104 based on the tip of the red giant branch (TRGB) method. Our measurement yields the distance to M104 to be 9.55 ± 0.13 ± 0.31 Mpc equivalent to a distance modulus of 29.90 ± 0.03 ± 0.07 mag. Our distance is an improvement over previous results as we use a well-calibrated, stable distance indicator, precision photometry in a optimally selected field of view, and a Bayesian maximum likelihood technique that reduces measurement uncertainties. The most discrepant previous results are due to Tully-Fisher method distances, which are likely inappropriate for M104 given its peculiar morphology and structure. Our results are part of a larger program to measure accurate distances to a sample of well-known spiral galaxies (including M51, M74, and M63) using the TRGB method. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the Data Archive at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  3. The Distance to M51

    Science.gov (United States)

    McQuinn, Kristen. B. W.; Skillman, Evan D.; Dolphin, Andrew E.; Berg, Danielle; Kennicutt, Robert

    2016-07-01

    Great investments of observing time have been dedicated to the study of nearby spiral galaxies with diverse goals ranging from understanding the star formation process to characterizing their dark matter distributions. Accurate distances are fundamental to interpreting observations of these galaxies, yet many of the best studied nearby galaxies have distances based on methods with relatively large uncertainties. We have started a program to derive accurate distances to these galaxies. Here we measure the distance to M51—the Whirlpool galaxy—from newly obtained Hubble Space Telescope optical imaging using the tip of the red giant branch method. We measure the distance modulus to be 8.58 ± 0.10 Mpc (statistical), corresponding to a distance modulus of 29.67 ± 0.02 mag. Our distance is an improvement over previous results as we use a well-calibrated, stable distance indicator, precision photometry in a optimally selected field of view, and a Bayesian Maximum Likelihood technique that reduces measurement uncertainties. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the Data Archive at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  4. Quality Content in Distance Education

    Science.gov (United States)

    Yildiz, Ezgi Pelin; Isman, Aytekin

    2016-01-01

    In parallel with technological advances in today's world of education activities can be conducted without the constraints of time and space. One of the most important of these activities is distance education. The success of the distance education is possible with content quality. The proliferation of e-learning environment has brought a need for…

  5. Virtual Bioinformatics Distance Learning Suite

    Science.gov (United States)

    Tolvanen, Martti; Vihinen, Mauno

    2004-01-01

    Distance learning as a computer-aided concept allows students to take courses from anywhere at any time. In bioinformatics, computers are needed to collect, store, process, and analyze massive amounts of biological and biomedical data. We have applied the concept of distance learning in virtual bioinformatics to provide university course material…

  6. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben;

    2016-01-01

    We consider distance labeling schemes for trees: given a tree with n nodes, label the nodes with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the distance in the tree between the two nodes. A lower bound by Gavoille et al. [Gavoill...

  7. New distances to RAVE stars

    NARCIS (Netherlands)

    Binney, J.; Burnett, B.; Kordopatis, G.; McMillan, P. J.; Sharma, S.; Zwitter, T.; Bienayme, O.; Bland-Hawthorn, J.; Steinmetz, M.; Gilmore, G.; Williams, M. E. K.; Navarro, J.; Grebel, E. K.; Helmi, A.; Parker, Q.; Reid, W. A.; Seabroke, G.; Watson, F.; Wyse, R. F. G.

    Probability density functions (pdfs) are determined from new stellar parameters for the distance moduli of stars for which the RAdial Velocity Experiment (RAVE) has obtained spectra with S/N >= 10. Single-Gaussian fits to the pdf in distance modulus suffice for roughly half the stars, with most of

  8. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben

    2016-01-01

    variants such as, for example, small distances in trees [Alstrup et al., SODA, 2003]. We improve the known upper and lower bounds of exact distance labeling by showing that 1/4 log2(n) bits are needed and that 1/2 log2(n) bits are sufficient. We also give (1 + ε)-stretch labeling schemes using Theta...

  9. Faculty Attitudes about Distance Education

    Science.gov (United States)

    Smidt, Esther; McDyre, Brian; Bunk, Jennifer; Li, Rui; Gatenby, Tanya

    2014-01-01

    In recent years, there has been a dramatic increase in distance learning in higher education. Given this, it is extremely important to understand faculty attitudes about distance education, not only because they can vary widely, but also because it is the faculty, through their design and implementation of online courses, that will shape the…

  10. Distance-constrained grid colouring

    Directory of Open Access Journals (Sweden)

    Aszalós László

    2016-06-01

    Full Text Available Distance-constrained colouring is a mathematical model of the frequency assignment problem. This colouring can be treated as an optimization problem so we can use the toolbar of the optimization to solve concrete problems. In this paper, we show performance of distance-constrained grid colouring for two methods which are good in map colouring.

  11. Hierarchical traits distances explain grassland Fabaceae species’ ecological niches distances

    Directory of Open Access Journals (Sweden)

    Florian eFort

    2015-02-01

    Full Text Available Fabaceae species play a key role in ecosystem functioning through their capacity to fix atmospheric nitrogen via their symbiosis with Rhizobium bacteria. To increase benefits of using Fabaceae in agricultural systems, it is necessary to find ways to evaluate species or genotypes having potential adaptations to sub-optimal growth conditions. We evaluated the relevance of phylogenetic distance, absolute trait distance and hierarchical trait distance for comparing the adaptation of 13 grassland Fabaceae species to different habitats, i.e. ecological niches. We measured a wide range of functional traits (root traits, leaf traits and whole plant traits in these species. Species phylogenetic and ecological distances were assessed from a species-level phylogenetic tree and species’ ecological indicator values, respectively. We demonstrated that differences in ecological niches between grassland Fabaceae species were related more to their hierarchical trait distances than to their phylogenetic distances. We showed that grassland Fabaceae functional traits tend to converge among species with the same ecological requirements. Species with acquisitive root strategies (thin roots, shallow root systems are competitive species adapted to non-stressful meadows, while conservative ones (coarse roots, deep root systems are able to tolerate stressful continental climates. In contrast, acquisitive species appeared to be able to tolerate low soil-P availability, while conservative ones need high P availability. Finally we highlight that traits converge along the ecological gradient, providing the assumption that species with similar root-trait values are better able to coexist, regardless of their phylogenetic distance.

  12. Distance labeling schemes for trees

    DEFF Research Database (Denmark)

    Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben

    2016-01-01

    We consider distance labeling schemes for trees: given a tree with n nodes, label the nodes with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the distance in the tree between the two nodes. A lower bound by Gavoille et al. [Gavoille...... variants such as, for example, small distances in trees [Alstrup et al., SODA, 2003]. We improve the known upper and lower bounds of exact distance labeling by showing that 1/4 log2(n) bits are needed and that 1/2 log2(n) bits are sufficient. We also give (1 + ε)-stretch labeling schemes using Theta......(log(n)) bits for constant ε> 0. (1 + ε)-stretch labeling schemes with polylogarithmic label size have previously been established for doubling dimension graphs by Talwar [Talwar, STOC, 2004]. In addition, we present matching upper and lower bounds for distance labeling for caterpillars, showing that labels...

  13. Molecular evidence for long-distance colonization in an Indo-Pacific seahorse lineage

    Digital Repository Service at National Institute of Oceanography (India)

    Teske, P.R.; Hamilton, H.; Palsboll, P.J.; Choo, C.K.; Gabr, H.; Lourie, S.A.; Santos, M.; Sreepada, R.A.; Cherry, M.I.; Matthee, C.A.

    of relative population ages, tests for evidence of population expansion, pair-wise migration rates and divergence times, as well as relationships between genetic and geographic distances. The mtDNA data indicate that all populations have undergone recent...

  14. The Distance Field Model and Distance Constrained MAP Adaptation Algorithm

    Institute of Scientific and Technical Information of China (English)

    YUPeng; WANGZuoying

    2003-01-01

    Spatial structure information, i.e., the rel-ative position information of phonetic states in the feature space, is long to be carefully researched yet. In this pa-per, a new model named “Distance Field” is proposed to describe the spatial structure information. Based on this model, a modified MAP adaptation algorithm named dis-tance constrained maximum a poateriori (DCMAP) is in-troduced. The distance field model gives large penalty when the spatial structure is destroyed. As a result the DCMAP reserves the spatial structure information in adaptation process. Experiments show the Distance Field Model improves the performance of MAP adapta-tion. Further results show DCMAP has strong cross-state estimation ability, which is used to train a well-performed speaker-dependent model by data from only part of pho-

  15. The Distance to M104

    CERN Document Server

    McQuinn, Kristen B W; Dolphin, Andrew E; Berg, Danielle; Kennicutt, Robert

    2016-01-01

    M104 (NGC 4594; the Sombrero galaxy) is a nearby, well-studied elliptical galaxy included in scores of surveys focused on understanding the details of galaxy evolution. Despite the importance of observations of M104, a consensus distance has not yet been established. Here, we use newly obtained Hubble Space Telescope optical imaging to measure the distance to M104 based on the tip of the red giant branch method. Our measurement yields the distance to M104 to be 9.55 +/- 0.13 +/- 0.31 Mpc equivalent to a distance modulus of 29.90 +/- 0.03 +/- 0.07 mag. Our distance is an improvement over previous results as we use a well-calibrated, stable distance indicator, precision photometry in a optimally selected field of view, and a Bayesian Maximum Likelihood technique that reduces measurement uncertainties. The most discrepant previous results are due to Tully-Fisher method distances, which are likely inappropriate for M104 given its peculiar morphology and structure. Our results are part of a larger program to measu...

  16. Reducing the distance in distance-caregiving by technology innovation

    Directory of Open Access Journals (Sweden)

    Lazelle E Benefield

    2007-07-01

    Full Text Available Lazelle E Benefield1, Cornelia Beck21College of Nursing, University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma, USA; 2Pat & Willard Walker Family Memory Research Center, University of Arkansas for Medical Sciences, Little Rock, Arkansas, USAAbstract: Family caregivers are responsible for the home care of over 34 million older adults in the United States. For many, the elder family member lives more than an hour’s distance away. Distance caregiving is a growing alternative to more familiar models where: 1 the elder and the family caregiver(s may reside in the same household; or 2 the family caregiver may live nearby but not in the same household as the elder. The distance caregiving model involves elders and their family caregivers who live at some distance, defined as more than a 60-minute commute, from one another. Evidence suggests that distance caregiving is a distinct phenomenon, differs substantially from on-site family caregiving, and requires additional assistance to support the physical, social, and contextual dimensions of the caregiving process. Technology-based assists could virtually connect the caregiver and elder and provide strong support that addresses the elder’s physical, social, cognitive, and/or sensory impairments. Therefore, in today’s era of high technology, it is surprising that so few affordable innovations are being marketed for distance caregiving. This article addresses distance caregiving, proposes the use of technology innovation to support caregiving, and suggests a research agenda to better inform policy decisions related to the unique needs of this situation.Keywords: caregiving, family, distance, technology, elders

  17. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  18. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  19. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management

    Directory of Open Access Journals (Sweden)

    Pierrick Marie

    2015-06-01

    Full Text Available Quality of Context (QoC awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  20. Perspectives With The GCT End-to-end Prototype Of The Small-Sized Telescope Proposed For The Cherenkov Telescope Array

    CERN Document Server

    Costantini, H; Ernenwein, J -P; Laporte, Ph; Sol, H

    2016-01-01

    In the framework of the Cherenkov Telescope Array (CTA), the GCT (Gamma-ray Cherenkov Telescope) team is building a dual-mirror telescope as one of the proposed prototypes for the CTA small size class of telescopes. The telescope is based on a Schwarzschild- Couder (SC) optical design, an innovative solution for ground-based Cherenkov astronomy, which allows a compact telescope structure, a lightweight large Field of View (FoV) camera and enables good angular resolution across the entire FoV. We review the different mechanical and optical components of the telescope. In order to characterise them, the Paris prototype will be operated during several weeks in 2016. In this framework, an estimate of the expected performance of this prototype has been made, based on Monte Carlo simulations. In particular the observability of the Crab Nebula in the context of high Night Sky Background (NSB) is presented.

  1. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  2. An Integrated End-to-End Modeling Framework for Testing Ecosystem-Wide Effects of Human-Induced Pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Maar, Marie; Nielsen, Rasmus

    to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking...

  3. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    Energy Technology Data Exchange (ETDEWEB)

    Huang, L [Huntsman Cancer Hospital, Salt Lake City, UT (United States); Sarkar, V [University of Utah Hospitals, Salt Lake City, UT (United States); Spiessens, S [Varian Medical Systems France, Buc Cedex (France); Rassiah-Szegedi, P; Huang, Y; Salter, B [University Utah, Salt Lake City, UT (United States); Zhao, H [University of Utah, Salt Lake City, UT (United States); Szegedi, M [Huntsman Cancer Hospital, The University of Utah, Salt Lake City, UT (United States)

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  4. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    Science.gov (United States)

    2012-04-24

    medications from the SNS. "States need supply and resupply strategies. Initiating the VMI has not been tested. In a true emergency, would they be able...Agriculture USG United States Government USPS United States Postal Service VMI Vendor Managed Inventory WARRP Wide Area Recovery and Resiliency Program

  5. Robot-assisted segmental resection of tubal pregnancy followed by end-to-end reanastomosis for preserving tubal patency and fertility

    Science.gov (United States)

    Park, Joo Hyun; Cho, SiHyun; Choi, Young Sik; Seo, Seok Kyo; Lee, Byung Seok

    2016-01-01

    Abstract The objective of this study was to evaluate whether robotic tubal reanastomosis after segmental resection of tubal pregnancy is a feasible means of preserving tubal integrity and natural fertility in those with compromised contralateral tubal condition. The study was performed at a university medical center in a retrospective manner where da Vinci robotic system-guided segmental resection of tubal ectopic mass followed by reanastomosis was performed to salvage tubal patency and fertility in those with a single viable fallopian tube. Of the 17 patients with tubal pregnancies that were selected, 14 patients with successful tubal segmental resection and reanastomosis were followed up. The reproducibility of anastomosis success and cumulative pregnancy rates of up to 24 months were analyzed. Patient mean age was 28.88 ± 4.74 years, mean amenorrheic period was 7.01 ± 1.57 weeks and mean human chorionic gonadotropin (hCG) level was 9289.00 ± 7510.00 mIU/mL. The overall intraoperative cancellation rate due to unfavorable positioning or size of the tubal mass was 17.65% (3/17), which was converted to either salpingectomy or milking of ectopic mass. Of the 14 attempted, anastomosis for all 14 cases was successful, with 1 anastomotic leakage. One patient wishing to postpone pregnancy and 2 patients where patency of the contralateral tube was confirmed during the operation, were excluded from the pregnancy outcome analysis. Cumulative pregnancy rate was 63.64% (7/11), with 3 (27.27%) ongoing pregnancies, 3 (27.27%) livebirths, and 1 missed abortion at 24 months. During the follow-up, hysterosalpingography (HSG) was performed at 6 months for those who consented, and all 10 fallopian tubes tested were patent. No subsequent tubal pregnancies occurred in the reananstomosed tube for up to a period 24 months. For patients with absent or defective contralateral tubal function, da Vinci-guided reanastomosis after segmental resection of tubal pregnancy is feasible for salvaging tubal patency and fertility. PMID:27741101

  6. EXSdetect: an end-to-end software for extended source detection in X-ray images: application to Swift-XRT data

    Science.gov (United States)

    Liu, T.; Tozzi, P.; Tundo, E.; Moretti, A.; Wang, J.-X.; Rosati, P.; Guglielmetti, F.

    2013-01-01

    Aims: We present a stand-alone software (named EXSdetect) for the detection of extended sources in X-ray images. Our goal is to provide a flexible tool capable of detecting extended sources down to the lowest flux levels attainable within instrumental limitations, while maintaining robust photometry, high completeness, and low contamination, regardless of source morphology. EXSdetect was developed mainly to exploit the ever-increasing wealth of archival X-ray data, but is also ideally suited to explore the scientific capabilities of future X-ray facilities, with a strong focus on investigations of distant groups and clusters of galaxies. Methods: EXSdetect combines a fast Voronoi tessellation code with a friends-of-friends algorithm and an automated deblending procedure. The values of key parameters are matched to fundamental telescope properties such as angular resolution and instrumental background. In addition, the software is designed to permit extensive tests of its performance via simulations of a wide range of observational scenarios. Results: We applied EXSdetect to simulated data fields modeled to realistically represent the Swift X-ray Cluster Survey (SXCS), which is based on archival data obtained by the X-ray telescope onboard the Swift satellite. We achieve more than 90% completeness for extended sources comprising at least 80 photons in the 0.5-2 keV band, a limit that corresponds to 10-14 erg cm-2 s-1 for the deepest SXCS fields. This detection limit is comparable to the one attained by the most sensitive cluster surveys conducted with much larger X-ray telescopes. While evaluating the performance of EXSdetect, we also explored the impact of improved angular resolution and discuss the ideal properties of the next generation of X-ray survey missions. The Phyton code EXSdetect is available on the SXCS website http://adlibitum.oats.inaf.it/sxcs

  7. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    Science.gov (United States)

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-06-16

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  8. Linear Programming Based Estimation of Internet End-to-End Delay%基于线性规划的Internet端到端时延的估计

    Institute of Scientific and Technical Information of China (English)

    朱畅华; 裴昌幸; 李建东; 肖海云

    2004-01-01

    测量Internet端到端时延特征是研究Internet端到端分组行为的重要内容之一,它能够应用于QoS(Quality of Service),SLA(Service Level Agreement)的管理、拥塞控制算法研究等许多方面.常用的端到端时延测量方法大多依赖于GPS接收机或采用NTP协议来实现收发端时钟的同步,但由于GPS接收机价格较高不可能每台主机都能配备,NTP协议的精度不能满足要求.该文基于线性规划的方法估计收发时钟的频差、相对时钟偏差等参数,以获得端到端时延的估计.作者在几条不同的链路上进行了测试,结果表明该方法能有效消除收发时钟不同步的影响.

  9. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    Energy Technology Data Exchange (ETDEWEB)

    Lin, M; Feigenberg, S [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  10. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement : Part II – Perceptual Model

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part I, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. This paper describes the

  11. EXSdetect: an end-to-end software for extended source detection in X-ray images: application to Swift-XRT data

    CERN Document Server

    Liu, Teng; Tundo, Elena; Moretti, A; Wang, Jun-Xian; Rosati, Piero; Guglielmetti, Fabrizia; 10.1051/0004-6361/201219866

    2012-01-01

    Aims. We present a stand-alone software (named EXSdetect) for the detection of extended sources in X-ray images. Our goal is to provide a flexible tool capable of detecting extended sources down to the lowest flux levels attainable within instrumental limitations, while maintaining robust photometry, high completeness, and low contamination, regardless of source morphology. EXSdetect was developed mainly to exploit the ever-increasing wealth of archival X-ray data, but is also ideally suited to explore the scientific capabilities of future X-ray facilities, with a strong focus on investigations of distant groups and clusters of galaxies. Methods. EXSdetect combines a fast Voronoi tessellation code with a friends-of-friends algorithm and an automated deblending procedure. The values of key parameters are matched to fundamental telescope properties such as angular resolution and instrumental background. In addition, the software is designed to permit extensive tests of its performance via simulations of a wide ...

  12. The CTTC 5G End-to-End Experimental Platform : Integrating Heterogeneous Wireless/Optical Networks, Distributed Cloud, and IoT Devices

    OpenAIRE

    Muñóz, Raul; Mangues-Bafalluy, Josep; Vilalta, Ricard; Verikoukis, Christos; Alonso-Zarate, Jesús; Bartzoudis, Nikolaos; Georgiadis, Apostolos; Payaró, Miquel; Pérez-Neira, Ana; Casellas, Ramon; Martínez, Ricardo; Núñez-Martínez, Jose; Manuel Requena Esteso, Manuel; Pubill, David; Font-Batch, Oriol

    2016-01-01

    The Internet of Things (IoT) will facilitate a wide variety of applications in different domains, such as smart cities, smart grids, industrial automation (Industry 4.0), smart driving, assistance of the elderly, and home automation. Billions of heterogeneous smart devices with different application requirements will be connected to the networks and will generate huge aggregated volumes of data that will be processed in distributed cloud infrastructures. On the other hand, there is also a gen...

  13. The Italian ASTRI program: an end-to-end dual-mirror telescope prototype for the CTA Small System telescope array

    Science.gov (United States)

    Caraveo, Patrizia; Pareschi, Giovanni; Catalano, Osvaldo; Vercellone, Stefano; Sacco, Bruno; Conconi, Paolo; Fiorini, Mauro; Canestrari, Rodolfo

    2012-07-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) is a flagship project ofthe Italian Ministry of Education, University and Research related to the next generation IACT (Imaging Atmospheric Cherenkov Telescope), within the framework of the CTA International Observatory. In this context, INAF (Italian National Institute of Astrophysics) is currently developing a scientific and technological breakthrough to allow the study of the uppermost end of the VHE domain (from a few TeV - hundreds of TeV). The ASTRI project timeframe is of about 3 years, and foresees the full development, installation and calibration of a Small Size class Telescope prototype compliant with the requirements of the High Energy array of CTA. The ASTRI prototype will adopt an aplanatic, wide field, double reflection optical layout in a Schwarzschild-Couder configuration. Moreover, the focal plane instrument will explore small pixelated detector sensors such as Silicon PMs. Among the number of technological challenges, this telescope will be the very first instrument implementing both the Schwarzschild-Couder optical configuration and the double reflection for air Cherenkov imaging. In this paper we describe the status of the project, and we present the results obtained so far among the different technological developments.

  14. Perspectives with the GCT end-to-end prototype of the small-sized telescope proposed for the Cherenkov telescope array

    Science.gov (United States)

    Costantini, H.; Dournaux, J.-L.; Ernenwein, J.-P.; Laporte, P.; Sol, H.

    2017-01-01

    In the framework of the Cherenkov Telescope Array (CTA), the GCT (Gamma-ray Cherenkov Telescope) team is building a dual-mirror telescope as one of the proposed prototypes for the CTA small size class of telescopes. The telescope is based on a Schwarzschild-Couder (SC) optical design, an innovative solution for ground-based Cherenkov astronomy, which allows a compact telescope structure, a lightweight large Field of View (FoV) camera and enables good angular resolution across the entire FoV. We review the different mechanical and optical components of the telescope. In order to characterise them, the Paris prototype will be operated during several weeks in 2016. In this framework, an estimate of the expected performance of this prototype has been made, based on Monte Carlo simulations. In particular the observability of the Crab Nebula in the context of high Night Sky Background (NSB) is presented.

  15. 'End to end' planktonic trophic web and its implications for the mussel farms in the Mar Piccolo of Taranto (Ionian Sea, Italy).

    Science.gov (United States)

    Karuza, Ana; Caroppo, Carmela; Monti, Marina; Camatti, Elisa; Di Poi, Elena; Stabili, Loredana; Auriemma, Rocco; Pansera, Marco; Cibic, Tamara; Del Negro, Paola

    2016-07-01

    The Mar Piccolo is a semi-enclosed basin subject to different natural and anthropogenic stressors. In order to better understand plankton dynamics and preferential carbon pathways within the planktonic trophic web, an integrated approach was adopted for the first time by examining all trophic levels (virioplankton, the heterotrophic and phototrophic fractions of pico-, nano- and microplankton, as well as mesozooplankton). Plankton abundance and biomass were investigated during four surveys in the period 2013-2014. Beside unveiling the dynamics of different plankton groups in the Mar Piccolo, the study revealed that high portion of the plankton carbon (C) pool was constituted by small-sized (Mar Piccolo exerts a profound impact on plankton communities, not only due to the important sequestration of the plankton biomass but also by strongly influencing its structure.

  16. Distance Learning for Special Populations

    Science.gov (United States)

    Bates, Rodger A.

    2012-01-01

    Distance education strategies for remotely deployed, highly mobile, or institutionalized populations are reviewed and critiqued. Specifically, asynchronous, offline responses for special military units, Native Americans on remote reservations, prison populations and other geographically, temporally or technologically isolated niche populations are…

  17. ECONOMICS OF DISTANCE EDUCATION RECONSIDERED

    Directory of Open Access Journals (Sweden)

    Wolfram LAASER

    2008-07-01

    Full Text Available ABSTRACT According to Gartner a certain hype of e-Learning was followed by a downturn but eLearning will continue to be an important factor in learning scenarios. However the economic viability of e-learning projects will be questioned with more scrutiny than in earlier periods. Therefore it seems to be a good opportunity to see what can be learned from past experience in costing distance learning projects and what aspects are added by current attempts to measure economic efficiency. After reviewing early research about costing distance learning some more recent approaches will be discussed, such as eLearning ROI-calculators and the concept of total cost of ownership. Furthermore some microeconomic effects referring to localization of distance learning courses are outlined. Finally several unsolved issues in costing distance education are summarized.

  18. Distance Education in Technological Age

    Directory of Open Access Journals (Sweden)

    R .C. SHARMA

    2005-04-01

    Full Text Available Distance Education in Technological AgeRomesh Verma (Editor, New Delhi: Anmol Publications, 2005, ISBN 81-261-2210-2, pp. 419 Reviewed by R C SHARMARegional DirectorIndira Gandhi National Open University-INDIA The advancements in information and communication technologies have brought significant changes in the way the open and distance learning are provided to the learners. The impact of such changes is quite visible in both developed and developing countries. Switching over to online mode, joining hands with private initiatives and making a presence in foreign waters, are some of the hallmarks of the open and distance education (ODE institutions in developing countries. The compilation of twenty six essays on themes as applicable to ODE has resulted in the book, “Distance Education in Technological Age”. These essays follow a progressive style of narration, starting from describing conceptual framework of distance education, how the distance education was emerged on the global scene and in India, and then goes on to discuss emergence of online distance education and research aspects in ODE. The initial four chapters provide a detailed account of historical development and growth of distance education in India and State Open University and National Open University Model in India . Student support services are pivot to any distance education and much of its success depends on how well the support services are provided. These are discussed from national and international perspective. The issues of collaborative learning, learning on demand, life long learning, learning-unlearning and re-learning model and strategic alliances have also given due space by the authors. An assortment of technologies like communication technology, domestic technology, information technology, mass media and entertainment technology, media technology and educational technology give an idea of how these technologies are being adopted in the open universities. The study

  19. KNOWLEDGE DISTANCE IN INFORMATION SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Yuhua QIAN; Jiye LIANG; Chuangyin DANG; Feng WANG; Wei XU

    2007-01-01

    In this paper, we first introduce the concepts of knowledge closeness and knowledge distance for measuring the sameness and the difference among knowledge in an information system, respectively.The relationship between these two concepts is a strictly mutual complement relation. We then investigate some important properties of knowledge distance and perform experimental analyses on two public data sets, which show the presented measure appears to be well suited to characterize the nature of knowledge in an information system. Finally, we establish the relationship between the knowledge distance and knowledge granulation, which shows that two variants of the knowledge distance can also be used to construct the knowledge granulation. These results will be helpful for studying uncertainty in information systems.

  20. Graph distance for complex networks

    Science.gov (United States)

    Shimada, Yutaka; Hirata, Yoshito; Ikeguchi, Tohru; Aihara, Kazuyuki

    2016-10-01

    Networks are widely used as a tool for describing diverse real complex systems and have been successfully applied to many fields. The distance between networks is one of the most fundamental concepts for properly classifying real networks, detecting temporal changes in network structures, and effectively predicting their temporal evolution. However, this distance has rarely been discussed in the theory of complex networks. Here, we propose a graph distance between networks based on a Laplacian matrix that reflects the structural and dynamical properties of networked dynamical systems. Our results indicate that the Laplacian-based graph distance effectively quantifies the structural difference between complex networks. We further show that our approach successfully elucidates the temporal properties underlying temporal networks observed in the context of face-to-face human interactions.