WorldWideScience

Sample records for end-to-end performance modeling

  1. Integrated Ray Tracing Model for End-to-end Performance Verification of Amon-Ra Instrument

    Science.gov (United States)

    Lee, Jae-Min; Park, Won Hyun; Ham, Sun-Jeong, Yi, Hyun-Su; Yoon, Jee Yeon; Kim, Sug-Whan; Choi, Ki-Hyuk; Kim, Zeen Chul; Lockwood, Mike

    2007-03-01

    The international EARTHSHINE mission is to measure 1% anomaly of the Earth global albedo and total solar irradiance using Amon-Ra instrument around Lagrange point 1. We developed a new ray tracing based integrated end-to-end simulation tool that overcomes the shortcomings of the existing end-to-end performance simulation techniques. We then studied the in-orbit radiometric performance of the breadboard Amon-Ra visible channel optical system. The TSI variation and the Earth albedo anomaly, reported elsewhere, were used as the key input variables in the simulation. The output flux at the instrument focal plane confirms that the integrated ray tracing based end-to-end science simulation delivers the correct level of incident power to the Amon-Ra instrument well within the required measurement error budget of better than ±0.28%. Using the global angular distribution model (ADM), the incident flux is then used to estimate the Earth global albedo and the TSI variation, confirming the validity of the primary science cases at the L1 halo orbit. These results imply that the integrated end-to-end ray tracing technique, reported here, can serve as an effective and powerful building block of the on-line science analysis tool in support of the international EARTHSHINE mission currently being developed.

  2. TROPOMI end-to-end performance studies

    Science.gov (United States)

    Voors, Robert; de Vries, Johan; Veefkind, Pepijn; Gloudemans, Annemieke; Mika, Àgnes; Levelt, Pieternel

    2008-10-01

    The TROPOspheric Monitoring Instrument (TROPOMI) is a UV/VIS/NIR/SWIR non-scanning nadir viewing imaging spectrometer that combines a wide swath (110°) with high spatial resolution (8 x 8 km). Its main heritages are from the Ozone Monitoring Instrument (OMI) and from SCIAMACHY. Since its launch in 2004 OMI has been providing, on a daily basis and on a global scale, a wealth of data on ozone, NO2 and minor trace gases, aerosols and local pollution, a scanning spectrometer launched in 2004. The TROPOMI UV/VIS/NIR and SWIR heritage is a combination of OMI and SCIAMACHY. In the framework of development programs for a follow-up mission for the successful Ozone Monitoring Instrument, we have developed the so-called TROPOMI Integrated Development Environment. This is a GRID based software simulation tool for OMI follow-up missions. It includes scene generation, an instrument simulator, a level 0-1b processing chain, as well as several level 1b-2 processing chains. In addition it contains an error-analyzer, i.e. a tool to feedback the level 2 results to the input of the scene generator. The paper gives a description of the TROPOMI instrument and focuses on design aspects as well as on the performance, as tested in the end-to-end development environment TIDE.

  3. SIP end to end performance metrics

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2012-01-01

    The paper deals with a SIP performance testing methodology. The main contribution to the field of performance testing of SIP infrastructure consists in the possibility to perform the standardized stress tests with the developed SIP TesterApp without a deeper knowledge in the area of SIP communication. The developed tool exploits several of open-source applications such as jQuery, Python, JSON and the cornerstone SIP generator SIPp, the result is highly modifiable and the ...

  4. End-to-End Neural Segmental Models for Speech Recognition

    Science.gov (United States)

    Tang, Hao; Lu, Liang; Kong, Lingpeng; Gimpel, Kevin; Livescu, Karen; Dyer, Chris; Smith, Noah A.; Renals, Steve

    2017-12-01

    Segmental models are an alternative to frame-based models for sequence prediction, where hypothesized path weights are based on entire segment scores rather than a single frame at a time. Neural segmental models are segmental models that use neural network-based weight functions. Neural segmental models have achieved competitive results for speech recognition, and their end-to-end training has been explored in several studies. In this work, we review neural segmental models, which can be viewed as consisting of a neural network-based acoustic encoder and a finite-state transducer decoder. We study end-to-end segmental models with different weight functions, including ones based on frame-level neural classifiers and on segmental recurrent neural networks. We study how reducing the search space size impacts performance under different weight functions. We also compare several loss functions for end-to-end training. Finally, we explore training approaches, including multi-stage vs. end-to-end training and multitask training that combines segmental and frame-level losses.

  5. End-to-end network/application performance troubleshooting methodology

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  6. End-to-End Delay Model for Train Messaging over Public Land Mobile Networks

    Directory of Open Access Journals (Sweden)

    Franco Mazzenga

    2017-11-01

    Full Text Available Modern train control systems rely on a dedicated radio network for train to ground communications. A number of possible alternatives have been analysed to adopt the European Rail Traffic Management System/European Train Control System (ERTMS/ETCS control system on local/regional lines to improve transport capacity. Among them, a communication system based on public networks (cellular&satellite provides an interesting, effective and alternative solution to proprietary and expensive radio networks. To analyse performance of this solution, it is necessary to model the end-to-end delay and message loss to fully characterize the message transfer process from train to ground and vice versa. Starting from the results of a railway test campaign over a 300 km railway line for a cumulative 12,000 traveled km in 21 days, in this paper, we derive a statistical model for the end-to-end delay required for delivering messages. In particular, we propose a two states model allowing for reproducing the main behavioral characteristics of the end-to-end delay as observed experimentally. Model formulation has been derived after deep analysis of the recorded experimental data. When it is applied to model a realistic scenario, it allows for explicitly accounting for radio coverage characteristics, the received power level, the handover points along the line and for the serving radio technology. As an example, the proposed model is used to generate the end-to-end delay profile in a realistic scenario.

  7. Model outputs - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  8. Atlantis model outputs - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  9. Physical oceanography - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  10. Enhancing End-to-End Performance of Information Services Over Ka-Band Global Satellite Networks

    Science.gov (United States)

    Bhasin, Kul B.; Glover, Daniel R.; Ivancic, William D.; vonDeak, Thomas C.

    1997-01-01

    The Internet has been growing at a rapid rate as the key medium to provide information services such as e-mail, WWW and multimedia etc., however its global reach is limited. Ka-band communication satellite networks are being developed to increase the accessibility of information services via the Internet at global scale. There is need to assess satellite networks in their ability to provide these services and interconnect seamlessly with existing and proposed terrestrial telecommunication networks. In this paper the significant issues and requirements in providing end-to-end high performance for the delivery of information services over satellite networks based on various layers in the OSI reference model are identified. Key experiments have been performed to evaluate the performance of digital video and Internet over satellite-like testbeds. The results of the early developments in ATM and TCP protocols over satellite networks are summarized.

  11. End-to-end modeling: a new modular and flexible approach

    Science.gov (United States)

    Genoni, M.; Riva, M.; Landoni, M.; Pariani, G.

    2016-08-01

    In this paper we present an innovative philosophy to develop the End-to-End model for astronomical observation projects, i.e. the architecture which allows physical modeling of the whole system from the light source to the reduced data. This alternative philosophy foresees the development of the physical model of the different modules, which compose the entire End-to-End system, directly during the project design phase. This approach is strongly characterized by modularity and flexibility; these aspects will be of relevant importance in the next generation astronomical observation projects like E-ELT (European Extremely Large Telescope) because of the high complexity and long-time design and development. With this approach it will be possible to keep the whole system and its different modules efficiently under control during every project phase and to exploit a reliable tool at a system engineering level to evaluate the effects on the final performance both of the main parameters and of different instrument architectures and technologies. This philosophy will be important to allow scientific community to perform in advance simulations and tests on the scientific drivers. This will translate in a continuous feedback to the (system) design process with a resulting improvement in the effectively achievable scientific goals and consistent tool for efficiently planning observation proposals and programs. We present the application case for this End-to-End modeling technique, which is the high resolution spectrograph at the E-ELT (E-ELT HIRES). In particular we present the definition of the system modular architecture, describing the interface parameters of the modules.

  12. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    Science.gov (United States)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    End-to-end marine ecosystem models link climate and oceanography to the food web and human activities. These models can be used as forecasting tools, to strategically evaluate management options and to support ecosystem-based management. Here we report the results of such forecasts in the California Current, using an Atlantis end-to-end model. We worked collaboratively with fishery managers at NOAA’s regional offices and staff at the National Marine Sanctuaries (NMS) to explore the impact of fishery policies on management objectives at different spatial scales, from single Marine Sanctuaries to the entire Northern California Current. In addition to examining Status Quo management, we explored the consequences of several gear switching and spatial management scenarios. Of the scenarios that involved large scale management changes, no single scenario maximized all performance metrics. Any policy choice would involve trade-offs between stakeholder groups and policy goals. For example, a coast-wide 25% gear shift from trawl to pot or longline appeared to be one possible compromise between an increase in spatial management (which sacrificed revenue) and scenarios such as the one consolidating bottom impacts to deeper areas (which did not perform substantially differently from Status Quo). Judged on a coast-wide scale, most of the scenarios that involved minor or local management changes (e.g. within Monterey Bay NMS only) yielded results similar to Status Quo. When impacts did occur in these cases, they often involved local interactions that were difficult to predict a priori based solely on fishing patterns. However, judged on the local scale, deviation from Status Quo did emerge, particularly for metrics related to stationary species or variables (i.e. habitat and local metrics of landed value or bycatch). We also found that isolated management actions within Monterey Bay NMS would cause local fishers to pay a cost for conservation, in terms of reductions in landed

  13. On end-to-end performance of MIMO multiuser in cognitive radio networks

    KAUST Repository

    Yang, Yuli

    2011-12-01

    In this paper, a design for the multiple-input-multiple-output (MIMO) multiuser transmission in the cognitive radio network is developed and its end-to-end performance is investigated under spectrum-sharing constraints. Firstly, the overall average packet error rate is analyzed by considering the channel state information feedback delay and the multiuser scheduling. Then, we provide corresponding numerical results to measure the performance evaluation for several separate scenarios, which presents a convenient tool for the cognitive radio network design with multiple secondary MIMO users. © 2011 IEEE.

  14. End-to-end modeling as part of an integrated research program in the Bering Sea

    Science.gov (United States)

    Punt, André E.; Ortiz, Ivonne; Aydin, Kerim Y.; Hunt, George L.; Wiese, Francis K.

    2016-12-01

    Traditionally, the advice provided to fishery managers has focused on the trade-offs between short- and long-term yields, and between future resource size and expected future catches. The harvest control rules that are used to provide management advice consequently relate catches to stock biomass levels expressed relative to reference biomass levels. There are, however, additional trade-offs. Ecosystem-based fisheries management (EBFM) aims to consider fish and fisheries in their ecological context, taking into account physical, biological, economic, and social factors. However, making EBFM operational remains challenging. It is generally recognized that end-to-end modeling should be a key part of implementing EBFM, along with harvest control rules that use information in addition to estimates of stock biomass to provide recommendations for management actions. Here we outline the process for selecting among alternative management strategies in an ecosystem context and summarize a Field-integrated End-To-End modeling program, or FETE, intended to implement this process as part of the Bering Sea Project. A key aspect of this project was that, from the start, the FETE included a management strategy evaluation component to compare management strategies. Effective use of end-to-end modeling requires that the models developed for a system are indeed integrated across climate drivers, lower trophic levels, fish population dynamics, and fisheries and their management. We summarize the steps taken by the program managers to promote integration of modeling efforts by multiple investigators and highlight the lessons learned during the project that can be used to guide future use and design of end-to-end models.

  15. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak

  16. Design and end-to-end modelling of a deployable telescope

    Science.gov (United States)

    Dolkens, Dennis; Kuiper, Hans

    2017-09-01

    a closed-loop system based on measurements of the image sharpness as well as measurements obtained with edge sensors placed between the mirror segments. In addition, a phase diversity system will be used to recover residual wavefront aberrations. To aid the design of the deployable telescope, an end-to-end performance model was developed. The model is built around a dedicated ray-trace program written in Matlab. This program was built from the ground up for the purpose of modelling segmented telescope systems and allows for surface data computed with Finite Element Models (FEM) to be imported in the model. The program also contains modules which can simulate the closed-loop calibration of the telescope and it can use simulated images as an input for phase diversity and image processing algorithms. For a given thermo-mechanical state, the end-to-end model can predict the image quality that will be obtained after the calibration has been completed and the image has been processed. As such, the model is a powerful systems engineering tool, which can be used to optimize the in-orbit performance of a segmented, deployable telescope.

  17. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, W.

    2000-02-22

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results

  18. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    Science.gov (United States)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  19. Functional Partitioning to Optimize End-to-End Performance on Many-core Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Li, Min [Virginia Polytechnic Institute and State University (Virginia Tech); Vazhkudai, Sudharshan S [ORNL; Butt, Ali R [Virginia Polytechnic Institute and State University (Virginia Tech); Meng, Fei [ORNL; Ma, Xiaosong [ORNL; Kim, Youngjae [ORNL; Engelmann, Christian [ORNL; Shipman, Galen M [ORNL

    2010-01-01

    Scaling computations on emerging massive-core supercomputers is a daunting task, which coupled with the significantly lagging system I/O capabilities exacerbates applications end-to-end performance. The I/O bottleneck often negates potential performance benefits of assigning additional compute cores to an application. In this paper, we address this issue via a novel functional partitioning (FP) runtime environment that allocates cores to specific application tasks - checkpointing, de-duplication, and scientific data format transformation - so that the deluge of cores can be brought to bear on the entire gamut of application activities. The focus is on utilizing the extra cores to support HPC application I/O activities and also leverage solid-state disks in this context. For example, our evaluation shows that dedicating 1 core on an oct-core machine for checkpointing and its assist tasks using FP can improve overall execution time of a FLASH benchmark on 80 and 160 cores by 43.95% and 41.34%, respectively.

  20. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Michelle M. [Southern Illinois Univ., Carbondale, IL (United States); Wu, Chase Q. [Univ. of Memphis, TN (United States)

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  1. End-to-End Traffic Flow Modeling of the Integrated SCaN Network

    Science.gov (United States)

    Cheung, K.-M.; Abraham, D. S.

    2012-05-01

    In this article, we describe the analysis and simulation effort of the end-to-end traffic flow for the Integrated Space Communications and Navigation (SCaN) Network. Using the network traffic derived for the 30-day period of July 2018 from the Space Communications Mission Model (SCMM), we generate the wide-area network (WAN) bandwidths of the ground links for different architecture options of the Integrated SCaN Network. We also develop a new analytical scheme to model the traffic flow and buffering mechanism of a store-and-forward network. It is found that the WAN bandwidth of the Integrated SCaN Network is an important differentiator of different architecture options, as the recurring circuit costs of certain architecture options can be prohibitively high.

  2. Performance Enhancements of UMTS networks using end-to-end QoS provisioning

    DEFF Research Database (Denmark)

    Wang, Haibo; Prasad, Devendra; Teyeb, Oumer

    2005-01-01

    This paper investigates the end-to-end(E2E) quality of service(QoS) provisioning approaches for UMTS networks together with DiffServ IP network. The effort was put on QoS classes mapping from DiffServ to UMTS, Access Control(AC), buffering and scheduling optimization. The DiffServ Code Point (DSCP...

  3. End-to-end models for marine ecosystems: Are we on the precipice of a significant advance or just putting lipstick on a pig?

    Directory of Open Access Journals (Sweden)

    Kenneth A. Rose

    2012-02-01

    Full Text Available There has been a rapid rise in the development of end-to-end models for marine ecosystems over the past decade. Some reasons for this rise include need for predicting effects of climate change on biota and dissatisfaction with existing models. While the benefits of a well-implemented end-to-end model are straightforward, there are many challenges. In the short term, my view is that the major role of end-to-end models is to push the modelling community forward, and to identify critical data so that these data can be collected now and thus be available for the next generation of end-to-end models. I think we should emulate physicists and build theoretically-oriented models first, and then collect the data. In the long-term, end-to-end models will increase their skill, data collection will catch up, and end-to-end models will move towards site-specific applications with forecasting and management capabilities. One pathway into the future is individual efforts, over-promise, and repackaging of poorly performing component submodels (“lipstick on a pig”. The other pathway is a community-based collaborative effort, with appropriate caution and thoughtfulness, so that the needed improvements are achieved (“significant advance”. The promise of end-to-end modelling is great. We should act now to avoid missing a great opportunity.

  4. Prediction, scenarios and insight: The uses of an end-to-end model

    Science.gov (United States)

    Steele, John H.

    2012-09-01

    A major function of ecosystem models is to provide extrapolations from observed data in terms of predictions or scenarios or insight. These models can be at various levels of taxonomic resolution such as total community production, abundance of functional groups, or species composition, depending on the data input as drivers. A 40-year dynamic simulation of end-to-end processes in the Georges Bank food web is used to illustrate the input/output relations and the insights gained at the three levels of food web aggregation. The focus is on the intermediate level and the longer term changes in three functional fish guilds - planktivores, benthivores and piscivores - in terms of three ecosystem-based metrics - nutrient input, relative productivity of plankton and benthos, and food intake by juvenile fish. These simulations can describe the long term constraints imposed on guild structure and productivity by energy fluxes over the 40 years but cannot explain concurrent switches in abundance of individual species within guilds. Comparing time series data for individual species with model output provides insights; but including the data in the model would confer only limited extra information. The advantages and limitations of the three levels of resolution of models in relation to ecosystem-based management are: The correlations between primary production and total yield of fish imply a “bottom-up” constraint on end-to-end energy flow through the food web that can provide predictions of such yields. Functionally defined metrics such as nutrient input, relative productivity of plankton and benthos and food intake by juvenile fish, represent bottom-up, mid-level and top-down forcing of the food web. Model scenarios using these metrics can demonstrate constraints on the productivity of these functionally defined guilds within the limits set by (1). Comparisons of guild simulations with time series of fish species provide insight into the switches in species dominance

  5. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model.

    Science.gov (United States)

    Ainsworth, Cameron H; Paris, Claire B; Perlin, Natalie; Dornberger, Lindsey N; Patterson, William F; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover.

  6. Impacts of the Deepwater Horizon oil spill evaluated using an end-to-end ecosystem model

    Science.gov (United States)

    Paris, Claire B.; Perlin, Natalie; Dornberger, Lindsey N.; Patterson, William F.; Chancellor, Emily; Murawski, Steve; Hollander, David; Daly, Kendra; Romero, Isabel C.; Coleman, Felicia; Perryman, Holly

    2018-01-01

    We use a spatially explicit biogeochemical end-to-end ecosystem model, Atlantis, to simulate impacts from the Deepwater Horizon oil spill and subsequent recovery of fish guilds. Dose-response relationships with expected oil concentrations were utilized to estimate the impact on fish growth and mortality rates. We also examine the effects of fisheries closures and impacts on recruitment. We validate predictions of the model by comparing population trends and age structure before and after the oil spill with fisheries independent data. The model suggests that recruitment effects and fishery closures had little influence on biomass dynamics. However, at the assumed level of oil concentrations and toxicity, impacts on fish mortality and growth rates were large and commensurate with observations. Sensitivity analysis suggests the biomass of large reef fish decreased by 25% to 50% in areas most affected by the spill, and biomass of large demersal fish decreased even more, by 40% to 70%. Impacts on reef and demersal forage caused starvation mortality in predators and increased reliance on pelagic forage. Impacts on the food web translated effects of the spill far away from the oiled area. Effects on age structure suggest possible delayed impacts on fishery yields. Recovery of high-turnover populations generally is predicted to occur within 10 years, but some slower-growing populations may take 30+ years to fully recover. PMID:29370187

  7. End-to-end integrated security and performance analysis on the DEGAS Choreographer platform

    DEFF Research Database (Denmark)

    Buchholtz, Mikael; Gilmore, Stephen; Haenel, Valentin

    2005-01-01

    We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed with the......We present a software tool platform which facilitates security and performance analysis of systems which starts and ends with UML model descriptions. A UML project is presented to the platform for analysis, formal content is extracted in the form of process calculi descriptions, analysed...

  8. End to End Travel

    Data.gov (United States)

    US Agency for International Development — E2 Solutions is a web based end-to-end travel management tool that includes paperless travel authorization and voucher document submissions, document approval...

  9. Increasing Army Supply Chain Performance: Using an Integrated End to End Metrics System

    Science.gov (United States)

    2017-01-01

    view of strategic , tactical, and operational measures. With the right measures and analytical capabilities, individual supply chain managers will...for supply chain performance measurement. International Journal of Production Economics, 87(3), 333–347. Henderson, R. (1994). Managing innovation...in operations management : Dealing with the metrics maze. Journal of Operations Management , 22(3), 209–217. Neidert, L. (2011). Stock-out as it

  10. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; /SLAC; Grigoriev, Maxim; /Fermilab; Haro, Felipe; /Chile U., Catolica; Nazir, Fawad; /NUST, Rawalpindi; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our

  11. End-to-end performance of cooperative relaying in spectrum-sharing systems with quality of service requirements

    KAUST Repository

    Asghari, Vahid Reza

    2011-07-01

    We propose adopting a cooperative relaying technique in spectrum-sharing cognitive radio (CR) systems to more effectively and efficiently utilize available transmission resources, such as power, rate, and bandwidth, while adhering to the quality of service (QoS) requirements of the licensed (primary) users of the shared spectrum band. In particular, we first consider that the cognitive (secondary) user\\'s communication is assisted by an intermediate relay that implements the decode-and-forward (DF) technique onto the secondary user\\'s relayed signal to help with communication between the corresponding source and the destination nodes. In this context, we obtain first-order statistics pertaining to the first- and second-hop transmission channels, and then, we investigate the end-to-end performance of the proposed spectrum-sharing cooperative relaying system under resource constraints defined to assure that the primary QoS is unaffected. Specifically, we investigate the overall average bit error rate (BER), ergodic capacity, and outage probability of the secondary\\'s communication subject to appropriate constraints on the interference power at the primary receivers. We then consider a general scenario where a cluster of relays is available between the secondary source and destination nodes. In this case, making use of the partial relay selection method, we generalize our results for the single-relay scheme and obtain the end-to-end performance of the cooperative spectrum-sharing system with a cluster of L available relays. Finally, we examine our theoretical results through simulations and comparisons, illustrating the overall performance of the proposed spectrum-sharing cooperative system and quantify its advantages for different operating scenarios and conditions. © 2011 IEEE.

  12. Far-Infrared Therapy Promotes Nerve Repair following End-to-End Neurorrhaphy in Rat Models of Sciatic Nerve Injury

    Directory of Open Access Journals (Sweden)

    Tai-Yuan Chen

    2015-01-01

    Full Text Available This study employed a rat model of sciatic nerve injury to investigate the effects of postoperative low-power far-infrared (FIR radiation therapy on nerve repair following end-to-end neurorrhaphy. The rat models were divided into the following 3 groups: (1 nerve injury without FIR biostimulation (NI/sham group; (2 nerve injury with FIR biostimulation (NI/FIR group; and (3 noninjured controls (normal group. Walking-track analysis results showed that the NI/FIR group exhibited significantly higher sciatic functional indices at 8 weeks after surgery (P<0.05 compared with the NI/sham group. The decreased expression of CD4 and CD8 in the NI/FIR group indicated that FIR irradiation modulated the inflammatory process during recovery. Compared with the NI/sham group, the NI/FIR group exhibited a significant reduction in muscle atrophy (P<0.05. Furthermore, histomorphometric assessment indicated that the nerves regenerated more rapidly in the NI/FIR group than in the NI/sham group; furthermore, the NI/FIR group regenerated neural tissue over a larger area, as well as nerve fibers of greater diameter and with thicker myelin sheaths. Functional recovery, inflammatory response, muscular reinnervation, and histomorphometric assessment all indicated that FIR radiation therapy can accelerate nerve repair following end-to-end neurorrhaphy of the sciatic nerve.

  13. West Coast fish, mammal, bird life history and abunance parameters - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  14. West Coast fish, mammal, and bird species diets - Developing end-to-end models of the California Current Large Marine Ecosystem

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the California Current LME, linking oceanography, biogeochemistry, food web...

  15. Gulf of California species and catch spatial distributions and historical time series - Developing end-to-end models of the Gulf of California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to develop spatially discrete end-to-end models of the northern Gulf of California, linking oceanography, biogeochemistry, food web...

  16. Secondary link adaptation in cognitive radio networks: End-to-end performance with cross-layer design

    KAUST Repository

    Ma, Hao

    2012-04-01

    Under spectrum-sharing constraints, we consider the secondary link exploiting cross-layer combining of adaptive modulation and coding (AMC) at the physical layer with truncated automatic repeat request (T-ARQ) at the data link layer in cognitive radio networks. Both, basic AMC and aggressive AMC, are adopted to optimize the overall average spectral efficiency, subject to the interference constraints imposed by the primary user of the shared spectrum band and a target packet loss rate. We achieve the optimal boundary points in closed form to choose the AMC transmission modes by taking into account the channel state information from the secondary transmitter to both the primary receiver and the secondary receiver. Moreover, numerical results substantiate that, without any cost in the transmitter/receiver design nor the end-to-end delay, the scheme with aggressive AMC outperforms that with conventional AMC. The main reason is that, with aggressive AMC, different transmission modes utilized in the initial packet transmission and the following retransmissions match the time-varying channel conditions better than the basic pattern. © 2012 IEEE.

  17. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    Science.gov (United States)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  18. End-to-end verifiability

    OpenAIRE

    Benaloh, Josh; Rivest, Ronald; Ryan, Peter Y. A.; Stark, Philip; Teague, Vanessa; Vora, Poorvi

    2015-01-01

    This pamphlet describes end-to-end election verifiability (E2E-V) for a nontechnical audience: election officials, public policymakers, and anyone else interested in secure, transparent, evidence - based electronic elections. This work is part of the Overseas Vote Foundation’s End-to-End Verifiable Internet Voting: Specification and Feasibility Assessment Study (E2E VIV Project), funded by the Democracy Fund.

  19. One stage functional end-to-end stapled intestinal anastomosis and resection performed by nonexpert surgeons for the treatment of small intestinal obstruction in 30 dogs.

    Science.gov (United States)

    Jardel, Nicolas; Hidalgo, Antoine; Leperlier, Dimitri; Manassero, Mathieu; Gomes, Aymeric; Bedu, Anne Sophie; Moissonnier, Pierre; Fayolle, Pascal; Begon, Dominique; Riquois, Elisabeth; Viateau, Véronique

    2011-02-01

    To describe stapled 1-stage functional end-to-end intestinal anastomosis for treatment of small intestinal obstruction in dogs and evaluate outcome when the technique is performed by nonexpert surgeons after limited training in the technique. Case series. Dogs (n=30) with intestinal lesions requiring an enterectomy. Stapled 1-stage functional end-to-end anastomosis and resection using a GIA-60 and a TA-55 stapling devices were performed under supervision of senior residents and faculty surgeons by junior surgeons previously trained in the technique on pigs. Procedure duration and technical problems were recorded. Short-term results were collected during hospitalization and at suture removal. Long-term outcome was established by clinical and ultrasonographic examinations at least 2 months after surgery and from written questionnaires, completed by owners. Mean±SD procedure duration was 15±12 minutes. Postoperative recovery was uneventful in 25 dogs. One dog had anastomotic leakage, 1 had a localized abscess at the transverse staple line, and 3 dogs developed an incisional abdominal wall abscess. No long-term complications occurred (follow-up, 2-32 months). Stapled 1-stage functional end-to-end anastomosis and resection is a fast and safe procedure in the hand of nonexpert but trained surgeons. © Copyright 2011 by The American College of Veterinary Surgeons.

  20. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    to the high-resolution coupled physical-biological model HBM-ERGOM and the fisheries bio-economic FishRent model. We investigate ecosystem-wide responses to changes in human-induced pressures by simulating several eutrophication scenarios that are relevant to existing Baltic Sea management plans (e.g. EU BSAP......, EU CFP). We further present the structure and calibration of the Baltic ATLANTIS model and the operational linkage to the other models. Using the results of eutrophication scenarios, and focusing on the relative changes in fish and fishery production, we discuss the robustness of the model linking......We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...

  1. An integrated end-to-end modeling framework for testing ecosystem-wide effects of human-induced pressures in the Baltic Sea

    DEFF Research Database (Denmark)

    Palacz, Artur; Nielsen, J. Rasmus; Christensen, Asbjørn

    We present an integrated end-to-end modeling framework that enables whole-of ecosystem climate, eutrophication, and spatial management scenario exploration in the Baltic Sea. The framework is built around the Baltic implementation of the spatially-explicit end-to-end ATLANTIS model, linked...... with respect to the underlying assumptions, strengths and weaknesses of individual models. Furthermore, we describe how to possibly expand the framework to account for spatial impacts and economic consequences, for instance by linking to the individual-vessel based DISPLACE modeling approach. We conclude...... that the proposed model integration and management scenario evaluation scheme lays the foundations for developing a robust framework for management strategy evaluation that is of strategic importance to stakeholders from around the Baltic Sea....

  2. Arcus end-to-end simulations

    Science.gov (United States)

    Wilms, Joern; Guenther, H. Moritz; Dauser, Thomas; Huenemoerder, David P.; Ptak, Andrew; Smith, Randall; Arcus Team

    2018-01-01

    We present an overview of the end-to-end simulation environment that we are implementing as part of the Arcus phase A Study. With the rcus simulator, we aim to to model the imaging, detection, and event reconstruction properties of the spectrometer. The simulator uses a Monte Carlo ray-trace approach, projecting photons onto the Arcus focal plane from the silicon pore optic mirrors and critical-angle transmission gratings. We simulate the detection and read-out of the photons in the focal plane CCDs with software originally written for the eROSITA and Athena-WFI detectors; we include all relevant detector physics, such as charge splitting, and effects of the detector read-out, such as out of time events. The output of the simulation chain is an event list that closely resembles the data expected during flight. This event list is processed using a prototype event reconstruction chain for the order separation, wavelength calibration, and effective area calibration. The output is compatible with standard X-ray astronomical analysis software.During phase A, the end-to-end simulation approach is used to demonstrate the overall performance of the mission, including a full simulation of the calibration effort. Continued development during later phases of the mission will ensure that the simulator remains a faithful representation of the true mission capabilities, and will ultimately be used as the Arcus calibration model.

  3. AN OPTIMIZATION MODEL TO MINIMIZE THE EXPECTED END-TO-END TRANSMISSION TIME IN WIRELESS MESH NETWORKS

    Directory of Open Access Journals (Sweden)

    Marlon da Silva

    Full Text Available ABSTRACT Time metrics are extremely important to evaluate the transmission performance on Wireless Mesh Networks (WMNs, whose main characteristic is to use multihop technology to extend the network coverage area. One of such metrics is WCETT (Weighted Cumulative Expected Transmission Time, in which transmission times per hop are weighted for both proactive and reactive conditions. Furthermore, such metrics are able to detect delays that can degrade some network services. This paper presents an optimization model to minimize WCETT in a WMN, subject to constraints grouped by bandwidth, flow control and power control. As the model includes nonlinear constraints, we propose a heuristic to solve it, which divides the problem in two subproblems. The first subproblem maximizes the network link capacity and a Simulated Annealing algorithm is used to solve it. Considering the link capacities obtained, the second subproblem minimizes the WCETTs, which is formulated as a linear programming model. Some numerical results are presented, based on instances of WMNs randomly generated. Some of these results are compared with the results obtained by a commercial simulator in order to verify the coherence of the proposed heuristic for realistic scenarios.

  4. Healing of esophageal anastomoses performed with the biofragmentable anastomosis ring versus the end-to-end anastomosis stapler: comparative experimental study in dogs.

    Science.gov (United States)

    Kovács, Tibor; Köves, István; Orosz, Zsolt; Németh, Tibor; Pandi, Erzsébet; Kralovanszky, Judit

    2003-04-01

    The biofragmentable anastomosis ring (BAR) has been used successfully for anastomoses from the stomach to the upper rectum. The healing of intrathoracic esophageal anastomoses performed with the BAR or an end-to-end anastomosis (EEA) stapler on an experimental model was compared. Parameters of tissue repair were evaluated: macroscopic examination, bursting strength (BS), collagen (hydroxyproline, or HP), histology (H&E and Picrosirius red staining for collagen). A series of 48 mongrel dogs were randomly separated into two groups (30 BAR, 18 stapler) and subgroups according to the time of autopsy (days 4, 7, 14, 28). Mortality was 13.3% (4 BAR cases) with two deaths not related to surgery (excluded). There were four leaks in the BAR group (14.3%) and no leaks or deaths but two strictures in the stapler group. BS was significantly higher in the BAR group during the first week, and values were almost equal from the second week with both methods. The HP rate was significantly reduced on days 4 and 7 in both groups compared to the reference values; the values were close to reference values from the second week (lower in the BAR group). Stapled anastomoses caused less pronounced inflammation and were associated with an earlier start of regeneration, but the difference was not significant compared to that in the BAR group. Accumulation of new collagen (green polarization) started on day 7 in both groups, but maturation (orange-red polarization) was significantly more advanced in the BAR group after the second week. A strong linear correlation between the BS and HP rate was found with both methods. There was no significant difference in the complication rate or healing of intrathoracic BAR and stapled anastomoses. The BAR method is simple, quick, and safe; and it seems to be a feasible procedure for creating intrathoracic esophageal anastomoses in dogs.

  5. An end-to-end coupled model ROMS-N 2 P 2 Z 2 D 2 -OSMOSE of ...

    African Journals Online (AJOL)

    African Journal of Marine Science ... It is linked to the biogeochemical model through the predation process; plankton groups are food for fish and fish apply a predation mortality on plankton. Here we ... Keywords: individual-based model, model validation, pattern-oriented modelling, trophic interactions, two-way coupling

  6. Bridging Automatic Speech Recognition and Psycholinguistics: Extending Shortlist to an End-to-End Model of Human Speech Recognition

    NARCIS (Netherlands)

    Scharenborg, O.E.; Bosch, L.F.M. ten; Boves, L.W.J.; Norris, D.

    2003-01-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz. Shortlist (Norris, 1994). Experiments based on 'real-life' speech highlight

  7. Bridging automatic speech recognition and psycholinguistics: Extending Shortlist to an end-to-end model of human speech recognition (L)

    Science.gov (United States)

    Scharenborg, Odette; ten Bosch, Louis; Boves, Lou; Norris, Dennis

    2003-12-01

    This letter evaluates potential benefits of combining human speech recognition (HSR) and automatic speech recognition by building a joint model of an automatic phone recognizer (APR) and a computational model of HSR, viz., Shortlist [Norris, Cognition 52, 189-234 (1994)]. Experiments based on ``real-life'' speech highlight critical limitations posed by some of the simplifying assumptions made in models of human speech recognition. These limitations could be overcome by avoiding hard phone decisions at the output side of the APR, and by using a match between the input and the internal lexicon that flexibly copes with deviations from canonical phonemic representations.

  8. Utilizing Domain Knowledge in End-to-End Audio Processing

    DEFF Research Database (Denmark)

    Tax, Tycho; Antich, Jose Luis Diez; Purwins, Hendrik

    2017-01-01

    End-to-end neural network based approaches to audio modelling are generally outperformed by models trained on high-level data representations. In this paper we present preliminary work that shows the feasibility of training the first layers of a deep convolutional neural network (CNN) model...... to learn the commonly-used log-scaled mel-spectrogram transformation. Secondly, we demonstrate that upon initializing the first layers of an end-to-end CNN classifier with the learned transformation, convergence and performance on the ESC-50 environmental sound classification dataset are similar to a CNN...

  9. Standardizing an End-to-end Accounting Service

    Science.gov (United States)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  10. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    Science.gov (United States)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  11. SU-E-T-360: End-To-End Dosimetric Testing of a Versa HD Linear Accelerator with the Agility Head Modeled in Pinnacle3

    Energy Technology Data Exchange (ETDEWEB)

    Saenz, D; Narayanasamy, G; Cruz, W; Papanikolaou, N; Stathakis, S [University of Texas Health Science Center at San Antonio, San Antonio, TX (United States)

    2015-06-15

    Purpose: The Versa HD incorporates a variety of upgrades, primarily including the Agility head. The distinct dosimetric properties of the head from its predecessors combined with flattening-filter-free (FFF) beams require a new investigation of modeling in planning systems and verification of modeling accuracy. Methods: A model was created in Pinnacle{sup 3} v9.8 with commissioned beam data. Leaf transmission was modeled as <0.5% with maximum leaf speed of 3 cm/s. Photon spectra were tuned for FFF beams, for which profiles were modeled with arbitrary profiles rather than with cones. For verification, a variety of plans with varied parameters were devised, and point dose measurements were compared to calculated values. A phantom of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle{sup 3}. Beams of different field sizes, SSD, wedges, and gantry angles were created. All available photon energies (6 MV, 10 MV, 18 MV, 6 FFF, 10 FFF) as well four clinical electron energies (6, 9, 12, and 15 MeV) were investigated. The plans were verified at a calculation point (8 cm deep for photons, variable for electrons) by measurement with a PTW Semiflex ionization chamber. In addition, IMRT testing was performed with three standard plans (step and shoot IMRT, small and large field VMAT plans). The plans were delivered on the Delta4 IMRT QA phantom (ScandiDos, Uppsala, Sweden). Results: Homogeneous point dose measurement agreed within 2% for all photon and electron beams. Open field photon measurements along the central axis at 100 cm SSD passed within 1%. Gamma passing rates were >99.5% for all plans with a 3%/3mm tolerance criteria. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4±2.3%. Conclusion: The end-to-end testing ensured confidence in the ability of Pinnacle{sup 3} to model photon and electron beams with the Agility head.

  12. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  13. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  14. The role of environmental controls in determining sardine and anchovy population cycles in the California Current: Analysis of an end-to-end model

    Science.gov (United States)

    Fiechter, Jerome; Rose, Kenneth A.; Curchitser, Enrique N.; Hedstrom, Katherine S.

    2015-11-01

    Sardine and anchovy are two forage species of particular interest because of their low-frequency cycles in adult abundance in boundary current regions, combined with a commercially relevant contribution to the global marine food catch. While several hypotheses have been put forth to explain decadal shifts in sardine and anchovy populations, a mechanistic basis for how the physics, biogeochemistry, and biology combine to produce patterns of synchronous variability across widely separated systems has remained elusive. The present study uses a 50-year (1959-2008) simulation of a fully coupled end-to-end ecosystem model configured for sardine and anchovy in the California Current System to investigate how environmental processes control their population dynamics. The results illustrate that slightly different temperature and diet preferences can lead to significantly different responses to environmental variability. Simulated adult population fluctuations are associated with age-1 growth (via age-2 egg production) and prey availability for anchovy, while they depend primarily on age-0 survival and temperature for sardine. The analysis also hints at potential linkages to known modes of climate variability, whereby changes in adult abundance are related to ENSO for anchovy and to the PDO for sardine. The connection to the PDO and ENSO is consistent with modes of interannual and decadal variability that would alternatively favor anchovy during years of cooler temperatures and higher prey availability, and sardine during years of warmer temperatures and lower prey availability. While the end-to-end ecosystem model provides valuable insight on potential relationships between environmental conditions and sardine and anchovy population dynamics, understanding the complex interplay, and potential lags, between the full array of processes controlling their abundances in the California Current System remains an on-going challenge.

  15. FROM UAS DATA ACQUISITION TO ACTIONABLE INFORMATION – HOW AN END-TO-END SOLUTION HELPS OIL PALM PLANTATION OPERATORS TO PERFORM A MORE SUSTAINABLE PLANTATION MANAGEMENT

    Directory of Open Access Journals (Sweden)

    C. Hoffmann

    2016-06-01

    Full Text Available Palm oil represents the most efficient oilseed crop in the world but the production of palm oil involves plantation operations in one of the most fragile environments - the tropical lowlands. Deforestation, the drying-out of swampy lowlands and chemical fertilizers lead to environmental problems that are putting pressure on this industry. Unmanned aircraft systems (UAS together with latest photogrammetric processing and image analysis capabilities represent an emerging technology that was identified to be suitable to optimize oil palm plantation operations. This paper focuses on two key elements of a UAS-based oil palm monitoring system. The first is the accuracy of the acquired data that is necessary to achieve meaningful results in later analysis steps. High performance GNSS technology was utilized to achieve those accuracies while decreasing the demand for cost-intensive GCP measurements. The second key topic is the analysis of the resulting data in order to optimize plantation operations. By automatically extracting information on a block level as well as on a single-tree level, operators can utilize the developed application to increase their productivity. The research results describe how operators can successfully make use of a UAS-based solution together with the developed software solution to improve their efficiency in oil palm plantation management.

  16. End-to-End Availability Analysis of IMS-Based Networks

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    -of-the-art models are compared with the proposed novel Markov-based reliability modeling for both simplex and redundant systems. The proposed model is proved to efficiently represent the system behaviors. The effect of the end-to-end availability model when applying redundancy in the IMS core system is investigated......Various methods and models have recently been proposed and applied for evaluating IP Multimedia Subsystem (IMS) in terms of reliability, availability and performance. End-to-end availability is one of the key factors that can improve the quality of service (QoS) and increase the reliability of Next...

  17. NPP Information Model as an Innovative Approach to End-to-End Lifecycle Management of the NPP and Nuclear Knowledge Management Proven in Russia

    International Nuclear Information System (INIS)

    Tikhonovsky, V.; Kanischev, A.; Kononov, V.; Salnikov, N.; Shkarin, A.; Dorobin, D.

    2016-01-01

    Full text: Managing engineering data for an industrial facility, including integration and maintenance of all engineering and technical data, ensuring fast and convenient access to that information and its analysis, proves to be necessary in order to perform the following tasks: 1) to increase economic efficiency of the plant during its lifecycle, including the decommissioning stage; 2) to ensure strict adherence to industrial safety requirements, radiation safety requirements (in case of nuclear facilities) and environmental safety requirements during operation (including refurbishment and restoration projects) and decommissioning. While performing tasks 1) and 2), one faces a range of challenges: 1. A huge amount of information describing the plant configuration. 2. Complexity of engineering procedures, step-by-step commissioning and significant geographical distribution of industrial infrastructure. 3. High importance of plant refurbishment projects. 4. The need to ensure comprehensive knowledge transfer between different generations of operational personnel and, which is especially important for the nuclear energy industry, between the commissioning personnel generations. NPP information model is an innovative method of NPP knowledge management throughout the whole plant lifecycle. It is an integrated database with all NPP technical engineering information (design, construction, operation, diagnosing, maintenance, refurbishment). (author

  18. Measurements and analysis of end-to-end Internet dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Paxson, Vern [Univ. of California, Berkeley, CA (United States). Computer Science Division

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  19. End-to-End Operations in the ELT Era

    Science.gov (United States)

    Hainaut, O. R.; Bierwirth, T.; Brillant, S.; Mieske, S.; Patat, F.; Rejkuba, M.; Romaniello, M.; Sterzik, M.

    2018-03-01

    The Data Flow System is the infrastructure on which Very Large Telescope (VLT) observations are performed at the Observatory, before and after the observations themselves take place. Since its original conception in the late 1990s, it has evolved to accommodate new observing modes and new instruments on La Silla and Paranal. Several updates and upgrades are needed to overcome its obsolescence and to integrate requirements from the new instruments from the community and, of course, from ESO's Extremely Large Telescope (ELT), which will be integrated into Paranal's operations. We describe the end-to-end operations and the resulting roadmap guiding their further development.

  20. An end to end secure CBIR over encrypted medical database.

    Science.gov (United States)

    Bellafqira, Reda; Coatrieux, Gouenou; Bouslimi, Dalel; Quellec, Gwenole

    2016-08-01

    In this paper, we propose a new secure content based image retrieval (SCBIR) system adapted to the cloud framework. This solution allows a physician to retrieve images of similar content within an outsourced and encrypted image database, without decrypting them. Contrarily to actual CBIR approaches in the encrypted domain, the originality of the proposed scheme stands on the fact that the features extracted from the encrypted images are themselves encrypted. This is achieved by means of homomorphic encryption and two non-colluding servers, we however both consider as honest but curious. In that way an end to end secure CBIR process is ensured. Experimental results carried out on a diabetic retinopathy database encrypted with the Paillier cryptosystem indicate that our SCBIR achieves retrieval performance as good as if images were processed in their non-encrypted form.

  1. End-to-End Security for Personal Telehealth

    NARCIS (Netherlands)

    Koster, R.P.; Asim, M.; Petkovic, M.

    2011-01-01

    Personal telehealth is in rapid development with innovative emerging applications like disease management. With personal telehealth people participate in their own care supported by an open distributed system with health services. This poses new end-to-end security and privacy challenges. In this

  2. End to End Inter-domain Quality of Service Provisioning

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy

    Multi-Protocol Label Switching (GMPLS) protocol suite was selected. Its growing popularity in access networks together with its maturity and wide adoption in core networks, makes it a great candidate as an end-to-end QoS provisioning mechanism. As a consequence of the UPnP-QoS/GMPLS mapping analysis...... of resource reservation that could be used in core networks. Teletraffic engineering theorems are used for management of resources reserved for traffic of different priorities and rates in nodes, open and closed networks. As a whole, this thesis can be seen as a QoS analysis starting from home networks...... through Home Gateways towards access links and finally reaching core networks - in this way constituting a path with end-to-end interdomain provisioned QoS....

  3. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  4. Estimating Average End-to-End Delays in IEEE 802.11 Multihop Wireless Networks

    OpenAIRE

    Sarr , Cheikh; Guérin-Lassous , Isabelle

    2007-01-01

    In this paper, we present a new analytic model for evaluating average end-to-end delay in IEEE 802.11 multihop wireless networks. Our model gives closed expressions for the end-to-end delay in function of arrivals and service time patterns. Each node is modeled as a M/M/1/K queue from which we can derive expressions for service time via queueing theory. By combining this delay evaluation with different admission controls, we design a protocol called DEAN (Delay Estimation in Ad hoc Networks)....

  5. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  6. DNA synthesis generates terminal duplications that seal end-to-end chromosome fusions.

    Science.gov (United States)

    Lowden, Mia Rochelle; Flibotte, Stephane; Moerman, Donald G; Ahmed, Shawn

    2011-04-22

    End-to-end chromosome fusions that occur in the context of telomerase deficiency can trigger genomic duplications. For more than 70 years, these duplications have been attributed solely to breakage-fusion-bridge cycles. To test this hypothesis, we examined end-to-end fusions isolated from Caenorhabditis elegans telomere replication mutants. Genome-level rearrangements revealed fused chromosome ends having interrupted terminal duplications accompanied by template-switching events. These features are very similar to disease-associated duplications of interstitial segments of the human genome. A model termed Fork Stalling and Template Switching has been proposed previously to explain such duplications, where promiscuous replication of large, noncontiguous segments of the genome occurs. Thus, a DNA synthesis-based process may create duplications that seal end-to-end fusions, in the absence of breakage-fusion-bridge cycles.

  7. End-to-end plasma bubble PIC simulations on GPUs

    Science.gov (United States)

    Germaschewski, Kai; Fox, William; Matteucci, Jackson; Bhattacharjee, Amitava

    2017-10-01

    Accelerator technologies play a crucial role in eventually achieving exascale computing capabilities. The current and upcoming leadership machines at ORNL (Titan and Summit) employ Nvidia GPUs, which provide vast computational power but also need specifically adapted computational kernels to fully exploit them. In this work, we will show end-to-end particle-in-cell simulations of the formation, evolution and coalescence of laser-generated plasma bubbles. This work showcases the GPU capabilities of the PSC particle-in-cell code, which has been adapted for this problem to support particle injection, a heating operator and a collision operator on GPUs.

  8. OGC standards for end-to-end sensor network integration

    Science.gov (United States)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  9. End-to-end validation process for the INTA-Nanosat-1B Attitude Control System

    Science.gov (United States)

    Polo, Óscar R.; Esteban, Segundo; Cercos, Lorenzo; Parra, Pablo; Angulo, Manuel

    2014-01-01

    This paper describes the end-to-end validation process for the Attitude Control Subsystem (ACS) of the satellite INTA-NanoSat-1B (NS-1B). This satellite was launched on July 2009 and it has been fully operative since then. The development of its ACS modules required an exhaustive integration and a system-level validation program. Some of the tests were centred on the validation of the drivers of sensors and actuators and were carried out over the flying model of the satellite. Others, more complex, constituted end-to-end tests where the concurrency of modules, the real-time control requirements and even the well-formedness of the telemetry data were verified. This work presents an incremental and highly automatised way for performing the ACS validation program based on two development suites and an end-to-end validation environment. The validation environment combines a Flat Satellite (FlatSat) configuration and a real-time emulator working in closed-loop. The FlatSat is built using the NS-1B Qualification Model (QM) hardware and it can run a complete version of the on-board software with the ACS modules fully integrated. The real-time emulator, running on an industrial PC, samples the actuation signals and emulates the sensors signals to close the control loop with the FlatSat. This validation environment constitutes a low-cost alternative to the classical three axes tilt table, with the advantage of being easily configured for working under specific orbit conditions, in accordance with any of the selected tests. The approach has been successfully applied to the NS-1B in order to verify different ACS modes under multiple orbit scenarios, providing an exhaustive coverage and reducing the risk of eventual errors during the satellite's lifetime. The strategy was applied also during the validation of the maintenance and reconfiguration procedures required once the satellite was launched. This paper describes in detail the complete ACS validation process that was

  10. On routing algorithms with end-to-end delay guarantees

    Energy Technology Data Exchange (ETDEWEB)

    Rao, N.S.V.; Batsell, S.G.

    1998-11-01

    The authors consider the transmission of a message of size r from a source to a destination with guarantees on the end-to-end delay over a computer network with n nodes and m links. There are three sources of delays: (a) propagation delays along the links, (b) delays due to bandwidth availability on the links, and (c) queuing delays at the intermediate nodes. First, the authors consider that delays on various links and nodes are given as functions of the message size. If the delay in (b) is a non-increasing function of the bandwidth, they propose O(m{sup 2} + mn log n) time algorithm to compute a path with the minimum end-to-end delay for any given message size r. They then consider that the queuing delay in (c) is a random variable correlated with the message size according to an unknown distribution. At each node, the measurements of queuing delays and message sizes are available. They propose two algorithms to compute paths whose delays are close to optimal delays with a high probability, irrespective of the distribution of the delays, and based entirely on the measurements of sufficient size.

  11. An end-to-end vechicle classification pipeline using vibrometry data

    Science.gov (United States)

    Smith, Ashley; Mendoza-Schrock, Olga; Kangas, Scott; Dierking, Matthew; Shaw, Arnab

    2014-06-01

    This paper evaluates and expands upon the existing end-to-end process used for vibrometry target classification and identification. A fundamental challenge in vehicle classification using vibrometry signature data is the determination of robust signal features. The methodology used in this paper involves comparing the performance of features taken from automatic speech recognition, seismology, and structural analysis work. These features provide a means to reduce the dimensionality of the data for the possibility of improved separability. The performances of different groups of features are compared to determine the best feature set for vehicle classification. Standard performance metrics are implemented to provide a method of evaluation. The contribution of this paper is to (1) thoroughly explain the time domain and frequency domain features that have been recently applied to the vehicle classification using laser-vibrometry data domain, (2) build an end-to-end classification pipeline for Aided Target Recognition (ATR) with common and easily accessible tools, and (3) apply feature selection methods to the end-to-end pipeline. The end-to-end process used here provides a structured path for accomplishing vibrometry-based target identification. This paper will compare with two studies in the public domain. The techniques utilized in this paper were utilized to analyze a small in-house database of several different vehicles.

  12. End-to-end tests using alanine dosimetry in scanned proton beams

    Science.gov (United States)

    Carlino, A.; Gouldstone, C.; Kragl, G.; Traneus, E.; Marrale, M.; Vatnitsky, S.; Stock, M.; Palmans, H.

    2018-03-01

    This paper describes end-to-end test procedures as the last fundamental step of medical commissioning before starting clinical operation of the MedAustron synchrotron-based pencil beam scanning (PBS) therapy facility with protons. One in-house homogeneous phantom and two anthropomorphic heterogeneous (head and pelvis) phantoms were used for end-to-end tests at MedAustron. The phantoms were equipped with alanine detectors, radiochromic films and ionization chambers. The correction for the ‘quenching’ effect of alanine pellets was implemented in the Monte Carlo platform of the evaluation version of RayStation TPS. During the end-to-end tests, the phantoms were transferred through the workflow like real patients to simulate the entire clinical workflow: immobilization, imaging, treatment planning and dose delivery. Different clinical scenarios of increasing complexity were simulated: delivery of a single beam, two oblique beams without and with range shifter. In addition to the dose comparison in the plastic phantoms the dose obtained from alanine pellet readings was compared with the dose determined with the Farmer ionization chamber in water. A consistent systematic deviation of about 2% was found between alanine dosimetry and the ionization chamber dosimetry in water and plastic materials. Acceptable agreement of planned and delivered doses was observed together with consistent and reproducible results of the end-to-end testing performed with different dosimetric techniques (alanine detectors, ionization chambers and EBT3 radiochromic films). The results confirmed the adequate implementation and integration of the new PBS technology at MedAustron. This work demonstrates that alanine pellets are suitable detectors for end-to-end tests in proton beam therapy and the developed procedures with customized anthropomorphic phantoms can be used to support implementation of PBS technology in clinical practice.

  13. Identification of the main processes underlying ecosystem functioning in the Eastern English Channel, with a focus on flatfish species, as revealed through the application of the Atlantis end-to-end model

    Science.gov (United States)

    Girardin, Raphaël; Fulton, Elizabeth A.; Lehuta, Sigrid; Rolland, Marie; Thébaud, Olivier; Travers-Trolet, Morgane; Vermard, Youen; Marchal, Paul

    2018-02-01

    The ecosystem model Atlantis was used to investigate the key dynamics and processes that structure the Eastern English Channel ecosystem, with a particular focus on two commercial flatfish species, sole (Solea solea) and plaice (Pleuronectes platessa). This complex model was parameterized with data collected from diverse sources (a literature review, survey data, as well as landings and stock assessment information) and tuned so both simulated biomass and catch fit 2002-2011 observations. Here, the outputs are mainly presented for the two focus species and for some other vertebrates found to be important in the trophic network. The calibration process revealed the importance of coastal areas in the Eastern English Channel and of nutrient inputs from estuaries: a lack of river nutrients decreases the productivity of nursery grounds and adversely affects the production of sole and plaice. The role of discards in the trophic network is also highlighted. While sole and plaice did not have a strong influence on the trophic network of vertebrates, they are important predators for benthic invertebrates and compete for food with crustaceans, whiting (Merlangius merlangus) and other demersal fish. We also found that two key species, cod (Gadus morhua) and whiting, thoroughly structured the Eastern English Channel trophic network.

  14. END-TO-END INDIA-UK TRANSNATIONAL WIRELESS TESTBED

    Directory of Open Access Journals (Sweden)

    Rohit Budhiraja

    2011-06-01

    Full Text Available Wireless Communication is a fast growing technology area where tremendous amount of research is ongoing. It is also an area where the use of technology in the market has seen wide and far-reaching impact. The India-UK Advanced Technology Centre initiative is a collaborative research project between various institutes and companies across UK and India, which envisages, apart from several research outcomes, putting in place of a support infrastructure for facilitating R&D of Next Generation networks, Systems and Services. As part of this project, an end-to-end trans-national advanced wireless testbed is being developed which will facilitate and support research and implementation of new ideas, concepts and technologies. The testbed will provide a framework which can be used to rapidly prototype and evaluate emerging concepts and technologies, and enables researchers to investigate/demonstrate the feasibility of new ideas in a realistic test environment. The testbed complements analytical and simulation based studies undertaken as part of the initial study when new ideas are proposed. This paper gives the details of the testbed and shows how a 4G technology like LTE has been implemented as one of the realisations of the test bed.

  15. Building dialogue POMDPs from expert dialogues an end-to-end approach

    CERN Document Server

    Chinaei, Hamidreza

    2016-01-01

    This book discusses the Partially Observable Markov Decision Process (POMDP) framework applied in dialogue systems. It presents POMDP as a formal framework to represent uncertainty explicitly while supporting automated policy solving. The authors propose and implement an end-to-end learning approach for dialogue POMDP model components. Starting from scratch, they present the state, the transition model, the observation model and then finally the reward model from unannotated and noisy dialogues. These altogether form a significant set of contributions that can potentially inspire substantial further work. This concise manuscript is written in a simple language, full of illustrative examples, figures, and tables. Provides insights on building dialogue systems to be applied in real domain Illustrates learning dialogue POMDP model components from unannotated dialogues in a concise format Introduces an end-to-end approach that makes use of unannotated and noisy dialogue for learning each component of dialogue POM...

  16. Rectovaginal fistula following colectomy with an end-to-end anastomosis stapler for a colorectal adenocarcinoma.

    Science.gov (United States)

    Klein, A; Scotti, S; Hidalgo, A; Viateau, V; Fayolle, P; Moissonnier, P

    2006-12-01

    An 11-year-old, female neutered Labrador retriever was presented with a micro-invasive differentiated papillar adenocarcinoma at the colorectal junction. A colorectal end-to-end anastomosis stapler device was used to perform resection and anastomosis using a transanal technique. A rectovaginal fistula was diagnosed two days later. An exploratory laparotomy was conducted and the fistula was identified and closed. Early dehiscence of the colon was also suspected and another colorectal anastomosis was performed using a manual technique. Comparison to a conventional manual technique of intestinal surgery showed that the use of an automatic staple device was quicker and easier. To the authors' knowledge, this is the first report of a rectovaginal fistula occurring after end-to-end anastomosis stapler colorectal resection-anastomosis in the dog. To minimise the risk of this potential complication associated with the limited surgical visibility, adequate tissue retraction and inspection of the anastomosis site are essential.

  17. End-to-end security in telemedical networks--a practical guideline.

    Science.gov (United States)

    Wozak, Florian; Schabetsberger, Thomas; Ammmenwerth, Elske

    2007-01-01

    The interconnection of medical networks in different healthcare institutions will be constantly increasing over the next few years, which will require concepts for securing medical data during transfer, since transmitting patient related data via potentially insecure public networks is considered a violation of data privacy. The aim of our work was to develop a model-based approach towards end-to-end security which is defined as continuous security from point of origin to point of destination in a communication process. We show that end-to-end security must be seen as a holistic security concept, which comprises the following three major parts: authentication and access control, transport security, as well as system security. For integration into existing security infrastructures abuse case models were used, which extend UML use cases, by elements necessary to describe abusive interactions. Abuse case models can be constructed for each part mentioned above, allowing for potential security risks in communication from point of origin to point of destination to be identified and counteractive measures to be directly derived from the abuse case models. The model-based approach is a guideline to continuous risk assessment and improvement of end-to-end security in medical networks. Validity and relevance to practice will be systematically evaluated using close-to-reality test networks as well as in production environments.

  18. Circular myotomy as an aid to resection and end-to-end anastomosis of the esophagus.

    Science.gov (United States)

    Attum, A A; Hankins, J R; Ngangana, J; McLaughlin, J S

    1979-08-01

    Segments ranging from 40 to 70% of the thoracic esophagus were resected in 80 mongrel dogs. End-to-end anastomosis was effected after circular myotomy either proximal or distal, or both proximal and distal, to the anastomosis. Among dogs undergoing resection of 60% of the esophagus, distal myotomy enabled 6 of 8 animals to survive, and combined proximal and distal myotomy permitted 8 of 10 to survive. Cineesophagography was performed in a majority of the 50 surviving animals and showed no appreciable delay of peristalsis at the myotomy sites. When these sites were examined at postmortem examination up to 13 months after operation, 1 dog showed a small diverticulum but none showed dilatation or stricture. It is concluded that circular myotomy holds real promise as a means of extending the clinical application of esophageal resection with end-to-end anastomosis.

  19. End-to-end delay analysis in wireless sensor networks with service vacation

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2014-04-01

    In this paper, a delay-sensitive multi-hop wireless sensor network is considered, employing an M/G/1 with vacations framework. Sensors transmit measurements to a predefined data sink subject to maximum end-to-end delay constraint. In order to prolong the battery lifetime, a sleeping scheme is adopted throughout the network nodes. The objective of our proposed framework is to present an expression for maximum hop-count as well as an approximate expression of the probability of blocking at the sink node upon violating certain end-to-end delay threshold. Using numerical simulations, we validate the proposed analytical model and demonstrate that the blocking probability of the system for various vacation time distributions matches the simulation results.

  20. End-to-End Image Simulator for Optical Imaging Systems: Equations and Simulation Examples

    Directory of Open Access Journals (Sweden)

    Peter Coppo

    2013-01-01

    Full Text Available The theoretical description of a simplified end-to-end software tool for simulation of data produced by optical instruments, starting from either synthetic or airborne hyperspectral data, is described and some simulation examples of hyperspectral and panchromatic images for existing and future design instruments are also reported. High spatial/spectral resolution images with low intrinsic noise and the sensor/mission specifications are used as inputs for the simulations. The examples reported in this paper show the capabilities of the tool for simulating target detection scenarios, data quality assessment with respect to classification performance and class discrimination, impact of optical design on image quality, and 3D modelling of optical performances. The simulator is conceived as a tool (during phase 0/A for the specification and early development of new Earth observation optical instruments, whose compliance to user’s requirements is achieved through a process of cost/performance trade-off. The Selex Galileo simulator, as compared with other existing image simulators for phase C/D projects of space-borne instruments, implements all modules necessary for a complete panchromatic and hyper spectral image simulation, and it allows excellent flexibility and expandability for new integrated functions because of the adopted IDL-ENVI software environment.

  1. End-to-end integrated service provision and protection scheme in IP over WDM networks

    Science.gov (United States)

    Zhu, Yonghua; Lin, Rujian

    2005-02-01

    Generalized Multiprotocol Label Switching (GMPLS), which is developed to support common control of packet, TDM, wavelength, and fiber services, is the key enabler of the new network model. The survivability of IP over WDM networks gains importance as network traffic keeps growing. In this paper, we propose an integrated provisioning scheme to dynamically allocate Label Switched Paths (LSPs) in IP over WDM networks. This scheme takes advantage of information sharing between layers (e.g., link state information, bandwidth usage, and protection capability) to eliminate redundancies and inefficiencies in the traditional layer-independent service provisioning. The integration of information is facilitated by GMPLS signaling. The proposed scheme also uses GMPLS capabilities to provide end-to-end survivability against network failures. The ability to provision across all network layers ensures efficient bandwidth usage. We propose two integrated routing algorithms: availability-based integrated routing algorithm and joint availability-based integrated routing algorithm. The simulation is made to evaluate the performance of our proposed integrated provisioning mechanism. As a result, network performance can be optimized over all layers. This could lead to significant cost savings for service providers.

  2. Airport Detection Using End-to-End Convolutional Neural Network with Hard Example Mining

    Directory of Open Access Journals (Sweden)

    Bowen Cai

    2017-11-01

    Full Text Available Deep convolutional neural network (CNN achieves outstanding performance in the field of target detection. As one of the most typical targets in remote sensing images (RSIs, airport has attracted increasing attention in recent years. However, the essential challenge for using deep CNN to detect airport is the great imbalance between the number of airports and background examples in large-scale RSIs, which may lead to over-fitting. In this paper, we develop a hard example mining and weight-balanced strategy to construct a novel end-to-end convolutional neural network for airport detection. The initial motivation of the proposed method is that backgrounds contain an overwhelming number of easy examples and a few hard examples. Therefore, we design a hard example mining layer to automatically select hard examples by their losses, and implement a new weight-balanced loss function to optimize CNN. Meanwhile, the cascade design of proposal extraction and object detection in our network releases the constraint on input image size and reduces spurious false positives. Compared with geometric characteristics and low-level manually designed features, the hard example mining based network could extract high-level features, which is more robust for airport detection in complex environment. The proposed method is validated on a multi-scale dataset with complex background collected from Google Earth. The experimental results demonstrate that our proposed method is robust, and superior to the state-of-the-art airport detection models.

  3. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    Science.gov (United States)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  4. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    Science.gov (United States)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  5. End to end adaptive congestion control in TCP/IP networks

    CERN Document Server

    Houmkozlis, Christos N

    2012-01-01

    This book provides an adaptive control theory perspective on designing congestion controls for packet-switching networks. Relevant to a wide range of disciplines and industries, including the music industry, computers, image trading, and virtual groups, the text extensively discusses source oriented, or end to end, congestion control algorithms. The book empowers readers with clear understanding of the characteristics of packet-switching networks and their effects on system stability and performance. It provides schemes capable of controlling congestion and fairness and presents real-world app

  6. WiMAX security and quality of service an end-to-end perspective

    CERN Document Server

    Tang, Seok-Yee; Sharif, Hamid

    2010-01-01

    WiMAX is the first standard technology to deliver true broadband mobility at speeds that enable powerful multimedia applications such as Voice over Internet Protocol (VoIP), online gaming, mobile TV, and personalized infotainment. WiMAX Security and Quality of Service, focuses on the interdisciplinary subject of advanced Security and Quality of Service (QoS) in WiMAX wireless telecommunication systems including its models, standards, implementations, and applications. Split into 4 parts, Part A of the book is an end-to-end overview of the WiMAX architecture, protocol, and system requirements.

  7. Analytical Framework for End-to-End Delay Based on Unidirectional Highway Scenario

    Directory of Open Access Journals (Sweden)

    Aslinda Hassan

    2015-01-01

    Full Text Available In a sparse vehicular ad hoc network, a vehicle normally employs a carry and forward approach, where it holds the message it wants to transmit until the vehicle meets other vehicles or roadside units. A number of analyses in the literature have been done to investigate the time delay when packets are being carried by vehicles on both unidirectional and bidirectional highways. However, these analyses are focusing on the delay between either two disconnected vehicles or two disconnected vehicle clusters. Furthermore, majority of the analyses only concentrate on the expected value of the end-to-end delay when the carry and forward approach is used. Using regression analysis, we establish the distribution model for the time delay between two disconnected vehicle clusters as an exponential distribution. Consequently, a distribution is newly derived to represent the number of clusters on a highway using a vehicular traffic model. From there, we are able to formulate end-to-end delay model which extends the time delay model for two disconnected vehicle clusters to multiple disconnected clusters on a unidirectional highway. The analytical results obtained from the analytical model are then validated through simulation results.

  8. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra.

    Science.gov (United States)

    Hussain, Akbar; Pansota, Mudassar Saeed; Rasool, Mumtaz; Tabassum, Shafqat Ali; Ahmad, Iftikhar; Saleem, Muhammad Shahzad

    2013-04-01

    To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Case series. Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Adult patients with completely obliterated post-traumatic stricture of posterior urethra ≤ 2 cm were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%.

  9. Outcome of end-to-end urethroplasty in post-traumatic stricture of posterior urethra

    International Nuclear Information System (INIS)

    Hussain, A.; Pansota, M. S.; Rasool, M.; Tabassum, S. A.; Ahmad, I.; Saleem, M. S.

    2013-01-01

    Objective: To determine the outcome of delayed end-to-end anastomotic urethroplasty in blind post-traumatic stricture of posterior urethra at our setup. Study Design: Case series. Place and Duration of Study: Department of Urology and Renal Transplantation, Quaid-e-Azam Medical College/ Bahawal Victoria Hospital, Bahawalpur, from January 2009 to June 2011. Methodology: Adult patients with completely obliterated post-traumatic stricture of posterior urethra 2 cm/sup 2/ were included in the study. Patients with post-prostatectomy (TUR-P, TVP) stricture, stricture more than 2 cm in size or patients of stricture with neurogenic bladder and patients with any perineal disease were excluded from the study. Retrograde urethrogram and voiding cysto-urethrogram was done in every patient to assess stricture length and location. Stricture excision and delayed end-to-end anastomosis of urethra with spatulation was performed in every patient. Minimum followup period was 6 months and maximum 18 months. Results: There were 26 cases with road traffic accident (indirect) and 14 had history of fall/direct trauma to perineum or urethra. Majority of the patients (57.5%) were between 16 to 30 years of age. Twelve (30.0%) patients developed complications postoperatively. Early complications of wound infection occurred in 01 (2.5%) patient. Late complications were seen in 11 (27.5%) patients i.e. stricture recurrence in 7 (17.5%), erectile dysfunction in 2 (5.0%), urethrocutaneous fistula and urinary incontinence in one patient (2.5%) each. Success rate was 70.0% initially and 87.5% overall. Conclusion: Delayed end-to-end anastomotic urethroplasty is an effective procedure for traumatic posterior urethral strictures with success rate of about 87.5%. (author)

  10. End-to-end experiment management in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John M [Los Alamos National Laboratory; Kroiss, Ryan R [Los Alamos National Laboratory; Torrez, Alfred [Los Alamos National Laboratory; Wingate, Meghan [Los Alamos National Laboratory

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  11. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    Science.gov (United States)

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  12. End-to-End Airplane Detection Using Transfer Learning in Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Zhong Chen

    2018-01-01

    Full Text Available Airplane detection in remote sensing images remains a challenging problem due to the complexity of backgrounds. In recent years, with the development of deep learning, object detection has also obtained great breakthroughs. For object detection tasks in natural images, such as the PASCAL (Pattern Analysis, Statistical Modelling and Computational Learning VOC (Visual Object Classes Challenge, the major trend of current development is to use a large amount of labeled classification data to pre-train the deep neural network as a base network, and then use a small amount of annotated detection data to fine-tune the network for detection. In this paper, we use object detection technology based on deep learning for airplane detection in remote sensing images. In addition to using some characteristics of remote sensing images, some new data augmentation techniques have been proposed. We also use transfer learning and adopt a single deep convolutional neural network and limited training samples to implement end-to-end trainable airplane detection. Classification and positioning are no longer divided into multistage tasks; end-to-end detection attempts to combine them for optimization, which ensures an optimal solution for the final stage. In our experiment, we use remote sensing images of airports collected from Google Earth. The experimental results show that the proposed algorithm is highly accurate and meaningful for remote sensing object detection.

  13. End-to-End Trade-space Analysis for Designing Constellation Missions

    Science.gov (United States)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  14. End-to-end simulations of the E-ELT/METIS coronagraphs

    Science.gov (United States)

    Carlomagno, Brunella; Absil, Olivier; Kenworthy, Matthew; Ruane, Garreth; Keller, Christoph U.; Otten, Gilles; Feldt, Markus; Hippler, Stefan; Huby, Elsa; Mawet, Dimitri; Delacroix, Christian; Surdej, Jean; Habraken, Serge; Forsberg, Pontus; Karlsson, Mikael; Vargas Catalan, Ernesto; Brandl, Bernhard R.

    2016-07-01

    The direct detection of low-mass planets in the habitable zone of nearby stars is an important science case for future E-ELT instruments such as the mid-infrared imager and spectrograph METIS, which features vortex phase masks and apodizing phase plates (APP) in its baseline design. In this work, we present end-to-end performance simulations, using Fourier propagation, of several METIS coronagraphic modes, including focal-plane vortex phase masks and pupil-plane apodizing phase plates, for the centrally obscured, segmented E-ELT pupil. The atmosphere and the AO contributions are taken into account. Hybrid coronagraphs combining the advantages of vortex phase masks and APPs are considered to improve the METIS coronagraphic performance.

  15. An end-to-end secure patient information access card system.

    Science.gov (United States)

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  16. Single and Multi-bunch End-to-end Tracking in the LHeC

    CERN Document Server

    Pellegrini, D; Latina, A; Schulte, D

    2015-01-01

    The LHeC study aims at delivering an electron beam for collision with the LHC proton beam. The current base- line design consists of a multi-pass superconductive energy- recovery linac operating in a continuous wave mode. The high current beam ($\\sim$ 100 mA) in the linacs excites long- range wake-fields between bunches of different turns, which induce instabilities and might cause beam losses. PLACET2, a novel version of the tracking code PLACET, capable to handle recirculation and time dependencies, has been em- ployed to perform the first LHeC end-to-end tracking. The impact of long-range wake-fields, synchrotron radiation, and beam-beam effects has been assessed. The simulation results and recent improvements in the lattice design are presented and discussed in this paper.

  17. Force Characteristics of the Rat Sternomastoid Muscle Reinnervated with End-to-End Nerve Repair

    Directory of Open Access Journals (Sweden)

    Stanislaw Sobotka

    2011-01-01

    Full Text Available The goal of this study was to establish force data for the rat sternomastoid (SM muscle after reinnervation with nerve end-to-end anastomosis (EEA, which could be used as a baseline for evaluating the efficacy of new reinnervation techniques. The SM muscle on one side was paralyzed by transecting its nerve and then EEA was performed at different time points: immediate EEA, 1-month and 3-month delay EEA. At the end of 3-month recovery period, the magnitude of functional recovery of the reinnervated SM muscle was evaluated by measuring muscle force and comparing with the force of the contralateral control muscle. Our results demonstrated that the immediately reinnervated SM produced approximately 60% of the maximal tetanic force of the control. The SM with delayed nerve repair yielded approximately 40% of the maximal force. Suboptimal recovery of muscle force after EEA demonstrates the importance of developing alternative surgical techniques to treat muscle paralysis.

  18. VisualCommander for Rapid End-to-End Mission Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal is for the development of a highly extensible and user-configurable software application for end-to-end mission simulation and design. We will leverage...

  19. Utility-Optimal Dynamic Rate Allocation under Average End-to-End Delay Requirements

    OpenAIRE

    Hajiesmaili, Mohammad H.; Talebi, Mohammad Sadegh; Khonsari, Ahmad

    2015-01-01

    QoS-aware networking applications such as real-time streaming and video surveillance systems require nearly fixed average end-to-end delay over long periods to communicate efficiently, although may tolerate some delay variations in short periods. This variability exhibits complex dynamics that makes rate control of such applications a formidable task. This paper addresses rate allocation for heterogeneous QoS-aware applications that preserves the long-term end-to-end delay constraint while, s...

  20. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  1. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  2. Advanced Camera Image Cropping Approach for CNN-Based End-to-End Controls on Sustainable Computing

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2018-03-01

    Full Text Available Recent research on deep learning has been applied to a diversity of fields. In particular, numerous studies have been conducted on self-driving vehicles using end-to-end approaches based on images captured by a single camera. End-to-end controls learn the output vectors of output devices directly from the input vectors of available input devices. In other words, an end-to-end approach learns not by analyzing the meaning of input vectors, but by extracting optimal output vectors based on input vectors. Generally, when end-to-end control is applied to self-driving vehicles, the steering wheel and pedals are controlled autonomously by learning from the images captured by a camera. However, high-resolution images captured from a car cannot be directly used as inputs to Convolutional Neural Networks (CNNs owing to memory limitations; the image size needs to be efficiently reduced. Therefore, it is necessary to extract features from captured images automatically and to generate input images by merging the parts of the images that contain the extracted features. This paper proposes a learning method for end-to-end control that generates input images for CNNs by extracting road parts from input images, identifying the edges of the extracted road parts, and merging the parts of the images that contain the detected edges. In addition, a CNN model for end-to-end control is introduced. Experiments involving the Open Racing Car Simulator (TORCS, a sustainable computing environment for cars, confirmed the effectiveness of the proposed method for self-driving by comparing the accumulated difference in the angle of the steering wheel in the images generated by it with those of resized images containing the entire captured area and cropped images containing only a part of the captured area. The results showed that the proposed method reduced the accumulated difference by 0.839% and 0.850% compared to those yielded by the resized images and cropped images

  3. End-to-End Neural Optical Music Recognition of Monophonic Scores

    Directory of Open Access Journals (Sweden)

    Jorge Calvo-Zaragoza

    2018-04-01

    Full Text Available Optical Music Recognition is a field of research that investigates how to computationally decode music notation from images. Despite the efforts made so far, there are hardly any complete solutions to the problem. In this work, we study the use of neural networks that work in an end-to-end manner. This is achieved by using a neural model that combines the capabilities of convolutional neural networks, which work on the input image, and recurrent neural networks, which deal with the sequential nature of the problem. Thanks to the use of the the so-called Connectionist Temporal Classification loss function, these models can be directly trained from input images accompanied by their corresponding transcripts into music symbol sequences. We also present the Printed Music Scores dataset, containing more than 80,000 monodic single-staff real scores in common western notation, that is used to train and evaluate the neural approach. In our experiments, it is demonstrated that this formulation can be carried out successfully. Additionally, we study several considerations about the codification of the output musical sequences, the convergence and scalability of the neural models, as well as the ability of this approach to locate symbols in the input score.

  4. Sleep/wake scheduling scheme for minimizing end-to-end delay in multi-hop wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Madani Sajjad

    2011-01-01

    Full Text Available Abstract We present a sleep/wake schedule protocol for minimizing end-to-end delay for event driven multi-hop wireless sensor networks. In contrast to generic sleep/wake scheduling schemes, our proposed algorithm performs scheduling that is dependent on traffic loads. Nodes adapt their sleep/wake schedule based on traffic loads in response to three important factors, (a the distance of the node from the sink node, (b the importance of the node's location from connectivity's perspective, and (c if the node is in the proximity where an event occurs. Using these heuristics, the proposed scheme reduces end-to-end delay and maximizes the throughput by minimizing the congestion at nodes having heavy traffic load. Simulations are carried out to evaluate the performance of the proposed protocol, by comparing its performance with S-MAC and Anycast protocols. Simulation results demonstrate that the proposed protocol has significantly reduced the end-to-end delay, as well as has improved the other QoS parameters, like average energy per packet, average delay, packet loss ratio, throughput, and coverage lifetime.

  5. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    International Nuclear Information System (INIS)

    Sun, Jidi; Menk, Fred; Lambert, Jonathan; Martin, Jarad; Denham, James W; Greer, Peter B; Dowling, Jason; Rivest-Henault, David; Pichler, Peter; Parker, Joel; Arm, Jameen; Best, Leah

    2015-01-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation.A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities.Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs.The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT. (paper)

  6. Circumferential resection and "Z"-shape plastic end-to-end anastomosis of canine trachea.

    Science.gov (United States)

    Zhao, H; Li, Z; Fang, J; Fang, C

    1999-03-01

    To prevent anastomotic stricture of the trachea. Forty young mongrel dogs, weighing 5-7 kg, were randomly divided into two groups: experimental group and control group, with 20 dogs in each group. Four tracheal rings were removed from each dog. In the experimental group, two "Z"-shape tracheoplastic anastomoses were performed on each dog, one on the anterior wall and the other on the membranous part of the trachea. In the control group, each dog received only simple end-to-end anastomosis. Vicryl 3-0 absorbable suture and OB fibrin glue were used for both groups. All dogs were killed when their body weight doubled. The average sagittal stenotic ratio were 1.20 +/- 0.12 for the experimental group and 0.83 +/- 0.05 for the control group. The average cross-sectional area stenotic ratio were 0.90 +/- 0.12 and 0.69 +/- 0.09 and T values were 8.71 and 4.57 for the two groups (P anastomosis in preventing anastomotic stricture of canine trachea.

  7. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    Science.gov (United States)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  8. An end-to-end communications architecture for condition-based maintenance applications

    Science.gov (United States)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  9. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  10. Semantic Complex Event Processing over End-to-End Data Flows

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi [University of Southern California; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  11. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    NARCIS (Netherlands)

    Lehner, B.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-01-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an

  12. Strategic design issues of IMS versus end-to-end architectures

    NARCIS (Netherlands)

    Braet, O.; Ballon, P.

    2007-01-01

    Purpose - The paper aims to discuss the business issues surrounding the choice between the end-to-end internet architecture, in particular peer-to-peer networks, versus managed telecommunications architectures, in particular IMS, for the migration towards a next-generation mobile system.

  13. Coupling of a single quantum emitter to end-to-end aligned silver nanowires

    DEFF Research Database (Denmark)

    Kumar, Shailesh; Huck, Alexander; Chen, Yuntian

    2013-01-01

    We report on the observation of coupling a single nitrogen vacancy (NV) center in a nanodiamond crystal to a propagating plasmonic mode of silver nanowires. The nanocrystal is placed either near the apex of a single silver nanowire or in the gap between two end-to-end aligned silver nanowires. We...

  14. Status report of the end-to-end ASKAP software system: towards early science operations

    Science.gov (United States)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  15. SPoRT - An End-to-End R2O Activity

    Science.gov (United States)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  16. Understanding Effect of Constraint Release Environment on End-to-End Vector Relaxation of Linear Polymer Chains

    KAUST Repository

    Shivokhin, Maksim E.

    2017-05-30

    We propose and verify methods based on the slip-spring (SSp) model [ Macromolecules 2005, 38, 14 ] for predicting the effect of any monodisperse, binary, or ternary environment of topological constraints on the relaxation of the end-to-end vector of a linear probe chain. For this purpose we first validate the ability of the model to consistently predict both the viscoelastic and dielectric response of monodisperse and binary mixtures of type A polymers, based on published experimental data. We also report the synthesis of new binary and ternary polybutadiene systems, the measurement of their linear viscoelastic response, and the prediction of these data by the SSp model. We next clarify the relaxation mechanisms of probe chains in these constraint release (CR) environments by analyzing a set of "toy" SSp models with simplified constraint release rates, by examining fluctuations of the end-to-end vector. In our analysis, the longest relaxation time of the probe chain is determined by a competition between the longest relaxation times of the effective CR motions of the fat and thin tubes and the motion of the chain itself in the thin tube. This picture is tested by the analysis of four model systems designed to separate and estimate every single contribution involved in the relaxation of the probe\\'s end-to-end vector in polydisperse systems. We follow the CR picture of Viovy et al. [ Macromolecules 1991, 24, 3587 ] and refine the effective chain friction in the thin and fat tubes based on Read et al. [ J. Rheol. 2012, 56, 823 ]. The derived analytical equations form a basis for generalizing the proposed methodology to polydisperse mixtures of linear and branched polymers. The consistency between the SSp model and tube model predictions is a strong indicator of the compatibility between these two distinct mesoscopic frameworks.

  17. jade: An End-To-End Data Transfer and Catalog Tool

    Science.gov (United States)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  18. End-to-End Tracking and Semantic Segmentation Using Recurrent Neural Networks

    OpenAIRE

    Ondruska, Peter; Dequaire, Julie; Wang, Dominic Zeng; Posner, Ingmar

    2016-01-01

    In this work we present a novel end-to-end framework for tracking and classifying a robot's surroundings in complex, dynamic and only partially observable real-world environments. The approach deploys a recurrent neural network to filter an input stream of raw laser measurements in order to directly infer object locations, along with their identity in both visible and occluded areas. To achieve this we first train the network using unsupervised Deep Tracking, a recently proposed theoretical f...

  19. A Network-based End-to-End Trainable Task-oriented Dialogue System

    OpenAIRE

    Wen, Tsung-Hsien; Vandyke, David; Mrksic, Nikola; Gasic, Milica; Rojas-Barahona, Lina M.; Su, Pei-Hao; Ultes, Stefan; Young, Steve

    2016-01-01

    Teaching machines to accomplish tasks by conversing naturally with humans is challenging. Currently, developing task-oriented dialogue systems requires creating multiple components and typically this involves either a large amount of handcrafting, or acquiring costly labelled datasets to solve a statistical learning problem for each component. In this work we introduce a neural network-based text-in, text-out end-to-end trainable goal-oriented dialogue system along with a new way of collectin...

  20. Adaptation and validation of a commercial head phantom for cranial radiosurgery dosimetry end-to-end audit.

    Science.gov (United States)

    Dimitriadis, Alexis; Palmer, Antony L; Thomas, Russell A S; Nisbet, Andrew; Clark, Catharine H

    2017-06-01

    To adapt and validate an anthropomorphic head phantom for use in a cranial radiosurgery audit. Two bespoke inserts were produced for the phantom: one for providing the target and organ at risk for delineation and the other for performing dose measurements. The inserts were tested to assess their positional accuracy. A basic treatment plan dose verification with an ionization chamber was performed to establish a baseline accuracy for the phantom and beam model. The phantom and inserts were then used to perform dose verification measurements of a radiosurgery plan. The dose was measured with alanine pellets, EBT extended dose film and a plastic scintillation detector (PSD). Both inserts showed reproducible positioning (±0.5 mm) and good positional agreement between them (±0.6 mm). The basic treatment plan measurements showed agreement to the treatment planning system (TPS) within 0.5%. Repeated film measurements showed consistent gamma passing rates with good agreement to the TPS. For 2%-2 mm global gamma, the mean passing rate was 96.7% and the variation in passing rates did not exceed 2.1%. The alanine pellets and PSD showed good agreement with the TPS (-0.1% and 0.3% dose difference in the target) and good agreement with each other (within 1%). The adaptations to the phantom showed acceptable accuracies. The presence of alanine and PSD do not affect film measurements significantly, enabling simultaneous measurements by all three detectors. Advances in knowledge: A novel method for thorough end-to-end test of radiosurgery, with capability to incorporate all steps of the clinical pathway in a time-efficient and reproducible manner, suitable for a national audit.

  1. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    Science.gov (United States)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  2. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  3. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    International Nuclear Information System (INIS)

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-01-01

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg 2 (or 6-65 deg 2 ) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies that would

  4. Analysis of the relationship between end-to-end distance and activity of single-chain antibody against colorectal carcinoma.

    Science.gov (United States)

    Zhang, Jianhua; Liu, Shanhong; Shang, Zhigang; Shi, Li; Yun, Jun

    2012-08-22

    We investigated the relationship of End-to-end distance between VH and VL with different peptide linkers and the activity of single-chain antibodies by computer-aided simulation. First, we developed (G4S)n (where n = 1-9) as the linker to connect VH and VL, and estimated the 3D structure of single-chain Fv antibody (scFv) by homologous modeling. After molecular models were evaluated and optimized, the coordinate system of every protein was built and unified into one coordinate system, and End-to-end distances calculated using 3D space coordinates. After expression and purification of scFv-n with (G4S)n as n = 1, 3, 5, 7 or 9, the immunoreactivity of purified ND-1 scFv-n was determined by ELISA. A multi-factorial relationship model was employed to analyze the structural factors affecting scFv: rn=ABn-ABO2+CDn-CDO2+BCn-BCst2. The relationship between immunoreactivity and r-values revealed that fusion protein structure approached the desired state when the r-value = 3. The immunoreactivity declined as the r-value increased, but when the r-value exceeded a certain threshold, it stabilized. We used a linear relationship to analyze structural factors affecting scFv immunoreactivity.

  5. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Energy Technology Data Exchange (ETDEWEB)

    Gordel, M., E-mail: marta.gordel@pwr.edu.pl [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Piela, K., E-mail: katarzyna.piela@pwr.edu.pl [Wrocław University of Technology, Department of Physical and Quantum Chemistry (Poland); Kołkowski, R. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland); Koźlecki, T. [Wrocław University of Technology, Department of Chemical Engineering, Faculty of Chemistry (Poland); Buckle, M. [CNRS, École Normale Supérieure de Cachan, Laboratoire de Biologie et Pharmacologie Appliquée (France); Samoć, M. [Wrocław University of Technology, Advanced Materials Engineering and Modelling Group, Faculty of Chemistry (Poland)

    2015-12-15

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.Graphical Abstract.

  6. End-to-end simulation and verification of GNC and robotic systems considering both space segment and ground segment

    Science.gov (United States)

    Benninghoff, Heike; Rems, Florian; Risse, Eicke; Brunner, Bernhard; Stelzer, Martin; Krenn, Rainer; Reiner, Matthias; Stangl, Christian; Gnat, Marcin

    2018-01-01

    In the framework of a project called on-orbit servicing end-to-end simulation, the final approach and capture of a tumbling client satellite in an on-orbit servicing mission are simulated. The necessary components are developed and the entire end-to-end chain is tested and verified. This involves both on-board and on-ground systems. The space segment comprises a passive client satellite, and an active service satellite with its rendezvous and berthing payload. The space segment is simulated using a software satellite simulator and two robotic, hardware-in-the-loop test beds, the European Proximity Operations Simulator (EPOS) 2.0 and the OOS-Sim. The ground segment is established as for a real servicing mission, such that realistic operations can be performed from the different consoles in the control room. During the simulation of the telerobotic operation, it is important to provide a realistic communication environment with different parameters like they occur in the real world (realistic delay and jitter, for example).

  7. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    Science.gov (United States)

    Gordel, M.; Piela, K.; Kołkowski, R.; Koźlecki, T.; Buckle, M.; Samoć, M.

    2015-12-01

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.

  8. END-TO-END DEPTH FROM MOTION WITH STABILIZED MONOCULAR VIDEOS

    Directory of Open Access Journals (Sweden)

    C. Pinard

    2017-08-01

    Full Text Available We propose a depth map inference system from monocular videos based on a novel dataset for navigation that mimics aerial footage from gimbal stabilized monocular camera in rigid scenes. Unlike most navigation datasets, the lack of rotation implies an easier structure from motion problem which can be leveraged for different kinds of tasks such as depth inference and obstacle avoidance. We also propose an architecture for end-to-end depth inference with a fully convolutional network. Results show that although tied to camera inner parameters, the problem is locally solvable and leads to good quality depth prediction.

  9. Weighted-DESYNC and Its Application to End-to-End Throughput Fairness in Wireless Multihop Network

    Directory of Open Access Journals (Sweden)

    Ui-Seong Yu

    2017-01-01

    Full Text Available The end-to-end throughput of a routing path in wireless multihop network is restricted by a bottleneck node that has the smallest bandwidth among the nodes on the routing path. In this study, we propose a method for resolving the bottleneck-node problem in multihop networks, which is based on multihop DESYNC (MH-DESYNC algorithm that is a bioinspired resource allocation method developed for use in multihop environments and enables fair resource allocation among nearby (up to two hops neighbors. Based on MH-DESYNC, we newly propose weighted-DESYNC (W-DESYNC as a tool artificially to control the amount of resource allocated to the specific user and thus to achieve throughput fairness over a routing path. Proposed W-DESYNC employs the weight factor of a link to determine the amount of bandwidth allocated to a node. By letting the weight factor be the link quality of a routing path and making it the same across a routing path via Cucker-Smale flocking model, we can obtain throughput fairness over a routing path. The simulation results show that the proposed algorithm achieves throughput fairness over a routing path and can increase total end-to-end throughput in wireless multihop networks.

  10. Common Patterns with End-to-end Interoperability for Data Access

    Science.gov (United States)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple

  11. End-to-End Beam Dynamics Simulations for the ANL-RIA Driver Linac

    CERN Document Server

    Ostroumov, P N

    2004-01-01

    The proposed Rare Isotope Accelerator (RIA) Facility consists of a superconducting (SC) 1.4 GV driver linac capable of producing 400 kW beams of any ion from hydrogen to uranium. The driver is configured as an array of ~350 SC cavities, each with independently controllable rf phase. For the end-to-end beam dynamics design and simulation we use a dedicated code, TRACK. The code integrates ion motion through the three-dimensional fields of all elements of the driver linac beginning from the exit of the electron cyclotron resonance (ECR) ion source to the production targets. TRACK has been parallelized and is able to track large number of particles in randomly seeded accelerators with misalignments and a comprehensive set of errors. The simulation starts with multi-component dc ion beams extracted from the ECR. Beam losses are obtained by tracking up to million particles in hundreds of randomly seeded accelerators. To control beam losses a set of collimators is applied in designated areas. The end-to-end simulat...

  12. The Swarm End-to-End mission simulator study: A demonstration of separating the various contributions to Earth's magnetic field using synthetic data

    DEFF Research Database (Denmark)

    Olsen, Nils; Haagmans, R.; Sabaka, T.J.

    2006-01-01

    by improving our understanding of the Earth's interior and climate. An End-to-End mission performance simulation was carried out during Phase A of the mission, with the aim of analyzing the key system requirements, particularly with respect to the number of Swarm satellites and their orbits related...

  13. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    Science.gov (United States)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  14. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    Science.gov (United States)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be small as a 40 ppm is a difficult and exacting task. The end-to end processing of determining planetary candidates from noisy, raw photometric measurements is discussed.

  15. Establishing end-to-end security in a nationwide network for telecooperation.

    Science.gov (United States)

    Staemmler, Martin; Walz, Michael; Weisser, Gerald; Engelmann, Uwe; Weininger, Robert; Ernstberger, Antonio; Sturm, Johannes

    2012-01-01

    Telecooperation is used to support care for trauma patients by facilitating a mutual exchange of treatment and image data in use-cases such as emergency consultation, second-opinion, transfer, rehabilitation and out-patient aftertreatment. To comply with data protection legislation a two-factor authentication using ownership and knowledge has been implemented to assure personalized access rights. End-to-end security is achieved by symmetric encryption in combination with external trusted services which provide the symmetric key solely at runtime. Telecooperation partners may be chosen at departmental level but only individuals of that department, as a result of checking the organizational assignments maintained by LDAP services, are granted access. Data protection officers of a federal state have accepted the data protection means. The telecooperation platform is in routine operation and designed to serve for up to 800 trauma centers in Germany, organized in more than 50 trauma networks.

  16. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Korostelev, Maxim [Cockcroft Inst. Accel. Sci. Tech.; Bailey, Ian [Lancaster U.; Herrod, Alexander [Liverpool U.; Morgan, James [Fermilab; Morse, William [RIKEN BNL; Stratakis, Diktys [RIKEN BNL; Tishchenko, Vladimir [RIKEN BNL; Wolski, Andrzej [Cockcroft Inst. Accel. Sci. Tech.

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  17. "Container" MIB for end-to-end management of ADSL networks

    Science.gov (United States)

    Lean, Andy G.; Schaffa, Frank A.; Seidman, David

    1998-09-01

    This document presents a MIB (Management Information Base) for use in the end-to-end management of networks which utilize ADSL (Asymmetric Dual Subscriber Line) technology for the 'last mile' (i.e. communication between the PSTN central office and users' premises.) The 'Container' MIB is useful to the Network Management System (NMS) in abstracting information derived from lower network layers and in cross-referencing elements of disparate network layers. The Container MIB is described here in the context of networks which use ATM (Asynchronous Transfer Mode) technology for the backbone. Conceptually, the Container MIB is above the ATM, ADSL, and entity MIBs in the management hierarchy, and bridges between ATM and ADSL network segments. We demonstrate how the Container MIB can be used in the implementation of several unique network management functions endemic to ATM/ADSL networks.

  18. Availability and End-to-end Reliability in Low Duty Cycle MultihopWireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Timo D. Hämäläinen

    2009-03-01

    Full Text Available A wireless sensor network (WSN is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS. Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER. The forwarding algorithm guarantees reliability up to 30% PER.

  19. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    Science.gov (United States)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  20. Mucociliary clearance following tracheal resection and end-to-end anastomosis.

    Science.gov (United States)

    Toomes, H; Linder, A

    1989-10-01

    Mucociliary clearance is an important cleaning system of the bronchial tree. The complex transport system reacts sensitively to medicinal stimuli and inhaled substances. A disturbance causes secretion retention which encourages the development of acute and chronic pulmonary diseases. It is not yet known in which way sectional resection of the central airway effects mucociliary clearance. A large number of the surgical failures are attributable to septic complications in the area of the anastomosis. In order to study the transportation process over the anastomosis, ten dogs underwent a tracheal resection with end-to-end anastomosis, and the mucociliary activity was recorded using a bronchoscopic video-technical method. Recommencement of mucous transport was observed on the third, and transport over the anastomosis from the sixth to tenth, postoperative days. The mucociliary clearance had completely recovered on the twenty-first day in the majority of dogs. Histological examination of the anastomoses nine months postoperatively showed a flat substitute epithelium without cilia-bearing cells in all dogs. This contrasts with the quick restitution of the transport function. In case of undamaged respiratory mucosa, a good adaptation of the resection margins suffices for the mucous film to slide over the anastomosis.

  1. Mechanics of spatulated end-to-end artery-to-vein anastomoses.

    Science.gov (United States)

    Morasch, M D; Dobrin, P B; Dong, Q S; Mrkvicka, R

    1998-01-01

    It previously has been shown that in straight end-to-end artery-to-vein anastomoses, maximum dimensions are obtained with an interrupted suture line. Nearly equivalent dimensions are obtained with a continuous compliant polybutester suture (Novafil), and the smallest dimensions are obtained with a continuous noncompliant polypropylene suture (Surgilene). The present study was undertaken to examine these suture techniques in a spatulated or beveled anastomosis in living dogs. Anastomoses were constructed using continuous 6-0 polypropylene (Surgilene), continuous 6-0 polybutester (Novafil), or interrupted 6-0 polypropylene or polybutester. Thirty minutes after construction, the artery, vein, and beveled anastomoses were excised, restored to in situ length and pressurized with the lumen filled with a dilute suspension of barium sulfate. High resolution radiographs were obtained at 25 mmHg pressure increments up to 200 mmHg. Dimensions and compliance were determined from the radiographic images. Results showed that, unlike straight artery-to-vein anastomoses, there were no differences in the dimensions or compliance of spatulated anastomoses with continuous Surgilene, continuous Novafil, or interrupted suture techniques. Therefore a continuous suture technique is acceptable when constructing spatulated artery-to-vein anastomoses in patients.

  2. Human Assisted Robotic Vehicle Studies - A conceptual end-to-end mission architecture

    Science.gov (United States)

    Lehner, B. A. E.; Mazzotta, D. G.; Teeney, L.; Spina, F.; Filosa, A.; Pou, A. Canals; Schlechten, J.; Campbell, S.; Soriano, P. López

    2017-11-01

    With current space exploration roadmaps indicating the Moon as a proving ground on the way to human exploration of Mars, it is clear that human-robotic partnerships will play a key role for successful future human space missions. This paper details a conceptual end-to-end architecture for an exploration mission in cis-lunar space with a focus on human-robot interactions, called Human Assisted Robotic Vehicle Studies (HARVeSt). HARVeSt will build on knowledge of plant growth in space gained from experiments on-board the ISS and test the first growth of plants on the Moon. A planned deep space habitat will be utilised as the base of operations for human-robotic elements of the mission. The mission will serve as a technology demonstrator not only for autonomous tele-operations in cis-lunar space but also for key enabling technologies for future human surface missions. The successful approach of the ISS will be built on in this mission with international cooperation. Mission assets such as a modular rover will allow for an extendable mission and to scout and prepare the area for the start of an international Moon Village.

  3. Parallel and Other Simulations in R Made Easy: An End-to-End Study

    Directory of Open Access Journals (Sweden)

    Marius Hofert

    2016-02-01

    Full Text Available It is shown how to set up, conduct, and analyze large simulation studies with the new R package simsalapar (= simulations simplified and launched parallel. A simulation study typically starts with determining a collection of input variables and their values on which the study depends. Computations are desired for all combinations of these variables. If conducting these computations sequentially is too time-consuming, parallel computing can be applied over all combinations of select variables. The final result object of a simulation study is typically an array. From this array, summary statistics can be derived and presented in terms of flat contingency or LATEX tables or visualized in terms of matrix-like figures. The R package simsalapar provides several tools to achieve the above tasks. Warnings and errors are dealt with correctly, various seeding methods are available, and run time is measured. Furthermore, tools for analyzing the results via tables or graphics are provided. In contrast to rather minimal examples typically found in R packages or vignettes, an end-to-end, not-so-minimal simulation problem from the realm of quantitative risk management is given. The concepts presented and solutions provided by simsalapar may be of interest to students, researchers, and practitioners as a how-to for conducting realistic, large-scale simulation studies in R.

  4. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    Science.gov (United States)

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.

  5. An end-to-end microfluidic platform for engineering life supporting microbes in space exploration missions, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology proposes a programmable, low-cost, and compact microfluidic platform capable of running automated end-to-end processes and optimization...

  6. NCAR Earth Observing Laboratory - An End-to-End Observational Science Enterprise

    Science.gov (United States)

    Rockwell, A.; Baeuerle, B.; Grubišić, V.; Hock, T. F.; Lee, W. C.; Ranson, J.; Stith, J. L.; Stossmeister, G.

    2017-12-01

    Researchers who want to understand and describe the Earth System require high-quality observations of the atmosphere, ocean, and biosphere. Making these observations not only requires capable research platforms and state-of-the-art instrumentation but also benefits from comprehensive in-field project management and data services. NCAR's Earth Observing Laboratory (EOL) is an end-to-end observational science enterprise that provides leadership in observational research to scientists from universities, U.S. government agencies, and NCAR. Deployment: EOL manages the majority of the NSF Lower Atmosphere Observing Facilities, which includes research aircraft, radars, lidars, profilers, and surface and sounding systems. This suite is designed to address a wide range of Earth system science - from microscale to climate process studies and from the planet's surface into the Upper Troposphere/Lower Stratosphere. EOL offers scientific, technical, operational, and logistics support to small and large field campaigns across the globe. Development: By working closely with the scientific community, EOL's engineering and scientific staff actively develop the next generation of observing facilities, staying abreast of emerging trends, technologies, and applications in order to improve our measurement capabilities. Through our Design and Fabrication Services, we also offer high-level engineering and technical expertise, mechanical design, and fabrication to the atmospheric research community. Data Services: EOL's platforms and instruments collect unique datasets that must be validated, archived, and made available to the research community. EOL's Data Management and Services deliver high-quality datasets and metadata in ways that are transparent, secure, and easily accessible. We are committed to the highest standard of data stewardship from collection to validation to archival. Discovery: EOL promotes curiosity about Earth science, and fosters advanced understanding of the

  7. Research on the Establishment and Evaluation of End - to - End Service Quality Index System

    Science.gov (United States)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    From the perspective of power data networks, put forward the index system model to measure the quality of service, covering user experience, business performance, network capacity support, etc., and gives the establishment and use of each layer index in the model.

  8. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    Science.gov (United States)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  9. SU-F-J-177: A Novel Image Analysis Technique (center Pixel Method) to Quantify End-To-End Tests

    Energy Technology Data Exchange (ETDEWEB)

    Wen, N; Chetty, I [Henry Ford Health System, Detroit, MI (United States); Snyder, K [Henry Ford Hospital System, Detroit, MI (United States); Scheib, S [Varian Medical System, Barton (Switzerland); Qin, Y; Li, H [Henry Ford Health System, Detroit, Michigan (United States)

    2016-06-15

    Purpose: To implement a novel image analysis technique, “center pixel method”, to quantify end-to-end tests accuracy of a frameless, image guided stereotactic radiosurgery system. Methods: The localization accuracy was determined by delivering radiation to an end-to-end prototype phantom. The phantom was scanned with 0.8 mm slice thickness. The treatment isocenter was placed at the center of the phantom. In the treatment room, CBCT images of the phantom (kVp=77, mAs=1022, slice thickness 1 mm) were acquired to register to the reference CT images. 6D couch correction were applied based on the registration results. Electronic Portal Imaging Device (EPID)-based Winston Lutz (WL) tests were performed to quantify the errors of the targeting accuracy of the system at 15 combinations of gantry, collimator and couch positions. The images were analyzed using two different methods. a) The classic method. The deviation was calculated by measuring the radial distance between the center of the central BB and the full width at half maximum of the radiation field. b) The center pixel method. Since the imager projection offset from the treatment isocenter was known from the IsoCal calibration, the deviation was determined between the center of the BB and the central pixel of the imager panel. Results: Using the automatic registration method to localize the phantom and the classic method of measuring the deviation of the BB center, the mean and standard deviation of the radial distance was 0.44 ± 0.25, 0.47 ± 0.26, and 0.43 ± 0.13 mm for the jaw, MLC and cone defined field sizes respectively. When the center pixel method was used, the mean and standard deviation was 0.32 ± 0.18, 0.32 ± 0.17, and 0.32 ± 0.19 mm respectively. Conclusion: Our results demonstrated that the center pixel method accurately analyzes the WL images to evaluate the targeting accuracy of the radiosurgery system. The work was supported by a Research Scholar Grant, RSG-15-137-01-CCE from the American

  10. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  11. Vision-based mobile robot navigation through deep convolutional neural networks and end-to-end learning

    Science.gov (United States)

    Zhang, Yachu; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Kong, Lingqin; Liu, Lingling

    2017-09-01

    In contrast to humans, who use only visual information for navigation, many mobile robots use laser scanners and ultrasonic sensors along with vision cameras to navigate. This work proposes a vision-based robot control algorithm based on deep convolutional neural networks. We create a large 15-layer convolutional neural network learning system and achieve the advanced recognition performance. Our system is trained from end to end to map raw input images to direction in supervised mode. The images of data sets are collected in a wide variety of weather conditions and lighting conditions. Besides, the data sets are augmented by adding Gaussian noise and Salt-and-pepper noise to avoid overfitting. The algorithm is verified by two experiments, which are line tracking and obstacle avoidance. The line tracking experiment is proceeded in order to track the desired path which is composed of straight and curved lines. The goal of obstacle avoidance experiment is to avoid the obstacles indoor. Finally, we get 3.29% error rate on the training set and 5.1% error rate on the test set in the line tracking experiment, 1.8% error rate on the training set and less than 5% error rate on the test set in the obstacle avoidance experiment. During the actual test, the robot can follow the runway centerline outdoor and avoid the obstacle in the room accurately. The result confirms the effectiveness of the algorithm and our improvement in the network structure and train parameters

  12. Influence of suture technique and suture material selection on the mechanics of end-to-end and end-to-side anastomoses.

    Science.gov (United States)

    Baumgartner, N; Dobrin, P B; Morasch, M; Dong, Q S; Mrkvicka, R

    1996-05-01

    Experiments were performed in dogs to evaluate the mechanics of 26 end-to-end and 42 end-to-side artery-vein graft anastomoses constructed with continuous polypropylene sutures (Surgilene; Davis & Geck, Division of American Cyanamid Co., Danbury, Conn.), continuous polybutester sutures (Novafil; Davis & Geck), and interrupted stitches with either suture material. After construction, the grafts and adjoining arteries were excised, mounted in vitro at in situ length, filled with a dilute barium sulfate suspension, and pressurized in 25 mm Hg steps up to 200 mm Hg. Radiographs were obtained at each pressure. The computed cross-sectional areas of the anastomoses were compared with those of the native arteries at corresponding pressures. Results showed that for the end-to-end anastomoses at 100 mm Hg the cross-sectional areas of the continuous Surgilene anastomoses were 70% of the native artery cross-sectional areas, the cross-sectional areas of the continuous Novafil anastomoses were 90% of the native artery cross-sectional areas, and the cross-sectional areas of the interrupted anastomoses were 107% of the native artery cross-sectional areas (p anastomoses demonstrated no differences in cross-sectional areas or compliance for the three suture techniques. This suggests that, unlike with end-to-end anastomoses, when constructing an end-to-side anastomosis in patients any of the three suture techniques may be acceptable.

  13. On the importance of risk knowledge for an end-to-end tsunami early warning system

    Science.gov (United States)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  14. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    Science.gov (United States)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  15. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    Science.gov (United States)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  16. A vision for end-to-end data services to foster international partnerships through data sharing

    Science.gov (United States)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  17. GROWTH OF THE HYPOPLASTIC AORTIC-ARCH AFTER SIMPLE COARCTATION RESECTION AND END-TO-END ANASTOMOSIS

    NARCIS (Netherlands)

    BROUWER, MHJ; CROMMEDIJKHUIS, AH; EBELS, T; EIJGELAAR, A

    Surgical treatment of a hypoplastic aortic arch associated with an aortic coarctation is controversial. The controversy concerns the claimed need to surgically enlarge the diameter of the hypoplastic arch, in addition to resection and end-to-end anastomosis. The purpose of this prospective study is

  18. Sutureless functional end-to-end anastomosis using a linear stapler with polyglycolic acid felt for intestinal anastomoses

    Directory of Open Access Journals (Sweden)

    Masanori Naito, MD, PhD

    2017-05-01

    Conclusion: Sutureless functional end-to-end anastomosis using the Endo GIA™ Reinforced appears to be safe, efficacious, and straightforward. Reinforcement of the crotch site with a bioabsorbable polyglycolic acid sheet appears to mitigate conventional problems with crotch-site vulnerability.

  19. Histochemical alterations of re-innervated rat extensor digitorum longus muscle after end-to-end or graft repair: a comparative histomorphological study

    Science.gov (United States)

    Lehnert, M; Steudel, WI; Marzi, I; Mautes, A

    2003-01-01

    Changes in the histochemical profile of 43 rat extensor digitorum longus muscles undergoing de-innervation and re-innervation were recorded. Assessment of fibre type composition and muscle fibre cross-sectional area was performed at 15, 30, 90 and 180 days post operative (p.o.) after either primary end-to-end repair or autologous graft repair of the common peroneal nerve (n = 5 per time point and type of repair). The size and histochemical profile of single muscle fibres were analysed by computer-assisted quantification on the basis of their myofibrillar ATPase (pH 4.3) and succinate dehydrogenase (SDH) activities in serial, whole-muscle cross-sections. Accordingly, four muscle-fibre types could be functionally identified: (1) slow oxidative (SO, type I); (2) fast-oxidative glycolytic (FOG, type IIA); (3) fast glycolytic (FG, type IIB); and (4) succinate dehydrogenase intermediate (SDH-INT). At 15 days following end-to-end repair, the SDH-INT muscle fibre type was observed. By contrast, 15 days following graft repair, no changes in fibre type composition were observed (vs. control). At 30 days p.o. in the group that received end-to-end repair, type SDH-INT reached its maximum and was significantly higher than in the group that underwent graft repair. At 90 days p.o., the amount of SDH-INT fibres declined after end-to-end repair, but it was still significantly higher than in the group treated with a nerve graft. The increase of the SDH-INT fibre type was mirrored by a proportional disappearance of FG and FOG fibres. These changes were time-dependent, not reversible at 180 days p.o and largely blunted after nerve graft. Muscle-fibre size decreased at 15 and 30 days after both types of nerve repair. This decrease was transient and reversible within 90 days p.o. These findings reflect the fact that the reorganization of the histochemical profile in re-innervated muscles is both time dependent and long lasting. The degree of this reorganization is significantly higher

  20. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study

    Science.gov (United States)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd

    2016-08-01

    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  1. Is "functional end-to-end anastomosis" really functional? A review of the literature on stapled anastomosis using linear staplers.

    Science.gov (United States)

    Kano, Masayuki; Hanari, Naoyuki; Gunji, Hisashi; Hayano, Koichi; Hayashi, Hideki; Matsubara, Hisahiro

    2017-01-01

    Anastomosis is one of the basic skills of a gastrointestinal surgeon. Stapling devices are widely used because stapled anastomosis (SA) can shorten operation times. Antiperistaltic stapled side-to-side anastomosis (SSSA) using linear staplers is a popular SA technique that is often referred to as "functional end-to-end anastomosis (FEEA)." The term "FEEA" has spread without any definite validation of its "function." The aim of this review is to show the heterogeneity of SA and conventional hand-sewn end-to-end anastomosis (HEEA) and to advocate the renaming of "FEEA." We conducted a narrative review of the literature on SSSA. We reviewed the literature on ileocolic and small intestinal anastomosis in colonic cancer, Crohn's disease and ileostomy closure due to the simplicity of the technique. The superiority of SSSA in comparison to HEEA has been demonstrated in previous clinical studies concerning gastrointestinal anastomosis. Additionally, experimental studies have shown the differences between the two anastomotic techniques on peristalsis and the intestinal bacteria at the anastomotic site. SSSA and HEEA affect the postoperative clinical outcome, electrophysiological peristalsis, and bacteriology in different manners; no current studies have shown the functional equality of SSSA and HEEA. However, the use of the terms "functional end-to-end anastomosis" and/or "FEEA" could cause confusion for surgeons and researchers and should therefore be avoided.

  2. JADS JT&E End-to-End Test Interim Report Phase 1

    National Research Council Canada - National Science Library

    McCall, James

    1998-01-01

    .... Northrop Grumman, the developer of the E-8C, performed the engineering and development of both a laboratory emulation of the E-8C radar subsystem and the capability to integrate the E-8C into a synthetic environment...

  3. Composable Mission Framework for Rapid End-to-End Mission Design and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation proposed here is the Composable Mission Framework (CMF) a model-based software framework that shall enable seamless continuity of mission design and...

  4. End-to-End simulation study of a full magnetic gradiometry mission

    DEFF Research Database (Denmark)

    Kotsiaros, Stavros; Olsen, Nils

    2014-01-01

    of a simulated low Earth orbiting satellite. The observations are synthesized from realistic models based upon a combination of the major sources contributing to the Earth’s magnetic field. From those synthetic data, we estimate field models using either the magnetic vector field observations only or the full......In this paper, we investigate space magnetic gradiometry as a possible path for future exploration of the Earth’s magnetic field with satellites. Synthetic observations of the magnetic field vector and of six elements of the magnetic gradient tensor are calculated for times and positions...

  5. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  6. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas.

    Science.gov (United States)

    Timmons, Joshua J; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T

    2017-10-12

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  7. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  8. End-to-end requirements management for multiprojects in the construction industry

    DEFF Research Database (Denmark)

    Wörösch, Michael

    -to-end a requirements structure is developed and tested as a starting point. This requirements structure is able to handle the encountered standard and non-standard situations such as product development and technology development in parallel with executing a construction project. At the same time the requirements...... in construction is performed. The results of this literature study show that very little has been written about applying requirements management to the field of construction even though some authors have proposed to do so. This is a first indication that the entire field of construction lacks research...

  9. End-to-end simulation of a visible 1 kW FEL

    International Nuclear Information System (INIS)

    Parazzoli, Claudio G.; Koltenbah, Benjamin E.C.

    2000-01-01

    In this paper we present the complete numerical simulation of the 1 kW visible Free Electron Laser under construction in Seattle. We show that the goal of producing 1.0 kW at 0.7 μm is well within the hardware capabilities. We simulate in detail the evolution of the electron bunch phase space in the entire e-beam line. The e-beam line includes the photo-injector cavities, the 433.33 MHz accelerator, the magnetic buncher, the 1300 MHz accelerator, the 180 deg. bend and the matching optics into the wiggler. The computed phase space is input for a three-dimensional time-dependent code that predicts the FEL performance. All the computations are based on state of the art software, and the limitations of the current software are discussed. We believe that this is the first time that such a thorough numerical simulation has been carried out and that such a realistic electron phase space has been used in FEL performance calculations

  10. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published

  11. An end-to-end assessment of extreme weather impacts on food security

    Science.gov (United States)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  12. Telephony Over IP: A QoS Measurement-Based End to End Control Algorithm

    Directory of Open Access Journals (Sweden)

    Luigi Alcuri

    2004-12-01

    Full Text Available This paper presents a method for admitting voice calls in Telephony over IP (ToIP scenarios. This method, called QoS-Weighted CAC, aims to guarantee Quality of Service to telephony applications. We use a measurement-based call admission control algorithm, which detects network congested links through a feedback on overall link utilization. This feedback is based on the measures of packet delivery latencies related to voice over IP connections at the edges of the transport network. In this way we introduce a close loop control method, which is able to auto-adapt the quality margin on the basis of network load and specific service level requirements. Moreover we evaluate the difference in performance achieved by different Queue management configurations to guarantee Quality of Service to telephony applications, in which our goal was to evaluate the weight of edge router queue configuration in complex and real-like telephony over IP scenario. We want to compare many well-know queue scheduling algorithms, such as SFQ, WRR, RR, WIRR, and Priority. This comparison aims to locate queue schedulers in a more general control scheme context where different elements such as DiffServ marking and Admission control algorithms contribute to the overall Quality of Service required by real-time voice conversations. By means of software simulations we want to compare this solution with other call admission methods already described in scientific literature in order to locate this proposed method in a more general control scheme context. On the basis of the results we try to evidence the possible advantages of this QoS-Weighted solution in comparison with other similar CAC solutions ( in particular Measured Sum, Bandwidth Equivalent with Hoeffding Bounds, and Simple Measure CAC, on the planes of complexity, stability, management, tune-ability to service level requirements, and compatibility with actual network implementation.

  13. Interoperable End-to-End Remote Patient Monitoring Platform based on IEEE 11073 PHD and ZigBee Health Care Profile.

    Science.gov (United States)

    Clarke, Malcolm; de Folter, Joost; Verma, Vivek; Gokalp, Hulya

    2017-08-07

    This paper describes the implementation of an end-to-end remote monitoring platform based on the IEEE 11073 standards for Personal Health Devices (PHD). It provides an overview of the concepts and approaches and describes how the standard has been optimized for small devices with limited resources of processor, memory and power and that use short range wireless technology. It explains aspects of IEEE 11073, including the Domain Information Model, state model and nomenclature, and how these support its plug-and-play architecture. It shows how these aspects underpin a much larger eco-system of interoperable devices and systems that include IHE PCD-01, HL7 and BlueTooth LE medical devices, and the relationship to the Continua Guidelines, advocating the adoption of data standards and nomenclature to support semantic interoperability between health and ambient assisted living (AAL) in future platforms. The paper further describes the adaptions that have been made in order to implement the standard on the ZigBee Health Care Profile and the experiences of implementing an end-to-end platform that has been deployed to frail elderly patients with chronic disease(s) and patients with diabetes.

  14. AN AUTOMATED END-TO-END MULTI-AGENT QOS BASED ARCHITECTURE FOR SELECTION OF GEOSPATIAL WEB SERVICES

    Directory of Open Access Journals (Sweden)

    M. Shah

    2012-07-01

    With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  15. A multicentre 'end to end' dosimetry audit of motion management (4DCT-defined motion envelope) in radiotherapy.

    Science.gov (United States)

    Palmer, Antony L; Nash, David; Kearton, John R; Jafari, Shakardokht M; Muscat, Sarah

    2017-12-01

    External dosimetry audit is valuable for the assurance of radiotherapy quality. However, motion management has not been rigorously audited, despite its complexity and importance for accuracy. We describe the first end-to-end dosimetry audit for non-SABR (stereotactic ablative body radiotherapy) lung treatments, measuring dose accumulation in a moving target, and assessing adequacy of target dose coverage. A respiratory motion lung-phantom with custom-designed insert was used. Dose was measured with radiochromic film, employing triple-channel dosimetry and uncertainty reduction. The host's 4DCT scan, outlining and planning techniques were used. Measurements with the phantom static and then moving at treatment delivery separated inherent treatment uncertainties from motion effects. Calculated and measured dose distributions were compared by isodose overlay, gamma analysis, and we introduce the concept of 'dose plane histograms' for clinically relevant interpretation of film dosimetry. 12 radiotherapy centres and 19 plans were audited: conformal, IMRT (intensity modulated radiotherapy) and VMAT (volumetric modulated radiotherapy). Excellent agreement between planned and static-phantom results were seen (mean gamma pass 98.7% at 3% 2 mm). Dose blurring was evident in the moving-phantom measurements (mean gamma pass 88.2% at 3% 2 mm). Planning techniques for motion management were adequate to deliver the intended moving-target dose coverage. A novel, clinically-relevant, end-to-end dosimetry audit of motion management strategies in radiotherapy is reported. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Risk Factors for Dehiscence of Stapled Functional End-to-End Intestinal Anastomoses in Dogs: 53 Cases (2001-2012).

    Science.gov (United States)

    Snowdon, Kyle A; Smeak, Daniel D; Chiang, Sharon

    2016-01-01

    To identify risk factors for dehiscence in stapled functional end-to-end anastomoses (SFEEA) in dogs. Retrospective case series. Dogs (n = 53) requiring an enterectomy. Medical records from a single institution for all dogs undergoing an enterectomy (2001-2012) were reviewed. Surgeries were included when gastrointestinal (GIA) and thoracoabdominal (TA) stapling equipment was used to create a functional end-to-end anastomosis between segments of small intestine or small and large intestine in dogs. Information regarding preoperative, surgical, and postoperative factors was recorded. Anastomotic dehiscence was noted in 6 of 53 cases (11%), with a mortality rate of 83%. The only preoperative factor significantly associated with dehiscence was the presence of inflammatory bowel disease (IBD). Surgical factors significantly associated with dehiscence included the presence, duration, and number of intraoperative hypotensive periods, and location of anastomosis, with greater odds of dehiscence in anastomoses involving the large intestine. IBD, location of anastomosis, and intraoperative hypotension are risk factors for intestinal anastomotic dehiscence after SFEEA in dogs. Previously suggested risk factors (low serum albumin concentration, preoperative septic peritonitis, and intestinal foreign body) were not confirmed in this study. © Copyright 2015 by The American College of Veterinary Surgeons.

  17. End-to-End Joint Antenna Selection Strategy and Distributed Compress and Forward Strategy for Relay Channels

    Directory of Open Access Journals (Sweden)

    Rahul Vaze

    2009-01-01

    Full Text Available Multihop relay channels use multiple relay stages, each with multiple relay nodes, to facilitate communication between a source and destination. Previously, distributed space-time codes were proposed to maximize the achievable diversity-multiplexing tradeoff; however, they fail to achieve all the points of the optimal diversity-multiplexing tradeoff. In the presence of a low-rate feedback link from the destination to each relay stage and the source, this paper proposes an end-to-end antenna selection (EEAS strategy as an alternative to distributed space-time codes. The EEAS strategy uses a subset of antennas of each relay stage for transmission of the source signal to the destination with amplifying and forwarding at each relay stage. The subsets are chosen such that they maximize the end-to-end mutual information at the destination. The EEAS strategy achieves the corner points of the optimal diversity-multiplexing tradeoff (corresponding to maximum diversity gain and maximum multiplexing gain and achieves better diversity gain at intermediate values of multiplexing gain, versus the best-known distributed space-time coding strategies. A distributed compress and forward (CF strategy is also proposed to achieve all points of the optimal diversity-multiplexing tradeoff for a two-hop relay channel with multiple relay nodes.

  18. Self-organization of nanorods into ultra-long range two-dimensional monolayer end-to-end network.

    Science.gov (United States)

    Kim, Dahin; Kim, Whi Dong; Kang, Moon Sung; Kim, Shin-Hyun; Lee, Doh C

    2015-01-14

    Highly uniform large-scale assembly of nanoscale building blocks can enable unique collective properties for practical electronic and photonic devices. We present a two-dimensional (2-D), millimeter-scale network of colloidal CdSe nanorods (NRs) in monolayer thickness through end-to-end linking. The colloidal CdSe NRs are sterically stabilized with tetradecylphosphonic acid (TDPA), and their tips are partially etched in the presence of gold chloride (AuCl3) and didecyldimethylammonium bromide (DDAB), which make them unwetted in toluene. This change in surface wetting property leads to spontaneous adsorption at the 2-D air/toluene interface. Anisotropy in both the geometry and the surface property of the CdSe NRs causes deformation of the NR/toluene/air interface, which derives capillary attraction between tips of neighboring NRs inward. As a result, the NRs confined at the interface spontaneously form a 2-D network composed of end-to-end linkages. We employ a vertical-deposition approach to maintain a consistent rate of NR supply to the interface during the assembly. The rate control turns out to be pivotal in the preparation of a highly uniform large scale 2-D network without aggregation. In addition, unprecedented control of the NR density in the network was possible by adjusting either the lift-up speed of the immersed substrate or the relative concentration of AuCl3 to DDAB. Our findings provide important design criteria for 2-D assembly of anisotropic nanobuilding blocks.

  19. POTION: an end-to-end pipeline for positive Darwinian selection detection in genome-scale data through phylogenetic comparison of protein-coding genes.

    Science.gov (United States)

    Hongo, Jorge A; de Castro, Giovanni M; Cintra, Leandro C; Zerlotini, Adhemar; Lobo, Francisco P

    2015-08-01

    Detection of genes evolving under positive Darwinian evolution in genome-scale data is nowadays a prevailing strategy in comparative genomics studies to identify genes potentially involved in adaptation processes. Despite the large number of studies aiming to detect and contextualize such gene sets, there is virtually no software available to perform this task in a general, automatic, large-scale and reliable manner. This certainly occurs due to the computational challenges involved in this task, such as the appropriate modeling of data under analysis, the computation time to perform several of the required steps when dealing with genome-scale data and the highly error-prone nature of the sequence and alignment data structures needed for genome-wide positive selection detection. We present POTION, an open source, modular and end-to-end software for genome-scale detection of positive Darwinian selection in groups of homologous coding sequences. Our software represents a key step towards genome-scale, automated detection of positive selection, from predicted coding sequences and their homology relationships to high-quality groups of positively selected genes. POTION reduces false positives through several sophisticated sequence and group filters based on numeric, phylogenetic, quality and conservation criteria to remove spurious data and through multiple hypothesis corrections, and considerably reduces computation time thanks to a parallelized design. Our software achieved a high classification performance when used to evaluate a curated dataset of Trypanosoma brucei paralogs previously surveyed for positive selection. When used to analyze predicted groups of homologous genes of 19 strains of Mycobacterium tuberculosis as a case study we demonstrated the filters implemented in POTION to remove sources of errors that commonly inflate errors in positive selection detection. A thorough literature review found no other software similar to POTION in terms of customization

  20. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R. [Seoul National University, Seoul (Korea, Republic of)

    2014-09-15

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly.

  1. End-to-end Structural Restriction of α-Synuclein and Its Influence on Amyloid Fibril Formation

    International Nuclear Information System (INIS)

    Hong, Chul Suk; Park, Jae Hyung; Choe, Young Jun; Paik, Seung R.

    2014-01-01

    Relationship between molecular freedom of amyloidogenic protein and its self-assembly into amyloid fibrils has been evaluated with α-synuclein, an intrinsically unfolded protein related to Parkinson's disease, by restricting its structural plasticity through an end-to-end disulfide bond formation between two newly introduced cysteine residues on the N- and C-termini. Although the resulting circular form of α-synuclein exhibited an impaired fibrillation propensity, the restriction did not completely block the protein's interactive core since co-incubation with wild-type α-synuclein dramatically facilitated the fibrillation by producing distinctive forms of amyloid fibrils. The suppressed fibrillation propensity was instantly restored as the structural restriction was unleashed with β-mercaptoethanol. Conformational flexibility of the accreting amyloidogenic protein to pre-existing seeds has been demonstrated to be critical for fibrillar extension process by exerting structural adjustment to a complementary structure for the assembly

  2. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods

    DEFF Research Database (Denmark)

    Jain, Titoo; Westerlund, Axel Rune Fredrik; Johnson, Erik

    2009-01-01

    Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures...... on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene...... that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods...

  3. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    Science.gov (United States)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  4. Reconstruction after ureteral resection during HIPEC surgery: Re-implantation with uretero-neocystostomy seems safer than end-to-end anastomosis.

    Science.gov (United States)

    Pinar, U; Tremblay, J-F; Passot, G; Dazza, M; Glehen, O; Tuech, J-J; Pocard, M

    2017-09-01

    Resection of the pelvic ureter may be necessary in cytoreductive surgery for peritoneal carcinomatosis in combination with hyperthermic intraperitoneal chemotherapy (HIPEC). As the morbidity for cytoreductive surgery with HIPEC has decreased, expert teams have begun to perform increasingly complex surgical procedures associated with HIPEC, including pelvic reconstructions. After ureteral resection, two types of reconstruction are possible: uretero-ureteral end-to-end anastomosis and uretero-vesical re-implantation or uretero-neocystostomy (the so-called psoas hitch technique). By compiling the experience of three surgical teams that perform HIPEC surgeries, we have tried to compare the effectiveness of these two techniques. A retrospective comparative case-matched multicenter study was conducted for patients undergoing operation between 2005 and 2014. Patients included had undergone resection of the pelvic ureter during cytoreductive surgery with HIPEC for peritoneal carcinomatomosis; ureteral reconstruction was by either end-to-end anastomosis (EEA group) or re-implantation uretero-neocystostomy (RUC group). The primary endpoint was the occurrence of urinary fistula in postoperative follow-up. There were 14 patients in the EEA group and 14 in the RUC group. The groups were comparable for age, extent of carcinomatosis (PCI index) and operative duration. Four urinary fistulas occurred in the EEA group (28.5%) versus zero fistulas in the RUC group (0%) (P=0.0308). Re-implantation with uretero-neocystostomy during cytoreductive surgery with HIPEC is the preferred technique for reconstruction after ureteral resection in case of renal conservation. Copyright © 2017. Published by Elsevier Masson SAS.

  5. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  6. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  7. Delayed primary end-to-end anastomosis for traumatic long segment urethral stricture and its short-term outcomes

    Directory of Open Access Journals (Sweden)

    Rajarshi Kumar

    2017-01-01

    Full Text Available Background: The purpose of this study is to evaluate the aetiology of posterior urethral stricture in children and analysis of results after delayed primary repair with extensive distal urethral mobilisation. Materials and Methods: This was a retrospective study carried out in a tertiary care centre from January 2009 to December 2013. Results: Eight children with median age 7.5 years (range 4–11 years, underwent delayed anastomotic urethroplasty: Six through perineal and two through combined perineal and transpubic approach. All the eight children had long-segment >2 cm stricture: Three posterior and five anterior urethral stricture. On a mean follow-up period of 33 months (range 24–48 m, all were passing urine with good flow and stream. Conclusion: End-to-end anastomosis in post-traumatic long segment posterior urethral stricture between prostatic and penile urethra in children is possible by perineal or combined perineal and transpubic approach with good results without any urethral replacement.

  8. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Gallas, Raya R.; Huenemohr, Nora; Runz, Armin; Niebuhr, Nina I.; Greilich, Steffen [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Jaekel, Oliver [German Cancer Research Center (DKFZ), Heidelberg (Germany). Div. of Medical Physics in Radiation Oncology; National Center for Radiation Research in Oncology, Heidelberg (Germany). Heidelberg Institute of Radiation Oncology (HIRO); Heidelberg University Hospital (Germany). Dept. of Radiation Oncology; Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany)

    2015-07-01

    With the increasing complexity of external beam therapy ''end-to-end'' tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  9. Towards a cross-platform software framework to support end-to-end hydrometeorological sensor network deployment

    Science.gov (United States)

    Celicourt, P.; Sam, R.; Piasecki, M.

    2016-12-01

    Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.

  10. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    Science.gov (United States)

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. Copyright © 2015. Published by Elsevier GmbH.

  11. Hardware and Methods of the Optical End-to-End Test of the Far Ultraviolet Spectroscopic Explorer (FUSE)

    Science.gov (United States)

    Conard, Steven J.; Redman, Kevin W.; Barkhouser, Robert H.; McGuffey, Doug B.; Smee, Stephen; Ohl, Raymond G.; Kushner, Gary

    1999-01-01

    The Far Ultraviolet Spectroscopic Explorer (FUSE), currently being tested and scheduled for a 1999 launch, is an astrophysics satellite designed to provide high spectral resolving power (Lambda/(Delta)Lambda = 24,000-30,000) over the interval 90.5-118.7 nm. The FUSE optical path consists of four co-aligned, normal incidence, off-axis parabolic, primary mirrors which illuminate separate Rowland circle spectrograph channels equipped with holographic gratings and delay line microchannel plate detectors. We describe the hardware and methods used for the optical end-to-end test of the FUSE instrument during satellite integration and test. Cost and schedule constraints forced us to devise a simplified version of the planned optical test which occurred in parallel with satellite thermal-vacuum testing. The optical test employed a collimator assembly which consisted of four co-aligned, 15" Cassegrain telescopes which were positioned above the FUSE instrument, providing a collimated beam for each optical channel. A windowed UV light source, remotely adjustable in three axes, was mounted at the focal plane of each collimator. Problems with the UV light sources, including high F-number and window failures, were the only major difficulties encountered during the test. The test succeeded in uncovering a significant problem with the secondary structure used for the instrument closeout cavity and, furthermore, showed that the mechanical solution was successful. The hardware was also used extensively for simulations of science observations, providing both UV light for spectra and visible light for the fine error sensor camera.

  12. Chinese Medical Question Answer Matching Using End-to-End Character-Level Multi-Scale CNNs

    Directory of Open Access Journals (Sweden)

    Sheng Zhang

    2017-07-01

    Full Text Available This paper focuses mainly on the problem of Chinese medical question answer matching, which is arguably more challenging than open-domain question answer matching in English due to the combination of its domain-restricted nature and the language-specific features of Chinese. We present an end-to-end character-level multi-scale convolutional neural framework in which character embeddings instead of word embeddings are used to avoid Chinese word segmentation in text preprocessing, and multi-scale convolutional neural networks (CNNs are then introduced to extract contextual information from either question or answer sentences over different scales. The proposed framework can be trained with minimal human supervision and does not require any handcrafted features, rule-based patterns, or external resources. To validate our framework, we create a new text corpus, named cMedQA, by harvesting questions and answers from an online Chinese health and wellness community. The experimental results on the cMedQA dataset show that our framework significantly outperforms several strong baselines, and achieves an improvement of top-1 accuracy by up to 19%.

  13. Presence of calcium in the vessel walls after end-to-end arterial anastomoses with polydioxanone and polypropylene sutures in growing dogs.

    Science.gov (United States)

    Gersak, B

    1993-10-01

    The presence of calcium in the vessel walls after end-to-end arterial anastomoses performed with polydioxanone and polypropylene interrupted sutures was studied in 140 anastomoses in 35 10-week-old German shepherd dogs. Histologic examination with hematoxylin and eosin, van Gieson, and von Kossa staining techniques was performed after the animals were killed 6 months after the operation. Ketamine hydrochloride was used as an anesthetic agent. At the start of the investigation the dogs weighed 14.5 +/- 2.6 kg (mean +/- standard deviation, n = 35), and after 6 months they weighed 45.3 +/- 3.1 kg (mean +/- standard deviation, n = 35). The diameter of the sutured arteries in the first operation was 2.6 +/- 0.5 mm (mean +/- standard deviation, n = 140). With each dog, both brachial and both femoral arteries were used--one artery for each different type of suture. In different dogs, different arteries were used for the same type of suture. The prevalence of calcifications after 6 months was determined from the numeric density of calcifications with standard stereologic techniques. The sutured and sutureless parts taken from longitudinal sections from each artery were studied, and t test values were calculated as follows: In paired samples, statistically significant differences in numerical density of calcifications were seen between sutured and sutureless arterial parts for both materials (sutureless part versus part with polydioxanone sutures, p 0.05, n = 70) and sutureless parts (p > 0.05, n = 70).

  14. Albert-Lembert versus hybrid-layered suture in hand sewn end-to-end cervical esophagogastric anastomosis after esophageal squamous cell carcinoma resection.

    Science.gov (United States)

    Feng, Fan; Sun, Li; Xu, Guanghui; Hong, Liu; Yang, Jianjun; Cai, Lei; Li, Guocai; Guo, Man; Lian, Xiao; Zhang, Hongwei

    2015-11-01

    Hand sewn cervical esophagogastric anastomosis (CEGA) is regarded as preferred technique by surgeons after esophagectomy. However, considering the anastomotic leakage and stricture, the optimal technique for performing this anastomosis is still under debate. Between November 2010 and September 2012, 230 patients who underwent esophagectomy with hand sewn end-to-end (ETE) CEGA for esophageal squamous cell carcinoma (ESCC) were analyzed retrospectively, including 111 patients underwent Albert-Lembert suture anastomosis and 119 patients underwent hybrid-layered suture anastomosis. Anastomosis construction time was recorded during operation. Anastomotic leakage was recorded through upper gastrointestinal water-soluble contrast examination. Anastomotic stricture was recorded during follow up. The hybrid-layered suture was faster than Albert-Lembert suture (29.40±1.24 min vs. 33.83±1.41 min, P=0.02). The overall anastomotic leak rate was 7.82%, the leak rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (3.36% vs. 12.61%, P=0.01). The overall anastomotic stricture rate was 9.13%, the stricture rate in hybrid-layered suture group was significantly lower than that in Albert-Lembert suture group (5.04% vs. 13.51%, P=0.04). Hand sewn ETE CEGA with hybrid-layered suture is associated with lower anastomotic leakage and stricture rate compared to hand sewn ETE CEGA with Albert-Lembert suture.

  15. Land Mobile Satellite Service (LMSS) channel simulator: An end-to-end hardware simulation and study of the LMSS communications links

    Science.gov (United States)

    Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.

    1984-01-01

    The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.

  16. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    Science.gov (United States)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  17. An End-to-End Modeling and Simulation Testbed (EMAST) to Support Detailed Quantitative Evaluations of GIG Transport Services

    National Research Council Canada - National Science Library

    Comparetto, G; Schult, N; Mirhakkak, M; Chen, L; Wade, R; Duffalo, S

    2005-01-01

    .... A variety of services must be provided to the users including management of resources to support QoS, a transition path from IPv4 to IPv6, and efficient networking across heterogeneous networks (i.e...

  18. First Demonstration of Real-Time End-to-End 40 Gb/s PAM-4 System using 10-G Transmitter for Next Generation Access Applications

    DEFF Research Database (Denmark)

    Wei, Jinlong; Eiselt, Nicklas; Griesser, Helmut

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved.......We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next generation access applications using 10G class transmitters only. Up to 25-dB upstream link budget for 20 km SMF is achieved....

  19. SU-F-P-37: Implementation of An End-To-End QA Test of the Radiation Therapy Imaging, Planning and Delivery Process to Identify and Correct Possible Sources of Deviation

    International Nuclear Information System (INIS)

    Salinas Aranda, F; Suarez, V; Arbiser, S; Sansogne, R

    2016-01-01

    Purpose: To implement an end-to-end QA test of the radiation therapy imaging, planning and delivery process, aimed to assess the dosimetric agreement accuracy between planned and delivered treatment, in order to identify and correct possible sources of deviation. To establish an internal standard for machine commissioning acceptance. Methods: A test involving all steps of the radiation therapy: imaging, planning and delivery process was designed. The test includes analysis of point dose and planar dose distributions agreement between TPS calculated and measured dose. An ad hoc 16 cm diameter PMMA phantom was constructed with one central and four peripheral bores that can accommodate calibrated electron density inserts. Using Varian Eclipse 10.0 and Elekta XiO 4.50 planning systems, IMRT, RapidArc and 3DCRT with hard and dynamic wedges plans were planned on the phantom and tested. An Exradin A1SL chamber is used with a Keithley 35617EBS electrometer for point dose measurements in the phantom. 2D dose distributions were acquired using MapCheck and Varian aS1000 EPID.Gamma analysis was performed for evaluation of 2D dose distribution agreement using MapCheck software and Varian Portal Dosimetry Application.Varian high energy Clinacs Trilogy, 2100C/CD, 2000CR and low energy 6X/EX where tested.TPS-CT# vs. electron density table were checked for CT-scanners used. Results: Calculated point doses were accurate to 0.127% SD: 0.93%, 0.507% SD: 0.82%, 0.246% SD: 1.39% and 0.012% SD: 0.01% for LoX-3DCRT, HiX-3DCRT, IMRT and RapidArc plans respectively. Planar doses pass gamma 3% 3mm in all cases and 2% 2mm for VMAT plans. Conclusion: Implementation of a simple and reliable quality assurance tool was accomplished. The end-to-end proved efficient, showing excellent agreement between planned and delivered dose evidencing strong consistency of the whole process from imaging through planning to delivery. This test can be used as a first step in beam model acceptance for clinical

  20. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    Science.gov (United States)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  1. Overcoming End-to-End Vessel Mismatch During Superficial Temporal Artery-Radial Artery-M2 Interposition Grafting for Cerebral Ischemia: Tapering Technique.

    Science.gov (United States)

    Cikla, Ulas; Mukherjee, Debraj; Tumturk, Abdulfettah; Baskaya, Mustafa K

    2018-02-01

    Cerebral revascularization procedures, such as the external carotid-internal carotid bypass, have been used in the clinical management of cerebral ischemic states. Among the most commonly performed bypasses is the superficial temporal artery-middle cerebral artery (STA-MCA) bypass to restore cerebral blood flow. In cases of a foreshortened STA donor vessel, a radial artery (RA) graft is often used as an interposition graft between the STA and MCA. However, addressing the vessel size mismatch between the radial artery and donor can be problematic and challenging. We present the case of an 80-year-old male presenting with positional-onset expressive aphasia and right-sided hemiparesis. Computed tomography perfusion demonstrated a diffusion-perfusion mismatch in a left MCA distribution. Angiography showed a complete left internal cerebral artery occlusion and poor distal filling of the STA. We performed an external carotid artery-to-internal carotid artery bypass through interposing an RA graft to the STA proximally with an end-to-end anastomosis and to the MCA distally using an end-to-side anastomosis. The mismatch between 2 bypass vessel sizes was corrected by removing a small piece from the RA graft at 1 margin and suturing it to itself to reduce the size of the RA vessel diameter opening on the side used to sew to the STA. The patient did well clinically with improved right-sided strength, a patent graft, and no postoperative complications. Addressing vessel mismatch when using RA interposition grafts for bypass is challenging. Various operative approaches to address mismatch should be individualized on the basis of the particular vascular anatomy and needs of the case. Nevertheless, our method of cutting and suturing 1 side of the RA graft into a semiblind end to match donor vessel diameter may be of use to cerebrovascular surgeons in select cases. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Evaluation of a composite Gel-Alanine phantom on an end-to-end test to treat multiple brain metastases by a single isocenter VMAT technique.

    Science.gov (United States)

    Pavoni, Juliana Fernandes; Neves-Junior, Wellington Furtado Pimenta; da Silveira, Matheus Antonio; Haddad, Cecília Maria Kalil; Baffa, Oswaldo

    2017-09-01

    This work aims to evaluate the application of a cylindrical phantom made of dosimetric gel containing alanine pellets distributed inside the gel volume during an end-to-end test of a single isocenter VMAT for simultaneous treatment of multiple brain metastases. The evaluation is based on the comparison of the results obtained with the composite phantom with the treatment planning system (TPS) dose distribution validated by using the clinical conventional quality control with point and planar dose measurements. A cylindrical MAGIC-f gel phantom containing alanine dosimeters (composite phantom) was used to design the VMAT plan in the treatment planning system (TPS). The alanine dosimeters were pellets with radius of 2.5 mm and height of 3 mm, and played the role of brain metastasis inside the gel cylinder, which simulated the cerebral tissue. Five of the alanine dosimeters were selected to simulate five lesions; five planning target volumes (PTVs) were created including the dosimeters and irradiated with different doses. Conventional quality assurance (QA) was performed on the TPS plan and on the composite phantom; a phantom containing only gel (Gel 1 phantom) was also irradiated. One day after irradiation, magnetic resonance images were acquired for both phantoms on a 3T scanner. An electron spin resonance spectrometer was used to evaluate alanine doses. Calibration curves were constructed for the alanine and the gel dosimeters. All the gel only measurement was repeated (Gel 2 phantom) in order to confirm the previous gel measurement. The VMAT treatment plan was approved by the conventional QA. The doses measured by alanine dosimeters on the composite gel phantom agreed to the TPS on average within 3.3%. The alanine dose for each lesion was used to calibrate the gel dosimeter measurements of the concerned PTV. Both gel dose volume histograms (DVH) achieved for each PTV were in agreement with the expected TPS DVH, except for a small discrepancy observed for the Gel 2

  3. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [ORNL; Fugate, David L [ORNL; Cetiner, Sacit M [ORNL; Qualls, A L [ORNL

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  4. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    Science.gov (United States)

    Brewka, Lukasz; Gavler, Anders; Wessing, Henrik; Dittmann, Lars

    2012-04-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.

  5. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Gavler, Anders; Wessing, Henrik

    2012-01-01

    of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches...... are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.......End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part...

  6. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    International Nuclear Information System (INIS)

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-01-01

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery

  7. Graft repair of the peroneal nerve restores histochemical profile after long-term reinnervation of the rat extensor digitorum longus muscle in contrast to end-to-end repair

    Science.gov (United States)

    Lehnert, M; Maier, B; Frank, J M; Steudel, W I; Marzi, I; Mautes, A

    2004-01-01

    Declining motor function is a prominent feature of ageing physiology. One reason for this is a reduction in plasticity that normally compensates for ongoing reorganization of motor units under physiological conditions. Previous work from our laboratory has shown that microsurgical repair of the transected peroneal nerve is followed by considerable changes in the histochemical profile of the reinnervated extensor digitorum longus (EDL) muscle and that these changes are dependent on both the time and the type of nerve repair. At 6 months postoperatively, a trend toward reversibility could be discerned. In the present work, we analysed the long-term reorganization of histochemical motor unit distribution patterns 15 months after performing either end-to-end repair or grafting of the peroneal nerve in 3-month-old rats. In addition, the EDL muscles of an age-matched control group (age 18 months) were analysed for age-dependent changes. We observed a loss of histochemical organization of motor units leading to an additional fibre type (SDH-INT) in the control group. Fifteen months after end-to-end repair, the histochemical profile showed a decrease in fibre type IIA and an increase in fibre type SDH-INT (P < 0.05), indicating a profound histochemical disorganization of motor units. In contrast, nerve grafting largely restored the histochemical profile of reinnervated EDL muscles. Fibre type grouping was present after both types of nerve repair. These findings show that reorganization of the histochemical profile in reinnervated muscles is dependent on the time and type of nerve repair and is long lasting. In this study, grafting provided superior results compared with end-to-end repair. These long-term results after peripheral nerve repair are influenced by age-dependent changes. Accordingly, nerve repair reduces the normal functional plasticity of motor unit organization. This reduction is enhanced by increasing age. PMID:15610394

  8. The End-to-end Demonstrator for improved decision making in the water sector in Europe (EDgE)

    Science.gov (United States)

    Wood, Eric; Wanders, Niko; Pan, Ming; Sheffield, Justin; Samaniego, Luis; Thober, Stephan; Kumar, Rohinni; Prudhomme, Christel; Houghton-Carr, Helen

    2017-04-01

    High-resolution simulations of water resources from hydrological models are vital to supporting important climate services. Apart from a high level of detail, both spatially and temporally, it is important to provide simulations that consistently cover a range of timescales, from historical reanalysis to seasonal forecast and future projections. In the new EDgE project commissioned by the ECMWF (C3S) we try to fulfill these requirements. EDgE is a proof-of-concept project which combines climate data and state-of-the-art hydrological modelling to demonstrate a water-oriented information system implemented through a web application. EDgE is working with key European stakeholders representative of private and public sectors to jointly develop and tailor approaches and techniques. With these tools, stakeholders are assisted in using improved climate information in decision-making, and supported in the development of climate change adaptation and mitigation policies. Here, we present the first results of the EDgE modelling chain, which is divided into three main processes: 1) pre-processing and downscaling; 2) hydrological modelling; 3) post-processing. Consistent downscaling and bias corrections for historical simulations, seasonal forecasts and climate projections ensure that the results across scales are robust. The daily temporal resolution and 5km spatial resolution ensure locally relevant simulations. With the use of four hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), uncertainty between models is properly addressed, while consistency is guaranteed by using identical input data for static land surface parameterizations. The forecast results are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs) that have been created in collaboration with the end-user community of the EDgE project. The final product of this project is composed of 15 years of seasonal forecast and 10 climate change projections, all combined with four hydrological

  9. Rearrangement of potassium ions and Kv1.1/Kv1.2 potassium channels in regenerating axons following end-to-end neurorrhaphy: ionic images from TOF-SIMS.

    Science.gov (United States)

    Liu, Chiung-Hui; Chang, Hung-Ming; Wu, Tsung-Huan; Chen, Li-You; Yang, Yin-Shuo; Tseng, To-Jung; Liao, Wen-Chieh

    2017-10-01

    The voltage-gated potassium channels Kv1.1 and Kv1.2 that cluster at juxtaparanodal (JXP) regions are essential in the regulation of nerve excitability and play a critical role in axonal conduction. When demyelination occurs, Kv1.1/Kv1.2 activity increases, suppressing the membrane potential nearly to the equilibrium potential of K + , which results in an axonal conduction blockade. The recovery of K + -dependent communication signals and proper clustering of Kv1.1/Kv1.2 channels at JXP regions may directly reflect nerve regeneration following peripheral nerve injury. However, little is known about potassium channel expression and its relationship with the dynamic potassium ion distribution at the node of Ranvier during the regenerative process of peripheral nerve injury (PNI). In the present study, end-to-end neurorrhaphy (EEN) was performed using an in vivo model of PNI. The distribution of K + at regenerating axons following EEN was detected by time-of-flight secondary-ion mass spectrometry. The specific localization and expression of Kv1.1/Kv1.2 channels were examined by confocal microscopy and western blotting. Our data showed that the re-establishment of K + distribution and intensity was correlated with the functional recovery of compound muscle action potential morphology in EEN rats. Furthermore, the re-clustering of Kv1.1/1.2 channels 1 and 3 months after EEN at the nodal region of the regenerating nerve corresponded to changes in the K + distribution. This study provided direct evidence of K + distribution in regenerating axons for the first time. We proposed that the Kv1.1/Kv1.2 channels re-clustered at the JXP regions of regenerating axons are essential for modulating the proper patterns of K + distribution in axons for maintaining membrane potential stability after EEN.

  10. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    Science.gov (United States)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  11. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    Science.gov (United States)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data

  12. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    Science.gov (United States)

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  13. The Gamma-ray Cherenkov Telescope, an end-to end Schwarzschild-Couder telescope prototype proposed for the Cherenkov Telescope Array

    Science.gov (United States)

    Dournaux, J. L.; Abchiche, A.; Allan, D.; Amans, J. P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Dangeon, L.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dumas, D.; Ernenwein, J. P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hameau, B.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J. M.; Jégouzo, I.; Jogler, T.; Kawashima, T.; Kraush, M.; Lapington, J. S.; Laporte, P.; Lefaucheur, J.; Markoff, S.; Melse, T.; Mohrmann, L.; Molyneux, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayède, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.

    2016-08-01

    The GCT (Gamma-ray Cherenkov Telescope) is a dual-mirror prototype of Small-Sized-Telescopes proposed for the Cherenkov Telescope Array (CTA) and made by an Australian-Dutch-French-German-Indian-Japanese-UK-US consortium. The integration of this end-to-end telescope was achieved in 2015. On-site tests and measurements of the first Cherenkov images on the night sky began on November 2015. This contribution describes the telescope and plans for the pre-production and a large scale production within CTA.

  14. THE EXPOSURE OF GUINEA PIGS TO PRESSURE-PULSES GENERATED DURING THE END-TO-END TEST (NO. 2) OF ATLAS MISSILE 8-D

    Science.gov (United States)

    THE EXPOSURE OF GUINEA PIGS TO PRESSURE-PULSES GENERATED DURING THE END-TO-END TEST (NO. 2) OF ATLAS MISSILE 8-D (MARCH 3) extent of the blast hazard...charge. Three guinea pigs were placed on the pressure control unit which was located beneath the ramp 90 ft from the missile. In addition, ten guinea pigs were...pressure pulse was slow rising (9-14 msec) and endured for about 25 msec. The three guinea pigs at that location were unharmed. At the 30-ft ranges

  15. End-to-end single cyanato and thiocyanato bridged Cu(II) polymers with a new tridentate Schiff base ligand: crystal structure and magnetic properties.

    Science.gov (United States)

    Talukder, Pritha; Datta, Amitabha; Mitra, Samiran; Rosair, Georgina; El Fallah, M Salah; Ribas, Joan

    2004-12-21

    A new tridentate Schiff base ligand HL (L = C14H19N2O), derived from the condensation of benzoylacetone and 2-dimethylaminoethylamine in a 1:1 ratio, reacts with copper(ii) acetate and cyanate, thiocyanate or azide, to give rise to several end-to-end polymeric complexes of formulae [CuL(mu(1,3)-NCO)]n 1, [CuL(mu(1,3)-NCS)]n 2 and the complex 3 has two crystallographically independent units of formula [CuL(N3)] in the asymmetric unit cell. Complex 3 exists in dimeric form rather than as a polymeric chain. Compound 1 is the first report of a singly end-to-end cyanate bridged polymeric chain of Cu(II) with a Schiff base as a co-ligand. There are many examples of double NCS bridged polymeric chains, but fewer singly bridged ones such as compound 2. We have characterized these complexes by analytical, spectroscopic, structural and variable temperature magnetic susceptibility measurements. The coordination geometry around the Cu(II) centers is distorted square pyramidal for 1 and 2 and square planar for complex 3. The magnetic susceptibility data show slight antiferromagnetic coupling for the polymers having J values -0.19 and -0.57 cm(-1) for complexes 1 and 2 respectively. The low values of J are consistent with the equatorial-axial disposition of the bridges in the polymers.

  16. Crystal structure of Aquifex aeolicus gene product Aq1627: a putative phosphoglucosamine mutase reveals a unique C-terminal end-to-end disulfide linkage.

    Science.gov (United States)

    Sridharan, Upasana; Kuramitsu, Seiki; Yokoyama, Shigeyuki; Kumarevel, Thirumananseri; Ponnuraj, Karthe

    2017-06-27

    The Aq1627 gene from Aquifex aeolicus, a hyperthermophilic bacterium has been cloned and overexpressed in Escherichia coli. The protein was purified to homogeneity and its X-ray crystal structure was determined to 1.3 Å resolution using multiple wavelength anomalous dispersion phasing. The structural and sequence analysis of Aq1627 is suggestive of a putative phosphoglucosamine mutase. The structural features of Aq1627 further indicate that it could belong to a new subclass of the phosphoglucosamine mutase family. Aq1627 structure contains a unique C-terminal end-to-end disulfide bond, which links two monomers and this structural information can be used in protein engineering to make proteins more stable in different applications.

  17. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    International Nuclear Information System (INIS)

    Bowen, S R; Nyflot, M J; Meyer, J; Sandison, G A; Herrmann, C; Groh, C M; Wollenweber, S D; Stearns, C W; Kinahan, P E

    2015-01-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [ 18 F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/B mean ) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT

  18. Stapled side-to-side anastomosis might be better than handsewn end-to-end anastomosis in ileocolic resection for Crohn's disease: a meta-analysis.

    Science.gov (United States)

    He, Xiaosheng; Chen, Zexian; Huang, Juanni; Lian, Lei; Rouniyar, Santosh; Wu, Xiaojian; Lan, Ping

    2014-07-01

    Ileocolic anastomosis is an essential step in the treatment to restore continuity of the gastrointestinal tract following ileocolic resection in patients with Crohn's disease (CD). However, the association between anastomotic type and surgical outcome is controversial. The aim of this meta-analysis is to compare surgical outcomes between stapled side-to-side anastomosis (SSSA) and handsewn end-to-end anastomosis (HEEA) after ileocolic resection in patients with CD. Studies comparing SSSA with HEEA after ileocolic resection in patients with CD were identified in PubMed and EMBASE. Outcomes such as complication, recurrence, and re-operation were evaluated. Eight studies (three randomized controlled trials, one prospective non-randomized trial, and four non-randomized retrospective trials) comparing SSSA (396 cases) and HEEA (425 cases) were included. As compared with HEEA, SSSA was superior in terms of overall postoperative complications [odds ratio (OR), 0.54; 95 % confidence interval (CI) 0.32-0.93], anastomotic leak (OR 0.45; 95 % CI 0.20-1.00), recurrence (OR 0.20; 95 % CI 0.07-0.55), and re-operation for recurrence (OR 0.18; 95 % CI 0.07-0.45). Postoperative hospital stay, mortality, and complications other than anastomotic leak were comparable. Based on the results of our meta-analysis, SSSA would appear to be the preferred procedure after ileocolic resection for CD, with reduced overall postoperative complications, especially anastomotic leak, and a decreased recurrence and re-operation rate.

  19. Poly(ethyl glyoxylate)-Poly(ethylene oxide) Nanoparticles: Stimuli-Responsive Drug Release via End-to-End Polyglyoxylate Depolymerization.

    Science.gov (United States)

    Fan, Bo; Gillies, Elizabeth R

    2017-08-07

    The ability to disrupt polymer assemblies in response to specific stimuli provides the potential to release drugs selectively at certain sites or conditions in vivo. However, most stimuli-responsive delivery systems require many stimuli-initiated events to release drugs. "Self-immolative polymers" offer the potential to provide amplified responses to stimuli as they undergo complete end-to-end depolymerization following the cleavage of a single end-cap. Herein, linker end-caps were developed to conjugate self-immolative poly(ethyl glyoxylate) (PEtG) with poly(ethylene oxide) (PEO) to form amphiphilic block copolymers. These copolymers were self-assembled to form nanoparticles in aqueous solution. Cleavage of the linker end-caps were triggered by a thiol reducing agent, UV light, H 2 O 2 , and combinations of these stimuli, resulting in nanoparticle disintegration. Low stimuli concentrations were effective in rapidly disrupting the nanoparticles. Nile red, doxorubin, and curcumin were encapsulated into the nanoparticles and were selectively released upon application of the appropriate stimulus. The ability to tune the stimuli-responsiveness simply by changing the linker end-cap makes this new platform highly attractive for applications in drug delivery.

  20. [Incidence of painful neuroma after end-to-end nerve suture wrapped into a collagen conduit. A prospective study of 185 cases].

    Science.gov (United States)

    Thomsen, L; Schlur, C

    2013-10-01

    Three to 5% of the nerves directly and correctly sutured evolve towards significant neuropathy pain. The psychological, social and economic impact of such a consequence is very important. The purpose of this retrospective study was to evaluate the incidence of the occurrence of a trigger zone or a neuroma, at 6months of maximum follow-up after direct nervous suture bushed in a type 1 collagen tube. Every patient taken care for a traumatic nervous injury from November 2008 to March 2012 was included in the study. The exclusion criteria were any replantation, nervous tissue defect and any distal nervous stump which could not technically be wrapped around. The only conduct used was made of collagen type 1 (Revolnerv(®), Orthomed™). All patients were examined after one, three and sixmonths for a clinical evaluation made by the same surgeon. The apparition of a trigger zone or a real neuroma was clinically assessed. One hundred and seventy-four patients for a total of 197 sutured nerves were included in the study. At the 6 months follow-up, 163 patients were evaluated for a total of 185 nerves. No patient suffered from a neuroma at this time. As the treatment of neuroma is very difficult, considering the cost and the results, wrapping direct end-to-end sutures by a collagen type 1 tube seems helping to prevent the appearance of a neuroma. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  1. End-to-end process of hollow spacecraft structures with high frequency and low mass obtained with in-house structural optimization tool and additive manufacturing

    Directory of Open Access Journals (Sweden)

    Alexandru-Mihai CISMILIANU

    2017-09-01

    Full Text Available In the space sector the most decisive elements are: mass reduction, cost saving and minimum lead time; here, structural optimization and additive layer manufacturing (ALM fit best. The design must be driven by stiffness, because an important requirement for spacecraft (S/C structures is to reduce the dynamic coupling between the S/C and the launch vehicle. The objective is to create an end-to-end process, from the input given by the customer to the manufacturing of an aluminum part as light as possible but at the same time considerably stiffer while taking the full advantage of the design flexibility given by ALM. To design and optimize the parts, a specialized in-house tool was used, guaranteeing a load-sufficient material distribution. Using topological optimization, the iterations between the design and the stress departments were diminished, thus greatly reducing the lead time. In order to improve and lighten the obtained structure a design with internal cavities and hollow beams was considered. This implied developing of a procedure for powder evacuation through iterations with the manufacturer while optimizing the design for ALM. The resulted part can be then manufactured via ALM with no need of further design adjustments. To achieve a high-quality part with maximum efficiency, it is essential to have a loop between the design team and the manufacturer. Topological optimization and ALM work hand in hand if used properly. The team achieved a more efficient structure using topology optimization and ALM, than using conventional design and manufacturing methods.

  2. Automated segmentation of 3D anatomical structures on CT images by using a deep convolutional network based on end-to-end learning approach

    Science.gov (United States)

    Zhou, Xiangrong; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2017-02-01

    We have proposed an end-to-end learning approach that trained a deep convolutional neural network (CNN) for automatic CT image segmentation, which accomplished a voxel-wised multiple classification to directly map each voxel on 3D CT images to an anatomical label automatically. The novelties of our proposed method were (1) transforming the anatomical structures segmentation on 3D CT images into a majority voting of the results of 2D semantic image segmentation on a number of 2D-slices from different image orientations, and (2) using "convolution" and "deconvolution" networks to achieve the conventional "coarse recognition" and "fine extraction" functions which were integrated into a compact all-in-one deep CNN for CT image segmentation. The advantage comparing to previous works was its capability to accomplish real-time image segmentations on 2D slices of arbitrary CT-scan-range (e.g. body, chest, abdomen) and produced correspondingly-sized output. In this paper, we propose an improvement of our proposed approach by adding an organ localization module to limit CT image range for training and testing deep CNNs. A database consisting of 240 3D CT scans and a human annotated ground truth was used for training (228 cases) and testing (the remaining 12 cases). We applied the improved method to segment pancreas and left kidney regions, respectively. The preliminary results showed that the accuracies of the segmentation results were improved significantly (pancreas was 34% and kidney was 8% increased in Jaccard index from our previous results). The effectiveness and usefulness of proposed improvement for CT image segmentations were confirmed.

  3. Operating performance of the gamma-ray Cherenkov telescope: An end-to-end Schwarzschild–Couder telescope prototype for the Cherenkov Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Dournaux, J.L., E-mail: jean-laurent.dournaux@obspm.fr [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); De Franco, A. [Department of Physics, University of Oxford, Keble Road, Oxford OX1 3RH (United Kingdom); Laporte, P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); White, R. [Max-Planck-Institut für Kernphysik, Saupfercheckweg 1, 69117 Heidelberg (Germany); Greenshaw, T. [University of Liverpool, Oliver Lodge Laboratory, P.O. Box 147, Oxford Street, Liverpool L69 3BX (United Kingdom); Sol, H. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Abchiche, A. [CNRS, Division technique DT-INSU, 1 Place Aristide Briand, 92190 Meudon (France); Allan, D. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Amans, J.P. [GEPI, Observatoire de Paris, PSL Research University, CNRS, Sorbonne Paris Cité, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); Armstrong, T.P. [Department of Physics and Centre for Advanced Instrumentation, Durham University, South Road, Durham DH1 3LE (United Kingdom); Balzer, A.; Berge, D. [GRAPPA, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Boisson, C. [LUTH, Observatoire de Paris, PSL Research University, CNRS, Université Paris Diderot, Place J. Janssen, 92190 Meudon (France); and others

    2017-02-11

    The Cherenkov Telescope Array (CTA) consortium aims to build the next-generation ground-based very-high-energy gamma-ray observatory. The array will feature different sizes of telescopes allowing it to cover a wide gamma-ray energy band from about 20 GeV to above 100 TeV. The highest energies, above 5 TeV, will be covered by a large number of Small-Sized Telescopes (SSTs) with a field-of-view of around 9°. The Gamma-ray Cherenkov Telescope (GCT), based on Schwarzschild–Couder dual-mirror optics, is one of the three proposed SST designs. The GCT is described in this contribution and the first images of Cherenkov showers obtained using the telescope and its camera are presented. These were obtained in November 2015 in Meudon, France.

  4. WE-DE-BRA-11: A Study of Motion Tracking Accuracy of Robotic Radiosurgery Using a Novel CCD Camera Based End-To-End Test System

    Energy Technology Data Exchange (ETDEWEB)

    Wang, L; M Yang, Y [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States); Nelson, B [Logos Systems Intl, Scotts Valley, CA (United States)

    2016-06-15

    Purpose: A novel end-to-end test system using a CCD camera and a scintillator based phantom (XRV-124, Logos Systems Int’l) capable of measuring the beam-by-beam delivery accuracy of Robotic Radiosurgery (CyberKnife) was developed and reported in our previous work. This work investigates its application in assessing the motion tracking (Synchrony) accuracy for CyberKnife. Methods: A QA plan with Anterior and Lateral beams (with 4 different collimator sizes) was created (Multiplan v5.3) for the XRV-124 phantom. The phantom was placed on a motion platform (superior and inferior movement), and the plans were delivered on the CyberKnife M6 system using four motion patterns: static, Sine- wave, Sine with 15° phase shift, and a patient breathing pattern composed of 2cm maximum motion with 4 second breathing cycle. Under integral recording mode, the time-averaged beam vectors (X, Y, Z) were measured by the phantom and compared with static delivery. In dynamic recording mode, the beam spots were recorded at a rate of 10 frames/second. The beam vector deviation from average position was evaluated against the various breathing patterns. Results: The average beam position of the six deliveries with no motion and three deliveries with Synchrony tracking on ideal motion (sinewave without phase shift) all agree within −0.03±0.00 mm, 0.10±0.04, and 0.04±0.03 in the X, Y, and X directions. Radiation beam width (FWHM) variations are within ±0.03 mm. Dynamic video record showed submillimeter tracking stability for both regular and irregular breathing pattern; however the tracking error up to 3.5 mm was observed when a 15 degree phase shift was introduced. Conclusion: The XRV-124 system is able to provide 3D and 4D targeting accuracy for CyberKnife delivery with Synchrony. The experimental results showed sub-millimeter delivery in phantom with excellent correlation in target to breathing motion. The accuracy was degraded when irregular motion and phase shift was introduced.

  5. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    Directory of Open Access Journals (Sweden)

    Greg Finak

    2014-08-01

    Full Text Available Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in

  6. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  7. A new 1D manganese(II) coordination polymer with end-to-end azide bridge and isonicotinoylhydrazone Schiff base ligand: Crystal structure, Hirshfeld surface, NBO and thermal analyses

    Science.gov (United States)

    Khani, S.; Montazerozohori, M.; Masoudiasl, A.; White, J. M.

    2018-02-01

    A new manganese (II) coordination polymer, [MnL2 (μ-1,3-N3)2]n, with co-ligands including azide anion and Schiff base based on isonicotinoylhydrazone has been synthesized and characterized. The crystal structure determination shows that the azide ligand acts as end-to-end (EE) bridging ligand and generates a one-dimensional coordination polymer. In this compound, each manganes (II) metal center is hexa-coordinated by four azide nitrogens and two pyridinic nitrogens for the formation of octahedral geometry. The analysis of crystal packing indicates that the 1D chain of [MnL2 (μ-1,3-N3)2]n, is stabilized as a 3D supramolecular network by intra- and inter-chain intermolecular interactions of X-H···Y (X = N and C, Y = O and N). Hirshfeld surface analysis and 2D fingerprint plots have been used for a more detailed investigation of intermolecular interactions. Also, natural bond orbital (NBO) analysis was performed to get information about atomic charge distributions, hybridizations and the strength of interactions. Finally, thermal analysis of compound showed its complete decomposition during three thermal steps.

  8. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement : Part II – Perceptual Model

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part I, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. This paper describes the

  9. Demonstration of the First Real-Time End-to-End 40-Gb/s PAM-4 for Next-Generation Access Applications using 10-Gb/s Transmitter

    DEFF Research Database (Denmark)

    Wei, J. L.; Eiselt, Nicklas; Griesser, Helmut

    2016-01-01

    We demonstrate the first known experiment of a real-time end-to-end 40-Gb/s PAM-4 system for next-generation access applications using 10-Gb/s class transmitters only. Based on the measurement of a real-time 40-Gb/s PAM system, low-cost upstream and downstream link power budgets are estimated. Up...

  10. SU-E-T-109: Development of An End-To-End Test for the Varian TrueBeamtm with a Novel Multiple-Dosimetric Modality H and N Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Zakjevskii, V; Knill, C; Rakowski, J; Snyder, M [Wayne State University, Karmanos Cancer Institute, Detroit, MI (United States)

    2014-06-01

    Purpose: To develop a comprehensive end-to-end test for Varian's TrueBeam linear accelerator for head and neck IMRT using a custom phantom designed to utilize multiple dosimetry devices. Methods: The initial end-to-end test and custom H and N phantom were designed to yield maximum information in anatomical regions significant to H and N plans with respect to: i) geometric accuracy, ii) dosimetric accuracy, and iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. A CT image was taken with a 1mm slice thickness. The CT was imported into Varian's Eclipse treatment planning system, where OARs and the PTV were contoured. A clinical template was used to create an eight field static gantry angle IMRT plan. After optimization, dose was calculated using the Analytic Anisotropic Algorithm with inhomogeneity correction. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film and ion chambers. Ion chamber dose measurements were compared to the TPS. Films were analyzed with FilmQAPro using composite gamma index. Results: Film analysis for the initial end-to-end plan with a geometrically simple PTV showed average gamma pass rates >99% with a passing criterion of 3% / 3mm. Film analysis of a plan with a more realistic, ie. complex, PTV yielded pass rates >99% in clinically important regions containing the PTV, spinal cord and parotid glands. Ion chamber measurements were on average within 1.21% of calculated dose for both plans. Conclusion: trials have demonstrated that our end-to-end testing methods provide baseline values for the dosimetric and geometric accuracy of Varian's TrueBeam system.

  11. SU-F-J-150: Development of An End-To-End Chain Test for the First-In-Man MR-Guided Treatments with the MRI Linear Accelerator by Using the Alderson Phantom

    Energy Technology Data Exchange (ETDEWEB)

    Hoogcarspel, S; Kerkmeijer, L; Lagendijk, J; Van Vulpen, M; Raaymakers, B [University Medical Center Utrecht, Utrecht, Utrecht (Netherlands)

    2016-06-15

    The Alderson phantom is a human shaped quality assurance tool that has been used for over 30 years in radiotherapy. The phantom can provide integrated tests of the entire chain of treatment planning and delivery. The purpose of this research was to investigate if this phantom can be used to chain test a treatment on the MRI linear accelerator (MRL) which is currently being developed at the UMC Utrecht, in collaboration with Elekta and Philips. The latter was demonstrated by chain testing the future First-in-Man treatments with this system.An Alderson phantom was used to chain test an entire treatment with the MRL. First, a CT was acquired of the phantom with additional markers that are both visible on MR and CT. A treatment plan for treating bone metastases in the sacrum was made. The phantom was consecutively placed in the MRL. For MRI imaging, an 3D volume was acquired. The initially developed treatment plan was then simulated on the new MRI dataset. For simulation, both the MR and CT data was used by registering them together. Before treatment delivery a MV image was acquired and compared with a DRR that was calculated form the MR/CT registration data. Finally, the treatment was delivered. Figure 1 shows both the T1 weighted MR-image of the phantom and the CT that was registered to the MR image. Figure 2 shows both the calculated and measured MV image that was acquired by the MV panel. Figure 3 shows the dose distribution that was simulated. The total elapsed time for the entire procedure excluding irradiation was 13:35 minutes.The Alderson Phantom yields sufficient MR contrast and can be used for full MR guided radiotherapy treatment chain testing. As a result, we are able to perform an end-to-end chain test of the future First-in-Man treatments.

  12. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  13. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    OpenAIRE

    Zhenhai Lu; Jianhong Peng; Cong Li; Fulong Wang; Wu Jiang; Wenhua Fan; Junzhong Lin; Xiaojun Wu; Desen Wan; Zhizhong Pan

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameter...

  14. Systems Engineering and Application of System Performance Modeling in SIM Lite Mission

    Science.gov (United States)

    Moshir, Mehrdad; Murphy, David W.; Milman, Mark H.; Meier, David L.

    2010-01-01

    The SIM Lite Astrometric Observatory will be the first space-based Michelson interferometer operating in the visible wavelength, with the ability to perform ultra-high precision astrometric measurements on distant celestial objects. SIM Lite data will address in a fundamental way questions such as characterization of Earth-mass planets around nearby stars. To accomplish these goals it is necessary to rely on a model-based systems engineering approach - much more so than most other space missions. This paper will describe in further detail the components of this end-to-end performance model, called "SIM-sim", and show how it has helped the systems engineering process.

  15. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    Science.gov (United States)

    2012-04-24

    Bioterrorism?‖ Hearing Before the United States Senate. 110th Cong. (2007) Print . Courtney, Brooke, Eric Toner , and Richard Waldhorn."Preparing the...34Sverdlovsk Revisited: Modeling Human Inhalation Anthrax." Proceedings of the National Academy of Sciences 103.20 (2006): 7589-594. Print . 3 IOM...Institute of Medicine). ―Prepositioning Antibiotics for Anthrax.‖ Washington, DC: The National Academies Press, 2011. Print . 4 Wide Area Recovery

  16. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Hudgins, Andrew P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Carrillo, Ismael M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jin, Xin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Simmins, John [Electric Power Research Institute (EPRI)

    2018-02-21

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR) power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.

  17. End-to-end energy efficient communication

    DEFF Research Database (Denmark)

    Dittmann, Lars

    Awareness of energy consumption in communication networks such as the Internet is currently gaining momentum as it is commonly acknowledged that increased network capacity (currently driven by video applications) requires significant more electrical power. This paper stresses the importance...

  18. Electronic remote blood issue: a combination of remote blood issue with a system for end-to-end electronic control of transfusion to provide a "total solution" for a safe and timely hospital blood transfusion service.

    Science.gov (United States)

    Staves, Julie; Davies, Amanda; Kay, Jonathan; Pearson, Oliver; Johnson, Tony; Murphy, Michael F

    2008-03-01

    The rapid provision of red cell (RBC) units to patients needing blood urgently is an issue of major importance in transfusion medicine. The development of electronic issue (sometimes termed "electronic crossmatch") has facilitated rapid provision of RBC units by avoidance of the serologic crossmatch in eligible patients. A further development is the issue of blood under electronic control at blood refrigerator remote from the blood bank. This study evaluated a system for electronic remote blood issue (ERBI) developed as an enhancement of a system for end-to-end electronic control of hospital transfusion. Practice was evaluated before and after its introduction in cardiac surgery. Before the implementation of ERBI, the median time to deliver urgently required RBC units to the patient was 24 minutes. After its implementation, RBC units were obtained from the nearby blood refrigerator in a median time of 59 seconds (range, 30 sec to 2 min). The study also found that unused requests were reduced significantly from 42 to 20 percent, the number of RBC units issued reduced by 52 percent, the number of issued units that were transfused increased from 40 to 62 percent, and there was a significant reduction in the workload of both blood bank and clinical staff. This study evaluated a combination of remote blood issue with an end-to-end electronically controlled hospital transfusion process, ERBI. ERBI reduced the time to make blood available for surgical patients and improved the efficiency of hospital transfusion.

  19. Safety and efficacy of the NiTi Shape Memory Compression Anastomosis Ring (CAR/ColonRing) for end-to-end compression anastomosis in anterior resection or low anterior resection.

    Science.gov (United States)

    Kang, Jeonghyun; Park, Min Geun; Hur, Hyuk; Min, Byung Soh; Lee, Kang Young; Kim, Nam Kyu

    2013-04-01

    Compression anastomoses may represent an improvement over traditional hand-sewn or stapled techniques. This prospective exploratory study aimed to assess the efficacy and complication rates in patients undergoing anterior resection (AR) or low anterior resection (LAR) anastomosed with a novel end-to-end compression anastomosis ring, the ColonRing. In all, 20 patients (13 male) undergoing AR or LAR were enrolled to be anastomosed using the NiTi Shape Memory End-to-End Compression Anastomosis Ring (NiTi Medical Technologies Ltd, Netanya, Israel). Demographic, intraoperative, and postoperative data were collected. Patients underwent AR (11/20) or LAR using laparoscopy (75%), robotic (10%) surgery, or an open laparotomy (15%) approach, with a median anastomotic level of 14.5 cm (range, 4-25 cm). Defunctioning loop ileostomies were formed in 6 patients for low anastomoses. Surgeons rated the ColonRing device as either easy or very easy to use. One patient developed an anastomotic leakage in the early postoperative period; there were no late postoperative complications. Mean time to passage of first flatus and commencement of oral fluids was 2.5 days and 3.2 days, respectively. Average hospital stay was 12.6 days (range, 8-23 days). Finally, the device was expelled on average 15.3 days postoperatively without difficulty. This is the first study reporting results in a significant number of LAR patients and the first reported experience from South Korea; it shows that the compression technique is surgically feasible, easy to use, and without significant complication rates. A large randomized controlled trial is warranted to investigate the benefits of the ColonRing over traditional stapling techniques.

  20. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    Science.gov (United States)

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients. PMID:27276395

  1. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients.

    Science.gov (United States)

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-05-01

    This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients.

  2. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    Directory of Open Access Journals (Sweden)

    Zhenhai Lu

    2016-05-01

    Full Text Available OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group and 157 patients with conventional circular staplers (STA group were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6% in the CAR group and in 5 patients (3.2% in the STA group (p=0.804. These eight patients received a temporary diverting ileostomy. One patient (1.3% in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192. With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152. CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients.

  3. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  4. A TOD dataset to validate human observer models for target acquisition modeling and objective sensor performance testing

    NARCIS (Netherlands)

    Bijl, P.; Kooi, F.L.; Hogervorst, M.A.

    2014-01-01

    End-to-end Electro-Optical system performance tests such as TOD, MRTD and MTDP require the effort of several trained human observers, each performing a series of visual judgments on the displayed output of the system. This significantly contributes to the costs of sensor testing. Currently, several

  5. An Efficient Framework Model for Optimizing Routing Performance in VANETs

    Science.gov (United States)

    Zulkarnain, Zuriati Ahmad; Subramaniam, Shamala

    2018-01-01

    Routing in Vehicular Ad hoc Networks (VANET) is a bit complicated because of the nature of the high dynamic mobility. The efficiency of routing protocol is influenced by a number of factors such as network density, bandwidth constraints, traffic load, and mobility patterns resulting in frequency changes in network topology. Therefore, Quality of Service (QoS) is strongly needed to enhance the capability of the routing protocol and improve the overall network performance. In this paper, we introduce a statistical framework model to address the problem of optimizing routing configuration parameters in Vehicle-to-Vehicle (V2V) communication. Our framework solution is based on the utilization of the network resources to further reflect the current state of the network and to balance the trade-off between frequent changes in network topology and the QoS requirements. It consists of three stages: simulation network stage used to execute different urban scenarios, the function stage used as a competitive approach to aggregate the weighted cost of the factors in a single value, and optimization stage used to evaluate the communication cost and to obtain the optimal configuration based on the competitive cost. The simulation results show significant performance improvement in terms of the Packet Delivery Ratio (PDR), Normalized Routing Load (NRL), Packet loss (PL), and End-to-End Delay (E2ED). PMID:29462884

  6. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  7. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  8. Signal and image processing systems performance evaluation, simulation, and modeling; Proceedings of the Meeting, Orlando, FL, Apr. 4, 5, 1991

    Science.gov (United States)

    Nasr, Hatem N.; Bazakos, Michael E.

    The various aspects of the evaluation and modeling problems in algorithms, sensors, and systems are addressed. Consideration is given to a generic modular imaging IR signal processor, real-time architecture based on the image-processing module family, application of the Proto Ware simulation testbed to the design and evaluation of advanced avionics, development of a fire-and-forget imaging infrared seeker missile simulation, an adaptive morphological filter for image processing, laboratory development of a nonlinear optical tracking filter, a dynamic end-to-end model testbed for IR detection algorithms, wind tunnel model aircraft attitude and motion analysis, an information-theoretic approach to optimal quantization, parametric analysis of target/decoy performance, neural networks for automated target recognition parameters adaptation, performance evaluation of a texture-based segmentation algorithm, evaluation of image tracker algorithms, and multisensor fusion methodologies. (No individual items are abstracted in this volume)

  9. Comparação entre dois fios de sutura não absorvíveis na anastomose traqueal término-terminal em cães Comparison of two nonabsorbable suture materials in the end-to-end tracheal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Sheila Canevese Rahal

    1995-01-01

    Full Text Available Doze cães sem raça definida, com idade variando entre 1 e 6 anos e peso de 6 a 20kg, foram submetidos a ressecção traqueal e anastomose término-terminal, na qual foram testados os fios poliéster trançado não capilar e náilon monofilamento. Seis animais, cada três com um mesmo tipo de fio de sutura, sofreram a excisão equivalente a três anéis traqueais. Com 15 dias foi executada uma nova intervenção onde se ressecou o equivalente a mais seis anéis, perfazendo um total de nove. Ao final de outros 15 dias foram sacrificados. Os outros seis animais, cada três com um mesmo tipo de fio, foram submetidos à excisão equivalente a três anéis traqueais e mantidos por 43 dias. As traquéias foram avaliadas por exames clínicos, radiográficos, macroscópicos e histopatológicos. O fio de náilon monofilamento apresentou menos reação tecidual do que o poliéster trançado não capilar, promoveu uma anastomose segura e com menor chance de formação de granuloma.Twelve mongrel dogs, with age between 1 and 6 years old and weight between 12 and 40 pounds, were submitted to tracheal resection and end-to-end anastomosis in which were tested braided polyester no capillary and monofilament nylon materiais. Six animais, every threeones with a same type of suture material, suffered the excision equivalent to three tracheal rings. A new intervention was performed with fifteen days, in which the equivalent of more six tracheal rings were removed, completing the total of nine. At the end of more fifteen days they were sacrificed. The other six animals, every three with a same type of suture material, were submitted to the excision equivalent to three tracheal rings and maintained for 43 days. The tracheal anastomosis were evaluated to clinic, radiographic, macroscopic and histopathologic studies. The monofilament nylon material exhibited less reaction than polyester and promoted a secure anastomosis with less risk of granuloma formation.

  10. CMDS System Integration and IAMD End-to-End Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Cruise Missile Defense Systems (CMDS) Project Office is establishing a secure System Integration Laboratory at the AMRDEC. This lab will contain tactical Signal...

  11. End-to-End Encryption for Personal Telehealth Systems.

    Science.gov (United States)

    Ollerer, Gerald; Mense, Alexander

    2014-01-01

    Data from personal health devices is expected to be an important part of personalized care in future, but communication frameworks for such data create new challenges for security and privacy. Continua Health Alliance has been very active and successful in defining guidelines and a reference architecture for transmitting personal health device data based on well-known international standards. But looking at the security definitions, the concepts are still facing open issues and weaknesses like identity management or missing end-2end (E2E) encryption. This paper presents an approach for an E2E encryption framework based on Continua's reference architecture and the underlying base standards. It introduces the basic process and proposes necessary extensions to the architecture as well as to the standardized protocols of ISO/IEEE 11073 and HL7 version 2.

  12. End-to-end visual speech recognition with LSTMS

    NARCIS (Netherlands)

    Petridis, Stavros; Li, Zuwei; Pantic, Maja

    2017-01-01

    Traditional visual speech recognition systems consist of two stages, feature extraction and classification. Recently, several deep learning approaches have been presented which automatically extract features from the mouth images and aim to replace the feature extraction stage. However, research on

  13. End-to-end simulation: The front end

    International Nuclear Information System (INIS)

    Haber, I.; Bieniosek, F.M.; Celata, C.M.; Friedman, A.; Grote, D.P.; Henestroza, E.; Vay, J.-L.; Bernal, S.; Kishek, R.A.; O'Shea, P.G.; Reiser, M.; Herrmannsfeldt, W.B.

    2002-01-01

    For the intense beams in heavy ion fusion accelerators, details of the beam distribution as it emerges from the source region can determine the beam behavior well downstream. This occurs because collective space-charge modes excited as the beam is born remain undamped for many focusing periods. Traditional studies of the source region in particle beam systems have emphasized the behavior of averaged beam characteristics, such as total current, rms beam size, or emittance, rather than the details of the full beam distribution function that are necessary to predict the excitation of these modes. Simulations of the beam in the source region and comparisons to experimental measurements at LBNL and the University of Maryland are presented to illustrate some of the complexity in beam characteristics that has been uncovered as increased attention has been devoted to developing a detailed understanding of the source region. Also discussed are methods of using the simulations to infer characteristics of the beam distribution that can be difficult to measure directly

  14. An integrated radar model solution for mission level performance and cost trades

    Science.gov (United States)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  15. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    Science.gov (United States)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  16. Principles of Sonar Performance Modeling

    NARCIS (Netherlands)

    Ainslie, M.A.

    2010-01-01

    Sonar performance modelling (SPM) is concerned with the prediction of quantitative measures of sonar performance, such as probability of detection. It is a multidisciplinary subject, requiring knowledge and expertise in the disparate fields of underwater acoustics, acoustical oceanography, sonar

  17. Characterising performance of environmental models

    NARCIS (Netherlands)

    Bennett, N.D.; Croke, B.F.W.; Guariso, G.; Guillaume, J.H.A.; Hamilton, S.H.; Jakeman, A.J.; Marsili-Libelli, S.; Newham, L.T.H.; Norton, J.; Perrin, C.; Pierce, S.; Robson, B.; Seppelt, R.; Voinov, A.; Fath, B.D.; Andreassian, V.

    2013-01-01

    In order to use environmental models effectively for management and decision-making, it is vital to establish an appropriate level of confidence in their performance. This paper reviews techniques available across various fields for characterising the performance of environmental models with focus

  18. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  19. Firm Sustainability Performance Index Modeling

    Directory of Open Access Journals (Sweden)

    Che Wan Jasimah Bt Wan Mohamed Radzi

    2015-12-01

    Full Text Available The main objective of this paper is to bring a model for firm sustainability performance index by applying both classical and Bayesian structural equation modeling (parametric and semi-parametric modeling. Both techniques are considered to the research data collected based on a survey directed to the China, Taiwan, and Malaysia food manufacturing industry. For estimating firm sustainability performance index we consider three main indicators include knowledge management, organizational learning, and business strategy. Based on the both Bayesian and classical methodology, we confirmed that knowledge management and business strategy have significant impact on firm sustainability performance index.

  20. MODELING SUPPLY CHAIN PERFORMANCE VARIABLES

    Directory of Open Access Journals (Sweden)

    Ashish Agarwal

    2005-01-01

    Full Text Available In order to understand the dynamic behavior of the variables that can play a major role in the performance improvement in a supply chain, a System Dynamics-based model is proposed. The model provides an effective framework for analyzing different variables affecting supply chain performance. Among different variables, a causal relationship among different variables has been identified. Variables emanating from performance measures such as gaps in customer satisfaction, cost minimization, lead-time reduction, service level improvement and quality improvement have been identified as goal-seeking loops. The proposed System Dynamics-based model analyzes the affect of dynamic behavior of variables for a period of 10 years on performance of case supply chain in auto business.

  1. Air Conditioner Compressor Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Xie, YuLong; Huang, Zhenyu

    2008-09-05

    During the past three years, the Western Electricity Coordinating Council (WECC) Load Modeling Task Force (LMTF) has led the effort to develop the new modeling approach. As part of this effort, the Bonneville Power Administration (BPA), Southern California Edison (SCE), and Electric Power Research Institute (EPRI) Solutions tested 27 residential air-conditioning units to assess their response to delayed voltage recovery transients. After completing these tests, different modeling approaches were proposed, among them a performance modeling approach that proved to be one of the three favored for its simplicity and ability to recreate different SVR events satisfactorily. Funded by the California Energy Commission (CEC) under its load modeling project, researchers at Pacific Northwest National Laboratory (PNNL) led the follow-on task to analyze the motor testing data to derive the parameters needed to develop a performance models for the single-phase air-conditioning (SPAC) unit. To derive the performance model, PNNL researchers first used the motor voltage and frequency ramping test data to obtain the real (P) and reactive (Q) power versus voltage (V) and frequency (f) curves. Then, curve fitting was used to develop the P-V, Q-V, P-f, and Q-f relationships for motor running and stalling states. The resulting performance model ignores the dynamic response of the air-conditioning motor. Because the inertia of the air-conditioning motor is very small (H<0.05), the motor reaches from one steady state to another in a few cycles. So, the performance model is a fair representation of the motor behaviors in both running and stalling states.

  2. Composite and Cascaded Generalized-K Fading Channel Modeling and Their Diversity and Performance Analysis

    KAUST Repository

    Ansari, Imran Shafique

    2010-12-01

    The introduction of new schemes that are based on the communication among nodes has motivated the use of composite fading models due to the fact that the nodes experience different multipath fading and shadowing statistics, which subsequently determines the required statistics for the performance analysis of different transceivers. The end-to-end signal-to-noise-ratio (SNR) statistics plays an essential role in the determination of the performance of cascaded digital communication systems. In this thesis, a closed-form expression for the probability density function (PDF) of the end-end SNR for independent but not necessarily identically distributed (i.n.i.d.) cascaded generalized-K (GK) composite fading channels is derived. The developed PDF expression in terms of the Meijer-G function allows the derivation of subsequent performance metrics, applicable to different modulation schemes, including outage probability, bit error rate for coherent as well as non-coherent systems, and average channel capacity that provides insights into the performance of a digital communication system operating in N cascaded GK composite fading environment. Another line of research that was motivated by the introduction of composite fading channels is the error performance. Error performance is one of the main performance measures and derivation of its closed-form expression has proved to be quite involved for certain systems. Hence, in this thesis, a unified closed-form expression, applicable to different binary modulation schemes, for the bit error rate of dual-branch selection diversity based systems undergoing i.n.i.d. GK fading is derived in terms of the extended generalized bivariate Meijer G-function.

  3. NREL's System Advisor Model Simplifies Complex Energy Analysis (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    NREL has developed a tool -- the System Advisor Model (SAM) -- that can help decision makers analyze cost, performance, and financing of any size grid-connected solar, wind, or geothermal power project. Manufacturers, engineering and consulting firms, research and development firms, utilities, developers, venture capital firms, and international organizations use SAM for end-to-end analysis that helps determine whether and how to make investments in renewable energy projects.

  4. Laser performance and modeling of RE3+:YAG double-clad crystalline fiber waveguides

    Science.gov (United States)

    Li, Da; Lee, Huai-Chuan; Meissner, Stephanie K.; Meissner, Helmuth E.

    2018-02-01

    We report on laser performance of ceramic Yb:YAG and single crystal Tm:YAG double-clad crystalline fiber waveguide (CFW) lasers towards the goal of demonstrating the design and manufacturing strategy of scaling to high output power. The laser component is a double-clad CFW, with RE3+:YAG (RE = Yb, Tm respectively) core, un-doped YAG inner cladding, and ceramic spinel or sapphire outer cladding. Laser performance of the CFW has been demonstrated with 53.6% slope efficiency and 27.5-W stable output power at 1030-nm for Yb:YAG CFW, and 31.6% slope efficiency and 46.7-W stable output power at 2019-nm for Tm:YAG CFW, respectively. Adhesive-Free Bond (AFB®) technology enables a designable refractive index difference between core and inner cladding, and designable core and inner cladding sizes, which are essential for single transverse mode CFW propagation. To guide further development of CFW designs, we present thermal modeling, power scaling and design of single transverse mode operation of double-clad CFWs and redefine the single-mode operation criterion for the double-clad structure design. The power scaling modeling of double-clad CFW shows that in order to achieve the maximum possible output power limited by the physical properties, including diode brightness, thermal lens effect, and simulated Brillion scattering, the length of waveguide is in the range of 0.5 2 meters. The length of an individual CFW is limited by single crystal growth and doping uniformity to about 100 to 200 mm lengths, and also by availability of starting crystals and manufacturing complexity. To overcome the limitation of CFW lengths, end-to-end proximity-coupling of CFWs is introduced.

  5. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  6. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  7. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  8. Calibration of PMIS pavement performance prediction models.

    Science.gov (United States)

    2012-02-01

    Improve the accuracy of TxDOTs existing pavement performance prediction models through calibrating these models using actual field data obtained from the Pavement Management Information System (PMIS). : Ensure logical performance superiority patte...

  9. Human Performance Models of Pilot Behavior

    Science.gov (United States)

    Foyle, David C.; Hooey, Becky L.; Byrne, Michael D.; Deutsch, Stephen; Lebiere, Christian; Leiden, Ken; Wickens, Christopher D.; Corker, Kevin M.

    2005-01-01

    Five modeling teams from industry and academia were chosen by the NASA Aviation Safety and Security Program to develop human performance models (HPM) of pilots performing taxi operations and runway instrument approaches with and without advanced displays. One representative from each team will serve as a panelist to discuss their team s model architecture, augmentations and advancements to HPMs, and aviation-safety related lessons learned. Panelists will discuss how modeling results are influenced by a model s architecture and structure, the role of the external environment, specific modeling advances and future directions and challenges for human performance modeling in aviation.

  10. Modelling and Motivating Academic Performance.

    Science.gov (United States)

    Brennan, Geoffrey; Pettit, Philip

    1991-01-01

    Three possible motivators for college teachers (individual economic interest, academic virtue, and academic honor) suggest mechanisms that can be used to improve performance. Policies need to address all three motivators; economic levers alone may undermine alternative ways of supporting good work. (MSE)

  11. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  12. Assembly line performance and modeling

    Science.gov (United States)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-09-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  13. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  14. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  15. ORGANIZATIONAL LEARNING AND PERFORMANCE. A CONCEPTUAL MODEL

    OpenAIRE

    Alexandra Luciana GUÞÃ

    2013-01-01

    Throught this paper, our main objective is to propose a conceptual model that links the notions of organizational learning (as capability and as a process) and organizational performance. Our contribution consists in analyzing the literature on organizational learning and organizational performance and in proposing an integrated model, that comprises: organizational learning capability, the process of organizational learning, organizational performance, human capital (the value and uniqueness...

  16. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  17. JELO: A Model of Joint Expeditionary Logistics Operations

    National Research Council Canada - National Science Library

    Boensel, Matthew

    2004-01-01

    JELO is an Excel spreadsheet model of joint expeditionary logistics operations and allows end-to-end analysis of the options for closing forces from CONUS, through the sea base, to objectives ashore...

  18. Photovoltaic performance models - A report card

    Science.gov (United States)

    Smith, J. H.; Reiter, L. R.

    1985-01-01

    Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.

  19. Performance Analysis of IEEE 802.15.6 CSMA/CA Protocol for WBAN Medical Scenario through DTMC Model.

    Science.gov (United States)

    Kumar, Vivek; Gupta, Bharat

    2016-12-01

    The newly drafted IEEE 802.15.6 standard for Wireless Body Area Networks (WBAN) has been concentrating on a numerous medical and non-medical applications. Such short range wireless communication standard offers ultra-low power consumption with variable data rates from few Kbps to Mbps in, on or around the proximity of the human body. In this paper, the performance analysis of carrier sense multiple access with collision avoidance (CSMA/CA) scheme based on IEEE 802.15.6 standard in terms of throughput, reliability, clear channel assessment (CCA) failure probability, packet drop probability, and end-to-end delay has been presented. We have developed a discrete-time Markov chain (DTMC) to significantly evaluate the performances of IEEE 802.15.6 CSMA/CA under non-ideal channel condition having saturated traffic condition including node wait time and service time. We also visualize that, as soon as the payload length increases the CCA failure probability increases, which results in lower node's reliability. Also, we have calculated the end-to-end delay in order to prioritize the node wait time cause by backoff and retransmission. The user priority (UP) wise DTMC analysis has been performed to show the importance of the standard especially for medical scenario.

  20. Iowa calibration of MEPDG performance prediction models.

    Science.gov (United States)

    2013-06-01

    This study aims to improve the accuracy of AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) pavement : performance predictions for Iowa pavement systems through local calibration of MEPDG prediction models. A total of 130 : representative p...

  1. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  2. Individualized Biomathematical Modeling of Fatigue and Performance

    Science.gov (United States)

    2008-05-29

    waking period are omitted in order to avoid confounds from sleep inertia. Gray bars indicate scheduled sleep periods . (b) Performance predictions...i.e., total sleep deprivation; black). Light gray areas indicate nocturnal sleep periods . In this illustration, the bifurcation point is set to...confounds from sleep inertia. Gray bars indicate scheduled sleep periods . (b) Corresponding performance predictions according to the new model

  3. Driver Performance Model: 1. Conceptual Framework

    National Research Council Canada - National Science Library

    Heimerl, Joseph

    2001-01-01

    ...'. At the present time, no such comprehensive model exists. This report discusses a conceptual framework designed to encompass the relationships, conditions, and constraints related to direct, indirect, and remote modes of driving and thus provides a guide or 'road map' for the construction and creation of a comprehensive driver performance model.

  4. Performance of hedging strategies in interval models

    NARCIS (Netherlands)

    Roorda, Berend; Engwerda, Jacob; Schumacher, J.M.

    2005-01-01

    For a proper assessment of risks associated with the trading of derivatives, the performance of hedging strategies should be evaluated not only in the context of the idealized model that has served as the basis of strategy development, but also in the context of other models. In this paper we

  5. Analysing the temporal dynamics of model performance for hydrological models

    NARCIS (Netherlands)

    Reusser, D.E.; Blume, T.; Schaefli, B.; Zehe, E.

    2009-01-01

    The temporal dynamics of hydrological model performance gives insights into errors that cannot be obtained from global performance measures assigning a single number to the fit of a simulated time series to an observed reference series. These errors can include errors in data, model parameters, or

  6. Biofilm carrier migration model describes reactor performance.

    Science.gov (United States)

    Boltz, Joshua P; Johnson, Bruce R; Takács, Imre; Daigger, Glen T; Morgenroth, Eberhard; Brockmann, Doris; Kovács, Róbert; Calhoun, Jason M; Choubert, Jean-Marc; Derlon, Nicolas

    2017-06-01

    The accuracy of a biofilm reactor model depends on the extent to which physical system conditions (particularly bulk-liquid hydrodynamics and their influence on biofilm dynamics) deviate from the ideal conditions upon which the model is based. It follows that an improved capacity to model a biofilm reactor does not necessarily rely on an improved biofilm model, but does rely on an improved mathematical description of the biofilm reactor and its components. Existing biofilm reactor models typically include a one-dimensional biofilm model, a process (biokinetic and stoichiometric) model, and a continuous flow stirred tank reactor (CFSTR) mass balance that [when organizing CFSTRs in series] creates a pseudo two-dimensional (2-D) model of bulk-liquid hydrodynamics approaching plug flow. In such a biofilm reactor model, the user-defined biofilm area is specified for each CFSTR; thereby, X carrier does not exit the boundaries of the CFSTR to which they are assigned or exchange boundaries with other CFSTRs in the series. The error introduced by this pseudo 2-D biofilm reactor modeling approach may adversely affect model results and limit model-user capacity to accurately calibrate a model. This paper presents a new sub-model that describes the migration of X carrier and associated biofilms, and evaluates the impact that X carrier migration and axial dispersion has on simulated system performance. Relevance of the new biofilm reactor model to engineering situations is discussed by applying it to known biofilm reactor types and operational conditions.

  7. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  8. Advances in HTGR fuel performance models

    International Nuclear Information System (INIS)

    Stansfield, O.M.; Goodin, D.T.; Hanson, D.L.; Turner, R.F.

    1985-01-01

    Advances in HTGR fuel performance models have improved the agreement between observed and predicted performance and contributed to an enhanced position of the HTGR with regard to investment risk and passive safety. Heavy metal contamination is the source of about 55% of the circulating activity in the HTGR during normal operation, and the remainder comes primarily from particles which failed because of defective or missing buffer coatings. These failed particles make up about 5 x 10 -4 fraction of the total core inventory. In addition to prediction of fuel performance during normal operation, the models are used to determine fuel failure and fission product release during core heat-up accident conditions. The mechanistic nature of the models, which incorporate all important failure modes, permits the prediction of performance from the relatively modest accident temperatures of a passively safe HTGR to the much more severe accident conditions of the larger 2240-MW/t HTGR. (author)

  9. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  10. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  11. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  12. Estuarine modeling: Does a higher grid resolution improve model performance?

    Science.gov (United States)

    Ecological models are useful tools to explore cause effect relationships, test hypothesis and perform management scenarios. A mathematical model, the Gulf of Mexico Dissolved Oxygen Model (GoMDOM), has been developed and applied to the Louisiana continental shelf of the northern ...

  13. Critical review of glass performance modeling

    International Nuclear Information System (INIS)

    Bourcier, W.L.

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process

  14. Critical review of glass performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Bourcier, W.L. [Lawrence Livermore National Lab., CA (United States)

    1994-07-01

    Borosilicate glass is to be used for permanent disposal of high-level nuclear waste in a geologic repository. Mechanistic chemical models are used to predict the rate at which radionuclides will be released from the glass under repository conditions. The most successful and useful of these models link reaction path geochemical modeling programs with a glass dissolution rate law that is consistent with transition state theory. These models have been used to simulate several types of short-term laboratory tests of glass dissolution and to predict the long-term performance of the glass in a repository. Although mechanistically based, the current models are limited by a lack of unambiguous experimental support for some of their assumptions. The most severe problem of this type is the lack of an existing validated mechanism that controls long-term glass dissolution rates. Current models can be improved by performing carefully designed experiments and using the experimental results to validate the rate-controlling mechanisms implicit in the models. These models should be supported with long-term experiments to be used for model validation. The mechanistic basis of the models should be explored by using modern molecular simulations such as molecular orbital and molecular dynamics to investigate both the glass structure and its dissolution process.

  15. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  16. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  17. Evaluation of models in performance assessment

    International Nuclear Information System (INIS)

    Dormuth, K.W.

    1993-01-01

    The reliability of models used for performance assessment for high-level waste repositories is a key factor in making decisions regarding the management of high-level waste. Model reliability may be viewed as a measure of the confidence that regulators and others have in the use of these models to provide information for decision making. The degree of reliability required for the models will increase as implementation of disposal proceeds and decisions become increasingly important to safety. Evaluation of the models by using observations of real systems provides information that assists the assessment analysts and reviewers in establishing confidence in the conclusions reached in the assessment. A continuing process of model calibration, evaluation, and refinement should lead to increasing reliability of models as implementation proceeds. However, uncertainty in the model predictions cannot be eliminated, so decisions will always be made under some uncertainty. Examples from the Canadian program illustrate the process of model evaluation using observations of real systems and its relationship to performance assessment. 21 refs., 2 figs

  18. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  19. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Model A 8.0 2.0 94.52% 88.46% 76 108 12 12 0.86 0.91 0.78 0.94. Model B 2.0 2.0 93.18% 89.33% 64 95 10 9 0.88 0.90 0.75 0.98. The above results for TEST – 1 show details for our two models (Model A and Model B).Performance of Model A after adding of 32 negative dataset of MiRTif on our testing set(MiRecords) ...

  20. Performance Evaluation and Modelling of Container Terminals

    Science.gov (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh

    2018-02-01

    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  1. Utilities for high performance dispersion model PHYSIC

    International Nuclear Information System (INIS)

    Yamazawa, Hiromi

    1992-09-01

    The description and usage of the utilities for the dispersion calculation model PHYSIC were summarized. The model was developed in the study of developing high performance SPEEDI with the purpose of introducing meteorological forecast function into the environmental emergency response system. The procedure of PHYSIC calculation consists of three steps; preparation of relevant files, creation and submission of JCL, and graphic output of results. A user can carry out the above procedure with the help of the Geographical Data Processing Utility, the Model Control Utility, and the Graphic Output Utility. (author)

  2. A practical model for sustainable operational performance

    International Nuclear Information System (INIS)

    Vlek, C.A.J.; Steg, E.M.; Feenstra, D.; Gerbens-Leenis, W.; Lindenberg, S.; Moll, H.; Schoot Uiterkamp, A.; Sijtsma, F.; Van Witteloostuijn, A.

    2002-01-01

    By means of a concrete model for sustainable operational performance enterprises can report uniformly on the sustainability of their contributions to the economy, welfare and the environment. The development and design of a three-dimensional monitoring system is presented and discussed [nl

  3. Data Model Performance in Data Warehousing

    Science.gov (United States)

    Rorimpandey, G. C.; Sangkop, F. I.; Rantung, V. P.; Zwart, J. P.; Liando, O. E. S.; Mewengkang, A.

    2018-02-01

    Data Warehouses have increasingly become important in organizations that have large amount of data. It is not a product but a part of a solution for the decision support system in those organizations. Data model is the starting point for designing and developing of data warehouses architectures. Thus, the data model needs stable interfaces and consistent for a longer period of time. The aim of this research is to know which data model in data warehousing has the best performance. The research method is descriptive analysis, which has 3 main tasks, such as data collection and organization, analysis of data and interpretation of data. The result of this research is discussed in a statistic analysis method, represents that there is no statistical difference among data models used in data warehousing. The organization can utilize four data model proposed when designing and developing data warehouse.

  4. Performance model for a CCTV-MTI

    International Nuclear Information System (INIS)

    Dunn, D.R.; Dunbar, D.L.

    1978-01-01

    CCTV-MTI (closed circuit television--moving target indicator) monitors represent typical components of access control systems, as for example in a material control and accounting (MC and A) safeguards system. This report describes a performance model for a CCTV-MTI monitor. The performance of a human in an MTI role is a separate problem and is not addressed here. This work was done in conjunction with the NRC sponsored LLL assessment procedure for MC and A systems which is presently under development. We develop a noise model for a generic camera system and a model for the detection mechanism for a postulated MTI design. These models are then translated into an overall performance model. Measures of performance are probabilities of detection and false alarm as a function of intruder-induced grey level changes in the protected area. Sensor responsivity, lens F-number, source illumination and spectral response were treated as design parameters. Some specific results are illustrated for a postulated design employing a camera with a Si-target vidicon. Reflectance or light level changes in excess of 10% due to an intruder will be detected with a very high probability for the portion of the visible spectrum with wavelengths above 500 nm. The resulting false alarm rate was less than one per year. We did not address sources of nuisance alarms due to adverse environments, reliability, resistance to tampering, nor did we examine the effects of the spatial frequency response of the optics. All of these are important and will influence overall system detection performance

  5. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  6. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  7. New Diagnostics to Assess Model Performance

    Science.gov (United States)

    Koh, Tieh-Yong

    2013-04-01

    The comparison of model performance between the tropics and the mid-latitudes is particularly problematic for observables like temperature and humidity: in the tropics, these observables have little variation and so may give an apparent impression that model predictions are often close to observations; on the contrary, they vary widely in mid-latitudes and so the discrepancy between model predictions and observations might be unnecessarily over-emphasized. We have developed a suite of mathematically rigorous diagnostics that measures normalized errors accounting for the observed and modeled variability of the observables themselves. Another issue in evaluating model performance is the relative importance of getting the variance of an observable right versus getting the modeled variation to be in phase with the observed. The correlation-similarity diagram was designed to analyse the pattern error of a model by breaking it down into contributions from amplitude and phase errors. A final and important question pertains to the generalization of scalar diagnostics to analyse vector observables like wind. In particular, measures of variance and correlation must be properly derived to avoid the mistake of ignoring the covariance between north-south and east-west winds (hence wrongly assuming that the north-south and east-west directions form a privileged vector basis for error analysis). There is also a need to quantify systematic preferences in the direction of vector wind errors, which we make possible by means of an error anisotropy diagram. Although the suite of diagnostics is mentioned with reference to model verification here, it is generally applicable to quantify differences between two datasets (e.g. from two observation platforms). Reference publication: Koh, T. Y. et al. (2012), J. Geophys. Res., 117, D13109, doi:10.1029/2011JD017103. also available at http://www.ntu.edu.sg/home/kohty

  8. Performance modeling of network data services

    Energy Technology Data Exchange (ETDEWEB)

    Haynes, R.A.; Pierson, L.G.

    1997-01-01

    Networks at major computational organizations are becoming increasingly complex. The introduction of large massively parallel computers and supercomputers with gigabyte memories are requiring greater and greater bandwidth for network data transfers to widely dispersed clients. For networks to provide adequate data transfer services to high performance computers and remote users connected to them, the networking components must be optimized from a combination of internal and external performance criteria. This paper describes research done at Sandia National Laboratories to model network data services and to visualize the flow of data from source to sink when using the data services.

  9. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  10. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  11. Model description and evaluation of model performance: DOSDIM model

    International Nuclear Information System (INIS)

    Lewyckyj, N.; Zeevaert, T.

    1996-01-01

    DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs

  12. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  13. Hybrid Modeling Improves Health and Performance Monitoring

    Science.gov (United States)

    2007-01-01

    Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.

  14. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  15. Intelligent End-To-End Resource Virtualization Using Service Oriented Architecture

    NARCIS (Netherlands)

    Onur, E.; Sfakianakis, E.; Papagianni, C.; Karagiannis, Georgios; Kontos, T.; Niemegeers, I.G.M.M.; Niemegeers, I.; Chochliouros, I.; Heemstra de Groot, S.M.; Sjödin, P.; Hidell, M.; Cinkler, T.; Maliosz, M.; Kaklamani, D.I.; Carapinha, J.; Belesioti, M.; Futrps, E.

    2009-01-01

    Service-oriented architecture can be considered as a philosophy or paradigm in organizing and utilizing services and capabilities that may be under the control of different ownership domains. Virtualization provides abstraction and isolation of lower level functionalities, enabling portability of

  16. Designing a holistic end-to-end intelligent network analysis and security platform

    Science.gov (United States)

    Alzahrani, M.

    2018-03-01

    Firewall protects a network from outside attacks, however, once an attack entering a network, it is difficult to detect. Recent significance accidents happened. i.e.: millions of Yahoo email account were stolen and crucial data from institutions are held for ransom. Within two year Yahoo’s system administrators were not aware that there are intruder inside the network. This happened due to the lack of intelligent tools to monitor user behaviour in internal network. This paper discusses a design of an intelligent anomaly/malware detection system with proper proactive actions. The aim is to equip the system administrator with a proper tool to battle the insider attackers. The proposed system adopts machine learning to analyse user’s behaviour through the runtime behaviour of each node in the network. The machine learning techniques include: deep learning, evolving machine learning perceptron, hybrid of Neural Network and Fuzzy, as well as predictive memory techniques. The proposed system is expanded to deal with larger network using agent techniques.

  17. Location Assisted Vertical Handover Algorithm for QoS Optimization in End-to-End Connections

    DEFF Research Database (Denmark)

    Dam, Martin S.; Christensen, Steffen R.; Mikkelsen, Lars M.

    2012-01-01

    When devices are mobile, they potentially move within range of several different wireless networks, which are not utilized due to security, inactive radios or it is just plain cumbersome for the user to exploit them. This paper proposes a vertical handover algorithm to utilize these networks, which...

  18. Multi-hop Relaying: An End-to-End Delay Analysis

    KAUST Repository

    Chaaban, Anas

    2015-12-01

    The impact of multi-hopping schemes on the communication latency in a relay channel is studied. The main aim is to characterize conditions under which such schemes decrease the communication latency given a reliability requirement. Both decode-forward (DF) and amplify-forward (AF) with block coding are considered, and are compared with the point-to-point (P2P) scheme which ignores the relay. Latency expressions for the three schemes are derived, and conditions under which DF and AF reduce latency are obtained for high signal-to-noise ratio (SNR). Interestingly, these conditions are more strict when compared to the conditions under which the same multi-hopping schemes achieve higher long-term (information-theoretic) rates than P2P. It turns out that the relation between the sourcedestination SNR and the harmonic mean of the SNR’s of the channels to and from the relay dictates whether multi-hopping reduces latency or not.

  19. Mechanisms for Differentiating End-to-End Loss Due to Channel Corruption and Network Congestion

    National Research Council Canada - National Science Library

    Romaniak, Gregory

    2001-01-01

    .... The experiment team established a test configuration at the NRL to better understand available mechanisms for the intelligent notification of loss due to channel errors as opposed to loss due to congestion...

  20. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    Science.gov (United States)

    2015-01-01

    Universal Generic Codes UNESCO United Nations Educational, Scientific and Cultural Organization USGS United States Geological Survey VTEC Valid... UNESCO World Heritage Site and accessible only by boat or plane, the terestrial portion of Gwai Haanas covers the southern portion of Moresby Island...RSS subscription • USGS (@USGSted ) Twiter or SMS subscriptions • UNESCO tsunami-information-ioc@lists.unesco.org email The response results for

  1. Towards End-to-End Lane Detection: an Instance Segmentation Approach

    OpenAIRE

    Neven, Davy; De Brabandere, Bert; Georgoulis, Stamatios; Proesmans, Marc; Van Gool, Luc

    2018-01-01

    Modern cars are incorporating an increasing number of driver assist features, among which automatic lane keeping. The latter allows the car to properly position itself within the road lanes, which is also crucial for any subsequent lane departure or trajectory planning decision in fully autonomous cars. Traditional lane detection methods rely on a combination of highly-specialized, hand-crafted features and heuristics, usually followed by post-processing techniques, that are computationally e...

  2. Development of an End-to-End Active Debris Removal (ADR) Mission Strategic Plan

    Data.gov (United States)

    National Aeronautics and Space Administration — The original proposal was to develop an ADR mission strategic plan. However, the task was picked up by the OCT. Subsequently the award was de-scoped to $30K to...

  3. Hardware Support for Malware Defense and End-to-End Trust

    Science.gov (United States)

    2017-02-01

    formed IoT division, and with selected partners.  Standards: “ Trust Dust”, the early prototype of the value of embedded TPM for cyber- physical systems ...Computing Group (TCG) for the definition of the Trusted Platform Module (TPM). It is noted that for long running systems , establishing trust at boot...may be insufficient for continuation of trust at an arbitrary point in the future. If the design of the system permits undetectable attacks against

  4. Using Voice Over Internet Protocol to Create True End-to-End Security

    Science.gov (United States)

    2011-09-01

    all UAs were forced to use the PCMU audio codec (see Figure 22). Also on this screen is the ability to turn on SRTP encryption. Jitsi and Linphone...call is established. Additionally, the 1 Linphone is an open source audio /video and text messaging...client that uses SIP 2 Blink is an open source audio SIP client that is available for Mac, Windows, and Linux. 3 Jitsi is an open source audio

  5. End-to-end unsupervised deformable image registration with a convolutional neural network

    NARCIS (Netherlands)

    de Vos, Bob D.; Berendsen, Floris; Viergever, Max A.; Staring, Marius; Išgum, Ivana

    2017-01-01

    In this work we propose a deep learning network for deformable image registration (DIRNet). The DIRNet consists of a convolutional neural network (ConvNet) regressor, a spatial transformer, and a resampler. The ConvNet analyzes a pair of fixed and moving images and outputs parameters for the spatial

  6. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    Science.gov (United States)

    2014-06-01

    prevent IPsec from being securely applied to general Internet paths (e.g., between a user at a coffee shop and each of the servers of websites they...asymmetric) behaviors, including paths that modify, delete , or insert: sequence numbers, IPID or receive window, ECN, MSS, SACK permitted, timestamps

  7. Integration of DST's for non-conflicting end-to-end flight scheduling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR effort we propose an innovative approach for the integration of Decision Support Tools (DSTs) for increased situational awareness, improved cooperative...

  8. IMS Intra- and Inter Domain End-to-End Resilience Analysis

    DEFF Research Database (Denmark)

    Kamyod, Chayapol; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2013-01-01

    when increasing requested traffics were presented; as well, the results were compared when applying equally load balancing of a 1:1 redundancy of S-CSCF unit into the core registered domain. The results exposed insight reliability behaviors of communication within similar and different registered......This paper evaluated resilience of the reference IMS based network topology in operation through the keys reliability parameters via OPNET. The reliability behaviors of communication within similar and across registered home IMS domains were simulated and compared. Besides, the reliability effects...

  9. Mining Fashion Outfit Composition Using An End-to-End Deep Learning Approach on Set Data

    OpenAIRE

    Li, Yuncheng; Cao, LiangLiang; Zhu, Jiang; Luo, Jiebo

    2016-01-01

    Composing fashion outfits involves deep understanding of fashion standards while incorporating creativity for choosing multiple fashion items (e.g., Jewelry, Bag, Pants, Dress). In fashion websites, popular or high-quality fashion outfits are usually designed by fashion experts and followed by large audiences. In this paper, we propose a machine learning system to compose fashion outfits automatically. The core of the proposed automatic composition system is to score fashion outfit candidates...

  10. AAL Security and Privacy: transferring XACML policies for end-to-end acess and usage control

    NARCIS (Netherlands)

    Vlamings, H.G.M.; Koster, R.P.

    2010-01-01

    Ambient Assisted Living (AAL) systems and services aim to provide a solution for growing healthcare expenses and degradation of life quality of elderly using information and communication technology. Inparticular AAL solutions are being created that are heavily based on web services an sensor

  11. Network Slicing in Industry 4.0 Applications: Abstraction Methods and End-to-End Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar; Kalør, Anders Ellersgaard

    2018-01-01

    having different requirements from the network, ranging from high reliability and low latency to high data rates. Furthermore, these industrial networks will be highly heterogeneous as they will feature a number of diverse communication technologies. In this article, we propose network slicing...

  12. End-to-end Configuration of Wireless Realtime Communication over Heterogeneous Protocols

    DEFF Research Database (Denmark)

    Malinowsky, B.; Grønbæk, Jesper; Schwefel, Hans-Peter

    2015-01-01

    . Such configuration meets requirements on probability of message delivery and deadlines for real-time data. We provide a closed-form expression to calculate viable configurations for lossy environments, and apply it to IEEE 802.11 and XBee 868 MHz technologies for contended channel conditions. Simulation shows...

  13. The Challenge of Ensuring Human Rights in the End-to-End Supply Chain

    DEFF Research Database (Denmark)

    Wieland, Andreas; Handfield, Robert B.

    2014-01-01

    Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible.......Certification programs have their merits and their limitations. With the growing availability of social media, analytics tools, and supply chain data, a smarter set of solutions could soon be possible....

  14. Building an End-to-end System for Long Term Soil Monitoring

    Science.gov (United States)

    Szlavecz, K.; Terzis, A.; Musaloiu-E., R.; Cogan, J.; Szalay, A.; Gray, J.

    2006-05-01

    We have developed and deployed an experimental soil monitoring system in an urban forest. Wireless sensor nodes collect data on soil temperature, soil moisture, air temperature, and light. Data are uploaded into a SQL Server database, where they are calibrated and reorganized into an OLAP data cube. The data are accessible on-line using a web services interface with various visual tools. Our prototype system of ten nodes has been live since Sep 2005, and in 5 months of operation over 6 million measurements have been collected. At a high level, our experiment was a success: we detected variations in soil condition corresponding to topography and external environmental parameters as expected. However, we encountered a number of challenging technical problems: need for low-level programming at multiple levels, calibration across space and time, and cross- reference of measurements with external sources. Based upon the experience with this system we are now deploying 200 mode nodes with close to a thousand sensors spread over multiple sites in the context of the Baltimore Ecosystem Study LTER. www

  15. Caius: Synthetic Observations Using a Robust End-to-End Radiative Transfer Pipeline

    Science.gov (United States)

    Simeon Barrow, Kirk Stuart; Wise, John H.; O'Shea, Brian; Norman, Michael L.; Xu, Hao

    2018-01-01

    We present synthetic observations for the first generations of galaxies in the Universe and make predictions for future deep field observations for redshifts greater than 6. Due to the strong impact of nebular emission lines and the relatively compact scale of HII regions, high resolution cosmological simulations and a robust suite of analysis tools are required to properly simulate spectra. We created a software pipeline consisting of FSPS, Yggdrasil, Hyperion, Cloudy and our own tools to generate synthetic IR observations from a fully three-dimensional arrangement of gas, dust, and stars. Our prescription allows us to include emission lines for a complete chemical network and tackle the effect of dust extinction and scattering in the various lines of sight. We provide spectra, 2-D binned photon imagery for both HST and JWST IR filters, luminosity relationships, and emission line strengths for a large sample of high redshift galaxies in the Renaissance Simulations (Xu et al. 2013). We also pay special attention to contributions from Population III stars and high-mass X-ray binaries and explore a direct-collapse black hole simulation (Aykutalp et al. 2014). Our resulting synthetic spectra show high variability between galactic halos with a strong dependence on stellar mass, viewing angle, metallicity, gas mass fraction, and formation history.

  16. Hoe kunnen end-to-end processen worden geborgd in de organisatie?

    NARCIS (Netherlands)

    Strikwerda, H.

    2017-01-01

    Processen waarin kennis, informatie en materiaal worden getransformeerd in goederen en diensten, vormen de kern van organiseren. Dat is een van de oudste uitgangspunten in de bedrijfskunde. Processen zijn in het scientific management en daarmee in lean six sigma het object van analyse en verbetering

  17. SecMon: End-to-End Quality and Security Monitoring System

    OpenAIRE

    Ciszkowski, Tomasz; Eliasson, Charlott; Fiedler, Markus; Kotulski, Zbigniew; Lupu, Radu; Mazurczyk, Wojciech

    2008-01-01

    The Voice over Internet Protocol (VoIP) is becoming a more available and popular way of communicating for Internet users. This also applies to Peer-to-Peer (P2P) systems and merging these two have already proven to be successful (e.g. Skype). Even the existing standards of VoIP provide an assurance of security and Quality of Service (QoS), however, these features are usually optional and supported by limited number of implementations. As a result, the lack of mandatory and widely applicable Q...

  18. Urban biomining meets printable electronics: end-to-end at destination biological recycling and reprinting

    Data.gov (United States)

    National Aeronautics and Space Administration — Space missions rely utterly on metallic components, from the spacecraft to electronics. Yet, metals add mass, and electronics have the additional problem of a...

  19. An End-to-End Compression Framework Based on Convolutional Neural Networks

    OpenAIRE

    Jiang, Feng; Tao, Wen; Liu, Shaohui; Ren, Jie; Guo, Xun; Zhao, Debin

    2017-01-01

    Deep learning, e.g., convolutional neural networks (CNNs), has achieved great success in image processing and computer vision especially in high level vision applications such as recognition and understanding. However, it is rarely used to solve low-level vision problems such as image compression studied in this paper. Here, we move forward a step and propose a novel compression framework based on CNNs. To achieve high-quality image compression at low bit rates, two CNNs are seamlessly integr...

  20. Ubiquitous Monitoring Solution for Wireless Sensor Networks with Push Notifications and End-to-End Connectivity

    Directory of Open Access Journals (Sweden)

    Luis M. L. Oliveira

    2014-01-01

    Full Text Available Wireless Sensor Networks (WSNs belongs to a new trend in technology in which tiny and resource constrained devices are wirelessly interconnected and are able to interact with the surrounding environment by collecting data such as temperature and humidity. Recently, due to the huge growth in the use of mobile devices with Internet connection, smartphones are becoming the center of future ubiquitous wireless networks. Interconnecting WSNs with smartphones and the Internet is a big challenge and new architectures are required due to the heterogeneity of these devices. Taking into account that people are using smartphones with Internet connection, there is a good opportunity to propose a new architecture for wireless sensors monitoring using push notifications and smartphones. Then, this paper proposes a ubiquitous approach for WSN monitoring based on a REST Web Service, a relational database, and an Android mobile application. Real-time data sensed by WSNs are sent directly to a smartphone or stored in a database and requested by the mobile application using a well-defined RESTful interface. A push notification system was created in order to alert mobile users when a sensor parameter overcomes a given threshold. The proposed architecture and mobile application were evaluated and validated using a laboratory WSN testbed and are ready for use.

  1. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    Science.gov (United States)

    Brandt, N.; Fichter, W.; Kersten, M.; Lucarelli, S.; Montemurro, F.

    2005-05-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised.

  2. LISA Pathfinder E2E performance simulation: optical and self-gravity stability analysis

    International Nuclear Information System (INIS)

    Brandt, N; Fichter, W; Kersten, M; Lucarelli, S; Montemurro, F

    2005-01-01

    End-to-end (E2E) modelling and simulation, i.e. verifying the science performance of LISA Pathfinder (spacecraft and payload), is mandatory in order to minimize mission risks. In this paper, focus is on two particular applications of the E2E performance simulator currently being developed at EADS Astrium GmbH: the opto-dynamical stability and the self-gravity disturbance stability analysis. The E2E models applied here comprise the opto-dynamical modelling of the optical metrology systems (OMS) laser interferometry, the thermo-elastic distortion modelling of the OMS optical elements and the self-gravity disturbance model accounting for structural distortions. Preliminary analysis results are presented in detail, identifying shortcomings of the current LISA technology package (LTP) mounting baseline. As a consequence, the design is now being revised

  3. Modelling fuel cell performance using artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Ogaji, S.O.T.; Singh, R.; Pilidis, P.; Diacakis, M. [Power Propulsion and Aerospace Engineering Department, Centre for Diagnostics and Life Cycle Costs, Cranfield University (United Kingdom)

    2006-03-09

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed. (author)

  4. Modelling fuel cell performance using artificial intelligence

    Science.gov (United States)

    Ogaji, S. O. T.; Singh, R.; Pilidis, P.; Diacakis, M.

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed.

  5. Modelling the predictive performance of credit scoring

    Directory of Open Access Journals (Sweden)

    Shi-Wei Shen

    2013-07-01

    Research purpose: The purpose of this empirical paper was to examine the predictive performance of credit scoring systems in Taiwan. Motivation for the study: Corporate lending remains a major business line for financial institutions. However, in light of the recent global financial crises, it has become extremely important for financial institutions to implement rigorous means of assessing clients seeking access to credit facilities. Research design, approach and method: Using a data sample of 10 349 observations drawn between 1992 and 2010, logistic regression models were utilised to examine the predictive performance of credit scoring systems. Main findings: A test of Goodness of fit demonstrated that credit scoring models that incorporated the Taiwan Corporate Credit Risk Index (TCRI, micro- and also macroeconomic variables possessed greater predictive power. This suggests that macroeconomic variables do have explanatory power for default credit risk. Practical/managerial implications: The originality in the study was that three models were developed to predict corporate firms’ defaults based on different microeconomic and macroeconomic factors such as the TCRI, asset growth rates, stock index and gross domestic product. Contribution/value-add: The study utilises different goodness of fits and receiver operator characteristics during the examination of the robustness of the predictive power of these factors.

  6. High Performance Modeling of Novel Diagnostics Configuration

    Science.gov (United States)

    Smith, Dalton; Gibson, John; Lodes, Rylie; Malcolm, Hayden; Nakamoto, Teagan; Parrack, Kristina; Trujillo, Christopher; Wilde, Zak; Los Alamos Laboratories Q-6 Students Team

    2017-06-01

    A novel diagnostics method to measure the Hayes Electric Effect was tested and verified against computerized models. Where standard PVDF diagnostics utilize piezoelectric materials to measure detonation pressure through strain-induced electrical signals, the PVDF was used in a novel technique by also detecting the detonation's induced electric field. The ALE-3D Hydro Codes predicted the performance by calculating detonation velocities, pressures, and arrival times. These theoretical results then validated the experimental use of the PVDF repurposed to specifically track the Hayes Electric Effect. Los Alamos National Laboratories Q-6.

  7. A Combat Mission Team Performance Model: Development and initial Application

    National Research Council Canada - National Science Library

    Silverman, Denise

    1997-01-01

    ... realistic combat scenarios. We present a conceptual model of team performance measurement in which aircrew coordination, team performance, mission performance and their interrelationships are operationally defined...

  8. The COD Model: Simulating Workgroup Performance

    Science.gov (United States)

    Biggiero, Lucio; Sevi, Enrico

    Though the question of the determinants of workgroup performance is one of the most central in organization science, precise theoretical frameworks and formal demonstrations are still missing. In order to fill in this gap the COD agent-based simulation model is here presented and used to study the effects of task interdependence and bounded rationality on workgroup performance. The first relevant finding is an algorithmic demonstration of the ordering of interdependencies in terms of complexity, showing that the parallel mode is the most simplex, followed by the sequential and then by the reciprocal. This result is far from being new in organization science, but what is remarkable is that now it has the strength of an algorithmic demonstration instead of being based on the authoritativeness of some scholar or on some episodic empirical finding. The second important result is that the progressive introduction of realistic limits to agents' rationality dramatically reduces workgroup performance and addresses to a rather interesting result: when agents' rationality is severely bounded simple norms work better than complex norms. The third main finding is that when the complexity of interdependence is high, then the appropriate coordination mechanism is agents' direct and active collaboration, which means teamwork.

  9. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models.

    Science.gov (United States)

    Liu, Bowen; Ramsundar, Bharath; Kawthekar, Prasad; Shi, Jade; Gomes, Joseph; Luu Nguyen, Quang; Ho, Stephen; Sloane, Jack; Wender, Paul; Pande, Vijay

    2017-10-25

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder-decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis.

  10. Model for measuring complex performance in an aviation environment

    International Nuclear Information System (INIS)

    Hahn, H.A.

    1988-01-01

    An experiment was conducted to identify models of pilot performance through the attainment and analysis of concurrent verbal protocols. Sixteen models were identified. Novice and expert pilots differed with respect to the models they used. Models were correlated to performance, particularly in the case of expert subjects. Models were not correlated to performance shaping factors (i.e. workload). 3 refs., 1 tab

  11. Numerical modeling capabilities to predict repository performance

    International Nuclear Information System (INIS)

    1979-09-01

    This report presents a summary of current numerical modeling capabilities that are applicable to the design and performance evaluation of underground repositories for the storage of nuclear waste. The report includes codes that are available in-house, within Golder Associates and Lawrence Livermore Laboratories; as well as those that are generally available within the industry and universities. The first listing of programs are in-house codes in the subject areas of hydrology, solute transport, thermal and mechanical stress analysis, and structural geology. The second listing of programs are divided by subject into the following categories: site selection, structural geology, mine structural design, mine ventilation, hydrology, and mine design/construction/operation. These programs are not specifically designed for use in the design and evaluation of an underground repository for nuclear waste; but several or most of them may be so used

  12. Performance Analysis of Mixed Nakagami- m and Gamma–Gamma Dual-Hop FSO Transmission Systems

    KAUST Repository

    Zedini, Emna

    2015-02-01

    In this paper, we carry out a unified performance analysis of a dual-hop relay system over the asymmetric links composed of both radio-frequency (RF) and unified free-space optical (FSO) links under the effect of pointing errors. Both fixed and variable gain relay systems are studied. The RF link is modeled by the Nakagami-m fading channel and the FSO link by the Gamma-Gamma fading channel subject to both types of detection techniques (i.e., heterodyne detection and intensity modulation with direct detection). In particular, we derive new unified closed-form expressions for the cumulative distribution function, the probability density function, the moment generating function (MGF), and the moments of the end-to-end signal-to-noise ratio (SNR) of these systems in terms of the Meijer\\'s G function. Based on these formulas, we offer exact closed-form expressions for the outage probability (OP), the higher order amount of fading, and the average bit error rate (BER) of a variety of binary modulations in terms of the Meijer\\'s G function. Furthermore, an exact closed-form expression of the end-to-end ergodic capacity is derived in terms of the bivariate G function. Additionally, by using the asymptotic expansion of the Meijer\\'s G function at the high-SNR regime, we derive new asymptotic results for the OP, the MGF, and the average BER in terms of simple elementary functions.

  13. Dengue human infection model performance parameters.

    Science.gov (United States)

    Endy, Timothy P

    2014-06-15

    Dengue is a global health problem and of concern to travelers and deploying military personnel with development and licensure of an effective tetravalent dengue vaccine a public health priority. The dengue viruses (DENVs) are mosquito-borne flaviviruses transmitted by infected Aedes mosquitoes. Illness manifests across a clinical spectrum with severe disease characterized by intravascular volume depletion and hemorrhage. DENV illness results from a complex interaction of viral properties and host immune responses. Dengue vaccine development efforts are challenged by immunologic complexity, lack of an adequate animal model of disease, absence of an immune correlate of protection, and only partially informative immunogenicity assays. A dengue human infection model (DHIM) will be an essential tool in developing potential dengue vaccines or antivirals. The potential performance parameters needed for a DHIM to support vaccine or antiviral candidates are discussed. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. High-performance phase-field modeling

    KAUST Repository

    Vignal, Philippe

    2015-04-27

    Many processes in engineering and sciences involve the evolution of interfaces. Among the mathematical frameworks developed to model these types of problems, the phase-field method has emerged as a possible solution. Phase-fields nonetheless lead to complex nonlinear, high-order partial differential equations, whose solution poses mathematical and computational challenges. Guaranteeing some of the physical properties of the equations has lead to the development of efficient algorithms and discretizations capable of recovering said properties by construction [2, 5]. This work builds-up on these ideas, and proposes novel discretization strategies that guarantee numerical energy dissipation for both conserved and non-conserved phase-field models. The temporal discretization is based on a novel method which relies on Taylor series and ensures strong energy stability. It is second-order accurate, and can also be rendered linear to speed-up the solution process [4]. The spatial discretization relies on Isogeometric Analysis, a finite element method that possesses the k-refinement technology and enables the generation of high-order, high-continuity basis functions. These basis functions are well suited to handle the high-order operators present in phase-field models. Two-dimensional and three dimensional results of the Allen-Cahn, Cahn-Hilliard, Swift-Hohenberg and phase-field crystal equation will be presented, which corroborate the theoretical findings, and illustrate the robustness of the method. Results related to more challenging examples, namely the Navier-Stokes Cahn-Hilliard and a diusion-reaction Cahn-Hilliard system, will also be presented. The implementation was done in PetIGA and PetIGA-MF, high-performance Isogeometric Analysis frameworks [1, 3], designed to handle non-linear, time-dependent problems.

  15. Delay and cost performance analysis of the diffie-hellman key exchange protocol in opportunistic mobile networks

    Science.gov (United States)

    Soelistijanto, B.; Muliadi, V.

    2018-03-01

    Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.

  16. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  17. Multivisceral transplantation in pigs: a model for research and training

    Directory of Open Access Journals (Sweden)

    André Ibrahim David

    2011-09-01

    Full Text Available Objective: To present a model for research and training inmultivisceral transplantation in pigs. Methods: Eight LargeWhite pigs (four donors and four recipients were operated. Themultivisceral transplant with stomach, duodenum, pancreas,liver and intestine was performed similarly to transplantation inhumans with a few differences, described below. Anastomoseswere performed as follows: end-to-end from the supra-hepaticvena cava of the graft to the recipient juxta diaphragmatic venacava; end-to-end from the infra-hepatic vena cava of the graftto the inferior (suprarenal vena cava of the recipient; and endto-side patch of the aorta of the graft to the infrarenal aortaof the recipient plus digestive reconstruction. Results: Theperformance of the multivisceral transplantion was possible inall four animals. Reperfusions of the multivisceral graft led to asevere ischemia-reperfusion syndrome, despite flushing of thegraft. The animals presented with hypotension and the need forhigh doses of vasoactive drugs, and all of them were sacrificedafter discontinuing these drugs. Conclusion: Some alternativesto minimize the ischemia-reperfusion syndrome, such as the useof another vasoactive drug, use of a third pig merely for bloodtransfusion, presence of an anesthesia team in the operatingroom, and reduction of the graft, will be the next steps to enableexperimental studies.

  18. The landscape of GPGPU performance modeling tools

    NARCIS (Netherlands)

    Madougou, S.; Varbanescu, A.; de Laat, C.; van Nieuwpoort, R.

    GPUs are gaining fast adoption as high-performance computing architectures, mainly because of their impressive peak performance. Yet most applications only achieve small fractions of this performance. While both programmers and architects have clear opinions about the causes of this performance gap,

  19. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    Science.gov (United States)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  20. The Five Key Questions of Human Performance Modeling.

    Science.gov (United States)

    Wu, Changxu

    2018-01-01

    Via building computational (typically mathematical and computer simulation) models, human performance modeling (HPM) quantifies, predicts, and maximizes human performance, human-machine system productivity and safety. This paper describes and summarizes the five key questions of human performance modeling: 1) Why we build models of human performance; 2) What the expectations of a good human performance model are; 3) What the procedures and requirements in building and verifying a human performance model are; 4) How we integrate a human performance model with system design; and 5) What the possible future directions of human performance modeling research are. Recent and classic HPM findings are addressed in the five questions to provide new thinking in HPM's motivations, expectations, procedures, system integration and future directions.

  1. PV Performance Modeling Methods and Practices: Results from the 4th PV Performance Modeling Collaborative Workshop.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    In 2014, the IEA PVPS Task 13 added the PVPMC as a formal activity to its technical work plan for 2014-2017. The goal of this activity is to expand the reach of the PVPMC to a broader international audience and help to reduce PV performance modeling uncertainties worldwide. One of the main deliverables of this activity is to host one or more PVPMC workshops outside the US to foster more international participation within this collaborative group. This report reviews the results of the first in a series of these joint IEA PVPS Task 13/PVPMC workshops. The 4th PV Performance Modeling Collaborative Workshop was held in Cologne, Germany at the headquarters of TÜV Rheinland on October 22-23, 2015.

  2. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    A model for a flue gas boiler covering the flue gas and the water-/steam side has been formulated. The model has been formulated as a number of sub models that are merged into an overall model for the complete boiler. Sub models have been defined for the furnace, the convection zone (split in 2......: a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat......Lab/Simulink has been applied for carrying out the simulations. To be able to verify the simulated results experiments has been carried out on a full scale boiler plant....

  3. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    OpenAIRE

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketing performance as a sequence of intermediate performance measures ultimately leading to financial performance. This framework, called the Hierarchical Marketing Performance (HMP) framework, starts ...

  4. Modelling and simulating fire tube boiler performance

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    : a zone submerged in water and a zone covered by steam), a model for the material in the boiler (the steel) and 2 models for resp. the water/steam zone (the boiling) and the steam. The dynamic model has been developed as a number of Differential-Algebraic-Equation system (DAE). Subsequently Mat...

  5. Models and criteria for waste repository performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1981-03-01

    A primary objective of the Waste Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in assuring that this objective is met. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste management decision making. Criteria development needs and the relation between criteria and models are also discussed

  6. Unified Performance Analysis of Mixed Line of Sight RF-FSO Fixed Gain Dual-Hop Transmission Systems

    KAUST Repository

    Zedini, Emna

    2014-04-03

    In the work, we carry out a unified performance analysis of a dual-hop fixed gain relay system over asymmetric links composed of both radio-frequency (RF) and unified free- space optics (FSO) under the effect of pointing errors. The RF link is modeled by the Nakagami-m fading channel and the FSO link by the Gamma-Gamma fading channel subject to both types of detection techniques (i.e. heterodyne detection and intensity modulation with direct detection (IM/DD)). In particular, we derive new unified closed-form expressions for the cumulative distribution function, the probability density function, the moment generation function, and the moments of the end-to-end signal-to-noise ratio of these systems in terms of the Meijer’s G function. Based on these formulas, we offer exact closed-form expressions for the outage probability, the higher-order amount of fading, and the average bit-error rate of a variety of binary modulations in terms of the Meijer’s G function. Further, an exact closed-form expression for the end-to-end ergodic capacity for the Nakagami-m-unified FSO relay links is derived in terms of the bivariate G function. All the given results are verified via Computer-based Monte-Carlo simulations.

  7. On the performance of dual-hop mixed RF/FSO wireless communication system in urban area over aggregated exponentiated Weibull fading channels with pointing errors

    Science.gov (United States)

    Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian

    2018-03-01

    The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.

  8. Models and criteria for LLW disposal performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1980-12-01

    A primary objective of the Low Level Waste (LLW) Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in meeting this objective. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste mangement decision making. In addition, criteria development needs and the relation between criteria and models are discussed

  9. Models and criteria for LLW disposal performance

    International Nuclear Information System (INIS)

    Smith, C.F.; Cohen, J.J.

    1980-01-01

    A primary objective of the Low Level Waste (LLW) Management Program is to assure that public health is protected. Predictive modeling, to some extent, will play a role in meeting this objective. This paper considers the requirements and limitations of predictive modeling in providing useful inputs to waste management decision making. In addition, criteria development needs and the relation between criteria and models are discussed

  10. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    Science.gov (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  11. Modelling Flexible Pavement Response and Performance

    DEFF Research Database (Denmark)

    Ullidtz, Per

    This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods.......This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods....

  12. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    This work develops a model for interpreting implementation progress. The proposed progress monitoring model uses existing implementation artefact metrics, tries .... determine implementation velocity. As noted by McConnell [28] this velocity increases at the beginning and decreases near the end. A formal implementation.

  13. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  14. Detailed Performance Model for Photovoltaic Systems: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tian, H.; Mancilla-David, F.; Ellis, K.; Muljadi, E.; Jenkins, P.

    2012-07-01

    This paper presents a modified current-voltage relationship for the single diode model. The single-diode model has been derived from the well-known equivalent circuit for a single photovoltaic cell. The modification presented in this paper accounts for both parallel and series connections in an array.

  15. Model for Agile Software Development Performance Monitoring

    OpenAIRE

    Žabkar, Nataša

    2013-01-01

    Agile methodologies have been in use for more than ten years and during this time they proved to be efficient, even though number of empirical research is scarce, especially regarding agile software development performance monitoring. The most popular agile framework Scrum is using only one measure of performance: the amount of work remaining for implementation of User Story from the Product Backlog or for implementation of Task from the Sprint Backlog. In time the need for additional me...

  16. Modeling and optimization of LCD optical performance

    CERN Document Server

    Yakovlev, Dmitry A; Kwok, Hoi-Sing

    2015-01-01

    The aim of this book is to present the theoretical foundations of modeling the optical characteristics of liquid crystal displays, critically reviewing modern modeling methods and examining areas of applicability. The modern matrix formalisms of optics of anisotropic stratified media, most convenient for solving problems of numerical modeling and optimization of LCD, will be considered in detail. The benefits of combined use of the matrix methods will be shown, which generally provides the best compromise between physical adequacy and accuracy with computational efficiency and optimization fac

  17. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  18. Integrated thermodynamic model for ignition target performance

    Directory of Open Access Journals (Sweden)

    Springer P.T.

    2013-11-01

    Full Text Available We have derived a 3-dimensional synthetic model for NIF implosion conditions, by predicting and optimizing fits to a broad set of x-ray and nuclear diagnostics obtained on each shot. By matching x-ray images, burn width, neutron time-of-flight ion temperature, yield, and fuel ρr, we obtain nearly unique constraints on conditions in the hotspot and fuel in a model that is entirely consistent with the observables. This model allows us to determine hotspot density, pressure, areal density (ρr, total energy, and other ignition-relevant parameters not available from any single diagnostic. This article describes the model and its application to National Ignition Facility (NIF tritium–hydrogen–deuterium (THD and DT implosion data, and provides an explanation for the large yield and ρr degradation compared to numerical code predictions.

  19. Mathematical Modeling of Circadian/Performance Countermeasures

    Data.gov (United States)

    National Aeronautics and Space Administration — We developed and refined our current mathematical model of circadian rhythms to incorporate melatonin as a marker rhythm. We used an existing physiologically based...

  20. Hydrologic Evaluation of Landfill Performance (HELP) Model

    Science.gov (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  1. Evaluating Models of Human Performance: Safety-Critical Systems Applications

    Science.gov (United States)

    Feary, Michael S.

    2012-01-01

    This presentation is part of panel discussion on Evaluating Models of Human Performance. The purpose of this panel is to discuss the increasing use of models in the world today and specifically focus on how to describe and evaluate models of human performance. My presentation will focus on discussions of generating distributions of performance, and the evaluation of different strategies for humans performing tasks with mixed initiative (Human-Automation) systems. I will also discuss issues with how to provide Human Performance modeling data to support decisions on acceptability and tradeoffs in the design of safety critical systems. I will conclude with challenges for the future.

  2. Comparison of performance of simulation models for floor heating

    DEFF Research Database (Denmark)

    Weitzmann, Peter; Svendsen, Svend

    2005-01-01

    This paper describes the comparison of performance of simulation models for floor heating with different level of detail in the modelling process. The models are compared in an otherwise identical simulation model containing room model, walls, windows, ceiling and ventilation system. By exchanging...

  3. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-10-01

    In 2011, the NAHB Research Center began assessing the needs and motivations of residential remodelers regarding energy performance remodeling. This report outlines: the current remodeling industry and the role of energy efficiency; gaps and barriers to adding energy efficiency into remodeling; and support needs of professional remodelers to increase sales and projects involving improving home energy efficiency.

  4. Modeling vibrato and portamento in music performance

    NARCIS (Netherlands)

    Desain, P.W.M.; Honing, H.J.

    1999-01-01

    Research in the psychology of music dealing with expression is often concerned with the discrete aspects of music performance, and mainly concentrates on the study of piano music (partly because of the ease with which piano music can be reduced to discrete note events). However, on other

  5. WirelessHART modeling and performance evaluation

    NARCIS (Netherlands)

    Remke, Anne Katharina Ingrid; Wu, Xian

    2013-01-01

    In process industries wired supervisory and control networks are more and more replaced by wireless systems. Wireless communication inevitably introduces time delays and message losses, which may degrade the system reliability and performance. WirelessHART, as the first international standard for

  6. Impact of Antenna Correlation on a New Dual-Hop MIMO AF Relaying Model

    Directory of Open Access Journals (Sweden)

    Amarasuriya Gayan

    2010-01-01

    Full Text Available A novel system model is proposed for the dual-hop multiple-input multiple-output amplify-andforward relay networks, and the impact of antenna correlation on the performance is studied. For a semiarbitrary correlated source-relay channel and an arbitrary correlated relay-destination channel, the complementary cumulative distribution function (CCDF and the moment-generating function (MGF approximations of the end-to-end signal-to-noise ratio (SNR are derived. The outage probability, the average symbol error rate (SER, and the ergodic capacity approximations are also derived. Two special cases are treated explicitly: (1 dual-antenna relay and multiple-antenna destination and (2 uncorrelated antennas at the relay and correlated antennas at the destination. For the first case, the CCDF, the MGF and the average SER of an upper bound of the end-to-end SNR are derived in closed-form. For the second case, the CCDF, the MGF, the average SER, and the moments of SNR are derived in closed-form; as well, the high SNR approximations for the outage probability and the average SER are derived, and the diversity gain and coding gain are developed. Extensive numerical results and Monte Carlo simulation results are presented to verify the analytical results and to quantify the detrimental impact of antenna correlations on the system performance.

  7. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some multi-input single-output (MISO) processes, namely: brewery operations (case study 1) and soap production (case study 2) processes. Two ANFIS models were developed to model the performance of the ...

  8. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  9. Some useful characteristics of performance models

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1985-01-01

    This paper examines the demands placed upon models of human cognitive decision processes in application to Probabilistic Risk Assessment. Successful models, for this purpose, should, 1) be based on proven or plausible psychological knowledge, e.g., Rasmussen's mental schematic, 2) incorporate opportunities for slips, 3) take account of the recursive nature, in time, of corrections to mistaken actions, and 4) depend on the crew's predominant mental states that accompany such recursions. The latter is equivalent to an explicit coupling between input and output of Rasmussen's mental schematic. A family of such models is proposed with observable rate processes mediating the (conscious) mental states involved. It is expected that the cumulative probability distributions corresponding to the individual rate processes can be identified with probability-time correlations of the HCR Human Cognitive Reliability type discussed elsewhere in this session. The functional forms of the conditional rates are intuitively shown to have simple characteristics that lead to a strongly recursive stochastic process with significant predictive capability. Models of the type proposed have few parts and form a representation that is intentionally far short of a fully transparent exposition of the mental process in order to avoid making impossible demands on data

  10. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  11. HANDOVER MANAGEABILITY AND PERFORMANCE MODELING IN

    African Journals Online (AJOL)

    SOFTLINKS DIGITAL

    situations. Library Management Design Using Use. Case. To model software using object oriented design, a case study of Bingham University. Library Management System is used. Software is developed to automate the Bingham. University manual Library. The system will be stand alone and will be designed with the.

  12. Sustaining Team Performance: A Systems Model\\

    Science.gov (United States)

    1979-07-31

    a " forgetting curve ." Three cases were tested and retested under four different research schedules. A description c;’ the test cases follows. III-11...available to show fluctuation in Ph due to unit yearly training cycle. Another real-world military example of the classical forgetting curve is found in the...no practice between the acquisition and subsequent test of performance. The classical forgetting curve is believed to apply. The shape of curve depends

  13. Modelling swimming hydrodynamics to enhance performance

    OpenAIRE

    Marinho, D.A.; Rouboa, A.; Barbosa, Tiago M.; Silva, A.J.

    2010-01-01

    Swimming assessment is one of the most complex but outstanding and fascinating topics in biomechanics. Computational fluid dynamics (CFD) methodology is one of the different methods that have been applied in swimming research to observe and understand water movements around the human body and its application to improve swimming performance. CFD has been applied attempting to understand deeply the biomechanical basis of swimming. Several studies have been conducted willing to analy...

  14. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  15. Developing an Energy Performance Modeling Startup Kit

    Energy Technology Data Exchange (ETDEWEB)

    Wood, A.

    2012-10-01

    In 2011, the NAHB Research Center began the first part of the multi-year effort by assessing the needs and motivations of residential remodelers regarding energy performance remodeling. The scope is multifaceted - all perspectives will be sought related to remodeling firms ranging in size from small-scale, sole proprietor to national. This will allow the Research Center to gain a deeper understanding of the remodeling and energy retrofit business and the needs of contractors when offering energy upgrade services. To determine the gaps and the motivation for energy performance remodeling, the NAHB Research Center conducted (1) an initial series of focus groups with remodelers at the 2011 International Builders' Show, (2) a second series of focus groups with remodelers at the NAHB Research Center in conjunction with the NAHB Spring Board meeting in DC, and (3) quantitative market research with remodelers based on the findings from the focus groups. The goal was threefold, to: Understand the current remodeling industry and the role of energy efficiency; Identify the gaps and barriers to adding energy efficiency into remodeling; and Quantify and prioritize the support needs of professional remodelers to increase sales and projects involving improving home energy efficiency. This report outlines all three of these tasks with remodelers.

  16. Modeling the Mechanical Performance of Die Casting Dies

    Energy Technology Data Exchange (ETDEWEB)

    R. Allen Miller

    2004-02-27

    The following report covers work performed at Ohio State on modeling the mechanical performance of dies. The focus of the project was development and particularly verification of finite element techniques used to model and predict displacements and stresses in die casting dies. The work entails a major case study performed with and industrial partner on a production die and laboratory experiments performed at Ohio State.

  17. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  18. Modeling and analysis to quantify MSE wall behavior and performance.

    Science.gov (United States)

    2009-08-01

    To better understand potential sources of adverse performance of mechanically stabilized earth (MSE) walls, a suite of analytical models was studied using the computer program FLAC, a numerical modeling computer program widely used in geotechnical en...

  19. Characterization uncertainty and its effects on models and performance

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.

  20. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  1. Modeling and Performance Analysis of Manufacturing Systems in ...

    African Journals Online (AJOL)

    This study deals with modeling and performance analysis of footwear manufacturing using arena simulation modeling software. It was investigated that modeling and simulation is a potential tool for modeling and analysis of manufacturing assembly lines like footwear manufacturing because it allows the researcher to ...

  2. Identifying the connective strength between model parameters and performance criteria

    Directory of Open Access Journals (Sweden)

    B. Guse

    2017-11-01

    Full Text Available In hydrological models, parameters are used to represent the time-invariant characteristics of catchments and to capture different aspects of hydrological response. Hence, model parameters need to be identified based on their role in controlling the hydrological behaviour. For the identification of meaningful parameter values, multiple and complementary performance criteria are used that compare modelled and measured discharge time series. The reliability of the identification of hydrologically meaningful model parameter values depends on how distinctly a model parameter can be assigned to one of the performance criteria. To investigate this, we introduce the new concept of connective strength between model parameters and performance criteria. The connective strength assesses the intensity in the interrelationship between model parameters and performance criteria in a bijective way. In our analysis of connective strength, model simulations are carried out based on a latin hypercube sampling. Ten performance criteria including Nash–Sutcliffe efficiency (NSE, Kling–Gupta efficiency (KGE and its three components (alpha, beta and r as well as RSR (the ratio of the root mean square error to the standard deviation for different segments of the flow duration curve (FDC are calculated. With a joint analysis of two regression tree (RT approaches, we derive how a model parameter is connected to different performance criteria. At first, RTs are constructed using each performance criterion as the target variable to detect the most relevant model parameters for each performance criterion. Secondly, RTs are constructed using each parameter as the target variable to detect which performance criteria are impacted by changes in the values of one distinct model parameter. Based on this, appropriate performance criteria are identified for each model parameter. In this study, a high bijective connective strength between model parameters and performance criteria

  3. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Cost. G=Gamma. CV=Cross Validation. MCC=Matthew Correlation Coefficient. Test 1: C G CV Accuracy TP TN FP FN ... Conclusion: Without considering the MirTif negative dataset for training Model A and B classifiers, our Model A and B ...

  4. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that

  5. A Systemic Cause Analysis Model for Human Performance Technicians

    Science.gov (United States)

    Sostrin, Jesse

    2011-01-01

    This article presents a systemic, research-based cause analysis model for use in the field of human performance technology (HPT). The model organizes the most prominent barriers to workplace learning and performance into a conceptual framework that explains and illuminates the architecture of these barriers that exist within the fabric of everyday…

  6. First-Principles Simulation and Comparison with Beam Tests for Transverse Instabilities and Damper Performance in the Fermilab Main Injector

    CERN Document Server

    Nicklaus, Dennis J; Kashikhin, Vladimir

    2005-01-01

    An end-to-end performance calculation and comparison with beam tests was performed for the bunch-by-bunch digital transverse damper in the Fermilab Main Injector. Time dependent magnetic wakefields responsible for "Resistive Wall" transverse instabilities in the Main Injector were calculated with OPERA-2D using the actual beam pipe and dipole magnet lamination geometry. The leading order dipole component was parameterized and used as input to a bunch-by-bunch simulation which included the filling pattern and injection errors experienced in high-intensity operation of the Main Injector. The instability growth times, and the spreading of the disturbance due to newly mis-injected batches was compared between simulations and beam data collected by the damper system. Further simulation models the effects of the damper system on the beam.

  7. Multitasking TORT under UNICOS: Parallel performance models and measurements

    International Nuclear Information System (INIS)

    Barnett, A.; Azmy, Y.Y.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  8. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Barnett, D.A.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  9. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  10. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  11. The GEDI Performance Tool

    Science.gov (United States)

    Hancock, S.; Armston, J.; Tang, H.; Patterson, P. L.; Healey, S. P.; Marselis, S.; Duncanson, L.; Hofton, M. A.; Kellner, J. R.; Luthcke, S. B.; Sun, X.; Blair, J. B.; Dubayah, R.

    2017-12-01

    NASA's Global Ecosystem Dynamics Investigation will mount a multi-track, full-waveform lidar on the International Space Station (ISS) that is optimised for the measurement of forest canopy height and structure. GEDI will use ten laser tracks, two 10 mJ "power beams" and eight 5 mJ "coverage beams" to produce global (51.5oS to 51.5oN) maps of above ground biomass (AGB), canopy height, vegetation structure and other biophysical parameters. The mission has a requirement to generate a 1 km AGB map with 80% of pixels with ≤ 20% standard error or 20 Mg·ha-1, whichever is greater. To assess performance and compare to mission requirements, an end-to-end simulator has been developed. The simulator brings together tools to propagate the effects of measurement and sampling error on GEDI data products. The simulator allows us to evaluate the impact of instrument performance, ISS orbits, processing algorithms and losses of data that may occur due to clouds, snow, leaf-off conditions, and areas with an insufficient signal-to-noise ratio (SNR). By evaluating the consequences of operational decisions on GEDI data products, this tool provides a quantitative framework for decision-making and mission planning. Here we demonstrate the performance tool by using it to evaluate the trade-off between measurement and sampling error on the 1 km AGB data product. Results demonstrate that the use of coverage beams during the day (lowest GEDI SNR case) over very dense forests (>95% canopy cover) will result in some measurement bias. Omitting these low SNR cases increased the sampling error. Through this an SNR threshold for a given expected canopy cover can be set. The other applications of the performance tool are also discussed, such as assessing the impact of decisions made in the AGB modelling and signal processing stages on the accuracy of final data products.

  12. A Probabilistic Approach to Symbolic Performance Modeling of Parallel Systems

    NARCIS (Netherlands)

    Gautama, H.

    2004-01-01

    Performance modeling plays a significant role in predicting the effects of a particular design choice or in diagnosing the cause for some observed performance behavior. Especially for complex systems such as parallel computer, typically, an intended performance cannot be achieved without recourse to

  13. Asymptotic performance modelling of DCF protocol with prioritized channel access

    Science.gov (United States)

    Choi, Woo-Yong

    2017-11-01

    Recently, the modification of the DCF (Distributed Coordination Function) protocol by the prioritized channel access was proposed to resolve the problem that the DCF performance worsens exponentially as more nodes exist in IEEE 802.11 wireless LANs. In this paper, an asymptotic analytical performance model is presented to analyze the MAC performance of the DCF protocol with the prioritized channel access.

  14. Performance Analysis of Multi-Hop Heterodyne FSO Systems over Malaga Turbulent Channels with Pointing Error Using Mixture Gamma Distribution

    KAUST Repository

    Alheadary, Wael Ghazy

    2017-11-16

    This work investigates the end-to-end performance of a free space optical amplify-and-forward relaying system using heterodyne detection over Malaga turbulence channels at the presence of pointing error. In order to overcome the analytical difficulties of the proposed composite channel model, we employed the mixture Gamma (MG) distribution. The proposed model shows a high accurate and tractable approximation just by adjusting some parameters. More specifically, we derived new closed-form expression for average bit error rate employing rectangular quadrature amplitude modulation in term of MG distribution and generalized power series of the Meijer\\'s G- function. The closed-form has been validated numerically and asymptotically at high signal to noise ratio.

  15. Maintenance personnel performance simulation (MAPPS): a model for predicting maintenance performance reliability in nuclear power plants

    International Nuclear Information System (INIS)

    Knee, H.E.; Krois, P.A.; Haas, P.M.; Siegel, A.I.; Ryan, T.G.

    1983-01-01

    The NRC has developed a structured, quantitative, predictive methodology in the form of a computerized simulation model for assessing maintainer task performance. Objective of the overall program is to develop, validate, and disseminate a practical, useful, and acceptable methodology for the quantitative assessment of NPP maintenance personnel reliability. The program was organized into four phases: (1) scoping study, (2) model development, (3) model evaluation, and (4) model dissemination. The program is currently nearing completion of Phase 2 - Model Development

  16. Performance and reliability model checking and model construction

    NARCIS (Netherlands)

    Hermanns, H.; Gnesi, Stefania; Schieferdecker, Ina; Rennoch, Axel

    2000-01-01

    Continuous-time Markov chains (CTMCs) are widely used to describe stochastic phenomena in many diverse areas. They are used to estimate performance and reliability characteristics of various nature, for instance to quantify throughputs of manufacturing systems, to locate bottlenecks in communication

  17. Automatic Performance Model Generation for Java Enterprise Edition (EE) Applications

    OpenAIRE

    Brunnert, Andreas;Vögele, Christian;Krcmar, Helmut

    2015-01-01

    The effort required to create performance models for enterprise applications is often out of proportion compared to their benefits. This work aims to reduce this effort by introducing an approach to automatically generate component-based performance models for running Java EE applications. The approach is applicable for all Java EE server products as it relies on standardized component types and interfaces to gather the required data for modeling an application. The feasibility of the approac...

  18. ECOPATH: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-01-01

    The model is based upon compartment theory and it is run in combination with a statistical error propagation method (PRISM, Gardner et al. 1983). It is intended to be generic for application on other sites with simple changing of parameter values. It was constructed especially for this scenario. However, it is based upon an earlier designed model for calculating relations between released amount of radioactivity and doses to critical groups (used for Swedish regulations concerning annual reports of released radioactivity from routine operation of Swedish nuclear power plants (Bergstroem och Nordlinder, 1991)). The model handles exposure from deposition on terrestrial areas as well as deposition on lakes, starting with deposition values. 14 refs, 16 figs, 7 tabs

  19. Indonesian Private University Lecturer Performance Improvement Model to Improve a Sustainable Organization Performance

    Science.gov (United States)

    Suryaman

    2018-01-01

    Lecturer performance will affect the quality and carrying capacity of the sustainability of an organization, in this case the university. There are many models developed to measure the performance of teachers, but not much to discuss the influence of faculty performance itself towards sustainability of an organization. This study was conducted in…

  20. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  1. Atomic scale simulations for improved CRUD and fuel performance modeling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Anders David Ragnar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Cooper, Michael William Donald [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-06

    A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.

  2. modeling the effect of bandwidth allocation on network performance

    African Journals Online (AJOL)

    Modeling The Effect Of Bandwidth Allocation On Network Performance. MODELING THE EFFECT OF BANDWIDTH ... ABSTRACT. In this paper, a new channel capacity model for interference- limited systems was obtained .... congestion admission control, with the intent of minimizing energy consumption at each terminal.

  3. Modelling of Box Type Solar Cooker Performance in a Tropical ...

    African Journals Online (AJOL)

    Thermal performance model of box type solar cooker with loaded water is presented. The model was developed using the method of Funk to estimate cooking power in terms of climatic and design parameters for box type solar cooker in a tropical environment. Coefficients for each term used in the model were determined ...

  4. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  5. FARMLAND: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Attwood, C.; Fayers, C.; Mayall, A.; Brown, J.; Simmonds, J.R.

    1996-01-01

    The FARMLAND model was originally developed for use in connection with continuous, routine releases of radionuclides, but because it has many time-dependent features it has been developed further for a single accidental release. The most recent version of FARMLAND is flexible and can be used to predict activity concentrations in food as a function of time after both accidental and routine releases of radionuclides. The effect of deposition at different times of the year can be taken into account. FARMLAND contains a suite of models which simulate radionuclide transfer through different parts of the foodchain. The models can be used in different combinations and offer the flexibility to assess a variety of radiological situations. The main foods considered are green vegetables, grain products, root vegetables, milk, meat and offal from cattle, and meat and offal from sheep. A large variety of elements can be considered although the degree of complexity with which some are modelled is greater than others; isotopes of caesium, strontium and iodine are treated in greatest detail. 22 refs, 12 figs, 10 tabs

  6. Performance Testing of Massive MIMO Base Station with Multi-Probe Anechoic Chamber Setups

    DEFF Research Database (Denmark)

    Zhang, Fengchun; Fan, Wei; Ji, Yilin

    2018-01-01

    The utilization of massive multiple-input multipleoutput (MIMO) antenna arrays at the base station (BS) side has been identified as an enabling technique for 5G communication systems. To evaluate the true end-to-end performance of BS's, an over-the-air (OTA) radiated method is required. In this p...

  7. Formal Modeling of Service Session Management

    NARCIS (Netherlands)

    Le, V.M.; van Beijnum, Bernhard J.F.; de Goede, Leo; Almeroth, Kevin C.; Hasan, Masum

    2002-01-01

    This paper proposes a concept to apply modeling tools to Multi-Provider Telematics Service Management. The service architecture is based on the framework called “Open Service Components” which serves as building blocks to compose end-to-end telematics services in terms of service components offered

  8. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  9. High Performance Interactive System Dynamics Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brunhart-Lupo, Nicholas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Duckworth, Jonathan C [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    This brochure describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  10. High Performance Interactive System Dynamics Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Bush, Brian W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brunhart-Lupo, Nicholas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Duckworth, Jonathan C [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    This presentation describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  11. Emerging Carbon Nanotube Electronic Circuits, Modeling, and Performance

    OpenAIRE

    Xu, Yao; Srivastava, Ashok; Sharma, Ashwani K.

    2010-01-01

    Current transport and dynamic models of carbon nanotube field-effect transistors are presented. A model of single-walled carbon nanotube as interconnect is also presented and extended in modeling of single-walled carbon nanotube bundles. These models are applied in studying the performances of circuits such as the complementary carbon nanotube inverter pair and carbon nanotube as interconnect. Cadence/Spectre simulations show that carbon nanotube field-effect transistor circuits can operate a...

  12. CORPORATE FORESIGHT AND PERFORMANCE: A CHAIN-OF-EFFECTS MODEL

    DEFF Research Database (Denmark)

    Jissink, Tymen; Huizingh, Eelko K.R.E.; Rohrbeck, René

    2015-01-01

    , formal organization, and culture. We investigate the relation of corporate foresight with three innovation performance dimensions – new product success, new product innovativeness, and financial performance. We use partial-least-squares structural equations modelling to assess our measurement mode ls......In this paper we develop and validate a measurement scale for corporate foresight and examine its impact on performance in a chain-of-effects model. We conceptualize corporate foresight as an organizational ability consisting of five distinct dimensions: information scope, method usage, people...... performance dimensions. Implications of our findings, and limitations and future research avenues are discussed....

  13. Models used to assess the performance of photovoltaic systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  14. Performance comparison of hydrological model structures during low flows

    Science.gov (United States)

    Staudinger, Maria; Stahl, Kerstin; Tallaksen, Lena M.; Clark, Martyn P.; Seibert, Jan

    2010-05-01

    Low flows are still poorly reproduced by common hydrological models since they are traditionally designed to meet peak flow situations best possible. As low flow becomes increasingly important to several target areas there is a need to improve available models. We present a study that assesses the impact of model structure on low flow simulations. This is done using the Framework for Understanding Structural Errors (FUSE), which identifies the set of (subjective) decisions made when building a hydrological model, and provides multiple options for each modeling decision. 79 models were built using the FUSE framework, and applied to simulate stream flows in the Narsjø catchment in Norway (119 km²). To allow comparison all new models were calibrated using an automatic optimization method. Low flow and recession analysis of the new models enables us to evaluate model performance focusing on different aspects by using various objective functions. Additionally, model structures responsible for poor performance, and hence unsuitable, can be detected. We focused on elucidating model performance during summer (August - October) and winter low flows which evolve from entirely different hydrological processes in the Narsjø catchment. Summer low flows develop out of a lack of precipitation while winter low flows are due to water storage in ice and snow. The results showed that simulations of summer low flows were throughout poorer than simulations of winter low flows when evaluating with an objective function focusing on low flows; here, the model structure influencing winter low flow simulations is the lower layer architecture. Different model structures were found to influence model performance during the summer season. The choice of other objective functions has the potential to affect such an evaluation. These findings call for the use of different model structures tailored to particular needs.

  15. A performance comparison of atmospheric dispersion models over complex topography

    International Nuclear Information System (INIS)

    Kido, Hiroko; Oishi, Ryoko; Hayashi, Keisuke; Kanno, Mitsuhiro; Kurosawa, Naohiro

    2007-01-01

    A code system using mass-consistent and Gaussian puff model was improved for a new option of atmospheric dispersion research. There are several atmospheric dispersion models for radionuclides. Because different models have both merits and disadvantages, it is necessary to choose the model that is most suitable for the surface conditions of the estimated region while regarding the calculation time, accuracy, and purpose of the calculations being performed. Some models are less accurate when the topography is complex. It is important to understand the differences between the models for smooth and complex surfaces. In this study, the performances of the following four models were compared: (1) Gaussian plume model (2) Gaussian puff model (3) Mass-consistent wind fields and Gaussian puff model that was improved in this study from one presented in Aomori Energy Society of Japan, 2005 Fall Meeting, D21. (4) Meso-scale meteorological model (RAMS: The Regional Atmospheric Modeling System) and particle-type model (HYPACT: The RAMS Hybrid Particle and Concentration Transport Model) (Reference: ATMET). (author)

  16. Confirming the Value of Swimming-Performance Models for Adolescents.

    Science.gov (United States)

    Dormehl, Shilo J; Robertson, Samuel J; Barker, Alan R; Williams, Craig A

    2017-10-01

    To evaluate the efficacy of existing performance models to assess the progression of male and female adolescent swimmers through a quantitative and qualitative mixed-methods approach. Fourteen published models were tested using retrospective data from an independent sample of Dutch junior national-level swimmers from when they were 12-18 y of age (n = 13). The degree of association by Pearson correlations was compared between the calculated differences from the models and quadratic functions derived from the Dutch junior national qualifying times. Swimmers were grouped based on their differences from the models and compared with their swimming histories that were extracted from questionnaires and follow-up interviews. Correlations of the deviations from both the models and quadratic functions derived from the Dutch qualifying times were all significant except for the 100-m breaststroke and butterfly and the 200-m freestyle for females (P backstroke for males and 200-m freestyle for males and females were almost directly proportional. In general, deviations from the models were accounted for by the swimmers' training histories. Higher levels of retrospective motivation appeared to be synonymous with higher-level career performance. This mixed-methods approach helped confirm the validity of the models that were found to be applicable to adolescent swimmers at all levels, allowing coaches to track performance and set goals. The value of the models in being able to account for the expected performance gains during adolescence enables quantification of peripheral factors that could affect performance.

  17. A performance model of the OSI communication architecture

    Science.gov (United States)

    Kritzinger, P. S.

    1986-06-01

    An analytical model aiming at predicting the performance of software implementations which would be built according to the OSI basic reference model is proposed. The model uses the peer protocol standard of a layer as the reference description of an implementation of that layer. The model is basically a closed multiclass multichain queueing network with a processor-sharing center, modeling process contention at the processor, and a delay center, modeling times spent waiting for responses from the corresponding peer processes. Each individual transition of the protocol constitutes a different class and each layer of the architecture forms a closed chain. Performance statistics include queue lengths and response times at the processor as a function of processor speed and the number of open connections. It is shown how to reduce the model should the protocol state space become very large. Numerical results based upon the derived formulas are given.

  18. Performance evaluation:= (process algebra + model checking) x Markov chains

    NARCIS (Netherlands)

    Hermanns, H.; Larsen, K.G.; Nielsen, Mogens; Katoen, Joost P.

    2001-01-01

    Markov chains are widely used in practice to determine system performance and reliability characteristics. The vast majority of applications considers continuous-time Markov chains (CTMCs). This tutorial paper shows how successful model specification and analysis techniques from concurrency theory

  19. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    Science.gov (United States)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This presentation describes the capabilities of three-dimensional thermal power model of advanced stirling radioisotope generator (ASRG). The performance of the ASRG is presented for different scenario, such as Venus flyby with or without the auxiliary cooling system.

  20. Practical Techniques for Modeling Gas Turbine Engine Performance

    Science.gov (United States)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  1. Testing algorithms for a passenger train braking performance model.

    Science.gov (United States)

    2011-09-01

    "The Federal Railroad Administrations Office of Research and Development funded a project to establish performance model to develop, analyze, and test positive train control (PTC) braking algorithms for passenger train operations. With a good brak...

  2. Dynamic vehicle model for handling performance using experimental data

    Directory of Open Access Journals (Sweden)

    SangDo Na

    2015-11-01

    Full Text Available An analytical vehicle model is essential for the development of vehicle design and performance. Various vehicle models have different complexities, assumptions and limitations depending on the type of vehicle analysis. An accurate full vehicle model is essential in representing the behaviour of the vehicle in order to estimate vehicle dynamic system performance such as ride comfort and handling. An experimental vehicle model is developed in this article, which employs experimental kinematic and compliance data measured between the wheel and chassis. From these data, a vehicle model, which includes dynamic effects due to vehicle geometry changes, has been developed. The experimental vehicle model was validated using an instrumented experimental vehicle and data such as a step change steering input. This article shows a process to develop and validate an experimental vehicle model to enhance the accuracy of handling performance, which comes from precise suspension model measured by experimental data of a vehicle. The experimental force data obtained from a suspension parameter measuring device are employed for a precise modelling of the steering and handling response. The steering system is modelled by a lumped model, with stiffness coefficients defined and identified by comparing steering stiffness obtained by the measured data. The outputs, specifically the yaw rate and lateral acceleration of the vehicle, are verified by experimental results.

  3. Evaluation Model of Organizational Performance for Small and Medium Enterprises

    Directory of Open Access Journals (Sweden)

    Carlos Augusto Passos

    2014-12-01

    Full Text Available In the 1980’s, many tools for evaluating organizational performance were created. However, most of them are useful only to large companies and do not foster results in small and medium-sized enterprises (SMEs. In light of this fact, this article aims at proposing an Organizational Performance Assessment model (OPA which is flexible and adaptable to the reality of SMEs, based on the theoretical framework of various models, and comparisons on the basis of three major authors’ criteria to evaluate OPA models. The research has descriptive and exploratory character, with qualitative nature. The MADE-O model, according to the criteria described in the bibliography, is the one that best fits the needs of SMEs, used as a baseline for the model proposed in this study with adaptations pertaining to the BSC model. The model called the Overall Performance Indicator – Environment (IDG-E has as its main differential, in addition to the base of the models mentioned above, the assessment of the external and internal environment weighted in modules of OPA. As the SME is characterized by having few processes and people, the small amount of performance indicators is another positive aspect. Submitted to the evaluation of the criteria subscribed by the authors, the model proved to be quite feasible for use in SMEs.

  4. Model of service-oriented catering supply chain performance evaluation

    OpenAIRE

    Gou, Juanqiong; Shen, Guguan; Chai, Rui

    2013-01-01

    Purpose: The aim of this paper is constructing a performance evaluation model for service-oriented catering supply chain. Design/methodology/approach: With the research on the current situation of catering industry, this paper summarized the characters of the catering supply chain, and then presents the service-oriented catering supply chain model based on the platform of logistics and information. At last, the fuzzy AHP method is used to evaluate the performance of service-oriented catering ...

  5. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  6. Performance of Air Pollution Models on Massively Parallel Computers

    DEFF Research Database (Denmark)

    Brown, John; Hansen, Per Christian; Wasniewski, Jerzy

    1996-01-01

    To compare the performance and use of three massively parallel SIMD computers, we implemented a large air pollution model on the computers. Using a realistic large-scale model, we gain detailed insight about the performance of the three computers when used to solve large-scale scientific problems...... that involve several types of numerical computations. The computers considered in our study are the Connection Machines CM-200 and CM-5, and the MasPar MP-2216...

  7. A Mathematical Model to Improve the Performance of Logistics Network

    Directory of Open Access Journals (Sweden)

    Muhammad Izman Herdiansyah

    2012-01-01

    Full Text Available The role of logistics nowadays is expanding from just providing transportation and warehousing to offering total integrated logistics. To remain competitive in the global market environment, business enterprises need to improve their logistics operations performance. The improvement will be achieved when we can provide a comprehensive analysis and optimize its network performances. In this paper, a mixed integer linier model for optimizing logistics network performance is developed. It provides a single-product multi-period multi-facilities model, as well as the multi-product concept. The problem is modeled in form of a network flow problem with the main objective to minimize total logistics cost. The problem can be solved using commercial linear programming package like CPLEX or LINDO. Even in small case, the solver in Excel may also be used to solve such model.Keywords: logistics network, integrated model, mathematical programming, network optimization

  8. Construction Of A Performance Assessment Model For Zakat Management Institutions

    Directory of Open Access Journals (Sweden)

    Sri Fadilah

    2016-12-01

    Full Text Available The objective of the research is to examine the performance evaluation using Balanced Scorecard model. The research is conducted due to a big gap existing between zakat (alms and religious tax in Islam with its potential earn of as much as 217 trillion rupiahs and the realization of the collected zakat fund that is only reached for three trillion. This indicates that the performance of zakat management organizations in collecting the zakat is still very low. On the other hand, the quantity and the quality of zakat management organizations have to be improved. This means the performance evaluation model as a tool to evaluate performance is needed. The model construct is making a performance evaluation model that can be implemented to zakat management organizations. The organizational performance with Balanced Scorecard evaluation model will be effective if it is supported by three aspects, namely:  PI, BO and TQM. This research uses explanatory method and data analysis tool of SEM/PLS. Data collecting technique are questionnaires, interviews and documentation. The result of this research shows that PI, BO and TQM simultaneously and partially gives a significant effect on organizational performance.

  9. Team performance modeling for HRA in dynamic situations

    International Nuclear Information System (INIS)

    Shu Yufei; Furuta, Kazuo; Kondo, Shunsuke

    2002-01-01

    This paper proposes a team behavior network model that can simulate and analyze response of an operator team to an incident in a dynamic and context-sensitive situation. The model is composed of four sub-models, which describe the context of team performance. They are task model, event model, team model and human-machine interface model. Each operator demonstrates aspects of his/her specific cognitive behavior and interacts with other operators and the environment in order to deal with an incident. Individual human factors, which determine the basis of communication and interaction between individuals, and cognitive process of an operator, such as information acquisition, state-recognition, decision-making and action execution during development of an event scenario are modeled. A case of feed and bleed operation in pressurized water reactor under an emergency situation was studied and the result was compared with an experiment to check the validity of the proposed model

  10. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  11. Toward a Subjective Measurement Model for Firm Performance

    Directory of Open Access Journals (Sweden)

    Luiz Artur Ledur Brito

    2012-05-01

    Full Text Available Firm performance is a relevant construct in strategic management research and frequently used as a dependentvariable. Despite this relevance, there is hardly a consensus about its definition, dimensionality andmeasurement, what limits advances in research and understanding of the concept. This article proposes and testsa measurement model for firm performance, based on subjective indicators. The model is grounded instakeholder theory and a review of empirical articles. Confirmatory Factor Analyses, using data from 116Brazilian senior managers, were used to test its fit and psychometric properties. The final model had six firstorderdimensions: profitability, growth, customer satisfaction, employee satisfaction, social performance, andenvironmental performance. A second-order financial performance construct, influencing growth andprofitability, correlated with the first-order intercorrelated, non-financial dimensions. Results suggest dimensionscannot be used interchangeably, since they represent different aspects of firm performance, and corroborate theidea that stakeholders have different demands that need to be managed independently. Researchers andpractitioners may use the model to fully treat performance in empirical studies and to understand the impact ofstrategies on multiple performance facets.

  12. A Composite Model for Employees' Performance Appraisal and Improvement

    Science.gov (United States)

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  13. A Model for Effective Performance in the Indonesian Navy.

    Science.gov (United States)

    1987-06-01

    NAVY LEADERSHIP AND MANAGEMENT COM PETENCY M ODEL .................................. 15 D. MCBER COMPETENT MANAGERS MODEL ................ IS E. SU M M... leadership and managerial skills which emphasize on effective performance of the officers in managing the human resources under their cormnand and...supervision. By effective performance we mean officers who not only know about management theories , but who possess the characteristics, knowledge, skill, and

  14. Discussion of various models related to cloud performance

    OpenAIRE

    Kande, Chaitanya Krishna

    2015-01-01

    This paper discusses the various models related to cloud computing. Knowing the metrics related to infrastructure is very critical to enhance the performance of cloud services. Various metrics related to clouds such as pageview response time, admission control and enforcing elasticity to cloud infrastructure are very crucial in analyzing the characteristics of the cloud to enhance the cloud performance.

  15. Performance Implications of Business Model Change: A Case Study

    Directory of Open Access Journals (Sweden)

    Jana Poláková

    2015-01-01

    Full Text Available The paper deals with changes in performance level introduced by the change of business model. The selected case is a small family business undergoing through substantial changes in reflection of structural changes of its markets. The authors used the concept of business model to describe value creation processes within the selected family business and by contrasting the differences between value creation processes before and after the change introduced they prove the role of business model as the performance differentiator. This is illustrated with the use of business model canvas constructed on the basis interviews, observations and document analysis. The two business model canvases allow for explanation of cause-and-effect relationships within the business leading to change in performance. The change in the performance is assessed by financial analysis of the business conducted over the period of 2006–2012 demonstrates changes in performance (comparing development of ROA, ROE and ROS having their lowest levels before the change of business model was introduced, growing after the introduction of the change, as well as the activity indicators with similar developments of the family business. The described case study contributes to the concept of business modeling with the arguments supporting its value as strategic tool facilitating decisions related to value creation within the business.

  16. Using Trust to Establish a Secure Routing Model in Cognitive Radio Network.

    Science.gov (United States)

    Zhang, Guanghua; Chen, Zhenguo; Tian, Liqin; Zhang, Dongwen

    2015-01-01

    Specific to the selective forwarding attack on routing in cognitive radio network, this paper proposes a trust-based secure routing model. Through monitoring nodes' forwarding behaviors, trusts of nodes are constructed to identify malicious nodes. In consideration of that routing selection-based model must be closely collaborative with spectrum allocation, a route request piggybacking available spectrum opportunities is sent to non-malicious nodes. In the routing decision phase, nodes' trusts are used to construct available path trusts and delay measurement is combined for making routing decisions. At the same time, according to the trust classification, different responses are made specific to their service requests. By adopting stricter punishment on malicious behaviors from non-trusted nodes, the cooperation of nodes in routing can be stimulated. Simulation results and analysis indicate that this model has good performance in network throughput and end-to-end delay under the selective forwarding attack.

  17. Using Trust to Establish a Secure Routing Model in Cognitive Radio Network

    Science.gov (United States)

    Zhang, Guanghua; Chen, Zhenguo; Tian, Liqin; Zhang, Dongwen

    2015-01-01

    Specific to the selective forwarding attack on routing in cognitive radio network, this paper proposes a trust-based secure routing model. Through monitoring nodes’ forwarding behaviors, trusts of nodes are constructed to identify malicious nodes. In consideration of that routing selection-based model must be closely collaborative with spectrum allocation, a route request piggybacking available spectrum opportunities is sent to non-malicious nodes. In the routing decision phase, nodes’ trusts are used to construct available path trusts and delay measurement is combined for making routing decisions. At the same time, according to the trust classification, different responses are made specific to their service requests. By adopting stricter punishment on malicious behaviors from non-trusted nodes, the cooperation of nodes in routing can be stimulated. Simulation results and analysis indicate that this model has good performance in network throughput and end-to-end delay under the selective forwarding attack. PMID:26421843

  18. Conceptual Modeling of Performance Indicators of Higher Education Institutions

    OpenAIRE

    Kahveci, Tuba Canvar; Taşkın, Harun; Toklu, Merve Cengiz

    2013-01-01

    Measuring and analyzing any type of organization are carried out by different actors in the organization. The performance indicators of performance management system increase according to products or services of the organization. Also these indicators should be defined for all levels of the organization. Finally, all of these characteristics make the performance evaluation process more complex for organizations. In order to manage this complexity, the process should be modeled at the beginnin...

  19. Gold-standard performance for 2D hydrodynamic modeling

    Science.gov (United States)

    Pasternack, G. B.; MacVicar, B. J.

    2013-12-01

    Two-dimensional, depth-averaged hydrodynamic (2D) models are emerging as an increasingly useful tool for environmental water resources engineering. One of the remaining technical hurdles to the wider adoption and acceptance of 2D modeling is the lack of standards for 2D model performance evaluation when the riverbed undulates, causing lateral flow divergence and convergence. The goal of this study was to establish a gold-standard that quantifies the upper limit of model performance for 2D models of undulating riverbeds when topography is perfectly known and surface roughness is well constrained. A review was conducted of published model performance metrics and the value ranges exhibited by models thus far for each one. Typically predicted velocity differs from observed by 20 to 30 % and the coefficient of determination between the two ranges from 0.5 to 0.8, though there tends to be a bias toward overpredicting low velocity and underpredicting high velocity. To establish a gold standard as to the best performance possible for a 2D model of an undulating bed, two straight, rectangular-walled flume experiments were done with no bed slope and only different bed undulations and water surface slopes. One flume tested model performance in the presence of a porous, homogenous gravel bed with a long flat section, then a linear slope down to a flat pool bottom, and then the same linear slope back up to the flat bed. The other flume had a PVC plastic solid bed with a long flat section followed by a sequence of five identical riffle-pool pairs in close proximity, so it tested model performance given frequent undulations. Detailed water surface elevation and velocity measurements were made for both flumes. Comparing predicted versus observed velocity magnitude for 3 discharges with the gravel-bed flume and 1 discharge for the PVC-bed flume, the coefficient of determination ranged from 0.952 to 0.987 and the slope for the regression line was 0.957 to 1.02. Unsigned velocity

  20. The Social Responsibility Performance Outcomes Model: Building Socially Responsible Companies through Performance Improvement Outcomes.

    Science.gov (United States)

    Hatcher, Tim

    2000-01-01

    Considers the role of performance improvement professionals and human resources development professionals in helping organizations realize the ethical and financial power of corporate social responsibility. Explains the social responsibility performance outcomes model, which incorporates the concepts of societal needs and outcomes. (LRW)

  1. Faculty Performance Evaluation: The CIPP-SAPS Model.

    Science.gov (United States)

    Mitcham, Maralynne

    1981-01-01

    The issues of faculty performance evaluation for allied health professionals are addressed. Daniel Stufflebeam's CIPP (content-imput-process-product) model is introduced and its development into a CIPP-SAPS (self-administrative-peer- student) model is pursued. (Author/CT)

  2. Technical performance of percutaneous and laminectomy leads analyzed by modeling

    NARCIS (Netherlands)

    Manola, L.; Holsheimer, J.

    2004-01-01

    The objective of this study was to compare the technical performance of laminectomy and percutaneous spinal cord stimulation leads with similar contact spacing by computer modeling. Monopolar and tripolar (guarded cathode) stimulation with both lead types in a low-thoracic spine model was simulated

  3. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    CHIDOZIE CHUKWUEMEKA NWOBI-OKOYE

    2017-11-16

    Nov 16, 2017 ... In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some ..... Every node i in this layer is an adaptive node with a node function. Neuro-fuzzy model for .... spectral analysis and parameter optimization using genetic algorithm, the values of v10. and ...

  4. UNCONSTRAINED HANDWRITING RECOGNITION : LANGUAGE MODELS, PERPLEXITY, AND SYSTEM PERFORMANCE

    NARCIS (Netherlands)

    Marti, U-V.; Bunke, H.

    2004-01-01

    In this paper we present a number of language models and their behavior in the recognition of unconstrained handwritten English sentences. We use the perplexity to compare the different models and their prediction power, and relate it to the performance of a recognition system under different

  5. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    Science.gov (United States)

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  6. Activity-Based Costing Model for Assessing Economic Performance.

    Science.gov (United States)

    DeHayes, Daniel W.; Lovrinic, Joseph G.

    1994-01-01

    An economic model for evaluating the cost performance of academic and administrative programs in higher education is described. Examples from its application at Indiana University-Purdue University Indianapolis are used to illustrate how the model has been used to control costs and reengineer processes. (Author/MSE)

  7. Longitudinal modeling in sports: young swimmers' performance and biomechanics profile.

    Science.gov (United States)

    Morais, Jorge E; Marques, Mário C; Marinho, Daniel A; Silva, António J; Barbosa, Tiago M

    2014-10-01

    New theories about dynamical systems highlight the multi-factorial interplay between determinant factors to achieve higher sports performances, including in swimming. Longitudinal research does provide useful information on the sportsmen's changes and how training help him to excel. These questions may be addressed in one single procedure such as latent growth modeling. The aim of the study was to model a latent growth curve of young swimmers' performance and biomechanics over a season. Fourteen boys (12.33 ± 0.65 years-old) and 16 girls (11.15 ± 0.55 years-old) were evaluated. Performance, stroke frequency, speed fluctuation, arm's propelling efficiency, active drag, active drag coefficient and power to overcome drag were collected in four different moments of the season. Latent growth curve modeling was computed to understand the longitudinal variation of performance (endogenous variables) over the season according to the biomechanics (exogenous variables). Latent growth curve modeling showed a high inter- and intra-subject variability in the performance growth. Gender had a significant effect at the baseline and during the performance growth. In each evaluation moment, different variables had a meaningful effect on performance (M1: Da, β = -0.62; M2: Da, β = -0.53; M3: η(p), β = 0.59; M4: SF, β = -0.57; all P < .001). The models' goodness-of-fit was 1.40 ⩽ χ(2)/df ⩽ 3.74 (good-reasonable). Latent modeling is a comprehensive way to gather insight about young swimmers' performance over time. Different variables were the main responsible for the performance improvement. A gender gap, intra- and inter-subject variability was verified. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Impact of Pointing Errors on the Performance of Mixed RF/FSO Dual-Hop Transmission Systems

    KAUST Repository

    Ansari, Imran Shafique

    2013-02-20

    In this work, the performance analysis of a dual-hop relay transmission system composed of asymmetric radio-frequency (RF)/free-space optical (FSO) links with pointing errors is presented. More specifically, we build on the system model presented in [1] to derive new exact closed-form expressions for the cumulative distribution function, probability density function, moment generating function, and moments of the end-to-end signal-to-noise ratio in terms of the Meijer\\'s G function. We then capitalize on these results to offer new exact closed-form expressions for the higher-order amount of fading, average error rate for binary and M-ary modulation schemes, and the ergodic capacity, all in terms of Meijer\\'s G functions. Our new analytical results were also verified via computer-based Monte-Carlo simulation results.

  9. Comparison of the performance of net radiation calculation models

    DEFF Research Database (Denmark)

    Kjærsgaard, Jeppe Hvelplund; Cuenca, R.H.; Martinez-Cob, A.

    2009-01-01

    Daily values of net radiation are used in many applications of crop-growth modeling and agricultural water management. Measurements of net radiation are not part of the routine measurement program at many weather stations and are commonly estimated based on other meteorological parameters. Daily....... The performance of the empirical models was nearly identical at all sites. Since the empirical models were easier to use and simpler to calibrate than the physically based models, the results indicate that the empirical models can be used as a good substitute for the physically based ones when available...

  10. Review of Methods for Buildings Energy Performance Modelling

    Science.gov (United States)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting – replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance

  11. An analytical model of the HINT performance metric

    Energy Technology Data Exchange (ETDEWEB)

    Snell, Q.O.; Gustafson, J.L. [Scalable Computing Lab., Ames, IA (United States)

    1996-10-01

    The HINT benchmark was developed to provide a broad-spectrum metric for computers and to measure performance over the full range of memory sizes and time scales. We have extended our understanding of why HINT performance curves look the way they do and can now predict the curves using an analytical model based on simple hardware specifications as input parameters. Conversely, by fitting the experimental curves with the analytical model, hardware specifications such as memory performance can be inferred to provide insight into the nature of a given computer system.

  12. Disaggregation of Rainy Hours: Compared Performance of Various Models.

    Science.gov (United States)

    Ben Haha, M.; Hingray, B.; Musy, A.

    In the urban environment, the response times of catchments are usually short. To de- sign or to diagnose waterworks in that context, it is necessary to describe rainfall events with a good time resolution: a 10mn time step is often necessary. Such in- formation is not always available. Rainfall disaggregation models have thus to be applied to produce from rough rainfall data that short time resolution information. The communication will present the performance obtained with several rainfall dis- aggregation models that allow for the disaggregation of rainy hours into six 10mn rainfall amounts. The ability of the models to reproduce some statistical character- istics of rainfall (mean, variance, overall distribution of 10mn-rainfall amounts; ex- treme values of maximal rainfall amounts over different durations) is evaluated thanks to different graphical and numerical criteria. The performance of simple models pre- sented in some scientific papers or developed in the Hydram laboratory as well as the performance of more sophisticated ones is compared with the performance of the basic constant disaggregation model. The compared models are either deterministic or stochastic; for some of them the disaggregation is based on scaling properties of rainfall. The compared models are in increasing complexity order: constant model, linear model (Ben Haha, 2001), Ormsbee Deterministic model (Ormsbee, 1989), Ar- tificial Neuronal Network based model (Burian et al. 2000), Hydram Stochastic 1 and Hydram Stochastic 2 (Ben Haha, 2001), Multiplicative Cascade based model (Olsson and Berndtsson, 1998), Ormsbee Stochastic model (Ormsbee, 1989). The 625 rainy hours used for that evaluation (with a hourly rainfall amount greater than 5mm) were extracted from the 21 years chronological rainfall series (10mn time step) observed at the Pully meteorological station, Switzerland. The models were also evaluated when applied to different rainfall classes depending on the season first and on the

  13. A network application for modeling a centrifugal compressor performance map

    Science.gov (United States)

    Nikiforov, A.; Popova, D.; Soldatova, K.

    2017-08-01

    The approximation of aerodynamic performance of a centrifugal compressor stage and vaneless diffuser by neural networks is presented. Advantages, difficulties and specific features of the method are described. An example of a neural network and its structure is shown. The performances in terms of efficiency, pressure ratio and work coefficient of 39 model stages within the range of flow coefficient from 0.01 to 0.08 were modeled with mean squared error 1.5 %. In addition, the loss and friction coefficients of vaneless diffusers of relative widths 0.014-0.10 are modeled with mean squared error 2.45 %.

  14. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  15. Hybrid Corporate Performance Prediction Model Considering Technical Capability

    Directory of Open Access Journals (Sweden)

    Joonhyuck Lee

    2016-07-01

    Full Text Available Many studies have tried to predict corporate performance and stock prices to enhance investment profitability using qualitative approaches such as the Delphi method. However, developments in data processing technology and machine-learning algorithms have resulted in efforts to develop quantitative prediction models in various managerial subject areas. We propose a quantitative corporate performance prediction model that applies the support vector regression (SVR algorithm to solve the problem of the overfitting of training data and can be applied to regression problems. The proposed model optimizes the SVR training parameters based on the training data, using the genetic algorithm to achieve sustainable predictability in changeable markets and managerial environments. Technology-intensive companies represent an increasing share of the total economy. The performance and stock prices of these companies are affected by their financial standing and their technological capabilities. Therefore, we apply both financial indicators and technical indicators to establish the proposed prediction model. Here, we use time series data, including financial, patent, and corporate performance information of 44 electronic and IT companies. Then, we predict the performance of these companies as an empirical verification of the prediction performance of the proposed model.

  16. Facial Performance Transfer via Deformable Models and Parametric Correspondence.

    Science.gov (United States)

    Asthana, Akshay; de la Hunty, Miles; Dhall, Abhinav; Goecke, Roland

    2012-09-01

    The issue of transferring facial performance from one person's face to another's has been an area of interest for the movie industry and the computer graphics community for quite some time. In recent years, deformable face models, such as the Active Appearance Model (AAM), have made it possible to track and synthesize faces in real time. Not surprisingly, deformable face model-based approaches for facial performance transfer have gained tremendous interest in the computer vision and graphics community. In this paper, we focus on the problem of real-time facial performance transfer using the AAM framework. We propose a novel approach of learning the mapping between the parameters of two completely independent AAMs, using them to facilitate the facial performance transfer in a more realistic manner than previous approaches. The main advantage of modeling this parametric correspondence is that it allows a "meaningful" transfer of both the nonrigid shape and texture across faces irrespective of the speakers' gender, shape, and size of the faces, and illumination conditions. We explore linear and nonlinear methods for modeling the parametric correspondence between the AAMs and show that the sparse linear regression method performs the best. Moreover, we show the utility of the proposed framework for a cross-language facial performance transfer that is an area of interest for the movie dubbing industry.

  17. Real-time individualization of the unified model of performance.

    Science.gov (United States)

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  18. Observer analysis and its impact on task performance modeling

    Science.gov (United States)

    Jacobs, Eddie L.; Brown, Jeremy B.

    2014-05-01

    Fire fighters use relatively low cost thermal imaging cameras to locate hot spots and fire hazards in buildings. This research describes the analyses performed to study the impact of thermal image quality on fire fighter fire hazard detection task performance. Using human perception data collected by the National Institute of Standards and Technology (NIST) for fire fighters detecting hazards in a thermal image, an observer analysis was performed to quantify the sensitivity and bias of each observer. Using this analysis, the subjects were divided into three groups representing three different levels of performance. The top-performing group was used for the remainder of the modeling. Models were developed which related image quality factors such as contrast, brightness, spatial resolution, and noise to task performance probabilities. The models were fitted to the human perception data using logistic regression, as well as probit regression. Probit regression was found to yield superior fits and showed that models with not only 2nd order parameter interactions, but also 3rd order parameter interactions performed the best.

  19. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    The advances seen in the semiconductor industry within the last decade have brought the possibility of integrating evermore functionality onto a single chip forming functionally highly advanced embedded systems. These integration possibilities also imply that as the design complexity increases, so...... an efficient system level design methodology, a modelling framework for performance estimation and design space exploration at the system level is required. This thesis presents a novel component based modelling framework for system level modelling and performance estimation of embedded systems. The framework...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  20. Causal Analysis for Performance Modeling of Computer Programs

    Directory of Open Access Journals (Sweden)

    Jan Lemeire

    2007-01-01

    Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.

  1. Impact of reactive settler models on simulated WWTP performance.

    Science.gov (United States)

    Gernaey, K V; Jeppsson, U; Batstone, D J; Ingildsen, P

    2006-01-01

    Including a reactive settler model in a wastewater treatment plant model allows representation of the biological reactions taking place in the sludge blanket in the settler, something that is neglected in many simulation studies. The idea of including a reactive settler model is investigated for an ASM1 case study. Simulations with a whole plant model including the non-reactive Takács settler model are used as a reference, and are compared to simulation results considering two reactive settler models. The first is a return sludge model block removing oxygen and a user-defined fraction of nitrate, combined with a non-reactive Takács settler. The second is a fully reactive ASM1 Takács settler model. Simulations with the ASM1 reactive settler model predicted a 15.3% and 7.4% improvement of the simulated N removal performance, for constant (steady-state) and dynamic influent conditions respectively. The oxygen/nitrate return sludge model block predicts a 10% improvement of N removal performance under dynamic conditions, and might be the better modelling option for ASM1 plants: it is computationally more efficient and it will not overrate the importance of decay processes in the settler.

  2. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  3. Evaluating Flight Crew Performance by a Bayesian Network Model

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-03-01

    Full Text Available Flight crew performance is of great significance in keeping flights safe and sound. When evaluating the crew performance, quantitative detailed behavior information may not be available. The present paper introduces the Bayesian Network to perform flight crew performance evaluation, which permits the utilization of multidisciplinary sources of objective and subjective information, despite sparse behavioral data. In this paper, the causal factors are selected based on the analysis of 484 aviation accidents caused by human factors. Then, a network termed Flight Crew Performance Model is constructed. The Delphi technique helps to gather subjective data as a supplement to objective data from accident reports. The conditional probabilities are elicited by the leaky noisy MAX model. Two ways of inference for the BN—probability prediction and probabilistic diagnosis are used and some interesting conclusions are drawn, which could provide data support to make interventions for human error management in aviation safety.

  4. Relative performance analysis of IR FPA technologies from the perspective of system level performance

    Science.gov (United States)

    Haran, Terence L.; James, J. Christopher; Cincotta, Tomas E.

    2017-08-01

    The majority of high performance infrared systems today utilize FPAs composed of intrinsic direct bandgap semiconductor photon detectors such as MCT or InSb. Quantum well detector technologies such as QWIPs, QDIPs, and SLS photodetectors are potentially lower cost alternatives to MCT and InSb, but the relative performance of these technologies has not been sufficiently high to allow widespread adoption outside of a handful of applications. While detectors are often evaluated using figures of merit such as NETD or D∗, these metrics, which include many underlying aspects such as spectral quantum efficiency, dark current, well size, MTF, and array response uniformity, may be far removed from the performance metrics used to judge performance of a system in an operationally relevant scenario. True comparisons of performance for various detector technologies from the perspective of end-to-end system performance have rarely been conducted, especially considering the rapid progress of the newer quantum well technologies. System level models such as the US Army's Night Vision Integrated Performance Model (NV-IPM) can calculate image contrast and spatial frequency content using data from the target/background, intervening atmosphere, and system components. This paper includes results from a performance parameter sensitivity analysis using NV-IPM to determine the relative importance of various FPA performance parameters to the overall performance of a long range imaging system. Parameters included are: QE, dark current density, quantum well capacity, downstream readout noise, well fill, image frame rate, frame averaging, and residual fixed pattern noise. The state-of-the art for XBn, QWIP, and SLS detector technologies operating in the MWIR and LWIR bands will be surveyed to assess performance of quantum structures compared to MCT and InSb. The intent is to provide a comprehensive assessment of quantum detector performance and to identify areas where increased research

  5. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Miller, Dwight Peter

    2006-01-01

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate they would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.

  6. Global climate model performance over Alaska and Greenland

    DEFF Research Database (Denmark)

    Walsh, John E.; Chapman, William L.; Romanovsky, Vladimir

    2008-01-01

    The performance of a set of 15 global climate models used in the Coupled Model Intercomparison Project is evaluated for Alaska and Greenland, and compared with the performance over broader pan-Arctic and Northern Hemisphere extratropical domains. Root-mean-square errors relative to the 1958...... of the models are generally much larger than the biases of the composite output, indicating that the systematic errors differ considerably among the models. There is a tendency for the models with smaller errors to simulate a larger greenhouse warming over the Arctic, as well as larger increases of Arctic...... to narrowing the uncertainty and obtaining more robust estimates of future climate change in regions such as Alaska, Greenland, and the broader Arctic....

  7. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  8. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  9. Performance modeling of parallel algorithms for solving neutron diffusion problems

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Kirk, B.L.

    1995-01-01

    Neutron diffusion calculations are the most common computational methods used in the design, analysis, and operation of nuclear reactors and related activities. Here, mathematical performance models are developed for the parallel algorithm used to solve the neutron diffusion equation on message passing and shared memory multiprocessors represented by the Intel iPSC/860 and the Sequent Balance 8000, respectively. The performance models are validated through several test problems, and these models are used to estimate the performance of each of the two considered architectures in situations typical of practical applications, such as fine meshes and a large number of participating processors. While message passing computers are capable of producing speedup, the parallel efficiency deteriorates rapidly as the number of processors increases. Furthermore, the speedup fails to improve appreciably for massively parallel computers so that only small- to medium-sized message passing multiprocessors offer a reasonable platform for this algorithm. In contrast, the performance model for the shared memory architecture predicts very high efficiency over a wide range of number of processors reasonable for this architecture. Furthermore, the model efficiency of the Sequent remains superior to that of the hypercube if its model parameters are adjusted to make its processors as fast as those of the iPSC/860. It is concluded that shared memory computers are better suited for this parallel algorithm than message passing computers

  10. Product Data Model for Performance-driven Design

    Science.gov (United States)

    Hu, Guang-Zhong; Xu, Xin-Jian; Xiao, Shou-Ne; Yang, Guang-Wu; Pu, Fan

    2017-09-01

    When designing large-sized complex machinery products, the design focus is always on the overall performance; however, there exist no design theory and method based on performance driven. In view of the deficiency of the existing design theory, according to the performance features of complex mechanical products, the performance indices are introduced into the traditional design theory of "Requirement-Function-Structure" to construct a new five-domain design theory of "Client Requirement-Function-Performance-Structure-Design Parameter". To support design practice based on this new theory, a product data model is established by using performance indices and the mapping relationship between them and the other four domains. When the product data model is applied to high-speed train design and combining the existing research result and relevant standards, the corresponding data model and its structure involving five domains of high-speed trains are established, which can provide technical support for studying the relationships between typical performance indices and design parameters and the fast achievement of a high-speed train scheme design. The five domains provide a reference for the design specification and evaluation criteria of high speed train and a new idea for the train's parameter design.

  11. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  12. The better model to predict and improve pediatric health care quality: performance or importance-performance?

    Science.gov (United States)

    Olsen, Rebecca M; Bryant, Carol A; McDermott, Robert J; Ortinau, David

    2013-01-01

    The perpetual search for ways to improve pediatric health care quality has resulted in a multitude of assessments and strategies; however, there is little research evidence as to their conditions for maximum effectiveness. A major reason for the lack of evaluation research and successful quality improvement initiatives is the methodological challenge of measuring quality from the parent perspective. Comparison of performance-only and importance-performance models was done to determine the better predictor of pediatric health care quality and more successful method for improving the quality of care provided to children. Fourteen pediatric health care centers serving approximately 250,000 patients in 70,000 households in three West Central Florida counties were studied. A cross-sectional design was used to determine the importance and performance of 50 pediatric health care attributes and four global assessments of pediatric health care quality. Exploratory factor analysis revealed five dimensions of care (physician care, access, customer service, timeliness of services, and health care facility). Hierarchical multiple regression compared the performance-only and the importance-performance models. In-depth interviews, participant observations, and a direct cognitive structural analysis identified 50 health care attributes included in a mailed survey to parents(n = 1,030). The tailored design method guided survey development and data collection. The importance-performance multiplicative additive model was a better predictor of pediatric health care quality. Attribute importance moderates performance and quality, making the importance-performance model superior for measuring and providing a deeper understanding of pediatric health care quality and a better method for improving the quality of care provided to children. Regardless of attribute performance, if the level of attribute importance is not taken into consideration, health care organizations may spend valuable

  13. Four-Stroke, Internal Combustion Engine Performance Modeling

    Science.gov (United States)

    Wagner, Richard C.

    In this thesis, two models of four-stroke, internal combustion engines are created and compared. The first model predicts the intake and exhaust processes using isentropic flow equations augmented by discharge coefficients. The second model predicts the intake and exhaust processes using a compressible, time-accurate, Quasi-One-Dimensional (Q1D) approach. Both models employ the same heat release and reduced-order modeling of the cylinder charge. Both include friction and cylinder loss models so that the predicted performance values can be compared to measurements. The results indicate that the isentropic-based model neglects important fluid mechanics and returns inaccurate results. The Q1D flow model, combined with the reduced-order model of the cylinder charge, is able to capture the dominant intake and exhaust fluid mechanics and produces results that compare well with measurement. Fluid friction, convective heat transfer, piston ring and skirt friction and temperature-varying specific heats in the working fluids are all shown to be significant factors in engine performance predictions. Charge blowby is shown to play a lesser role.

  14. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    Science.gov (United States)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  15. End-to-End Privacy Protection for Facebook Mobile Chat based on AES with Multi-Layered MD5

    Directory of Open Access Journals (Sweden)

    Wibisono Sukmo Wardhono

    2018-01-01

    Full Text Available As social media environments become more interactive and amount of users grown tremendously, privacy is a matter of increasing concern. When personal data become a commodity, social media company can share users data to another party such as government. Facebook, inc is one of the social media company that frequently asked for user’s data. Although this private data request mechanism through a formal and valid legal process, it still undermine the fundamental right to information privacy. In This Case, social media users need protection against privacy violation from social media platform provider itself.  Private chat is the most favorite feature of a social media. Inside a chat room, user can share their private information contents. Cryptography is one of data protection methods that can be used to hides private communication data from unauthorized parties. In our study, we proposed a system that can encrypt chatting content based on AES and multi-layered MD5 to ensure social media users have privacy protection against social media company that use user informations as a commodity. In addition, this system can make users convenience to share their private information through social media platform.

  16. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    Science.gov (United States)

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  17. Supporting end-to-end resource virtualization for Web 2.0 applications using Service Oriented Architecture

    NARCIS (Netherlands)

    Papagianni, C.; Karagiannis, Georgios; Tselikas, N. D.; Sfakianakis, E.; Chochliouros, I. P.; Kabilafkas, D.; Cinkler, T.; Westberg, L.; Sjödin, P.; Hidell, M.; Heemstra de Groot, S.M.; Kontos, T.; Katsigiannis, C.; Pappas, C.; Antonakopoulou, A.; Venieris, I.S.

    2008-01-01

    In recent years, technologies have been introduced offering a large amount of computing and networking resources. New applications such as Google AdSense and BitTorrent can profit from the use of these resources. An efficient way of discovering and reserving these resources is by using the Service

  18. An Anthological Review of Research Utilizing MontyLingua: a Python-Based End-to-End Text Processor

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available MontyLingua, an integral part of ConceptNet which is currently the largest commonsense knowledge base, is an English text processor developed using Python programming language in MIT Media Lab. The main feature of MontyLingua is the coverage for all aspects of English text processing from raw input text to semantic meanings and summary generation, yet each component in MontyLingua is loosely-coupled to each other at the architectural and code level, which enabled individual components to be used independently or substituted. However, there has been no review exploring the role of MontyLingua in recent research work utilizing it. This paper aims to review the use of and roles played by MontyLingua and its components in research work published in 19 articles between October 2004 and August 2006. We had observed a diversified use of MontyLingua in many different areas, both generic and domain-specific. Although the use of text summarizing component had not been observe, we are optimistic that it will have a crucial role in managing the current trend of information overload in future research.

  19. End-to-end encryption in on-line payment systems : The industry reluctance and the role of laws

    NARCIS (Netherlands)

    Kasiyanto, Safari

    2016-01-01

    Various security breaches at third-party payment processors show that online payment systems are the primary target for cyber-criminals. In general, the security of online payment systems relies on a number of factors, namely technical factors, processing factors, and legal factors. The industry

  20. An End-To-End Microfluidic Platform for Engineering Life Supporting Microbes in Space Exploration Missions, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — HJ Science & Technology (HJS&T) and Lawrence Berkeley National Laboratory (LBNL) propose a highly integrated, programmable, and miniaturized microfluidic...