WorldWideScience

Sample records for based hdtv post-processor

  1. NC-code post-processor for 4-axis machining center based on NX of FANUC system%基于NX的FANUC系统四轴加工中心后置处理器构建

    Institute of Scientific and Technical Information of China (English)

    谭大庆

    2013-01-01

      A customized NC-code post-processor,based on the universal template of NX post-processor, is designed to meet the standards of 4-axis machining center equipped with FANUC CNC system.%  使用NX后置处理构建器通用模板的基础上,设计符合FANUC数控系统四轴加工中心要求的专用后置处理器。

  2. 基于XSLT的通用STEP-NC后置处理器开发%Development of universal STEP-NC post processor based on XSLT

    Institute of Scientific and Technical Information of China (English)

    肖文磊; 郇极

    2012-01-01

    To make STandard for the Exchange of Product model data-Numerical Control(STEP-NC)was compatible with traditional NC systems,the special STEP-NC post-processor was necessary for each traditional NC system.Thus,a general post-processor was created to reduce the development workloads and difficulties in constructing the special STEP-NC post-processor.By using STEP-NC code in eXtensible Markup Language(XML)mode as the post-processor input and the eXtensible Stylesheet Language Transformations(XSLT)based transformation mechanism,the post processing oriented to unification of different NC machining equipments was realized.According to transfer principle and requirement of XSLT,the data converters for EXPRESS-X and P21-P28 were individually developed.A general post-processor was constructed which took P28 file as data input and XSLT style sheet as machine equipment interface format.A cutting robot and a 3-Axis NC machining were taken as application examples,which accepted robot language and G-code respectively as the cutter location data.The results testified and verified the functionality and feasibility of the proposed post-processor.%为了使STEP-NC兼容传统的数控系统,需要在传统数控系统上构建专用的STEP-NC后置处理器。为此,建立了一个通用的后置处理器,以降低构建专用STEP后置处理器的开发工作量和开发难度。采用可扩展标记语言模式下的STEP-NC代码作为输入和基于可扩展样式表语言转换的转换机制,实现面向不同数控加工设备统一化的后置处理过程。根据可扩展样式表语言转换原理和要求,分别开发了EXPRESS-X和P21-P28的文件格式转换器。构建了以P28格式作为输入和以XSLT样式表语言定义机床设备接口格式的通用后置处理器。以一台切削加工机器人和一台三轴数控铣床为应用实例,分别以机器人语言和G代码作为后置处理结果,对后置处理器的功能性和可行性进行了验证。

  3. A post-processor for Gurmukhi OCR

    Indian Academy of Sciences (India)

    G S Lehal; Chandan Singh

    2002-02-01

    A post-processing system for OCR of Gurmukhi script has been developed. Statistical information of Punjabi language syllable combinations, corpora look-up and certain heuristics based on Punjabi grammar rules have been combined to design the post-processor. An improvement of 3% in recognition rate, from 94.35% to 97.34%, has been reported on clean images using the post-processing techniques.

  4. Developments in projection lenses for HDTV

    Science.gov (United States)

    Rudolph, John D.

    1991-08-01

    Recent focus on the development of HDTV systems worldwide has raised a critical concern--the economic viability of HDTV for the home marketplace. While projection systems performing at or above HDTV-quality levels exist today, they are designed for the institutional market and are priced far above the threshold for the individual consumer. Manufacturers will be under considerable pressure to significantly reduce the cost of HDTV projectors, as will the suppliers of key components such as lenses. Fortunately, recent developments have been made in the design, development and manufacturing technologies used to produce hybrid lenses for high-performance projection systems. This is particularly true for CRT-based front- and rear-projection systems for data and graphics applications. Extending these advances to HDTV would suggest that by the time HDTV is ready for high volume mass production, cost-effective projection lenses will be enhancing, not retarding, the market acceptance of HDTV.

  5. Multiple MIPS 4Kc cores based interrupt controller design and its implementation on HDTV SoC platform

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A multiple MIPS 4Kc processor cores based interrupt processing system is introduced. The interrupt controller plays a key role in the high definition television (HDTV) system-on-a-chip (SoC) platform,especially when it is a multiple processor system. Based on a general introduction to the whole HDTV SoC platform, a layered interrupt controller and its implementation are discussed in detail. The proposed scheme was implemented in our FPGA verification board. The results indicate that our scheme is reliable and efficient. Meanwhile, as a functional intellectual property (IP), the interrupt controller has reusability and expandability with the layered structure.

  6. Grand alliance HDTV

    Science.gov (United States)

    Petajan, Eric D.

    1995-12-01

    Terrestrial broadcast television in the United States has remained essentially unchanged in the last fifty years except for the addition of color and stereo sound. Today, personal computers are addressing the need for random access of high resolution images and CD quality audio. Furthermore, advances in digital video compression and digital communication technology have cleared the way toward offering high resolution video and audio services to consumers using traditional analog communications channels. In 1987, the U.S. Federal Communications Commission (FCC) chartered an advisory committee to recommend an advanced television system for the United States. From 1990 to 1992, the Advanced Television Test Center tested four all-digital systems, one analog High Definition Television (HDTV) system, and one enhancement NTSC system using broadcast and cable television environment simulators. The formation of the HDTV Grand Alliance in May of 1993 resulted from the withdrawal of the only analog HDTV system from the competition and a stalemate between the other four all- digital systems. The HDTV Grand Alliance system is composed of the best components from previously competing digital systems demonstrated to the FCC. Moving Pictures Experts Group (MPEG-2) syntax is used with novel encoding techniques to deliver a set of video scanning formats for a variety of applications. This paper describes the important features and concepts embodied in the HDTV Grand Alliance system.

  7. Damage 90: A post processor for crack initiation

    Science.gov (United States)

    Lemaitre, Jean; Doghri, Issam

    1994-05-01

    A post processor is fully described which allows the calculation of the crack initiation conditions from the history of strain components taken as the output of a finite element calculation. It is based upon damage mechanics using coupled strain damage constitutive equations for linear isotropic elasticity, perfect plasticity and a unified kinetic law of damage evolution. The localization of damage allows this coupling to be considered only for the damaging point for which the input strain history is taken from a classical structure calculation in elasticity or elastoplasticity. The listing of the code, a `friendly' code, with less than 600 FORTRAN instructions is given and some examples show its ability to model ductile failure in one or multi dimensions, brittle failure, low and high cycle fatigue with the non-linear accumulation, and multi-axial fatigue.

  8. A Bayesian joint probability post-processor for reducing errors and quantifying uncertainty in monthly streamflow predictions

    OpenAIRE

    P. Pokhrel; Robertson, D E; Q. J. Wang

    2013-01-01

    Hydrologic model predictions are often biased and subject to heteroscedastic errors originating from various sources including data, model structure and parameter calibration. Statistical post-processors are applied to reduce such errors and quantify uncertainty in the predictions. In this study, we investigate the use of a statistical post-processor based on the Bayesian joint probability (BJP) modelling approach to reduce errors and quantify uncertainty in streamflow predi...

  9. A Subband Coding Method for HDTV

    Science.gov (United States)

    Chung, Wilson; Kossentini, Faouzi; Smith, Mark J. T.

    1995-01-01

    This paper introduces a new HDTV coder based on motion compensation, subband coding, and high order conditional entropy coding. The proposed coder exploits the temporal and spatial statistical dependencies inherent in the HDTV signal by using intra- and inter-subband conditioning for coding both the motion coordinates and the residual signal. The new framework provides an easy way to control the system complexity and performance, and inherently supports multiresolution transmission. Experimental results show that the coder outperforms MPEG-2, while still maintaining relatively low complexity.

  10. Activities to develop digital SDTV/HDTV standards in Korea

    Science.gov (United States)

    Ahn, Chieteuk; Kim, Yong Han; Park, Sang Gyu; Yang, Jae-Woo; Nam, Jae Y.

    1995-12-01

    In this paper we first introduce some R&D activities to develop the technical standards for digital standard definition TV (SDTV) and high definition TV (HDTV) as well as MPEG related activities in Korea. Then we present the key element of the technical standards of SDTV and HDTV via satellite, which are based on the MPEG-2 international standard. We describe design and implementation of a prototype DTV encoding system we developed. We also explain the system architecture and design considerations for the development of the prototype HDTV encoding system with application specific integrated circuit. Both of these prototype systems will be used to verify the technical standards, which need to be prepared for the introduction of digital video services.

  11. DVB-C HDTV Box

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    DVB—C HDTV Box是永新视博推出的基于PC的PVR.集标清、高清、时移、预录等功能于一体的有线高清数字机顶盒。它融合Cable和互联网技术,是数字多媒体运营平台的核心部件。

  12. A challenge for traditional video conferencing: HDTV?

    Science.gov (United States)

    Mark, G.; DeFlorio, P.

    2001-01-01

    In this paper we present an experiment in which we examine how life-size HDTV as a window connecting two conference rooms might overcome some of the problems found with using traditional video conferencing in meeting rooms across distance.

  13. Radiation-Hardened HDTV Sensors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — High-performance HDTV cameras are commercially widespread, but are not presently available in radiation-hard versions. The objective of the proposed SBIR effort is...

  14. Radiation-Hardened HDTV Sensors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — High-performance HDTV cameras are commercially widespread, but are not presently available in radiation-hard versions. The objective of the proposed SBIR effort is...

  15. Channel coding for digital HDTV terrestrial broadcasting

    Science.gov (United States)

    Beakley, Guy W.

    1991-01-01

    The Federal Communications Commission of the United States has ruled that high-definition television (HDTV) will occupy no more than 6 MHz of the VHF and UHF bands now used for conventional TV. In order to transmit the HDTV signal in 6 MHz, the four United States digital HDTV proponents, the DigiCipher, DSC-HDTV, ADTV, and ATVA-P systems, are reducing the video data rate of HDTV to 15-17 Mb/s, a compression ratio of approximately 60-70 times. The high compression dictates that channel coding be used to avoid block errors and multiframe error propagation. High efficiency in channel utilization required by the 6-MHz limitation means that the channel must be properly equalized and that the multipath and interfering signals must be severely limited. The channel coding techniques used for error reduction include data interleaving, error detection and replacement, and error correction at different levels of protection for bits and blocks of unequal importance.

  16. The ISS Water Processor Catalytic Reactor as a Post Processor for Advanced Water Reclamation Systems

    Science.gov (United States)

    Nalette, Tim; Snowdon, Doug; Pickering, Karen D.; Callahan, Michael

    2007-01-01

    Advanced water processors being developed for NASA s Exploration Initiative rely on phase change technologies and/or biological processes as the primary means of water reclamation. As a result of the phase change, volatile compounds will also be transported into the distillate product stream. The catalytic reactor assembly used in the International Space Station (ISS) water processor assembly, referred to as Volatile Removal Assembly (VRA), has demonstrated high efficiency oxidation of many of these volatile contaminants, such as low molecular weight alcohols and acetic acid, and is considered a viable post treatment system for all advanced water processors. To support this investigation, two ersatz solutions were defined to be used for further evaluation of the VRA. The first solution was developed as part of an internal research and development project at Hamilton Sundstrand (HS) and is based primarily on ISS experience related to the development of the VRA. The second ersatz solution was defined by NASA in support of a study contract to Hamilton Sundstrand to evaluate the VRA as a potential post processor for the Cascade Distillation system being developed by Honeywell. This second ersatz solution contains several low molecular weight alcohols, organic acids, and several inorganic species. A range of residence times, oxygen concentrations and operating temperatures have been studied with both ersatz solutions to provide addition performance capability of the VRA catalyst.

  17. SDRAM bus schedule of HDTV video decoder

    Science.gov (United States)

    Wang, Hui; He, Yan L.; Yu, Lu

    2001-12-01

    In this paper, a time division multiplexed task scheduling (TDM) is designed for HDTV video decoder is proposed. There are three tasks: to fetch decoded data from SDRAM for displaying (DIS), read the reference data from SDRAM for motion compensating (REF) and write the motion compensated data back to SDRAM (WB) on the bus. The proposed schedule is based on the novel 4 banks interlaced SDRAM storage structure which results in less overhead on read/write time. Two SDRAM of 64M bits (4Bank×512K×32bit) are used. Compared with two banks, the four banks storage strategy read/write data with 45% less time. Therefore the process data rates for those three tasks are reduced. TDM is developed by round robin scheduling and fixed slot allocating. There are both MB slot and task slot. As a result the conflicts on bus are avoided, and the buffer size is reduced 48% compared with the priority bus scheduling. Moreover, there is a compacted bus schedule for the worst case of stuffing owning to the reduced executing time on tasks. The size of buffer is reduced and the control logic is simplified.

  18. The Mission Assessment Post Processor (MAPP): A New Tool for Performance Evaluation of Human Lunar Missions

    Science.gov (United States)

    Williams, Jacob; Stewart, Shaun M.; Lee, David E.; Davis, Elizabeth C.; Condon, Gerald L.; Senent, Juan

    2010-01-01

    The National Aeronautics and Space Administration s (NASA) Constellation Program paves the way for a series of lunar missions leading to a sustained human presence on the Moon. The proposed mission design includes an Earth Departure Stage (EDS), a Crew Exploration Vehicle (Orion) and a lunar lander (Altair) which support the transfer to and from the lunar surface. This report addresses the design, development and implementation of a new mission scan tool called the Mission Assessment Post Processor (MAPP) and its use to provide insight into the integrated (i.e., EDS, Orion, and Altair based) mission cost as a function of various mission parameters and constraints. The Constellation architecture calls for semiannual launches to the Moon and will support a number of missions, beginning with 7-day sortie missions, culminating in a lunar outpost at a specified location. The operational lifetime of the Constellation Program can cover a period of decades over which the Earth-Moon geometry (particularly, the lunar inclination) will go through a complete cycle (i.e., the lunar nodal cycle lasting 18.6 years). This geometry variation, along with other parameters such as flight time, landing site location, and mission related constraints, affect the outbound (Earth to Moon) and inbound (Moon to Earth) translational performance cost. The mission designer must determine the ability of the vehicles to perform lunar missions as a function of this complex set of interdependent parameters. Trade-offs among these parameters provide essential insights for properly assessing the ability of a mission architecture to meet desired goals and objectives. These trades also aid in determining the overall usable propellant required for supporting nominal and off-nominal missions over the entire operational lifetime of the program, thus they support vehicle sizing.

  19. HEROIC: 3D general relativistic radiative post-processor with comptonization for black hole accretion discs

    Science.gov (United States)

    Narayan, Ramesh; Zhu, Yucong; Psaltis, Dimitrios; Saḑowski, Aleksander

    2016-03-01

    We describe Hybrid Evaluator for Radiative Objects Including Comptonization (HEROIC), an upgraded version of the relativistic radiative post-processor code HERO described in a previous paper, but which now Includes Comptonization. HEROIC models Comptonization via the Kompaneets equation, using a quadratic approximation for the source function in a short characteristics radiation solver. It employs a simple form of accelerated lambda iteration to handle regions of high scattering opacity. In addition to solving for the radiation field, HEROIC also solves for the gas temperature by applying the condition of radiative equilibrium. We present benchmarks and tests of the Comptonization module in HEROIC with simple 1D and 3D scattering problems. We also test the ability of the code to handle various relativistic effects using model atmospheres and accretion flows in a black hole space-time. We present two applications of HEROIC to general relativistic magnetohydrodynamics simulations of accretion discs. One application is to a thin accretion disc around a black hole. We find that the gas below the photosphere in the multidimensional HEROIC solution is nearly isothermal, quite different from previous solutions based on 1D plane parallel atmospheres. The second application is to a geometrically thick radiation-dominated accretion disc accreting at 11 times the Eddington rate. Here, the multidimensional HEROIC solution shows that, for observers who are on axis and look down the polar funnel, the isotropic equivalent luminosity could be more than 10 times the Eddington limit, even though the spectrum might still look thermal and show no signs of relativistic beaming.

  20. Methane Post-Processor Development to Increase Oxygen Recovery beyond State-of-the-Art Carbon Dioxide Reduction Technology

    Science.gov (United States)

    Abney, Morgan B.; Greenwood, Zachary; Miller, Lee A.; Alvarez, Giraldo; Iannantuono, Michelle; Jones, Kenny

    2013-01-01

    State-of-the-art life support carbon dioxide (CO2) reduction technology, based on the Sabatier reaction, is theoretically capable of 50% recovery of oxygen from metabolic CO2. This recovery is constrained by the limited availability of reactant hydrogen. Post-processing of the methane byproduct from the Sabatier reactor results in hydrogen recycle and a subsequent increase in oxygen recovery. For this purpose, a Methane Post-Processor Assembly containing three sub-systems has been developed and tested. The assembly includes a Methane Purification Assembly (MePA) to remove residual CO2 and water vapor from the Sabatier product stream, a Plasma Pyrolysis Assembly (PPA) to partially pyrolyze methane into hydrogen and acetylene, and an Acetylene Separation Assembly (ASepA) to purify the hydrogen product for recycle. The results of partially integrated testing of the sub-systems are reported

  1. PP: A graphics post-processor for the EQ6 reaction path code

    Energy Technology Data Exchange (ETDEWEB)

    Stockman, H.W.

    1994-09-01

    The PP code is a graphics post-processor and plotting program for EQ6, a popular reaction-path code. PP runs on personal computers, allocates memory dynamically, and can handle very large reaction path runs. Plots of simple variable groups, such as fluid and solid phase composition, can be obtained with as few as two keystrokes. Navigation through the list of reaction path variables is simple and efficient. Graphics files can be exported for inclusion in word processing documents and spreadsheets, and experimental data may be imported and superposed on the reaction path runs. The EQ6 thermodynamic database can be searched from within PP, to simplify interpretation of complex plots.

  2. Systematic evaluation of autoregressive error models as post-processors for a probabilistic streamflow forecast system

    Science.gov (United States)

    Morawietz, Martin; Xu, Chong-Yu; Gottschalk, Lars; Tallaksen, Lena

    2010-05-01

    A post-processor that accounts for the hydrologic uncertainty in a probabilistic streamflow forecast system is necessary to account for the uncertainty introduced by the hydrological model. In this study different variants of an autoregressive error model that can be used as a post-processor for short to medium range streamflow forecasts, are evaluated. The deterministic HBV model is used to form the basis for the streamflow forecast. The general structure of the error models then used as post-processor is a first order autoregressive model of the form dt = αdt-1 + σɛt where dt is the model error (observed minus simulated streamflow) at time t, α and σ are the parameters of the error model, and ɛt is the residual error described through a probability distribution. The following aspects are investigated: (1) Use of constant parameters α and σ versus the use of state dependent parameters. The state dependent parameters vary depending on the states of temperature, precipitation, snow water equivalent and simulated streamflow. (2) Use of a Standard Normal distribution for ɛt versus use of an empirical distribution function constituted through the normalized residuals of the error model in the calibration period. (3) Comparison of two different transformations, i.e. logarithmic versus square root, that are applied to the streamflow data before the error model is applied. The reason for applying a transformation is to make the residuals of the error model homoscedastic over the range of streamflow values of different magnitudes. Through combination of these three characteristics, eight variants of the autoregressive post-processor are generated. These are calibrated and validated in 55 catchments throughout Norway. The discrete ranked probability score with 99 flow percentiles as standardized thresholds is used for evaluation. In addition, a non-parametric bootstrap is used to construct confidence intervals and evaluate the significance of the results. The main

  3. HERO - A 3D general relativistic radiative post-processor for accretion discs around black holes

    Science.gov (United States)

    Zhu, Yucong; Narayan, Ramesh; Sadowski, Aleksander; Psaltis, Dimitrios

    2015-08-01

    HERO (Hybrid Evaluator for Radiative Objects) is a 3D general relativistic radiative transfer code which has been tailored to the problem of analysing radiation from simulations of relativistic accretion discs around black holes. HERO is designed to be used as a post-processor. Given some fixed fluid structure for the disc (i.e. density and velocity as a function of position from a hydrodynamic or magnetohydrodynamic simulation), the code obtains a self-consistent solution for the radiation field and for the gas temperatures using the condition of radiative equilibrium. The novel aspect of HERO is that it combines two techniques: (1) a short-characteristics (SC) solver that quickly converges to a self-consistent disc temperature and radiation field, with (2) a long-characteristics (LC) solver that provides a more accurate solution for the radiation near the photosphere and in the optically thin regions. By combining these two techniques, we gain both the computational speed of SC and the high accuracy of LC. We present tests of HERO on a range of 1D, 2D, and 3D problems in flat space and show that the results agree well with both analytical and benchmark solutions. We also test the ability of the code to handle relativistic problems in curved space. Finally, we discuss the important topic of ray defects, a major limitation of the SC method, and describe our strategy for minimizing the induced error.

  4. A Bayesian joint probability post-processor for reducing errors and quantifying uncertainty in monthly streamflow predictions

    Directory of Open Access Journals (Sweden)

    P. Pokhrel

    2012-10-01

    Full Text Available Hydrological post-processors refer here to statistical models that are applied to hydrological model predictions to further reduce prediction errors and to quantify remaining uncertainty. For streamflow predictions, post-processors are generally applied to daily or sub-daily time scales. For many applications such as seasonal streamflow forecasting and water resources assessment, monthly volumes of streamflows are of primary interest. While it is possible to aggregate post-processed daily or sub-daily predictions to monthly time scales, the monthly volumes so produced may not have the least errors achievable and may not be reliable in uncertainty distributions. Post-processing directly at the monthly time scale is likely to be more effective. In this study, we investigate the use of a Bayesian joint probability modelling approach to directly post-process model predictions of monthly streamflow volumes. We apply the BJP post-processor to 18 catchments located in eastern Australia and demonstrate its effectiveness in reducing prediction errors and quantifying prediction uncertainty.

  5. HDTV satellite broadcasting in the EHF domain: Feasibility Study and Quality Assessment

    OpenAIRE

    Conci, Nicola; Rossi, Tommaso; Sacchi, Claudio

    2012-01-01

    In this work we propose a simulation-based feasibility study for the efficient exploitation of W band (75-110GHz) for high quality HDTV broadcasting applications. In order to obtain a reliable and realistic simulation environment, we have considered the DVB-S2 standard specifications, introducing the typical W-band impairments such as phase noise, rain attenuation, as well as non-linearities. For testing purposes, we have adopted common High-Definition benchmark video sequences, so as to eval...

  6. Design and Implementation of the Motion Compensation Module for HDTV Video Decoder

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper presented a new solution for motion compensation module in the high definition television (HDTV) video decoder. The overall architecture and the design of the major functional units, such as the motion vector decoder, the predictor , and the mixer, were discussed. Based on the exploitation of the special characteristics inherent in the motion compensation algorithm, the motion compensation module and its functional units adopt various novel architectures in order to allow the module to meet real-time constraints. This solution resolves the problem of high hardware costs, low bus efficiency and complex control schemes in conventional designs.

  7. Evaluation of Dual-Launch Lunar Architectures Using the Mission Assessment Post Processor

    Science.gov (United States)

    Stewart, Shaun M.; Senent, Juan; Williams, Jacob; Condon, Gerald L.; Lee, David E.

    2010-01-01

    The National Aeronautics and Space Administrations (NASA) Constellation Program is currently designing a new transportation system to replace the Space Shuttle, support human missions to both the International Space Station (ISS) and the Moon, and enable the eventual establishment of an outpost on the lunar surface. The present Constellation architecture is designed to meet nominal capability requirements and provide flexibility sufficient for handling a host of contingency scenarios including (but not limited to) launch delays at the Earth. This report summarizes a body of work performed in support of the Review of U.S. Human Space Flight Committee. It analyzes three lunar orbit rendezvous dual-launch architecture options which incorporate differing methodologies for mitigating the effects of launch delays at the Earth. NASA employed the recently-developed Mission Assessment Post Processor (MAPP) tool to quickly evaluate vehicle performance requirements for several candidate approaches for conducting human missions to the Moon. The MAPP tool enabled analysis of Earth perturbation effects and Earth-Moon geometry effects on the integrated vehicle performance as it varies over the 18.6-year lunar nodal cycle. Results are provided summarizing best-case and worst-case vehicle propellant requirements for each architecture option. Additionally, the associated vehicle payload mass requirements at launch are compared between each architecture and against those of the Constellation Program. The current Constellation Program architecture assumes that the Altair lunar lander and Earth Departure Stage (EDS) vehicles are launched on a heavy lift launch vehicle. The Orion Crew Exploration Vehicle (CEV) is separately launched on a smaller man-rated vehicle. This strategy relaxes man-rating requirements for the heavy lift launch vehicle and has the potential to significantly reduce the cost of the overall architecture over the operational lifetime of the program. The crew launch

  8. Stereoscopic HDTV Research at NHK Science and Technology Research Laboratories

    CERN Document Server

    Yamanoue, Hirokazu; Nojiri, Yuji

    2012-01-01

    This book focuses on the two psychological factors of naturalness and ease of viewing of three-dimensional high-definition television (3D HDTV) images. It has been said that distortions peculiar to stereoscopic images, such as the “puppet theater” effect or the “cardboard” effect, spoil the sense of presence. Whereas many earlier studies have focused on geometrical calculations about these distortions, this book instead describes the relationship between the naturalness of reproduced 3D HDTV images and the nonlinearity of depthwise reproduction. The ease of viewing of each scene is regarded as one of the causal factors of visual fatigue. Many of the earlier studies have been concerned with the accurate extraction of local parallax; however, this book describes the typical spatiotemporal distribution of parallax in 3D images. The purpose of the book is to examine the correlations between the psychological factors and amount of characteristics of parallax distribution in order to understand the characte...

  9. 北欧的HDTV HD-DIVINE

    Institute of Scientific and Technical Information of China (English)

    1995-01-01

    @@ 由北欧广播局和开发部门组成的HDTV开发集团于94年IBC展览会上展出用自行试制的调制解调器的北欧HDTV HD-DIVINE.该集团要到95年2月才能把HD-DIVINE制的详细规格确定下来.

  10. Windows Media Player也能播放HDTV

    Institute of Scientific and Technical Information of China (English)

    李学昌

    2005-01-01

    HDTV(High Definision TV)可能有不少朋友听说过了吧,它就是高清晰度电视的简称。它除了拥有高清晰的画质外,还提供有5.1声道的环绕立体声。通常可采用Meda Player Classic进行播放,而其实Windows自带的Windows Media Player也能播放HDTV。

  11. Design and implementation of an efficient SDRAM controller for HDTV decoder

    Institute of Scientific and Technical Information of China (English)

    Wang Xiaohui; Zhao Yiqiang; Xie Xiaodong; Wu Di; Zhang Peng

    2007-01-01

    A high performance SDRAM controller for HDTV decoder is designed.MB-based (macro block)address mapping,adaptive-precharge and command interleaving are adopted in this controller.MB-based address mapping reduces the precharge operations of the video processing unit in one access;adaptiveprecharge avoids unnecessary precharge operations;while command interleaving inserts the precharge and activate commands of the next access into the command sequence of the current access,thus reduces the no operation(NOP)cycles.Combination of these three schemes effectively improves the SDRAM performance.Compared with precharge-all scheme,adaptive-precharge and command interleaving reduce the SDRAM overhead cycles by 70% and increases SDRAM performance by up to 19.2%in the best case.This controller has been implemented in an AVS SoC and the frequency is 200MHz.

  12. Understanding the diffusion of HDTV through an analysis of risks and uncertainties of supply and demand in the Netherlands

    NARCIS (Netherlands)

    Baaren, Eva; Huizer, E.; van de Wijngaert, Lidwien; Vuorimaa, Petri; Naranen, Pertti

    2010-01-01

    This paper analyses the diffusion of HDTV in The Netherlands. The research provides an analysis of the supply side of the broadcast value chain as well as an analysis of consumer acceptance of HDTV in The Netherlands. The research is part of longitudinal research effort and uses both qualitative and

  13. The emperor's clothes in high resolution: an experimental study of the framing effect and the diffusion of HDTV

    NARCIS (Netherlands)

    Joor, Dirkjan; Beekhuizen, Wilco; Wijngaert, van de Lidwien; Baaren, Eva

    2009-01-01

    In this article, an experiment was conducted to measure the effect of framing a high definition television (HDTV) clip. One group of participants was told they were watching a brand new HDTV clip, while the other group was told they were watching a digital DVD clip. Both groups were in fact watching

  14. Automation of ORIGEN2 calculations for the transuranic waste baseline inventory database using a pre-processor and a post-processor

    Energy Technology Data Exchange (ETDEWEB)

    Liscum-Powell, J. [Sandia National Labs., Albuquerque, NM (United States). Nuclear Safety and Systems Analysis

    1997-06-01

    The purpose of the work described in this report was to automate ORIGEN2 calculations for the Waste Isolation Pilot Plant (WIPP) Transuranic Waste Baseline Inventory Database (WTWBID); this was done by developing a pre-processor to generate ORIGEN2 input files from WWBID inventory files and a post-processor to remove excess information from the ORIGEN2 output files. The calculations performed with ORIGEN2 estimate the radioactive decay and buildup of various radionuclides in the waste streams identified in the WTWBID. The resulting radionuclide inventories are needed for performance assessment calculations for the WIPP site. The work resulted in the development of PreORG, which requires interaction with the user to generate ORIGEN2 input files on a site-by-site basis, and PostORG, which processes ORIGEN2 output into more manageable files. Both programs are written in the FORTRAN 77 computer language. After running PreORG, the user will run ORIGEN2 to generate the desired data; upon completion of ORIGEN2 calculations, the user can run PostORG to process the output to make it more manageable. All the programs run on a 386 PC or higher with a math co-processor or a computer platform running under VMS operating system. The pre- and post-processors for ORIGEN2 were generated for use with Rev. 1 data of the WTWBID and can also be used with Rev. 2 and 3 data of the TWBID (Transuranic Waste Baseline Inventory Database).

  15. A Hierarchical Joint Optimized Bit—allocation Strategy for HDTV Encoder with Parallel Coding Architecture

    Institute of Scientific and Technical Information of China (English)

    XIONGHongkai; YUSongyu; YEWei

    2003-01-01

    Because real-time compression and high-speed digital processing circuitry are crucial for digital high definition television (HDTV) coding, parallel processing has become a feasible scheme in most applications as yet. This paper presents a novel bit-allocation strategy for an HDTV encoder system with parallel architecture, in which the original HDTV-picture is divided into six hor-izontal sub-pictures. It is shown that the MPEG-2 Test Model 5 (TMS) rate control scheme would not only give rise to non-consistent sub-pictures visual quality in a com-posite HDTV frame, but also make the coding quality de-grade abruptly and the buffer underfiow at scene changes.How to allocate bit-rates among sub-pictures becomes a great challenge in literatures. The proposed strategy is dedicated to a hierarchical joint optimized bit-allocation with sub-pictures' average complexity and average bits measure, and moreover, capable of alleviating serious pic-ture quality inconsistence at scene changes. The optimized bit-allocation and its complementary rate adaptive proce-dures are formulated and described. In the paper, the pro-posed strategy is compared with the independent coding,in which each sub-picture sequence is assigned the same proportion of the channel bandwidth. Experimental re-suits demonstrate the effectiveness of the proposed scheme not only alleviates the boundary effect but also promises the sub-pictures quality consistency.

  16. IR Sensor Synchronizing Active Shutter Glasses for 3D HDTV with Flexible Liquid Crystal Lenses

    Directory of Open Access Journals (Sweden)

    Jeong In Han

    2013-12-01

    Full Text Available IR sensor synchronizing active shutter glasses for three-dimensional high definition television (3D HDTV were developed using a flexible liquid crystal (FLC lens. The FLC lens was made on a polycarbonate (PC substrate using conventional liquid crystal display (LCD processes. The flexible liquid crystal lens displayed a maximum transmission of 32% and total response time of 2.56 ms. The transmittance, the contrast ratio and the response time of the flexible liquid crystal lens were superior to those of glass liquid crystal lenses. Microcontroller unit and drivers were developed as part of a reception module with power supply for the IR sensor synchronizing active shutter glasses with the flexible liquid crystal lens prototypes. IR sensor synchronizing active shutter glasses for 3D HDTV with flexible liquid crystal lenses produced excellent 3D images viewing characteristics.

  17. SDI HDTV video transmission unit; SDI high vision eizo denso sochi

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The number is increasing of broadcasting stations who use HDTV (high-definition television) signals for program formulation for the improvement of video quality for general broadcasting programs. The above-named unit, which transmits HDTV signals which are broadcasting materials over a long distance with their high quality well preserved, is capable of the optical transmission of SDI (serial digital interface) signals meeting the standards of BTA (Broadcasting Technologies Association) on 1 video channel and 2 digital audio channels. It can also work on the light wavelength multiplex system. It was developed under the guidance of TTNet (Tokyo Telecommunication Network Co., Inc.), and gained public favor when it successfully transmitted programs over a distance of approximately 400km from Nagano to Tokyo during the Nagano Olympic Games. It now transmits various high-quality video signals for instance in the relay of baseball games, thereby contributing to the improvement of video quality for broadcasting. (translated by NEDO)

  18. Highly parallel implementation of sub-pixel interpolation for AVS HDTV decoder

    Institute of Scientific and Technical Information of China (English)

    Wan-yi LI; Lu YU

    2008-01-01

    In this paper,we propose an effective VLSI architecture of sub-pixel interpolation for motion compensation in the AVS HDTV decoder. To utilize the similar arithmetical operations of 15 luma sub-pixel positions,three types of interpolation filters are proposed. A simplified multiplier is presented due to the limited range of input in the chroma interpolation process. To improve the processing throughput,a parallel and pipelined computing architecture is adopted. The simulation results show that the proposed hardware implementation can satisfy the real-time constraint for the AVS HDTV (1 920×1 088) 30 fps decoder by operating at 108 M Hz with 38.18k logic gates. Meanwhile,it costs only 216 cycles to accomplish one macroblock,which means the B frame sub-pixel interpolation can be realized by using only one set of the proposed architecture under real-time constraints.

  19. IR Sensor Synchronizing Active Shutter Glasses for 3D HDTV with Flexible Liquid Crystal Lenses

    OpenAIRE

    Jeong In Han

    2013-01-01

    IR sensor synchronizing active shutter glasses for three-dimensional high definition television (3D HDTV) were developed using a flexible liquid crystal (FLC) lens. The FLC lens was made on a polycarbonate (PC) substrate using conventional liquid crystal display (LCD) processes. The flexible liquid crystal lens displayed a maximum transmission of 32% and total response time of 2.56 ms. The transmittance, the contrast ratio and the response time of the flexible liquid crystal lens were superio...

  20. FIRINPC and FIRACPC graphics post-processor support user`s guide and programmer`s reference

    Energy Technology Data Exchange (ETDEWEB)

    Hensel, E. [New Mexico State Univ., Las Cruces, NM (United States). Dept. of Mechanical Engineering

    1992-03-01

    FIRIN is a computer program used by DOE fire protection engineers to simulate hypothetical fire accidents in compartments at DOE facilities. The FIRIN code is typically used in conjunction with a ventilation system code such as FIRAC, which models the impact of the fire compartment upon the rest of the system. The code described here, FIRINPC is a PC based implementation of the full mainframe code FIRIN. In addition, FIRINPC contains graphics support for monitoring the progress of the simulation during execution and for reviewing the complete results of the simulation upon completion of the run. This document describes how to install, test, and subsequently use the code FIRINPC, and addresses differences in usage between the PC version of the code and its mainframe predecessor. The PC version contains all of the modeling capabilities of the earlier version, with additional graphics support. This user`s guide is a supplement to the original FIRIN report published by the NRC. FIRAC is a computer program used by DOE fire protection engineers to simulate the transient response of complete ventilation system to fire induced transients. FIRAC has the ability to use the FIRIN code as the driving function or source term for the ventilation system response. The current version of FIRAC does not contain interactive graphics capabilities. A third program, called POST, is made available for reviewing the results of a previous FIRIN or FIRAC simulation, without having to recompute the numerical simulation. POST uses the output data files created by FIRINPC and FIRACPC to avoid recomputation.

  1. The Real Time Mixing Module Design for HDTV Data of SMPTE 274M and PC Video Data

    Institute of Scientific and Technical Information of China (English)

    魏江力; 赵保军; 韩月秋

    2003-01-01

    A real time mixing module for high definition television (HDTV) data of SMPTE 274M and PC video data is designed. The hardware implementation, algorithm and simulation of the mixing module are given. In order to improve the capability of data processing, an anti-fuse FPGA chip and a mechanism of pipelining and modularization are adopted. With 6 parallel LUTs and a fast algorithm, it can mix 4∶2∶2 component signals in luminance and chrominance space respectively in real time. According to the simulation, the module has the ability to mix the uncompressed HDTV data with PC video data in real time, which can not be fulfilled by current ASIC chips. Furthermore, it can be extended to multi-stage mixing with the thoughts implied by the design. The mixing module can be widely used in HDTV production systems.

  2. Televisión de alta definición (HDTV)

    OpenAIRE

    Pozo Meneses, Fanny Alicia; Pérez Ramos, Tania

    1997-01-01

    Con el desarrollo de nuevas tecnologías en transmisión para radiodifusión y la necesidad de mejorar la calidad del video de los actuales sistemas de televisión, el enfoque hacia el futuro es conseguir el mejoramiento sustancial de la resolución de video. A estos sistemas con calidad superior de video se los ha denominado Sistemas de Televisión de Alta Definición (High Definition Television HDTV). La introducción de la tecnología digital en el campo de la televisión está retrasada con ...

  3. AFD高标清幅型变换信息在MXF文件的表示方法%A Method for Information Representation of Aspect Ratio Conversion between HDTV and SDTV

    Institute of Scientific and Technical Information of China (English)

    凌坚; 周春燕

    2012-01-01

    Aspect Ratio Conversion is necessary to achieve HDTV and SDTV simulcast and material sharing. In this paper, based on the analysis of the characteristics of the AFD standard, the MXF file structure and data encapsulation method, a method for embedding aspect ratio conversion information which accords with AFD standard is put forward. In accordance with the MXF file metadata extension specification, the aspect ratio conversion between HDTV and SDTV and the information of the conversion and material data binding are a- chieved by the method.%实现高标清同播和素材共享必须对视频进行画幅变换。在分析AFD标准的特点、MXF文件的结构与数据封装方法的基础上,提出了一种在MXF文件中嵌入采用AFD标准描述符的幅型变换信息的方法,该方法将AFD数据作为元数据,依照MXF文件中元数据的扩展规范,实现了画幅变换信息与素材数据的绑定。

  4. 2002 Blue Marble and Developments in HDTV Technology for Public Outreach

    Science.gov (United States)

    Hasler, Fritz; Starr, David OC. (Technical Monitor)

    2001-01-01

    Fritz Hasler (NASA/Goddard) will demonstrate the latest Blue Marble Digital Earth technology. We will fly in from space through Terra, Landsat 7, to 1 m Ikonos "Spy Satellite" data of Disney World and the Orlando Convention Center. You will see the complete global cloud free and cloudy 500 m datasets from the EOS Terra satellite. Spectacular new animations from Terra, Landsat 7, and SeaWiFS will be presented. See also animations of the hurricanes & tropical storms of the 2001 season, as well as Floyd, Georges, and Mitch, etc. from GOES & TRMM supported by MM5 3-D nested numerical model results. See movies assembled using new low cost HDTV nonlinear editing equipment that is revolutionizing the way we communicate scientific results. See climate change in action with Global Land & Ocean productivity changes over the last 20 years. Remote sensing observations of ocean SST, height, winds, color, and El Nino from GOES, AVHRR, SSMI & SeaWiFS are put in context with atmospheric and ocean simulations. Compare symmetrical equatorial eddies observed by GOES with the simulations.

  5. INVESTIGANDO OS FATORES QUE INFLUENCIAM A ADOÇÃO DE HDTV NO BRASIL

    Directory of Open Access Journals (Sweden)

    Daniel da Hora Alves Lima

    2012-12-01

    Full Text Available Três anos depois do lançamento da TV digital no Brasil, sua penetração ainda é pequena. O consumidor de conteúdo em alta definição tem sido principalmente assinantes de TV paga, em geral famílias com renda elevada, mas potenciais formadores de opinião. Para identificar estímulos e barreiras à adoção da TV por assinatura em alta definição (HD, quinze potenciais usuários foram entrevistados. Em seguida, um survey foi conduzido em amostra de 348 assinantes de TV paga que não contratavam pacotes HD. Os dados foram tratados por técnicas de data mining e árvore de decisão, identificando-se a relação entre atributos do serviço de HD percebidos e a intenção de contratá-lo. Os resultados sugerem que a disponibilidade de conteúdo em HD, recursos percebidos, renda familiar e a percepção de facilidade de uso da tecnologia têm impacto significativo na intenção de adoção de um serviço por assinatura de HDTV.

  6. Application of the IDT 72V2113 FIFO within an HDTV encoder%用FIFO设计HDTV编码器

    Institute of Scientific and Technical Information of China (English)

    彭京湘

    2000-01-01

    @@ 在HDTV(高清晰度电视)编码器中,用MPEG2对输入视频流进行压缩而输出是压缩的HDTV视频图像.FIFO(先进先出)存储器可用在HDTV编码器通路不同点上,主要是提供频率耦合.FIFO设置为×9和×18配置,连接成1M深的存储体.FIFO写和读时钟为27MHz~74MHz.FIFO读和写操作同时执行.

  7. Live Broadcasting of High Definition Audiovisual Content Using HDTV over Broadband IP Networks

    Directory of Open Access Journals (Sweden)

    C. E. Vegiris

    2008-01-01

    Full Text Available The current paper focuses on validating an implementation of a state-of-the art audiovisual (AV technologies setup for live broadcasting of cultural shows, via broadband Internet. The main objective of the work was to study, configure, and setup dedicated audio-video equipment for the processes of capturing, processing, and transmission of extended resolution and high fidelity AV content in order to increase realism and achieve maximum audience sensation. Internet2 and GEANT broadband telecommunication networks were selected as the most applicable technology to deliver such traffic workloads. Validation procedures were conducted in combination with metric-based quality of service (QoS and quality of experience (QoE evaluation experiments for the quantification and the perceptual interpretation of the quality achieved during content reproduction. The implemented system was successfully applied in real-world applications, such as the transmission of cultural events from Thessaloniki Concert Hall throughout Greece as well as the reproduction of Philadelphia Orchestra performances (USA via Internet2 and GEANT backbones.

  8. Post-processor for simulations of the ORIGEN program and calculation of the composition of the activity of a burnt fuel core by a BWR type reactor; Post-procesador para simulaciones del programa ORIGEN y calculo de la composicion de la actividad de un nucleo de combustible quemado por un reactor tipo BWR

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval V, S. [IIE, Av. Reforma 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico)]. e-mail: sandoval@iie.org.mx

    2006-07-01

    The composition calculation and the activity of nuclear materials subject to processes of burnt, irradiation and decay periods are of utility for diverse activities inside the nuclear industry, as they are it: the processes design and operations that manage radioactive material, the calculation of the inventory and activity of a core of burnt nuclear fuel, for studies of type Probabilistic Safety Analysis (APS), as well as for regulation processes and licensing of nuclear facilities. ORIGEN is a program for computer that calculates the composition and the activity of nuclear materials subject to periods of burnt, irradiation and decay. ORIGEN generates a great quantity of information whose processing and analysis are laborious, and it requires thoroughness to avoid errors. The automation of the extraction, conditioning and classification of that information is of great utility for the analyst. By means of the use of the post-processor presented in this work it is facilitated, it speeds up and wide the capacity of analysis of results, since diverse consultations with several classification options and filtrate of results can be made. As illustration of the utility of the post-processor, and as an analysis of interest for itself, it is also presented in this work the composition of the activity of a burned core in a BWR type reactor according to the following classification criteria: by type of radioisotope (fission products, activation products and actinides), by specie type (gassy, volatile, semi-volatile and not volatile), by element and by chemical group. The results show that the total activity of the studied core is dominated by the fission products and for the actinides, in proportion four to one, and that the gassy and volatile species conform a fifth part of the total activity of the core. (Author)

  9. The HDTV digital audio matrix

    Science.gov (United States)

    Mason, A. J.

    Multichannel sound systems are being studied as part of the Eureka 95 and Radio-communication Bureau TG10-1 investigations into high definition television. One emerging sound system has five channels; three at the front and two at the back. This raises some compatibility issues. The listener might have only, say, two loudspeakers or the material to be broadcast may have fewer than five channels. The problem is how best to produce a set of signals to be broadcast, which is suitable for all listeners, from those that are available. To investigate this area, a device has been designed and built which has six input channels and six output channels. Each output signal is a linear combination of the input signals. The inputs and outputs are in AES/EBU digital audio format using BBC-designed AESIC chips. The matrix operation, to produce the six outputs from the six inputs, is performed by a Motorola DSP56001. The user interface and 'housekeeping' is managed by a T222 transputer. The operator of the matrix uses a VDU to enter sets of coefficients and a rotary switch to select which set to use. A set of analog controls is also available and is used to control operations other than the simple compatibility matrixing. The matrix has been very useful for simple tasks: mixing a stereo signal into mono, creating a stereo signal from a mono signal, applying a fixed gain or attenuation to a signal, exchanging the A and B channels of an AES/EBU bitstream, and so on. These are readily achieved using simple sets of coefficients. Additions to the user interface software have led to several more sophisticated applications which still consist of a matrix operation. Different multichannel panning laws have been evaluated. The analog controls adjust the panning; the audio signals are processed digitally using a matrix operation. A digital SoundField microphone decoder has also been implemented. digital audio matrix is such that it can be applied to a wide variety of signal processing tasks. -The combination of a dedicated DSP chip programmed in assembly language for speed of operation and a general purpose processor for user interface tasks programmed in a high level language has been found to be extremely useful.

  10. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  11. HDTV Picture Quality Improvement Based on HQV Vida Processor%基于HQV Vida处理器的HDTV图像质量改善

    Institute of Scientific and Technical Information of China (English)

    卫建华; 贺新华

    2010-01-01

    通过高清电视接收机收看互联网等视频信号时,由于带宽等的限制,大多数视频内容是高压缩的,图像信号含有大量噪声,当在HDTV大屏幕显示器上观看时,压缩失真随处可见.为解决这一问题,可采用具有自动 HQV 和 HQVStreamClean两项新技术的HQV Vida处理器对图像信号进行降噪处理,HQV Vida处理器能够自动提高视频图像的质量,提供强大的源视频清理功能,消除噪声,有利于呈现尽可能清晰、干净的图像,可以为电视用户提供高清电视效果.

  12. 广东电视台高清播出系统改造项目设计与实现%Design and Implementation of HDTV Broadcasting System Improvement for Guangdong TV Station

    Institute of Scientific and Technical Information of China (English)

    王希

    2013-01-01

    The proposal for system upgrading of the original HDTV broadcasting system of Guangdong TV station's broadcast center is descried,which is designed to expand the broadcast scale.Furthermore,a detailed description of the new system architecture and design ideas on aspects of video system,broadcast control system and the video server system.Examples of special technical design for the features and security of the new system are listed,which make it a system with advanced technology,expansibility and compatibility.%为拓展播出规模,广东电视台播出中心对原有的高清播出系统进行了系统改造.从视音频系统、播出控制系统、视频服务器系统等方面分别详细描述了新的高清播出系统架构与设计思路,举例介绍了新系统的特点及为提高播出安全性所采用的特别技术设计,使得新系统具有较强的技术先进性、可扩展性和可兼容性.

  13. Study and development on the application of ABAQUS post processor in thin-walled tube NC bending based on Python%基于Python的ABAQUS后处理研究开发及其在薄壁管数控弯曲中的应用

    Institute of Scientific and Technical Information of China (English)

    郭玲; 杨合; 邱晞; 李恒; 詹梅; 郭良刚

    2007-01-01

    为了提高对非线性有限元软件ABAQUS结果提取及分析的效率和精度,基于面向对象的Python语言,对ABAQUS后处理程序进行了研究开发.提出了程序开发的思路及一般步骤,并基于此,研究开发了针对薄壁管数控弯曲这一具体应用的ABAQUS后处理程序,用于薄壁管弯曲起皱的判断和起皱区域的确定,以及最大壁厚减薄量的计算和其所在位置的确定,为实现弯管数控弯曲这一复杂成形过程的优化奠定基础.结果表明,所开发的程序能够从ABAQUS海量的计算结果中有效、精确地提取有用信息并进行操作,从而实现对模拟结果的定量分析和总结.文章也为基于Python的研究开发在其他领域的应用,提供了重要指导与借鉴.

  14. Finite element based simulation of dry sliding wear

    Science.gov (United States)

    Hegadekatte, V.; Huber, N.; Kraft, O.

    2005-01-01

    In order to predict wear and eventually the life-span of complex mechanical systems, several hundred thousand operating cycles have to be simulated. Therefore, a finite element (FE) post-processor is the optimum choice, considering the computational expense. A wear simulation approach based on Archard's wear law is implemented in an FE post-processor that works in association with a commercial FE package, ABAQUS, for solving the general deformable-deformable contact problem. Local wear is computed and then integrated over the sliding distance using the Euler integration scheme. The wear simulation tool works in a loop and performs a series of static FE-simulations with updated surface geometries to get a realistic contact pressure distribution on the contacting surfaces. It will be demonstrated that this efficient approach can simulate wear on both two-dimensional and three-dimensional surface topologies. The wear on both the interacting surfaces is computed using the contact pressure distribution from a two-dimensional or three-dimensional simulation, depending on the case. After every wear step the geometry is re-meshed to correct the deformed mesh due to wear, thus ensuring a fairly uniform mesh for further processing. The importance and suitability of such a wear simulation tool will be enunciated in this paper.

  15. Residual uncertainty estimation using instance-based learning with applications to hydrologic forecasting

    Science.gov (United States)

    Wani, Omar; Beckers, Joost V. L.; Weerts, Albrecht H.; Solomatine, Dimitri P.

    2017-08-01

    A non-parametric method is applied to quantify residual uncertainty in hydrologic streamflow forecasting. This method acts as a post-processor on deterministic model forecasts and generates a residual uncertainty distribution. Based on instance-based learning, it uses a k nearest-neighbour search for similar historical hydrometeorological conditions to determine uncertainty intervals from a set of historical errors, i.e. discrepancies between past forecast and observation. The performance of this method is assessed using test cases of hydrologic forecasting in two UK rivers: the Severn and Brue. Forecasts in retrospect were made and their uncertainties were estimated using kNN resampling and two alternative uncertainty estimators: quantile regression (QR) and uncertainty estimation based on local errors and clustering (UNEEC). Results show that kNN uncertainty estimation produces accurate and narrow uncertainty intervals with good probability coverage. Analysis also shows that the performance of this technique depends on the choice of search space. Nevertheless, the accuracy and reliability of uncertainty intervals generated using kNN resampling are at least comparable to those produced by QR and UNEEC. It is concluded that kNN uncertainty estimation is an interesting alternative to other post-processors, like QR and UNEEC, for estimating forecast uncertainty. Apart from its concept being simple and well understood, an advantage of this method is that it is relatively easy to implement.

  16. Octave: A MARSYAS post-processor for computer-aideed control system design

    Science.gov (United States)

    Hodel, A. Scottedward

    1993-01-01

    MARSYAS is a computer-aided control system analysis package for the simulation and analysis of dynamic systems. In the summer of 1991 MARSYAS was updated to allow for the analysis of sampled-data systems in terms of frequency response, stability, etc. This update was continued during the summer of 1992 in order to extend further MARSYAS commands to the study of sampled data systems. Further work was done to examine the computation of openat transfer functions, root-locii and omega-plane frequency response plots. At the conclusion of the summer of 1992 work, it was proposed that control-system design capability be incorporated into the MARSYAS package. It was decided at that time to develop a separate 'stand-alone' computer-aided control system design (CACSD) package. This report is a brief description of such a package.

  17. Pre- and post-processor for the wool won transport code

    CERN Document Server

    Fawley, W M

    2001-01-01

    ICOOL is a Fortran77 macroparticle transport code widely used by researchers to study the front end of a neutrino factory/muon collider. In part due to the desire that ICOOL be usable over multiple computer platforms and operating systems, the code uses simple text files for input/output services. This choice together with user-driven requests for greater and greater choice of lattice element type and configuration has led to ICOOL input decks becoming rather difficult to compose and modify easily. Moreover, the lack of a standard graphical postprocessor has prevented many ICOOL users from extracting all but the most simple results from the output files. Here I present two attempts to improve this situation: First, a simple but quite general graphical pre-processor (NIME) written in the Tcl/TK to permit users to write and maintain ASCII-formatted input files by use of simple macro definitions and expansions. Second, an interactive postprocessor written in Fortran90 and NCAR graphics, which allows users to def...

  18. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  19. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  20. Wartime Requirements for Ammunition, Materiel and Personnel (WARRAMP). Volume III. Ammunition Post Processor User’s Manual (APP-UM).

    Science.gov (United States)

    1981-12-01

    preparation of the Distribution Requirement Report and the 3 - Day Report; these are produced and copied into the output file REPORTI by the program, and is...day in the study the number of samples or postures that are being played, and the AMMO (or DATA) expenditure file. The main output file ** REPORTI is...34 is to be used instead of the Program File name for the remainder of the processing; o The old output file REPORTI (56REPORT)is deleted o The input

  1. Wartime Requirements for Ammunition, Materiel, and Personnel (WARRAMP). Volume IV. Ammunition Post-Processor Program Maintenance Manual.

    Science.gov (United States)

    1982-02-01

    SUMAPH APL INLU UAILY LN6A6ELJot.ILLtMlKILL UA1A 69 CI Kcm, I PL IaAU I 60 C SUMA ~b AllILLLRY bLIJL UAILY Eh6A4LUtKKILLI1F4LL OAIA 61 C (kLH.TVPL,INaA4I...flag in logic tests for output printing. JEVEC(1) A one-dimensional integer array set to the value of IESD, reserved as 50. The array serves as a vector ...120 6 : days 121 - 150 7 : days 151 - 180 SEVEC(3,I) A two-dimensional real array (utilized as a vector ) set to the stylized expenditures of a single

  2. Fast Second Degree Total Variation Method for Image Compressive Sensing.

    Science.gov (United States)

    Liu, Pengfei; Xiao, Liang; Zhang, Jun

    2015-01-01

    This paper presents a computationally efficient algorithm for image compressive sensing reconstruction using a second degree total variation (HDTV2) regularization. Firstly, a preferably equivalent formulation of the HDTV2 functional is derived, which can be formulated as a weighted L1-L2 mixed norm of second degree image derivatives under the spectral decomposition framework. Secondly, using the equivalent formulation of HDTV2, we introduce an efficient forward-backward splitting (FBS) scheme to solve the HDTV2-based image reconstruction model. Furthermore, from the averaged non-expansive operator point of view, we make a detailed analysis on the convergence of the proposed FBS algorithm. Experiments on medical images demonstrate that the proposed method outperforms several fast algorithms of the TV and HDTV2 reconstruction models in terms of peak signal to noise ratio (PSNR), structural similarity index (SSIM) and convergence speed.

  3. Application of genetic algorithm to hexagon-based motion estimation.

    Science.gov (United States)

    Kung, Chih-Ming; Cheng, Wan-Shu; Jeng, Jyh-Horng

    2014-01-01

    With the improvement of science and technology, the development of the network, and the exploitation of the HDTV, the demands of audio and video become more and more important. Depending on the video coding technology would be the solution for achieving these requirements. Motion estimation, which removes the redundancy in video frames, plays an important role in the video coding. Therefore, many experts devote themselves to the issues. The existing fast algorithms rely on the assumption that the matching error decreases monotonically as the searched point moves closer to the global optimum. However, genetic algorithm is not fundamentally limited to this restriction. The character would help the proposed scheme to search the mean square error closer to the algorithm of full search than those fast algorithms. The aim of this paper is to propose a new technique which focuses on combing the hexagon-based search algorithm, which is faster than diamond search, and genetic algorithm. Experiments are performed to demonstrate the encoding speed and accuracy of hexagon-based search pattern method and proposed method.

  4. The Developments and Application of Computer Simulation in Heat Treatment Process Based on Metallo-Thermo-Mechanical Theory

    Institute of Scientific and Technical Information of China (English)

    Dong-Ying Ju

    2000-01-01

    The aim of this paper is to introduce the application and future of computer simulation in heat treatment process, as an important method in manufacturing of material. Here, developments of the finite element based on a code "HEARTS will are presented by using some experiments to be applied simulation of general quenching, quenching-tempering and quenching with carbonization or induction. The code "HEARTS" for 3-D simulation of heat treatment process based on a metallo-thermo-mechanical theory is presented. Coupled equations of heat conduction, inelastic stresses and kinetics of phase transformations are derived as well as the diffusion equation during carbonization, followed by finite element formulation. The program is available for 2-D and 3-D simulation of various heat treatment processes, such as quenching,tempering and so on. The system is used in the CAE environment of some pre-and post-processors over "Ⅰ-DEAS ", "PATRAN" or "FEMAP". Some examples of simulated results of carbonized quenching processes for cylindrical rod, ring and gear wheel are also presented.

  5. Dislocation-based plasticity model and micro-beam Laue diffraction analysis of polycrystalline Ni foil: A forward prediction

    Science.gov (United States)

    Song, Xu; Hofmann, Felix; Korsunsky, Alexander M.

    2010-10-01

    A physically-based, rate and length-scale dependent strain gradient crystal plasticity framework was employed to simulate the polycrystalline plastic deformation at the microscopic level in a large-grained, commercially pure Ni sample. The latter was characterised in terms of the grain morphology and orientation (in the bulk) by micro-beam Laue diffraction experiments carried out on beamline B16 at Diamond Light Source. The corresponding finite element model was developed using a grain-based mesh with the specific grain orientation assignment appropriate for the sample considered. Sample stretching to 2% plastic strain was simulated, and a post-processor was developed to extract the information about the local lattice misorientation (curvature), enabling forward-prediction of the Laue diffraction patterns. The 'streaking' phenomenon of the Laue spots (anisotropic broadening of two-dimensional (2D) diffraction peaks observed on the 2D detector) was correctly captured by the simulation, as constructed by direct superposition of reflections from different integration points within the diffraction gauge volume. Good agreement was found between the images collected from experiments and simulation patterns at various positions in the sample.

  6. Development of a Microwave Regenerative Sorbent-Based Hydrogen Purifier

    Science.gov (United States)

    Wheeler, Richard R., Jr.; Dewberry, Ross H.; McCurry, Bryan D.; Abney, Morgan B.; Greenwood, Zachary W.

    2016-01-01

    This paper describes the design and fabrication of a Microwave Regenerative Sorbent-based Hydrogen Purifier (MRSHP). This unique microwave powered technology was developed for the purification of a hydrogen stream produced by the Plasma Pyrolysis Assembly (PPA). The PPA is a hydrogen recovery (from methane) post processor for NASA's Sabatier-based carbon dioxide reduction process. Embodied in the Carbon dioxide Reduction Assembly (CRA), currently aboard the International Space Station (ISS), the Sabatier reaction employs hydrogen to catalytically recover oxygen, in the form of water, from respiratory carbon dioxide produced by the crew. This same approach is base-lined for future service in the Air Revitalization system on extended missions into deep space where resupply is not practical. Accordingly, manned exploration to Mars may only become feasible with further closure of the air loop as afforded by the greater hydrogen recovery permitted by the PPA with subsequent hydrogen purification. By utilizing the well-known high sorbate loading capacity of molecular sieve 13x, coupled with microwave dielectric heating phenomenon, MRSHP technology is employed as a regenerative filter for a contaminated hydrogen gas stream. By design, freshly regenerated molecular sieve 13x contained in the MRSHP will remove contaminants from the effluent of a 1-CM scale PPA for several hours prior to breakthrough. By reversing flow and pulling a relative vacuum the MRSHP prototype then uses 2.45 GHz microwave power, applied through a novel coaxial antenna array, to rapidly heat the sorbent bed and drive off the contaminants in a short duration vacuum/thermal contaminant desorption step. Finally, following rapid cooling via room temperature cold plates, the MRSHP is again ready to serve as a hydrogen filter.

  7. 基于Python的数控编程后置处理器设计%The Postprocessor of NC Programming Design Based on Python

    Institute of Scientific and Technical Information of China (English)

    崔传辉

    2012-01-01

    A designing method of the post processor of NC programming based on Python was proposed, and the cutter location source file format of APT was analyzed. Taking five-axis machine tools DMU70ev with double table as an object, using function of file management, characters processing and numerical calculation, an useful postprocessor of five-axis milling was developed and designed. At last this method is proved correct and feasible in practice.%提出基于Python的数控编程后置处理器的设计方法,并对标准APT刀轨文件格式进行分析,以双摆台五坐标加工机床DMU70ev为对象,运用Python的文件管理、字符处理与数值运算功能设计并开发了具有实用价值的五坐标铣削后置处理器,并在实践中证明了该方法的正确性及可行性.

  8. The Application of EPLD on HDTV Video Signal Post-Processing

    Institute of Scientific and Technical Information of China (English)

    JIANG Linhua; LIU Yuehua; CHAI Zhenming; LIU Geng

    2001-01-01

    Because of the flexibility of program-mable devices as well as other merits, the EPLD(erasable programmable logic device) chip was usedto implement the matrix transformation of HDTVvideo signal. In this paper, Altera's EPLD chipnamed EPM9400LC84-15 was used really. A seriesof computer simulations and optimizations are real-ized. First, the 54MHz processing speed was achievedby use of an improved carry-selection parallel addi-tion structure. Second, two kinds of technique, oneis the fine adjustment of transformation equations aswell as their coefficients, the other is the conversionof float point algorithm to fixed point algorithm, pre-vent precision from reducing and overflowing of mid-dle result. Third, many kinds of combined results of the different resource and precision requirements werecompared and evaluated on the last transformationoutput level, a good compromise result between chiplogic resource and transformation precision was finallyobtained. Furthermore, a concise correlation curvebetween them was illustrated in detail and a precisionequation was also defined using a 8 × 8-bit multiplier asan example. After simple improvement, this kind ofcorrelation approach can also be used to some similarEPLD designs and in other video processing fields.

  9. LG电子将面市支持Netflix视频发送服务的HDTV

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    日经电子社报道,韩国LG电子于美国时间2009年1月5日,发布了支持美国大型在线DVD租赁商Netflix的视频发送服务的高清数字电视(HDTV)。此次发布的液晶电视和等离子电视直接嵌入了Netflix的流媒体软件(Streaming Software),因此无需外部装置即可收看Netflix提供的内容。

  10. DVB-T2: An Outline of HDTV and UHDTV Programmes Broadcasting

    Directory of Open Access Journals (Sweden)

    M. Milivojević

    2015-11-01

    Full Text Available Increasing of video resolution of the HD programmes requires a higher bit rate. On the other hand, a continuous enhancement of existing and introduction of new source coding techniques are evident. Technical evolution of the Digital Terrestrial Television platform improves channel coding and RF channel utilization. Some optional solutions have also been proposed. This paper discusses the possibilities of allocating HD programmes in formats expected in the near future. Performed calculations are guidance to accurate examination of the efficiency of allocating high definition programmes in the DVB-T2 multiplex, in line with existing and expected enhancements.

  11. Understanding technology adoption through individual and context characteristics: the case of HDTV

    NARCIS (Netherlands)

    Baaren, Eva; Wijngaert, van de Lidwien; Huizer, Erik

    2011-01-01

    Technology adoption research has a tradition of using and improving Davis' (1989) “Technology Acceptance Model” (TAM) and extended versions of it. This article suggests a break with this tradition by showing that the TAM is limited in its understanding of technology adoption. Two alternative approac

  12. A method of groundwater quality assessment based on fuzzy network-CANFIS and geographic information system (GIS)

    Science.gov (United States)

    Gholami, V.; Khaleghi, M. R.; Sebghati, M.

    2016-12-01

    The process of water quality testing is money/time-consuming, quite important and difficult stage for routine measurements. Therefore, use of models has become commonplace in simulating water quality. In this study, the coactive neuro-fuzzy inference system (CANFIS) was used to simulate groundwater quality. Further, geographic information system (GIS) was used as the pre-processor and post-processor tool to demonstrate spatial variation of groundwater quality. All important factors were quantified and groundwater quality index (GWQI) was developed. The proposed model was trained and validated by taking a case study of Mazandaran Plain located in northern part of Iran. The factors affecting groundwater quality were the input variables for the simulation, whereas GWQI index was the output. The developed model was validated to simulate groundwater quality. Network validation was performed via comparison between the estimated and actual GWQI values. In GIS, the study area was separated to raster format in the pixel dimensions of 1 km and also by incorporation of input data layers of the Fuzzy Network-CANFIS model; the geo-referenced layers of the effective factors in groundwater quality were earned. Therefore, numeric values of each pixel with geographical coordinates were entered to the Fuzzy Network-CANFIS model and thus simulation of groundwater quality was accessed in the study area. Finally, the simulated GWQI indices using the Fuzzy Network-CANFIS model were entered into GIS, and hence groundwater quality map (raster layer) based on the results of the network simulation was earned. The study's results confirm the high efficiency of incorporation of neuro-fuzzy techniques and GIS. It is also worth noting that the general quality of the groundwater in the most studied plain is fairly low.

  13. 基于Android平台利用Wi-Fi Direct设计并实现音视频共享系统%Design and Implement the system to share Audio and Video using Wi-Fi Direct based on Android platform

    Institute of Scientific and Technical Information of China (English)

    泰凯文

    2012-01-01

      an audio and Video sharing system using wi-fi Direct based on android platform is introduced in this paper. the system is used on android platform and obeys the wi-fi Direct standard. It constructs a wireless p2p connection to connect android endpoint with wireless adapter. the audio and video stream is transported through wi-fi module to show the detail of android endpoint on HDtV con-nected with adapter.%  本文设计并实现了一种在 Android 平台上利用 Wi-Fi Direct 实现音视频共享的系统。该系统在 Android 平台上运行,使用了 Wi-Fi Direct 标准,Android 终端与无线适配器通过建立无线直连网络连接,音视频流通过 Wi-Fi 将手机上的内容投影到与适配器连接的电视上。

  14. Base

    DEFF Research Database (Denmark)

    Hjulmand, Lise-Lotte; Johansson, Christer

    2004-01-01

    BASE - Engelsk basisgrammatik er resultatet af Lise-Lotte Hjulmands grundige bearbejdning og omfattende revidering af Christer Johanssons Engelska basgrammatik. Grammatikken adskiller sig fra det svenske forlæg på en lang række punkter. Den er bl.a. tilpasset til et dansk publikum og det danske...

  15. 基于JPEG硬编码的嵌入式无线图像传输处理终端%Embedded Wireless Image Transmitting and Processing Terminal Based on JPEG Hardware Encoding

    Institute of Scientific and Technical Information of China (English)

    沈龙梅; 张立文; 国珊珊; 宋占伟

    2013-01-01

    In modern multimedia teaching system,the students just obtain image information through the big screen's slider,but due to the factors such as distance,they can't get the data clearly and store the content of the teacher's lecture automatically,in order to meet this demand,the embedded processor,image acquisition module and WiFi have been combined,the JPEG(Joint Photographic Experts Group)hardware encoding and PP (Post Processor)supporting image scaling and format conversion technology have been introduced,the embedded wireless image transmission and processing system based on JPEG hardware encoding was designed.For the students,the image from teacher can be decoded,displayed and stored automatically improving the learning efficiency and realizing the image information interaction.Finally,in the Linux environment,the procedures of Ad-Hoc and self-communication were programmed to test the program,the experimental data verified the feasibility and completeness of the system architecture,network protocols,and the core algorithm.The connecting speed of the terminals was real-time and efficient,the speed of image acquisition,compression,display,and wireless transmission can reach the intelligent standards.%针对多媒体教学中,图像信息因距离等因素而不能清晰地获取数据且不能对教师的授课内容进行自动接收存储问题,将嵌入式处理器与图像采集模块及WiFi(Wireless Fidelity)相结合,并引入了JPEG(Joint Photographic Experts Group)硬件编解码及PP(Post Processor)支持图像缩放及格式转换技术,设计了基于JPEG硬编码的嵌入式无线图像传输处理终端.给出了整体方案,并进行了编程调试,验证了系统的整体架构、通信协议及核心算法的可行性与完备性.通过实验证明,学生端能自动接收显示并存储教师端广播的图像数据,提高了学习效率,实现了终端之间图像信息的交互.终端之间连接实时高效、图像数据的采集压缩显示

  16. New approach to color calibration of high fidelity color digital camera by using unique wide gamut color generator based on LED diodes

    Science.gov (United States)

    Kretkowski, M.; Shimodaira, Y.; Jabłoński, R.

    2008-11-01

    Development of a high accuracy color reproduction system requires certain instrumentation and reference for color calibration. Our research led to development of a high fidelity color digital camera with implemented filters that realize the color matching functions. The output signal returns XYZ values which provide absolute description of color. In order to produce XYZ output a mathematical conversion must be applied to CCD output values introducing a conversion matrix. The conversion matrix coefficients are calculated by using a color reference with known XYZ values and corresponding output signals from the CCD sensor under each filter acquisition from a certain amount of color samples. The most important feature of the camera is its ability to acquire colors from the complete theoretically visible color gamut due to implemented filters. However market available color references such as various color checkers are enclosed within HDTV gamut, which is insufficient for calibration in the whole operating color range. This led to development of a unique color reference based on LED diodes called the LED Color Generator (LED CG). It is capable of displaying colors in a wide color gamut estimated by chromaticity coordinates of 12 primary colors. The total amount of colors possible to produce is 25512. The biggest advantage is a possibility of displaying colors with desired spectral distribution (with certain approximations) due to multiple primary colors it consists. The average color difference obtained for test colors was found to be ▵E~0.78 for calibration with LED CG. The result is much better and repetitive in comparison with the Macbeth ColorCheckerTM which typically gives ▵E~1.2 and in the best case ▵E~0.83 with specially developed techniques.

  17. Converting multiple OC-3c ATM streams to HIPPI to drive an HDTV frame buffer from a workstation cluster

    Energy Technology Data Exchange (ETDEWEB)

    Tolmie, D.E.; Dornhoff, A.G.; DuBois, A.J.

    1994-12-01

    A group of eight Digital Equipment Corporation Alpha workstations is interconnected with ATM to form a cluster with supercomputer power. For output, each workstation drives a single ``tile`` on an 8-tile high-resolution frame buffer. A special purpose adapter is used to convert the workstation`s ATM format to the frame buffer`s HIPPI format. This paper discusses the rationale behind the workstation farm, and then describes the visualization output path in detail. To provide the system quickly, special emphasis was placed on making the design as simple as possible. The design choices are examined, and the resultant system is described.

  18. Applications and Innovations for Use of High Definition and High Resolution Digital Motion Imagery in Space Operations

    Science.gov (United States)

    Grubbs, Rodney

    2016-01-01

    The first live High Definition Television (HDTV) from a spacecraft was in November, 2006, nearly ten years before the 2016 SpaceOps Conference. Much has changed since then. Now, live HDTV from the International Space Station (ISS) is routine. HDTV cameras stream live video views of the Earth from the exterior of the ISS every day on UStream, and HDTV has even flown around the Moon on a Japanese Space Agency spacecraft. A great deal has been learned about the operations applicability of HDTV and high resolution imagery since that first live broadcast. This paper will discuss the current state of real-time and file based HDTV and higher resolution video for space operations. A potential roadmap will be provided for further development and innovations of high-resolution digital motion imagery, including gaps in technology enablers, especially for deep space and unmanned missions. Specific topics to be covered in the paper will include: An update on radiation tolerance and performance of various camera types and sensors and ramifications on the future applicability of these types of cameras for space operations; Practical experience with downlinking very large imagery files with breaks in link coverage; Ramifications of larger camera resolutions like Ultra-High Definition, 6,000 [pixels] and 8,000 [pixels] in space applications; Enabling technologies such as the High Efficiency Video Codec, Bundle Streaming Delay Tolerant Networking, Optical Communications and Bayer Pattern Sensors and other similar innovations; Likely future operations scenarios for deep space missions with extreme latency and intermittent communications links.

  19. InP-based monolithically integrated 1310/1550nm diplexer/triplexer

    Science.gov (United States)

    Silfvenius, C.; Swillo, M.; Claesson, J.; Forsberg, E.; Akram, N.; Chacinski, M.; Thylén, L.

    2008-11-01

    Multiple streams of high definition television (HDTV) and improved home-working infrastructure are currently driving forces for potential fiber to the home (FTTH) customers [1]. There is an interest to reduce the cost and physical size of the FTTH equipment. The current fabrication methods have reached a cost minimum. We have addressed the costchallenge by developing 1310/(1490)/1550nm bidirectional diplexers, by monolithic seamless integration of lasers, photodiodes and wavelength division multiplexing (WDM) couplers into one single InP-based device. A 250nm wide optical gain profile covers the spectrum from 1310 to 1550nm and is the principal building block. The device fabrication is basically based on the established configuration of using split-contacts on continuos waveguides. Optical and electrical cross-talks are further addressed by using a Y-configuration to physically separate the components from each other and avoid inline configurations such as when the incoming signal travels through the laser component or vice versa. By the eliminated butt-joint interfaces which can reflect light between components or be a current leakage path and by leaving optically absorbing (unpumped active) material to surround the components to absorb spontaneous emission and nonintentional reflections the devices are optically and electrically isolated from each other. Ridge waveguides (RWG) form the waveguides and which also maintain the absorbing material between them. The WDM functionality is designed for a large optical bandwidth complying with the wide spectral range in FTTH applications and also reducing the polarization dependence of the WDM-coupler. Lasing is achieved by forming facet-free, λ/4-shifted, DFB (distributed feedback laser) lasers emitting directly into the waveguide. The photodiodes are waveguide photo-diodes (WGPD). Our seamless technology is also able to array the single channel diplexers to 4 to 12 channel diplexer arrays with 250μm fiber port

  20. Using the PCRaster-POLFLOW approach to GIS-based modelling of coupled groundwater-surface water hydrology in the Forsmark Area

    Energy Technology Data Exchange (ETDEWEB)

    Jarsjoe, Jerker; Shibuo, Yoshihiro; Destouni, Georgia [Stockholm Univ. (Sweden). Dept. of Physical Geography and Quaternary Geology

    2004-09-01

    The catchment-scale hydrologic modelling approach PCRaster-POLFLOW permits the integration of environmental process modelling functions with classical GIS functions such as database maintenance and screen display. It has previously successfully been applied at relatively large river basins and catchments, such as Rhine, Elbe and Norrstroem, for modelling stream water flow and nutrient transport. In this study, we review the PCRaster-POLFLOW modelling approach and apply it using a relatively fine spatial resolution to the smaller catchment of Forsmark. As input we use data from SKB's database, which includes detailed data from Forsmark (and Simpevarp), since these locations are being investigated as part of the process to find a suitable location for a deep repository for spent nuclear fuel. We show, by comparison with independently measured, area-averaged runoff data, that the PCRaster-POLFLOW model produces results that, without using site-specific calibration, agree well with these independent measurements. In addition, we deliver results for four planned hydrological stations within the Forsmark catchment thus allowing for future direct comparisons with streamflow monitoring. We also show that, and how, the PCRaster-POLFLOW model in its present state can be used for predicting average seasonal streamflow. The present modelling exercise provided insights into possible ways of extending and using the PCRaster-POLFLOW model for applications beyond its current main focus of surface water hydrology. In particular, regarding analysis of possible surface water-groundwater interactions, we identify the Analytic Element Method for groundwater modelling together with its GIS-based pre- and post processor ArcFlow as suitable and promising for use in combination with the PCRaster-POLFLOW modelling approach. Furthermore, for transport modelling, such as that of radionuclides entering the coupled shallow groundwater-surface water hydrological system from possible deep

  1. LCD-TV in China-The New Must-have Appliance%液晶电视将成为中国电视机主流

    Institute of Scientific and Technical Information of China (English)

    Ken Tompkins

    2006-01-01

    @@ With the 2008 Summer Olympics coming to Beijing,China,most are predicting an up tick in interest in HDTV and flat panels.What may be a little surprising is the very large growth this market will experience ,at least according to a report in the China-based subsidiary of the newspaper,Fuji-Kezai.

  2. A Methodology for Platform Based High—Level System—on—Chip Verification

    Institute of Scientific and Technical Information of China (English)

    GAOFeng; LIUPeng; YAOQingdong

    2003-01-01

    The time-to-market challenge has increased the need for shortening the co-verification time in system-on-chip development.In this article,a new methodology of high-level hardware/software coverification is introduced.With the help of the real-time operating system,the application program can easily be migrated from the software simulator to the hardware emulation board.The hierarchical architecture can be used to separate application program from the implementation of the platform during the veriflaction process.The highlevel verification platform is successfully used in developing the HDTV decoding chip.

  3. Forecast-Based Operations Support Tool for the New York City Water Supply System

    Science.gov (United States)

    Pyke, G.; Porter, J.

    2012-12-01

    USGS and ensemble inflow forecasts from the National Weather Service (NWS). Incoming data passes through an automated flagging/filling process, and data is presented to operators for approval prior to use as model input. OST allows the user to drive operational runs with two types of ensemble inflow forecasts. Statistical forecasts are based on historical inflows that are conditioned on antecedent hydrology. The statistical algorithm is relatively simple and versatile and is useful for longer-term projections. For improved short-term skill, OST will rely on NWS meteorologically-based ensemble forecasts. A post-processor within OST will provide bias correction for the NWS ensembles. OST applications to date have included routine short-term operational projections to support release decisions, analysis of tradeoffs between water supply and water quality during turbidity events, facility outage planning, development of operating rules and release policies, long-term water supply planning, and climate change assessment. The structure and capabilities of OST are expected to be a useful template for drinking water utilities and water system managers seeking to integrate forecasts into system operations and balance tradeoffs between competing objectives in both near-term operations and long-term planning.

  4. Improving high-resolution quantitative precipitation estimation via fusion of multiple radar-based precipitation products

    Science.gov (United States)

    Rafieeinasab, Arezoo; Norouzi, Amir; Seo, Dong-Jun; Nelson, Brian

    2015-12-01

    without post hoc bias correction of the ingredient QPEs. The results show that only SE passes the evaluation criterion consistently. The performance of DE and BC are generally comparable; while DE is more attractive for computational economy, BC is more attractive for reducing occurrences of negative estimates. The performance of RBC is poor as it does not account for magnitude-dependent biases in the QPE products. SE assumes that the higher-resolution QPE product is skillful in capturing spatiotemporal variability of precipitation at its native resolution, and that the lower-resolution QPE product provides skill at its native resolution. While the above assumptions may not always be met, the simplicity and robustness observed in this work make SE an extremely attractive choice as a simple post-processor to the QPE process. Also, unlike the other procedures considered in this work, it is extremely easy to update the statistical parameters of SE in real time, similarly to the real-time bias correction currently used in MPE, for improved performance via self-learning.

  5. Accurate dose assessment system for an exposed person utilising radiation transport calculation codes in emergency response to a radiological accident.

    Science.gov (United States)

    Takahashi, F; Shigemori, Y; Seki, A

    2009-01-01

    A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field.

  6. Fiscal 1997 report on the verification survey of geothermal exploration technology. 5-1. Development of the reservoir variation survey method (technology of prediction of reservoir variation); 1997 nendo chinetsu tansa gijutsu nado kensho chosa. Choryuso hendo tansaho kaihatsu (choryuso hendo yosoku gijutsu) hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    For the reservoir evaluation at an initial developmental stage and stabilization/maintenance of power after the start of operation, the fiscal 1997 result was described of the study of technology of prediction of reservoir variation. Using the conventional post processor, feasibilities were computed of reservoir models and behavior after the development, and gravity/self potential/resistivity variation. Variation in the seismic wave speed structure was large in travel time change distribution. The measuring accuracy of 1m sec is required to get enough detection resolving power. A conceptual design of the post processor development was conducted to study a system operated on Windows. Based on the reservoir numerical simulation technology, by taking in variation parameters such as gravity and self potentials as new model constraint conditions, the reservoir modeling technology which increased in accuracy by history matching was trially developed. Using the conventional reservoir model in the Oguni area, predictably computed were reservoir behaviors during 50 years which simulated a 20 MW development. Effectiveness of the post processor were able to be shown though influenced by characteristics such as permeability and resistivity. 74 refs., 95 refs., 12 tabs.

  7. The Hardware Implementation of Digital HDTV Source Decoder%数字HDTV信源解码器的硬件实现

    Institute of Scientific and Technical Information of China (English)

    王承宁; 俞斯乐; 李华; 国澄明

    2000-01-01

    本文提出了一种主要基于FPGA的数字HDTV信源解码器的总体设计方案和硬件实现方法.同时还介绍了整机的测试系统和测试结果.目前,该信源解码器已研制完成,并参加了于1998年9月8日至11日在北京成功进行了我国HDTV功能样机系统的公开演示.

  8. 中央电视台高清后期网络制作系统%HDTV Postproduction Network System in CCTV

    Institute of Scientific and Technical Information of China (English)

    马悦

    2007-01-01

    本文介绍了中央电视台高清后期网络制作系统的节目制作方式、系统总体构成、生产流程、系统建立的创新点等.系统建设在大规模高清制作,高质量制作同时兼顾高清广谱应用,优化上载流程,高清制作与媒资系统结合,字幕与图像分离制作同步播出,网络化远程审片,实践互联互通等方面做出了有益的探索.

  9. Some Aspects of Analysis of a Micromirror

    OpenAIRE

    Dipti Razdan; A.B. Chattopadhyay

    2015-01-01

    Micromirror is a very small mirror based on the principle of Micro Electro Mechanical Systems (MEMS). Micromirror application in areas like laser scanning displays, DLP Projection system and HDTV are realized using MEMS technology. In this study, an electrostatically controlled micromirror is designed using COMSOL multi-physics software. The structural and mechanical properties of the actuation mechanism of various shapes of a micro-mirror will be studied. The base materials used will include...

  10. Design and Optimization of the VLSI Architecture for Discrete Cosine Transform Used in Image Compression

    OpenAIRE

    Kovač, Mario; Žagar, Mario; Ranganathan, N.

    1996-01-01

    Presentation of images plays a significant role in today's information exchange. Numerous applications that have been introduced in last few years, such as video teleconferencing, HDTV, wirephoto, fax, computer tomography, interactive visualization, multimedia and other, are based on image presentation and distribution. procedures. Disadvantage of using digital images in these applications is enormous amount of space needed for image storage. For example, a 1024 x 1024 color image with 24 bit...

  11. A Unified Approach to Restoration, Deinterlacing and Resolution Enhancement in Decoding MPEG-2 Video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Martins, Bo

    2002-01-01

    The quality and spatial resolution of video can be improved by combining multiple pictures to form a single superresolution picture. We address the special problems associated with pictures of variable but somehow parameterized quality such as MPEG-decoded video. Our algorithm provides a unified......) interlaced high-definition TV (HDTV) in 4:2:0; (5) progressive HDTV in 4:2:0. These conversions also provide features such as freeze frame and zoom. The algorithm is mainly targeted at bit rates of 4-8 Mb/s. The algorithm is based on motion-compensated spatial upsampling from multiple images and decimation....... The superresolution pictures obtained by the algorithm are of much higher visual quality and have lower MSE than superresolution pictures obtained by simple spatial interpolation....

  12. Real-time IP-hologram conversion hardware based on floating point DSPs

    Science.gov (United States)

    Oi, Ryutaro; Mishina, Tomoyuki; Yamamoto, Kenji; Okui, Makoto

    2009-02-01

    Holography is a 3-D display method that fully satisfies the visual characteristics of the human eye. However, the hologram must be developed in a darkroom under laser illumination. We attempted hologram generation under white light by adopting an integral photography (IP) technique as the input. In this research, we developed a hardware converter to convert IP input (with 120×66 elemental images) to a hologram with high definition television (HDTV) resolution (approximately 2 million pixels). This conversion could be carried out in real time. In this conversion method, each elemental image can be independently extracted and processed. Our hardware contains twenty 300-MHz floating-point digital signal processors (DSPs) operating in parallel. We verified real-time conversion operations by the implemented hardware.

  13. SEAWAT: A Computer Program for Simulation of Variable-Density Groundwater Flow and Multi-Species Solute and Heat Transport

    Science.gov (United States)

    Langevin, Christian D.

    2009-01-01

    SEAWAT is a MODFLOW-based computer program designed to simulate variable-density groundwater flow coupled with multi-species solute and heat transport. The program has been used for a wide variety of groundwater studies including saltwater intrusion in coastal aquifers, aquifer storage and recovery in brackish limestone aquifers, and brine migration within continental aquifers. SEAWAT is relatively easy to apply because it uses the familiar MODFLOW structure. Thus, most commonly used pre- and post-processors can be used to create datasets and visualize results. SEAWAT is a public domain computer program distributed free of charge by the U.S. Geological Survey.

  14. Fatigue design of welded joints using the finite element method and the 2007 ASME Div. 2 Master curve

    Directory of Open Access Journals (Sweden)

    G. Nicoletto

    2009-07-01

    Full Text Available Fatigue design of welded structures is primarily based on a nominal stress; hot spot stress methods or local approaches each having several limitations when coupled with finite element modeling. An alternative recent structural stress definition is discussed and implemented in a post-processor. It provides an effective means for the direct coupling of finite element results to the fatigue assessment of welded joints in complex structures. The applications presented in this work confirm the main features of the method: mesh-insensitivity, accurate crack location and life to failure predictions.

  15. Experimental service of 3DTV broadcasting relay in Korea

    Science.gov (United States)

    Hur, Namho; Ahn, Chung-Hyun; Ahn, Chieteuk

    2002-11-01

    This paper introduces 3D HDTV relay broadcasting experiments of 2002 FIFA World Cup Korea/Japan using a terrestrial and satellite network. We have developed 3D HDTV cameras, 3D HDTV video multiplexer/demultiplexer, a 3D HDTV receiver, and a 3D HDTV OB van for field productions. By using a terrestrial and satellite network, we distributed a compressed 3D HDTV signal to predetermined demonstration venues which are approved by host broadcast services (HBS), KirchMedia, and FIFA. In this case, we transmitted a 40Mbps MPEG-2 transport stream (DVB-ASI) over a DS-3 network specified in ITU-T Rec. G.703. The video/audio compression formats are MPEG-2 main-profile, high-level and Dolby Digital AC-3 respectively. Then at venues, the recovered left and right images by the 3D HDTV receiver are displayed on a screen with polarized beam projectors.

  16. Effect of Adding a Regenerator to Kornhauser's MIT "Two-Space" (Gas-Spring+Heat Exchanger) Test Rig

    Science.gov (United States)

    Ebiana, Asuquo B.; Gidugu, Praveen

    2008-01-01

    This study employed entropy-based second law post-processing analysis to characterize the various thermodynamic losses inside a 3-space solution domain (gas spring+heat exchanger+regenerator) operating under conditions of oscillating pressure and oscillating flow. The 3- space solution domain is adapted from the 2-space solution domain (gas spring+heat exchanger) in Kornhauser's MIT test rig by modifying the heat exchanger space to include a porous regenerator system. A thermal nonequilibrium model which assumes that the regenerator porous matrix and gas average temperatures can differ by several degrees at a given axial location and time during the cycle is employed. An important and primary objective of this study is the development and application of a thermodynamic loss post-processor to characterize the major thermodynamic losses inside the 3-space model. It is anticipated that the experience gained from thermodynamic loss analysis of the simple 3-space model can be extrapolated to more complex systems like the Stirling engine. It is hoped that successful development of loss post-processors will facilitate the improvement of the optimization capability of Stirling engine analysis codes through better understanding of the heat transfer and power losses. It is also anticipated that the incorporation of a successful thermal nonequilibrium model of the regenerator in Stirling engine CFD analysis codes, will improve our ability to accurately model Stirling regenerators relative to current multidimensional thermal-equilibrium porous media models.

  17. 浅析高清电视时代灯光与化妆的关系%A Brief Analysis of Relationship between Lighting and Makeup in HDTV Era

    Institute of Scientific and Technical Information of China (English)

    文艺

    2014-01-01

    In the context of HD technology, the relationship between lighting and makeup, two keys to visual presentation in TV Production, were discussed here.%探讨在高清技术的背景下,电视制作环节中灯光和化妆之间的关系。

  18. Lotus Base

    DEFF Research Database (Denmark)

    Mun, Terry; Bachmann, Asger; Gupta, Vikas

    2016-01-01

    exploration of Lotus genomic and transcriptomic data. Equally important are user-friendly in-browser tools designed for data visualization and interpretation. Here, we present Lotus Base, which opens to the research community a large, established LORE1 insertion mutant population containing an excess of 120...... such data, allowing users to construct, visualize, and annotate co-expression gene networks. Lotus Base takes advantage of modern advances in browser technology to deliver powerful data interpretation for biologists. Its modular construction and publicly available application programming interface enable...... developers to tap into the wealth of integrated Lotus data. Lotus Base is freely accessible at: https://lotus.au.dk....

  19. Touch BASE

    CERN Multimedia

    Antonella Del Rosso

    2015-01-01

    In a recent Nature article (see here), the BASE collaboration reported the most precise comparison of the charge-to-mass ratio of the proton to its antimatter equivalent, the antiproton. This result is just the beginning and many more challenges lie ahead.   CERN's AD Hall, where the BASE experiment is set-up. The Baryon Antibaryon Symmetry Experiment (BASE) was approved in June 2013 and was ready to take data in August 2014. During these 14 months, the BASE collaboration worked hard to set up its four cryogenic Penning traps, which are the heart of the whole experiment. As their name indicates, these magnetic devices are used to trap antiparticles – antiprotons coming from the Antiproton Decelerator – and particles of matter – negative hydrogen ions produced in the system by interaction with a degrader that slows the antiprotons down, allowing scientists to perform their measurements. “We had very little time to set up the wh...

  20. Web based foundry knowledge base

    Directory of Open Access Journals (Sweden)

    A. Stawowy

    2009-01-01

    Full Text Available The main assumptions and functions of proposed Foundry Knowledge Base (FKB are presented in this paper. FKB is a framework forinformation exchange of casting products and manufacturing methods. We use CMS (Content Management System to develope andmaintain our web-based system. The CastML – XML dialect developed by authors for description of casting products and processes – isused as a tool for information interchange between ours and outside systems, while SQL is used to store and edit knowledge rules and alsoto solve the basic selection problems in the rule-based module. Besides the standard functions (companies data, news, events, forums and media kit, our website contains a number of nonstandard functions; the intelligent search module based on expert system is the main advantage of our solution. FKB is to be a social portal which content will be developed by foundry community.

  1. Simulation and Analysis of Ring Compression Using RP/RVP Meshless Method

    Institute of Scientific and Technical Information of China (English)

    YANG Yuying; LI Jing

    2006-01-01

    To avoid mesh distortion and iterative remeshing in mesh-based numerical analysis, a meshless approach based on element free Galerkin (EFG) method is applied to the metal forming analysis of ring compression. Discrete equations are formulated upon the moving least-squares (MLS) approximation and modified Markov variational principles for rigid-plastic/ rigid-viscoplastic (RP/RVP) material models. The penalty function is used for the incompressible condition without volumetric locking. Based on the axisymmetric mechanical model, ring tests with different friction coefficients are studied. The deformed nodal configurations and shaded contours of equivalent strains are shown by developed meshless post processor. The comparison of meshless and finite element (FE) results validates the feasibility and accuracy for meshless method to simulate metal forming process.

  2. Efficient visualization of unsteady and huge scalar and vector fields

    Science.gov (United States)

    Vetter, Michael; Olbrich, Stephan

    2016-04-01

    and methods, we are developing a stand-alone post-processor, adding further data structures and mapping algorithms, and cooperating with the ICON developers and users. With the implementation of a DSVR-based post-processor, a milestone was achieved. By using the DSVR post-processor the mentioned 3 processes are completely separated: the data set is processed in a batch mode - e.g. on the same supercomputer, which the data is generated on - and the interactive 3D rendering is done afterwards on the scientist's local system. At the actual status of implementation the DSVR post-processor supports the generation of isosurfaces and colored slicers on volume data set time series based on rectilinear grids as well as the visualization of pathlines on time varying flow fields based on either rectilinear grids or prism grids. The software implementation and evaluation is done on the supercomputers at DKRZ, including scalability tests using ICON output files in NetCDF format. The next milestones will be (a) the in-situ integration of the DSVR library in the ICON model and (b) the implementation of an isosurface algorithm for prism grids.

  3. Foundation: Transforming data bases into knowledge bases

    Science.gov (United States)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  4. Algorithms for Academic Search and Recommendation Systems

    DEFF Research Database (Denmark)

    Amolochitis, Emmanouil

    2014-01-01

    In this work we present novel algorithms for academic search, recommendation and association rules mining. In the first part of the work we introduce a novel hierarchical heuristic scheme for re-ranking academic publications. The scheme is based on the hierarchical combination of a custom...... are part of a developed Movie Recommendation system, the first such system to be commercially deployed in Greece by a major Triple Play services provider. In the third part of the work we present the design of a quantitative association rule mining algorithm. The introduced mining algorithm processes...... a specific number of user histories in order to generate a set of association rules with a minimally required support and confidence value. We have introduced a post processor that uses the generated association rules and improves the quality (in terms of recall) of the original recommendation functionality....

  5. TOGA - A GNSS Reflections Instrument for Remote Sensing Using Beamforming

    Science.gov (United States)

    Esterhuizen, S.; Meehan, T. K.; Robison, D.

    2009-01-01

    Remotely sensing the Earth's surface using GNSS signals as bi-static radar sources is one of the most challenging applications for radiometric instrument design. As part of NASA's Instrument Incubator Program, our group at JPL has built a prototype instrument, TOGA (Time-shifted, Orthometric, GNSS Array), to address a variety of GNSS science needs. Observing GNSS reflections is major focus of the design/development effort. The TOGA design features a steerable beam antenna array which can form a high-gain antenna pattern in multiple directions simultaneously. Multiple FPGAs provide flexible digital signal processing logic to process both GPS and Galileo reflections. A Linux OS based science processor serves as experiment scheduler and data post-processor. This paper outlines the TOGA design approach as well as preliminary results of reflection data collected from test flights over the Pacific ocean. This reflections data demonstrates observation of the GPS L1/L2C/L5 signals.

  6. Computational Controls Workstation: Algorithms and hardware

    Science.gov (United States)

    Venugopal, R.; Kumar, M.

    1993-01-01

    The Computational Controls Workstation provides an integrated environment for the modeling, simulation, and analysis of Space Station dynamics and control. Using highly efficient computational algorithms combined with a fast parallel processing architecture, the workstation makes real-time simulation of flexible body models of the Space Station possible. A consistent, user-friendly interface and state-of-the-art post-processing options are combined with powerful analysis tools and model databases to provide users with a complete environment for Space Station dynamics and control analysis. The software tools available include a solid modeler, graphical data entry tool, O(n) algorithm-based multi-flexible body simulation, and 2D/3D post-processors. This paper describes the architecture of the workstation while a companion paper describes performance and user perspectives.

  7. Fast and accurate database searches with MS-GF+Percolator.

    Science.gov (United States)

    Granholm, Viktor; Kim, Sangtae; Navarro, José C F; Sjölund, Erik; Smith, Richard D; Käll, Lukas

    2014-02-07

    One can interpret fragmentation spectra stemming from peptides in mass-spectrometry-based proteomics experiments using so-called database search engines. Frequently, one also runs post-processors such as Percolator to assess the confidence, infer unique peptides, and increase the number of identifications. A recent search engine, MS-GF+, has shown promising results, due to a new and efficient scoring algorithm. However, MS-GF+ provides few statistical estimates about the peptide-spectrum matches, hence limiting the biological interpretation. Here, we enabled Percolator processing for MS-GF+ output and observed an increased number of identified peptides for a wide variety of data sets. In addition, Percolator directly reports p values and false discovery rate estimates, such as q values and posterior error probabilities, for peptide-spectrum matches, peptides, and proteins, functions that are useful for the whole proteomics community.

  8. TOGA - A GNSS Reflections Instrument for Remote Sensing Using Beamforming

    Science.gov (United States)

    Esterhuizen, S.; Meehan, T. K.; Robison, D.

    2009-01-01

    Remotely sensing the Earth's surface using GNSS signals as bi-static radar sources is one of the most challenging applications for radiometric instrument design. As part of NASA's Instrument Incubator Program, our group at JPL has built a prototype instrument, TOGA (Time-shifted, Orthometric, GNSS Array), to address a variety of GNSS science needs. Observing GNSS reflections is major focus of the design/development effort. The TOGA design features a steerable beam antenna array which can form a high-gain antenna pattern in multiple directions simultaneously. Multiple FPGAs provide flexible digital signal processing logic to process both GPS and Galileo reflections. A Linux OS based science processor serves as experiment scheduler and data post-processor. This paper outlines the TOGA design approach as well as preliminary results of reflection data collected from test flights over the Pacific ocean. This reflections data demonstrates observation of the GPS L1/L2C/L5 signals.

  9. Plenoptic Imager for Automated Surface Navigation

    Science.gov (United States)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  10. Radiation and scattering analysis of piezoelectric transducers using finite and infinite wave envelope elements

    Science.gov (United States)

    Kim, Jaehwan; Jung, Eunmi; Choi, Seung-Bok

    2002-07-01

    This paper presents a numerical modeling technique of piezoelectric transducers by taking into account wave radiation and scattering. It is based on the finite element modeling. Coupling problems between piezoelectric and elastic materials as well as fluid and structure systems associated with the modeling of piezoelectric underwater acoustic sensors are formulated. In the finite element modeling of unbounded acoustic fluid, IWEE (Infinite Wave Envelop Element) is adopted to take into account the infinite domain. The IWEE code is added to an in-house finite element program, and commercial pre and post-processor are used for mesh generation and to see the output. The validation of the numerical modeling is proved through an example, and scattering and radiation analysis of Tonpilz transducer is performed. The scattered wave on the sensor is calculated, and the sensor response, so called RVS (Receiving Voltage Sensitivity) is predicted.

  11. TOOKUIL: A case study in user interface development for safety code application

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G. [and others

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.

  12. A non-parametric 2D deformable template classifier

    DEFF Research Database (Denmark)

    Schultz, Nette; Nielsen, Allan Aasbjerg; Conradsen, Knut;

    2005-01-01

    We introduce an interactive segmentation method for a sea floor survey. The method is based on a deformable template classifier and is developed to segment data from an echo sounder post-processor called RoxAnn. RoxAnn collects two different measures for each observation point, and in this 2D...... feature space the ship-master will be able to interactively define a segmentation map, which is refined and optimized by the deformable template algorithms. The deformable templates are defined as two-dimensional vector-cycles. Local random transformations are applied to the vector-cycles, and stochastic...... relaxation in a Bayesian scheme is used. In the Bayesian likelihood a class density function and its estimate hereof is introduced, which is designed to separate the feature space. The method is verified on data collected in Øresund, Scandinavia. The data come from four geographically different areas. Two...

  13. Ascent Heating Thermal Analysis on Spacecraft Adaptor Fairings

    Science.gov (United States)

    Wang, Xiao Yen; Yuko, James; Motil, Brian

    2011-01-01

    When the Crew Exploration Vehicle (CEV) is launched, the spacecraft adaptor (SA) fairings that cover the CEV service module (SM) are exposed to aero heating. Thermal analysis is performed to compute the fairing temperatures and to investigate whether the temperatures are within the material limits for nominal ascent aeroheating case. The ascent heating is analyzed by using computational fluid dynamics (CFD) and engineering codes at Marshall Space Flight Center. The aeroheating environment data used for this work is known as Thermal Environment 3 (TE3) heating data. One of the major concerns is with the SA fairings covering the CEV SM and the SM/crew launch vehicle (CLV) flange interface. The TE3 heating rate is a function of time, wall temperature, and the spatial locations. The implementation of the TE3 heating rate as boundary conditions in the thermal analysis becomes challenging. The ascent heating thermal analysis on SA fairings and SM/CLV flange interface are performed using two commercial software packages: Cullimore & Ring (C&R) Thermal Desktop (TD) 5.1 and MSC Patran 2007r1 b. TD is the pre-and post-processor for SINDA, which is a finite-difference-based solver. In TD, the geometry is built and meshed, the boundary conditions are defined, and then SINDA is used to compute temperatures. MSC Pthermal is a finite-element- based thermal solver. MSC Patran is the pre- and post-processor for Pthermal. Regarding the boundary conditions, the convection, contact resistance, and heat load can be imposed in different ways in both programs. These two software packages are used to build the thermal model for the same analysis to validate each other and show the differences in the modeling details.

  14. INVOLUTIVE BASES UNDER COMPOSITION

    Institute of Scientific and Technical Information of China (English)

    Zailiang TANG

    2007-01-01

    In this paper, the behaviors of involutive bases under composition operation are studied.For two kinds of involutive bases, i.e., Pommaret bases, Janet bases, we study their behavior problems under composition. Some further problems are also proposed.

  15. User guide to SUNDT. A simulation tool for ultrasonic NDT

    Energy Technology Data Exchange (ETDEWEB)

    Wirdelius, H. [SAQ Kontroll AB, Moelndal (Sweden)

    2000-08-01

    Mathematical modelling of the ultrasonic NDT situation has become an emerging discipline with a broadening industrial interest in the recent decade. New and stronger demands on reliability of used procedures and methods applied in e.g. nuclear and pressure vessel industries have enforced this fact. To qualify the procedures, extensive experimental work on test blocks is normally required. A thoroughly validated model has the ability to be an alternative and a complement to the experimental work in order to reduce the extensive cost that is associated with the previous procedure. The present report describes the SUNDT software (Simulation tool for Ultrasonic NDT). Except being a user guide to the software it also pinpoints its modelling capabilities and restrictions. The SUNDT software is a windows based pre- and post processor using the UTDefect model as mathematical kernel. The software simulates the whole testing procedure with the contact probes (of arbitrary type, angle and size) acting in pulse-echo or tandem inspection situations and includes a large number of defect models. The simulated test piece is at the present state restricted to be of a homogeneous and isotropic material and does not include any model of attenuation due to absorption (viscous effects) or grain boundary scattering. The report also incorporates a short declaration of previous validations and verifications against experimental investigations and comparisons with other existing simulation software. The major part of the report deals with a presentation and visualisation of the various options within the pre- and post processor. In order to exemplify its capability a specific simulation is followed from setting the parameters, running the kernel and towards visualisation of the result.

  16. Identifying functional transcription factor binding sites in yeast by considering their positional preference in the promoters.

    Directory of Open Access Journals (Sweden)

    Fu-Jou Lai

    Full Text Available Transcription factor binding site (TFBS identification plays an important role in deciphering gene regulatory codes. With comprehensive knowledge of TFBSs, one can understand molecular mechanisms of gene regulation. In the recent decades, various computational approaches have been proposed to predict TFBSs in the genome. The TFBS dataset of a TF generated by each algorithm is a ranked list of predicted TFBSs of that TF, where top ranked TFBSs are statistically significant ones. However, whether these statistically significant TFBSs are functional (i.e. biologically relevant is still unknown. Here we develop a post-processor, called the functional propensity calculator (FPC, to assign a functional propensity to each TFBS in the existing computationally predicted TFBS datasets. It is known that functional TFBSs reveal strong positional preference towards the transcriptional start site (TSS. This motivates us to take TFBS position relative to the TSS as the key idea in building our FPC. Based on our calculated functional propensities, the TFBSs of a TF in the original TFBS dataset could be reordered, where top ranked TFBSs are now the ones with high functional propensities. To validate the biological significance of our results, we perform three published statistical tests to assess the enrichment of Gene Ontology (GO terms, the enrichment of physical protein-protein interactions, and the tendency of being co-expressed. The top ranked TFBSs in our reordered TFBS dataset outperform the top ranked TFBSs in the original TFBS dataset, justifying the effectiveness of our post-processor in extracting functional TFBSs from the original TFBS dataset. More importantly, assigning functional propensities to putative TFBSs enables biologists to easily identify which TFBSs in the promoter of interest are likely to be biologically relevant and are good candidates to do further detailed experimental investigation. The FPC is implemented as a web tool at http://santiago.ee.ncku.edu.tw/FPC/.

  17. Automatic provisioning of end-to-end QoS into the home

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Skoldström, Pontus; Nelis, Jelle;

    2011-01-01

    Due to a growing number of high bandwidth applications today (such as HDTV), and an increasing amount of network and cloud based applications, service providers need to pay attention to QoS in their networks. We believe there is a need for an end-to-end approach reaching into the home as well....... The Home Gateway (HG) as a key component of the home network is crucial for enabling the end-to-end solutions. UPnP-QoS has been proposed as an inhome solution for resource reservations. In this paper we assess a solution for automatic QoS reservations, on behalf of non-UPnP-QoS aware applications....... Additionally we focus on an integrated end-to-end solution, combining GMPLS-based reservations in e.g., access/metro and UPnP-QoS based reservation in the home network....

  18. Some Aspects of Analysis of a Micromirror

    Directory of Open Access Journals (Sweden)

    Dipti Razdan

    2015-06-01

    Full Text Available Micromirror is a very small mirror based on the principle of Micro Electro Mechanical Systems (MEMS. Micromirror application in areas like laser scanning displays, DLP Projection system and HDTV are realized using MEMS technology. In this study, an electrostatically controlled micromirror is designed using COMSOL multi-physics software. The structural and mechanical properties of the actuation mechanism of various shapes of a micro-mirror will be studied. The base materials used will include Copper, Silicon and Aluminum. To make the cantilever more efficient, the structural steel was introduced along with the base materials listed above so as to obtain the displacement of the mirror. In order to evaluate the mirror further, the analytical formulation for Capacitance and Torque are developed and compared to the calculated theoretical values. The final results are shown as a range of wavelength, which will be obtained taking into consideration the tilting angle obtained from various materials used for the designing of the mirrors.

  19. A dynamic knowledge base based search engine

    Institute of Scientific and Technical Information of China (English)

    WANG Hui-jin; HU Hua; LI Qing

    2005-01-01

    Search engines have greatly helped us to find thedesired information from the Intemet. Most search engines use keywords matching technique. This paper discusses a Dynamic Knowledge Base based Search Engine (DKBSE), which can expand the user's query using the keywords' concept or meaning. To do this, the DKBSE needs to construct and maintain the knowledge base dynamically via the system's searching results and the user's feedback information. The DKBSE expands the user's initial query using the knowledge base, and returns the searched information after the expanded query.

  20. Analysis and implementation of the Large Scale Video-on-Demand System

    CERN Document Server

    Kanrar, Soumen

    2012-01-01

    Next Generation Network (NGN) provides multimedia services over broadband based networks, which supports high definition TV (HDTV), and DVD quality video-on-demand content. The video services are thus seen as merging mainly three areas such as computing, communication, and broadcasting. It has numerous advantages and more exploration for the large-scale deployment of video-on-demand system is still needed. This is due to its economic and design constraints. It's need significant initial investments for full service provision. This paper presents different estimation for the different topologies and it require efficient planning for a VOD system network. The methodology investigates the network bandwidth requirements of a VOD system based on centralized servers, and distributed local proxies. Network traffic models are developed to evaluate the VOD system's operational bandwidth requirements for these two network architectures. This paper present an efficient estimation of the of the bandwidth requirement for ...

  1. Sonar In-Situ Mode Assessment System (SIMAS) AN/UYQ-25 Data Processing System Software Life Cycle Support Plan.

    Science.gov (United States)

    1980-02-01

    conjunction with SSSMPG and is to be used on an interim basis until implementation of the POST PROCESSOR with the .TASS System Generator . 8. System ...in hard copy output of AN/UYQ-25. 12. Post Processor ( ) - converts output of MTASS system generator to format of input to loaders; written in Fortran... Generator (SSS )GEN) - produces a bootable system tape from (1) a library of modules which have been processed through OISPR, and in conjunction with (2) a

  2. Modular Air Defense Effectiveness Model, Program Documentation and User’s Guide. Volume II. MADEM Programmer Manual.

    Science.gov (United States)

    1980-01-31

    4) Common block dump 5) Other data, according to parameters, which may include: * subroutine trace messages * I space dump * data structure dumps...post processor 3. Event trace listing (printed) 4. Common block dump (printed) 5. DEBUG messages (if chosen) 6. ISPACE ’DUMP’ (if chosen) 250 WBOMBSM...TAPE11, TAPE12, TAPE13 2. TAPE4 - for the post processor 3. Event trace listing (printed) 4. Common block dump (printed) 5. DEBUG messages (if chosen) 6

  3. Solid Base Catalysis

    CERN Document Server

    Ono, Yoshio

    2011-01-01

    The importance of solid base catalysts has come to be recognized for their environmentally benign qualities, and much significant progress has been made over the past two decades in catalytic materials and solid base-catalyzed reactions. The book is focused on the solid base. Because of the advantages over liquid bases, the use of solid base catalysts in organic synthesis is expanding. Solid bases are easier to dispose than liquid bases, separation and recovery of products, catalysts and solvents are less difficult, and they are non-corrosive. Furthermore, base-catalyzed reactions can be performed without using solvents and even in the gas phase, opening up more possibilities for discovering novel reaction systems. Using numerous examples, the present volume describes the remarkable role solid base catalysis can play, given the ever increasing worldwide importance of "green" chemistry. The reader will obtain an overall view of solid base catalysis and gain insight into the versatility of the reactions to whic...

  4. Knowledge Based Strategies for Knowledge Based Organizations

    National Research Council Canada - National Science Library

    Madalina Cristina Tocan

    2012-01-01

    In the present, we can observe that a new economy is arising. It is an economy based on knowledge and ideas, in which the key factor for prosperity and for creation of the new jobs is the knowledge capitalization...

  5. Knowledge Based Strategies for Knowledge Based Organizations

    OpenAIRE

    2012-01-01

    In the present, we can observe that a new economy is arising. It is an economy based on knowledge and ideas, in which the key factor for prosperity and for creation of the new jobs is the knowledge capitalization. Knowledge capitalization, intellectual capital, obtaining prosperity in the market economy imposes a new terminology, new managerial methods and techniques, new technologies and also new strategies. In other words, knowledge based economy, as a new type of economy; impose a new type...

  6. Arenas of development

    DEFF Research Database (Denmark)

    Sørensen, Ole Henning; Jørgensen, Ulrik

    1999-01-01

    In this paper the notion 'development arena' is presented and discussed. The notion is suggested to function as a cognitive space for research. It seeks to catch and describe the relational, unstable and heterogeneous character of the development process. It should sensitize both researchers...... and managers towards processes of technology development that are poorly accounted for in economic and management theory. Thereby, we wish to contribute to critical discussions about the role of management and the directions it chooses for development of technologies and products. The elements...... and transformations involved in shaping and restructuring activities in a development arena are described and discussed based on a case study of the development of HDTV as the next generation television. It exemplifies different moments and aspects of technology development. A number of configurations of specific...

  7. Video Format Conversion Chip Design

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper introduces the design of an IC, which is capable of cross-converting between various DTV standards, up to the HDTV resolution. A multi-phase FIR-based filtering algorithm is developed to perform the video scaling tasks. A dedicated fast SDRAM interface is designed in the system, providing an economical high-density storage for frame buffer. Meanwhile, film material pre-processing and frame/field rate up-conversion are also implemented in the memory control block. Finally, all the programmable parameters, such as the filter properties, can be set dynamically at run-time through an I2C interface, making the IC a very flexible system.This design has been verified through an FPGA emulation system. Subjective test of the output images indicates that the IC is a suitable and high quality solution to consumer applications.

  8. Improved FFSBM Algorithm and Its VLSI Architecture for AVS Video Standard

    Institute of Scientific and Technical Information of China (English)

    Li Zhang; Don Xie; Di Wu

    2006-01-01

    The Video part of AVS (Audio Video Coding Standard) has been finalized recently. It has adopted variable block size motion compensation to improve its coding efficiency. This has brought heavy computation burden when it is applied to compress the HDTV (high definition television) content. Based on the original FFSBM (fast full search blocking matching),this paper proposes an improved FFSBM algorithm to adaptively reduce the complexity of motion estimation according to the actual motion intensity. The main idea of the proposed algorithm is to use the statistical distribution of MVD (motion vector difference). A VLSI (very large scale integration) architecture is also proposed to implement the improved motion estimation algorithm. Experimental results show that this algorithm-hardware co-design gives better tradeoff of gate-count and throughput than the existing ones and is a proper solution for the variable block size motion estimation in AVS.

  9. Teknologi Televisi Digital

    Directory of Open Access Journals (Sweden)

    Nick

    2011-03-01

    Full Text Available Television is the most common medium for people to update the latest information around the world, for more than fifty years. Now, the development of television has grown rapidly to present a picture, even to bring fun to watch movies at home. With HDTV technology, consumers can now enjoy more highly detailed images to be enjoyed with family. This article is based on literature study, on printed and electronic references. The higher the TV resolution does not mean that the picture performance will be better. For HD format, there are 720 vertical lines; meanwhile for full HD there are 1080 vertical lines. TV in low resolution perhaps having good picture quality, but not all the luxury could be enjoyed in details like in HD or full HD format.

  10. Evaluation of packet loss impairment on streaming video

    Institute of Scientific and Technical Information of China (English)

    RUI Hua-xia; LI Chong-rong; QIU Sheng-ke

    2006-01-01

    Video compression technologies are essential in video streaming application because they could save a great amount of network resources. However compressed videos are also extremely sensitive to packet loss which is inevitable in today's best effort IP network. Therefore we think accurate evaluation of packet loss impairment on compressed video is very important. In this work, we develop an analytic model to describe these impairments without the reference of the original video (NR) and propose an impairment metric based on the model, which takes into account both impairment length and impairment strength. To evaluate an impaired frame or video, we design a detection and evaluation algorithm (DE algorithm) to compute the above metric value. The DE algorithm has low computational complexity and is currently being implemented in the real-time monitoring module of our HDTV over IP system. The impairment metric and DE algorithm could also be used in adaptive system or be used to compare diffeient error concealment strategies.

  11. VectorBase

    Data.gov (United States)

    U.S. Department of Health & Human Services — VectorBase is a Bioinformatics Resource Center for invertebrate vectors. It is one of four Bioinformatics Resource Centers funded by NIAID to provide web-based...

  12. Mobile Inquiry Based Learning

    NARCIS (Netherlands)

    Specht, Marcus

    2012-01-01

    Specht, M. (2012, 8 November). Mobile Inquiry Based Learning. Presentation given at the Workshop "Mobile inquiry-based learning" at the Mobile Learning Day 2012 at the Fernuniversität Hagen, Hagen, Germany.

  13. Mobile Inquiry Based Learning

    NARCIS (Netherlands)

    Specht, Marcus

    2012-01-01

    Specht, M. (2012, 8 November). Mobile Inquiry Based Learning. Presentation given at the Workshop "Mobile inquiry-based learning" at the Mobile Learning Day 2012 at the Fernuniversität Hagen, Hagen, Germany.

  14. "Education-based Research"

    DEFF Research Database (Denmark)

    Degn Johansson, Troels

    This paper lays out a concept of education-based research-the production of research knowledge within the framework of tertiary design education-as an integration of problem-based learning and research-based education. This leads to a critique of reflective practice as the primary way to facilitate...... learning at this level, a discussion of the nature of design problems in the instrumentalist tradition, and some suggestions as to how design studies curricula may facilitate education-based research....

  15. Convergent Filter Bases

    OpenAIRE

    Coghetto Roland

    2015-01-01

    We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres) and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections).

  16. Stolen Base Physics

    Science.gov (United States)

    Kagan, David

    2013-01-01

    Few plays in baseball are as consistently close and exciting as the stolen base. While there are several studies of sprinting, the art of base stealing is much more nuanced. This article describes the motion of the base-stealing runner using a very basic kinematic model. The model will be compared to some data from a Major League game. The…

  17. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  18. ANS Based Submarine Simulation

    Science.gov (United States)

    1994-08-01

    computer based simulation proraon supplied by Dr. John Ware at Computer Sceinces Corporation (CSC). Thee am two reasons to use simulated data instead...ANS (Artificial Neural System) capable of modeling submarine perfomncie based on full scale data generated using a computer based simulabon program...The Optimized Entropy algorilth enables the solutions to diffcu problems on a desktop computer within an acceptable time frame. Ob6ectve for w

  19. Pattern Based Morphometry

    OpenAIRE

    Gaonkar, Bilwaj; Pohl, Kilian; Davatzikos, Christos

    2011-01-01

    Voxel based morphometry (VBM) is widely used in the neuroimaging community to infer group differences in brain morphology. VBM is effective in quantifying group differences highly localized in space. However it is not equally effective when group differences might be based on interactions between multiple brain networks. We address this by proposing a new framework called pattern based morphometry (PBM). PBM is a data driven technique. It uses a dictionary learning algorithm to extract global...

  20. Case-based reasoning

    CERN Document Server

    Kolodner, Janet

    1993-01-01

    Case-based reasoning is one of the fastest growing areas in the field of knowledge-based systems and this book, authored by a leader in the field, is the first comprehensive text on the subject. Case-based reasoning systems are systems that store information about situations in their memory. As new problems arise, similar situations are searched out to help solve these problems. Problems are understood and inferences are made by finding the closest cases in memory, comparing and contrasting the problem with those cases, making inferences based on those comparisons, and asking questions whe

  1. Imagery Data Base Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Imagery Data Base Facility supports AFRL and other government organizations by providing imagery interpretation and analysis to users for data selection, imagery...

  2. Soy-based polyols

    Science.gov (United States)

    Suppes, Galen; Lozada, Zueica; Lubguban, Arnold

    2013-06-25

    The invention provides processes for preparing soy-based oligomeric polyols or substituted oligomeric polyols, as well as urethane bioelasteromers comprising the oligomeric polyols or substituted oligomeric polyols.

  3. Gasification-based biomass

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    The gasification-based biomass section of the Renewable Energy Technology Characterizations describes the technical and economic status of this emerging renewable energy option for electricity supply.

  4. Synthetic Base Fluids

    Science.gov (United States)

    Brown, M.; Fotheringham, J. D.; Hoyes, T. J.; Mortier, R. M.; Orszulik, S. T.; Randles, S. J.; Stroud, P. M.

    The chemical nature and technology of the main synthetic lubricant base fluids is described, covering polyalphaolefins, alkylated aromatics, gas-to-liquid (GTL) base fluids, polybutenes, aliphatic diesters, polyolesters, polyalkylene glycols or PAGs and phosphate esters.Other synthetic lubricant base oils such as the silicones, borate esters, perfluoroethers and polyphenylene ethers are considered to have restricted applications due to either high cost or performance limitations and are not considered here.Each of the main synthetic base fluids is described for their chemical and physical properties, manufacture and production, their chemistry, key properties, applications and their implications when used in the environment.

  5. Monitoring Knowledge Base (MKB)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial...

  6. Empirically Based, Agent-based models

    Directory of Open Access Journals (Sweden)

    Elinor Ostrom

    2006-12-01

    Full Text Available There is an increasing drive to combine agent-based models with empirical methods. An overview is provided of the various empirical methods that are used for different kinds of questions. Four categories of empirical approaches are identified in which agent-based models have been empirically tested: case studies, stylized facts, role-playing games, and laboratory experiments. We discuss how these different types of empirical studies can be combined. The various ways empirical techniques are used illustrate the main challenges of contemporary social sciences: (1 how to develop models that are generalizable and still applicable in specific cases, and (2 how to scale up the processes of interactions of a few agents to interactions among many agents.

  7. Research and Implementation of Multicast Technology in HFC%HFC网络组播技术研究与实现

    Institute of Scientific and Technical Information of China (English)

    王沁; 袁玲玲; 张燕; 许娜; 李翀

    2009-01-01

    A network access device (CM) based on DOCSIS 1.1 must suppress multicast traffic. The CM controls the forwarding of multicast by administratively setting parameters for the policy-filter service and by a specific multicast tracking algorithm, and the latter includes the passive IGMP mode and active IGMP mode. According to the demands of the passive IGMP mode and the characteristic that CM has a fixed host port and a fixed router port, a Data-Over-Cable IGMP Snooping protocol was designed and implemented. The protocol works on the MAC Layer, can snoop ICMP packets received by CM, maintains a list of multicast groups and can filter multicast data. The new protocol was tested in the bidirectional system platform for HDTV( High Definition Television), which consists of a self-developed PHY and MAC chip.%符合DOCSIS1.1规范的网络接入设备CM需要抑制组播.这种CM使用两种机制控制组播的转发,即设置策略过滤业务参数和专门的组播跟踪算法,而后者有passive IGMP和active IGMP两种模式.根据passive IGMP的要求和CM有固定的主机端口和固定的路由器端口的特点,设计并实现了一种电缆网络IGMP Snooping协议.该协议运行在MAC层,实现侦听CM收到的IGMP报文、维护组播组列表、过滤组播数据的功能.本设计在自主开发的物理层和MAC层芯片组成的HDTV(高清晰度电视)双向系统平台进行测试.

  8. Secure base stations

    NARCIS (Netherlands)

    Bosch, Peter; Brusilovsky, Alec; McLellan, Rae; Mullender, Sape; Polakos, Paul

    2009-01-01

    With the introduction of the third generation (3G) Universal Mobile Telecommunications System (UMTS) base station router (BSR) and fourth generation (4G) base stations, such as the 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) Evolved Node B (eNB), it has become important to se

  9. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Olsen, Ole Fogh; Sporring, Jon

    2006-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  10. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Fogh Olsen, Ole; Sporring, Jon

    2007-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  11. Content-Based Instruction

    Science.gov (United States)

    DelliCarpini, M.; Alonso, O.

    2013-01-01

    DelliCarpini and Alonso's book "Content-Based Instruction" explores different approaches to teaching content-based instruction (CBI) in the English language classroom. They provide a comprehensive overview of how to teach CBI in an easy-to-follow guide that language teachers will find very practical for their own contexts. Topics…

  12. Arizona Geophysical Data Base

    OpenAIRE

    McLeod, Ronald G.

    1981-01-01

    A series of digital data sets were compiled for input into a geophysical data base for a one degree quadrangle in Arizona. Using a Landsat digital mosaic as a base, information on topography, geology, gravity as well as Seasat radar imagery were registered. Example overlays and tabulations are performed.

  13. Revolutionary Base Spurs Development

    Institute of Scientific and Technical Information of China (English)

    1997-01-01

    MAO Zedong’s Autumn Harvest Uprising spurred not only revolution but development and innovation among the masses. In October 1927, Mao Zedong led troops to Jinggang Mountain, establishing the first revolutionary base. During the 1960s, many young people went to work at Jinggang Mountain and devoted their youth to this revolutionary base. The open-minded and shrewd Tong

  14. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  15. Content-Based Instruction

    Science.gov (United States)

    DelliCarpini, M.; Alonso, O.

    2013-01-01

    DelliCarpini and Alonso's book "Content-Based Instruction" explores different approaches to teaching content-based instruction (CBI) in the English language classroom. They provide a comprehensive overview of how to teach CBI in an easy-to-follow guide that language teachers will find very practical for their own contexts. Topics…

  16. Hydrogel based occlusion systems

    NARCIS (Netherlands)

    Stam, F.A.; Jackson, N.; Dubruel, P.; Adesanya, K.; Embrechts, A.; Mendes, E.; Neves, H.P.; Herijgers, P.; Verbrugghe, Y.; Shacham, Y.; Engel, L.; Krylov, V.

    2013-01-01

    A hydrogel based occlusion system, a method for occluding vessels, appendages or aneurysms, and a method for hydrogel synthesis are disclosed. The hydrogel based occlusion system includes a hydrogel having a shrunken and a swollen state and a delivery tool configured to deliver the hydrogel to a tar

  17. Game-Based Teaching

    DEFF Research Database (Denmark)

    Hanghøj, Thorkild

    2013-01-01

    This chapter outlines theoretical and empirical perspectives on how Game-Based Teaching can be integrated within the context of formal schooling. Initially, this is done by describing game scenarios as models for possible actions that need to be translated into curricular knowledge practices...... approaches to game-based teaching, which may or may not correspond with the pedagogical models of particular games....

  18. Skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    With the advances of cross-sectional imaging radiologists gained an increasing responsibility in the management of patients with skull base pathology. As this anatomic area is hidden to clinical exam, surgeons and radiation oncologists have to rely on imaging studies to plan the most adequate treatment. To fulfil these endeavour radiologists need to be knowledgeable about skull base anatomy, about the main treatment options available, their indications and contra-indications and needs to be aware of the wide gamut of pathologies seen in this anatomic region. This article will provide a radiologists' friendly approach to the central skull base and will review the most common central skull base tumours and tumours intrinsic to the bony skull base.

  19. Data base management study

    Science.gov (United States)

    1976-01-01

    Data base management techniques and applicable equipment are described. Recommendations which will assist potential NASA data users in selecting and using appropriate data base management tools and techniques are presented. Classes of currently available data processing equipment ranging from basic terminals to large minicomputer systems were surveyed as they apply to the needs of potential SEASAT data users. Cost and capabilities projections for this equipment through 1985 were presented. A test of a typical data base management system was described, as well as the results of this test and recommendations to assist potential users in determining when such a system is appropriate for their needs. The representative system tested was UNIVAC's DMS 1100.

  20. Value-based pricing

    Directory of Open Access Journals (Sweden)

    Netseva-Porcheva Tatyana

    2010-01-01

    Full Text Available The main aim of the paper is to present the value-based pricing. Therefore, the comparison between two approaches of pricing is made - cost-based pricing and value-based pricing. The 'Price sensitively meter' is presented. The other topic of the paper is the perceived value - meaning of the perceived value, the components of perceived value, the determination of perceived value and the increasing of perceived value. In addition, the best company strategies in matrix 'value-cost' are outlined. .

  1. Skull Base Anatomy.

    Science.gov (United States)

    Patel, Chirag R; Fernandez-Miranda, Juan C; Wang, Wei-Hsin; Wang, Eric W

    2016-02-01

    The anatomy of the skull base is complex with multiple neurovascular structures in a small space. Understanding all of the intricate relationships begins with understanding the anatomy of the sphenoid bone. The cavernous sinus contains the carotid artery and some of its branches; cranial nerves III, IV, VI, and V1; and transmits venous blood from multiple sources. The anterior skull base extends to the frontal sinus and is important to understand for sinus surgery and sinonasal malignancies. The clivus protects the brainstem and posterior cranial fossa. A thorough appreciation of the anatomy of these various areas allows for endoscopic endonasal approaches to the skull base.

  2. Pattern based morphometry.

    Science.gov (United States)

    Gaonkar, Bilwaj; Pohl, Kilian; Davatzikos, Christos

    2011-01-01

    Voxel based morphometry (VBM) is widely used in the neuroimaging community to infer group differences in brain morphology. VBM is effective in quantifying group differences highly localized in space. However it is not equally effective when group differences might be based on interactions between multiple brain networks. We address this by proposing a new framework called pattern based morphometry (PBM). PBM is a data driven technique. It uses a dictionary learning algorithm to extract global patterns that characterize group differences. We test this approach on simulated and real data obtained from ADNI. In both cases PBM is able to uncover complex global patterns effectively.

  3. Lidar base specification

    Science.gov (United States)

    Heidemann, Hans Karl.

    2012-01-01

    In late 2009, a $14.3 million allocation from the “American Recovery and Reinvestment Act” for new light detection and ranging (lidar) elevation data prompted the U.S. Geological Survey (USGS) National Geospatial Program (NGP) to develop a common base specification for all lidar data acquired for The National Map. Released as a draft in 2010 and formally published in 2012, the USGS–NGP “Lidar Base Specification Version 1.0” (now Lidar Base Specification) was quickly embraced as the foundation for numerous state, county, and foreign country lidar specifications.

  4. QuickBase

    CERN Document Server

    Conner, Nancy

    2007-01-01

    Ready to put Intuit's QuickBase to work? Our new Missing Manual shows you how to capture, modify, share, and manage data and documents with this web-based data-sharing program quickly and easily. No longer do you have to coordinate your team through a blizzard of emails or play frustrating games of "guess which document is the right one."QuickBase saves your organization time and money, letting you manage and share the information that makes your business tick: sales figures, project timelines, drafts of documents, purchase or work requests--whatever information you need to keep business flowi

  5. Cheboygan Vessel Base

    Data.gov (United States)

    Federal Laboratory Consortium — Cheboygan Vessel Base (CVB), located in Cheboygan, Michigan, is a field station of the USGS Great Lakes Science Center (GLSC). CVB was established by congressional...

  6. Community Based Distribution

    African Journals Online (AJOL)

    Community Based Distribution (CBD) is a relatively new concept. It is a service that reaches ... neration; Resupply systems; Pricing of contraceptives; Mix of services ... tion on how best to design and implement the project and the community in ...

  7. trimethylammoniumpropane-based Liposomes

    African Journals Online (AJOL)

    antibody, vaccines, gene based drugs which are subject to enzymic degradation by proteinases, peptidases, nucleases ... Lyophilized La Sota® vaccine containing 200 doses/vial was ... Each sample was diluted with bi-distilled water and the ...

  8. WormBase

    Data.gov (United States)

    U.S. Department of Health & Human Services — WormBase is an international consortium of biologists and computer scientists dedicated to providing the research community with accurate, current, accessible...

  9. Biomimetics: nature based innovation

    National Research Council Canada - National Science Library

    Bar-Cohen, Yoseph

    2012-01-01

    "Based on the concept that nature offers numerous sources of inspiration for inventions related to mechanisms, materials, processes, and algorithms, this book covers the topic of biomimetics and the inspired innovation...

  10. Kelomehele preemia Baseli festivalil

    Index Scriptorium Estoniae

    2000-01-01

    Baselis festivalil "VIPER - International Festival for Film Video and New Media" tunnistati parimaks CD-ROMiks Gustav Deutschi/Anna Schimeki "Odysee today", netiprojektiks itaallaste "01.ORG", äramärkimispreemia - Raivo Kelomehe "Videoweaver"

  11. Hanscom Air Force Base

    Data.gov (United States)

    Federal Laboratory Consortium — MIT Lincoln Laboratory occupies 75 acres (20 acres of which are MIT property) on the eastern perimeter of Hanscom Air Force Base, which is at the nexus of Lexington,...

  12. Mutually unbiased bases

    Indian Academy of Sciences (India)

    S Chaturvedi

    2002-08-01

    After a brief review of the notion of a full set of mutually unbiased bases in an -dimensional Hilbert space, we summarize the work of Wootters and Fields (W K Wootters and B C Fields, Ann. Phys. 191, 363 (1989)) which gives an explicit construction for such bases for the case = r, where is a prime. Further, we show how, by exploiting certain freedom in the Wootters–Fields construction, the task of explicitly writing down such bases can be simplified for the case when is an odd prime. In particular, we express the results entirely in terms of the character vectors of the cyclic group of order . We also analyse the connection between mutually unbiased bases and the representations of .

  13. BaseMap

    Data.gov (United States)

    California Department of Resources — The goal of this project is to provide a convenient base map that can be used as a starting point for CA projects. It's simple, but designed to work at a number of...

  14. Kelomehele preemia Baseli festivalil

    Index Scriptorium Estoniae

    2000-01-01

    Baselis festivalil "VIPER - International Festival for Film Video and New Media" tunnistati parimaks CD-ROMiks Gustav Deutschi/Anna Schimeki "Odysee today", netiprojektiks itaallaste "01.ORG", äramärkimispreemia - Raivo Kelomehe "Videoweaver"

  15. Graphene-based composites.

    Science.gov (United States)

    Huang, Xiao; Qi, Xiaoying; Boey, Freddy; Zhang, Hua

    2012-01-21

    Graphene has attracted tremendous research interest in recent years, owing to its exceptional properties. The scaled-up and reliable production of graphene derivatives, such as graphene oxide (GO) and reduced graphene oxide (rGO), offers a wide range of possibilities to synthesize graphene-based functional materials for various applications. This critical review presents and discusses the current development of graphene-based composites. After introduction of the synthesis methods for graphene and its derivatives as well as their properties, we focus on the description of various methods to synthesize graphene-based composites, especially those with functional polymers and inorganic nanostructures. Particular emphasis is placed on strategies for the optimization of composite properties. Lastly, the advantages of graphene-based composites in applications such as the Li-ion batteries, supercapacitors, fuel cells, photovoltaic devices, photocatalysis, as well as Raman enhancement are described (279 references).

  16. Electrochemical Based Biosensors

    OpenAIRE

    Chung Chiun Liu

    2012-01-01

    This editorial summarizes the general approaches of the electrochemical based biosensors described in the manuscripts published in this Special Issue. Electrochemical based biosensors are scientifically and economically important for the detection and early diagnosis of many diseases, and they will be increasing used and developed in the coming years. The importance of the selection of recognition processes, fabrication techniques and biosensor materials will be introduced.

  17. Participatory design based research

    DEFF Research Database (Denmark)

    Dau, Susanne; Falk, Lars; Jensen, Louise Bach

    2014-01-01

    This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus.......This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus....

  18. DSP Based Waveform Generator

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The DSP Based Waveform Generator is used for CSR Control system to control special controlled objects, such as the pulsed power supply for magnets, RF system, injection and extraction synchronization, global CSR synchronization etc. This intelligent controller based on 4800 MIPS DSP and 256M SDRAM technology will supply highly stable and highly accurate reference waveform used by the power supply of magnets. The specifications are as follows:

  19. Image Based Indoor Navigation

    OpenAIRE

    Noreikis, Marius

    2014-01-01

    Over the last years researchers proposed numerous indoor localisation and navigation systems. However, solutions that use WiFi or Radio Frequency Identification require infrastructure to be deployed in the navigation area and infrastructureless techniques, e.g. the ones based on mobile cell ID or dead reckoning suffer from large accuracy errors. In this Thesis, we present a novel approach of infrastructure-less indoor navigation system based on computer vision Structure from Motion techniques...

  20. Hydrogel based occlusion systems

    OpenAIRE

    Stam, F.A.; Jackson, N.; Dubruel, P.; Adesanya, K.; Embrechts, A; Mendes, E.; Neves, H.P.; Herijgers, P; Verbrugghe, Y.; Shacham, Y.; Engel, L.; Krylov, V

    2013-01-01

    A hydrogel based occlusion system, a method for occluding vessels, appendages or aneurysms, and a method for hydrogel synthesis are disclosed. The hydrogel based occlusion system includes a hydrogel having a shrunken and a swollen state and a delivery tool configured to deliver the hydrogel to a target occlusion location. The hydrogel is configured to permanently occlude the target occlusion location in the swollen state. The hydrogel may be an electro-activated hydrogel (EAH) which could be ...

  1. Participatory design based research

    DEFF Research Database (Denmark)

    Dau, Susanne; Bach Jensen, Louise; Falk, Lars

    2014-01-01

    This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus.......This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus....

  2. REST based mobile applications

    Science.gov (United States)

    Rambow, Mark; Preuss, Thomas; Berdux, Jörg; Conrad, Marc

    2008-02-01

    Simplicity is the major advantage of REST based webservices. Whereas SOAP is widespread in complex, security sensitive business-to-business aplications, REST is widely used for mashups and end-user centric applicatons. In that context we give an overview of REST and compare it to SOAP. Furthermore we apply the GeoDrawing application as an example for REST based mobile applications and emphasize on pros and cons for the use of REST in mobile application scenarios.

  3. Layered nickel based superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Ronning, Filip [Los Alamos National Laboratory; Bauer, Eric D [Los Alamos National Laboratory; Park, Tuson [Los Alamos National Laboratory; Kurita, Nobuyuki [Los Alamos National Laboratory; Klimczuk, T [Los Alamos National Laboratory; Movshovich, R [Los Alamos National Laboratory; Thompson, J D [Los Alamos National Laboratory; Sefat, A S [ORNL; Mandrus, D [ORNL

    2009-01-01

    We review the properties of Ni-based superconductors which contain Ni{sub 2}X{sub 2} (X=As, P, Bi, Si, Ge, B) planes, a common structural element to the recently discovered FeAs superconductors. We also compare the properties ofthe Ni-and Fe-based systems from a perspective ofelectronic structure as well as structure-property relations.

  4. ACTIVITY - BASED COSTING DESIGNING

    OpenAIRE

    Wioletta Skibiñska; Marta Kad³ubek

    2010-01-01

    The traditional costing system sometimes does not give accurate information about the consumption of different resources and the activities of the organisation. The activity-based costing system is an information-rich costing system which is more and more necessary for the success of many European companies. Base of designing and implementation of an ABC system in the enterprises are presented in the article.

  5. Base de datos FIA

    OpenAIRE

    Clemente Iturriaga, José Alfredo

    2015-01-01

    Este trabajo se basa en el desarrollo de una base de datos en Oracle para la FIA (Federación Internacional de Automovilismo) partiendo de cero. Aquest treball es basa en el desenvolupament d'una base de dades en Oracle per a la FIA (Federació Internacional d'Automobilisme) partint de zero. Bachelor thesis for the Computer Science program on Databases.

  6. LDEF materials data bases

    Science.gov (United States)

    Funk, Joan G.; Strickland, John W.; Davis, John M.

    1993-01-01

    The Long Duration Exposure Facility (LDEF) and the accompanying experiments were composed of and contained a wide variety of materials representing the largest collection of materials flown in low Earth orbit (LEO) and retrieved for ground based analysis to date. The results and implications of the mechanical, thermal, optical, and electrical data from these materials are the foundation on which future LEO space missions will be built. The LDEF Materials Special Investigation Group (MSIG) has been charged with establishing and developing data bases to document these materials and their performance to assure not only that the data are archived for future generations but also that the data are available to the spacecraft user community in an easily accessed, user-friendly form. This paper discusses the format and content of the three data bases developed or being developed to accomplish this task. The hardware and software requirements for each of these three data bases are discussed along with current availability of the data bases. This paper also serves as a user's guide to the MAPTIS LDEF Materials Data Base.

  7. Swarm-based medicine.

    Science.gov (United States)

    Putora, Paul Martin; Oldenburg, Jan

    2013-09-19

    Occasionally, medical decisions have to be taken in the absence of evidence-based guidelines. Other sources can be drawn upon to fill in the gaps, including experience and intuition. Authorities or experts, with their knowledge and experience, may provide further input--known as "eminence-based medicine". Due to the Internet and digital media, interactions among physicians now take place at a higher rate than ever before. With the rising number of interconnected individuals and their communication capabilities, the medical community is obtaining the properties of a swarm. The way individual physicians act depends on other physicians; medical societies act based on their members. Swarm behavior might facilitate the generation and distribution of knowledge as an unconscious process. As such, "swarm-based medicine" may add a further source of information to the classical approaches of evidence- and eminence-based medicine. How to integrate swarm-based medicine into practice is left to the individual physician, but even this decision will be influenced by the swarm.

  8. Proceedings of airborne reconnaissance 14

    Energy Technology Data Exchange (ETDEWEB)

    Henkel, P.A. (General Dynamics Corp. (US)); LaGasse, F.R.; Schurter, W.W. (McDonnell Aircraft Co., St. Louis, MO (United States))

    1990-01-01

    This book is covered under the following topics: HDTV/High-Resolution Video Overview; Image Acquisition and Recording; Image Processing and Exploitation; Reconnaissance Requirements; Reconnaissance Platforms; and Advanced Development.

  9. 78 FR 42487 - Announcement of Grant Application Deadlines; Deadlines and Funding Levels

    Science.gov (United States)

    2013-07-16

    ... and high definition television (HDTV) programming, at both the interim and final channel and power... produces an area's only local news or if the station has been historically active in producing...

  10. On mutually unbiased bases

    CERN Document Server

    Durt, Thomas; Bengtsson, Ingemar; Zyczkowski, Karol \\

    2010-01-01

    Mutually unbiased bases for quantum degrees of freedom are central to all theoretical investigations and practical exploitations of complementary properties. Much is known about mutually unbiased bases, but there are also a fair number of important questions that have not been answered in full as yet. In particular, one can find maximal sets of ${N+1}$ mutually unbiased bases in Hilbert spaces of prime-power dimension ${N=p^\\m}$, with $p$ prime and $\\m$ a positive integer, and there is a continuum of mutually unbiased bases for a continuous degree of freedom, such as motion along a line. But not a single example of a maximal set is known if the dimension is another composite number ($N=6,10,12,...$). In this review, we present a unified approach in which the basis states are labeled by numbers ${0,1,2,...,N-1}$ that are both elements of a Galois field and ordinary integers. This dual nature permits a compact systematic construction of maximal sets of mutually unbiased bases when they are known to exist but th...

  11. Paper based electronics platform

    KAUST Repository

    Nassar, Joanna Mohammad

    2017-07-20

    A flexible and non-functionalized low cost paper-based electronic system platform fabricated from common paper, such as paper based sensors, and methods of producing paper based sensors, and methods of sensing using the paper based sensors are provided. A method of producing a paper based sensor can include the steps of: a) providing a conventional paper product to serve as a substrate for the sensor or as an active material for the sensor or both, the paper product not further treated or functionalized; and b) applying a sensing element to the paper substrate, the sensing element selected from the group consisting of a conductive material, the conductive material providing contacts and interconnects, sensitive material film that exhibits sensitivity to pH levels, a compressible and/or porous material disposed between a pair of opposed conductive elements, or a combination of two of more said sensing elements. The method of sensing can further include measuring, using the sensing element, a change in resistance, a change in voltage, a change in current, a change in capacitance, or a combination of any two or more thereof.

  12. Design-Based Research

    DEFF Research Database (Denmark)

    Gynther, Karsten; Christensen, Ove; Petersen, Trine Brun

    2012-01-01

    I denne artikel introduceres Design Based Research for første gang på dansk i et videnskabeligt tidsskrift. Artiklen præsenterer de grundlæggende antagelser, som ligger til grund for Design Based Research-traditionen, og artiklen diskuterer de principper, som ligger til grund for gennemførelse af...... et DBR-forskningsprojekt. Med udgangspunkt i forsknings- og udviklingsprojektet ELYK: E-læring, Yderområder og Klyngedannelse, præsenteres den innovationsmodel, som projektet har udviklet med udgangspunkt i Design Based Research traditionen. ELYKs DBR innovationsmodel har vist sig effektiv i forhold...... til projektfremdrift, brugerinvolvering og vidensgenerering, og den vil kunne inspirere andre med interesse i forskningsbaseret udvikling af didaktisk design, der er medieret af digitale teknologier....

  13. Compression-based Similarity

    CERN Document Server

    Vitanyi, Paul M B

    2011-01-01

    First we consider pair-wise distances for literal objects consisting of finite binary files. These files are taken to contain all of their meaning, like genomes or books. The distances are based on compression of the objects concerned, normalized, and can be viewed as similarity distances. Second, we consider pair-wise distances between names of objects, like "red" or "christianity." In this case the distances are based on searches of the Internet. Such a search can be performed by any search engine that returns aggregate page counts. We can extract a code length from the numbers returned, use the same formula as before, and derive a similarity or relative semantics between names for objects. The theory is based on Kolmogorov complexity. We test both similarities extensively experimentally.

  14. Cantilever Based Mass Sensing

    DEFF Research Database (Denmark)

    Dohn, Søren

    2007-01-01

    Cantilever based mass sensors utilize that a change in vibrating mass will cause a change in the resonant frequency. This can be used for very accurate sensing of adsorption and desorption processes on the cantilever surface. The change in resonant frequency caused by a single molecule depends...... on various parameters including the vibrating mass of the cantilever and the frequency at which it vibrates. The minimum amount of molecules detectable is highly dependent on the noise of the system as well as the method of readout. The aim of this Ph.D. thesis has been twofold: To develop a readout method...... suitable for a portable device and to investigate the possibility of enhancing the functionality and sensitivity of cantilever based mass sensors. A readout method based on the hard contact between the cantilever and a biased electrode placed in close proximity to the cantilever is proposed. The viability...

  15. Evidence-Based Development

    DEFF Research Database (Denmark)

    Hertzum, Morten; Simonsen, Jesper

    2004-01-01

    Systems development is replete with projects that represent substantial resource investments but result in systems that fail to meet users’ needs. Evidence-based development is an emerging idea intended to provide means for managing customer-vendor relationships and working systematically toward...... meeting customer needs. We are suggesting that the effects of the use of a system should play a prominent role in the contractual definition of IT projects and that contract fulfilment should be determined on the basis of evidence of these effects. Based on two ongoing studies of home-care management...

  16. Inkjet-based micromanufacturing

    CERN Document Server

    Korvink, Jan G; Shin, Dong-Youn; Brand, Oliver; Fedder, Gary K; Hierold, Christofer; Tabata, Osamu

    2012-01-01

    Inkjet-based Micromanufacturing Inkjet technology goes way beyond putting ink on paper: it enables simpler, faster and more reliable manufacturing processes in the fields of micro- and nanotechnology. Modern inkjet heads are per se precision instruments that deposit droplets of fluids on a variety of surfaces in programmable, repeating patterns, allowing, after suitable modifications and adaptations, the manufacturing of devices such as thin-film transistors, polymer-based displays and photovoltaic elements. Moreover, inkjet technology facilitates the large-scale production of flexible RFID tr

  17. Identity-based encryption

    CERN Document Server

    Chatterjee, Sanjit

    2011-01-01

    Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Ide

  18. Iron-based superconductivity

    CERN Document Server

    Johnson, Peter D; Yin, Wei-Guo

    2015-01-01

    This volume presents an in-depth review of experimental and theoretical studies on the newly discovered Fe-based superconductors.  Following the Introduction, which places iron-based superconductors in the context of other unconventional superconductors, the book is divided into three sections covering sample growth, experimental characterization, and theoretical understanding.  To understand the complex structure-property relationships of these materials, results from a wide range of experimental techniques and theoretical approaches are described that probe the electronic and magnetic proper

  19. Video-based rendering

    CERN Document Server

    Magnor, Marcus A

    2005-01-01

    Driven by consumer-market applications that enjoy steadily increasing economic importance, graphics hardware and rendering algorithms are a central focus of computer graphics research. Video-based rendering is an approach that aims to overcome the current bottleneck in the time-consuming modeling process and has applications in areas such as computer games, special effects, and interactive TV. This book offers an in-depth introduction to video-based rendering, a rapidly developing new interdisciplinary topic employing techniques from computer graphics, computer vision, and telecommunication en

  20. Process-based costing.

    Science.gov (United States)

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  1. Evaluation of high-definition television for remote task performance

    Energy Technology Data Exchange (ETDEWEB)

    Draper, J.V.; Fujita, Y.; Herndon, J.N.

    1987-04-01

    High-definition television (HDTV) transmits a video image with more than twice the number (1125 for HDTV to 525 for standard-resolution TV) of horizontal scan lines that standard-resolution TV provides. The improvement in picture quality (compared to standard-resolution TV) that the extra scan lines provide is impressive. Objects in the HDTV picture have more sharply defined edges, better contrast, and more accurate reproduction of shading and color patterns than do those in the standard-resolution TV picture. Because the TV viewing system is a key component for teleoperator performance, an improvement in TV picture quality could mean an improvement in the speed and accuracy with which teleoperators perform tasks. This report describes three experiments designed to evaluate the impact of HDTV on the performance of typical remote tasks. The performance of HDTV was compared to that of standard-resolution, monochromatic TV and standard-resolution, stereoscopic, monochromatic TV in the context of judgment of depth in a televised scene, visual inspection of an object, and performance of a typical remote handling task. The results of the three experiments show that in some areas HDTV can lead to improvement in teleoperator performance. Observers inspecting a small object for a flaw were more accurate with HDTV than with either of the standard-resolution systems. High resolution is critical for detection of small-scale flaws of the type in the experiment (a scratch on a glass bottle). These experiments provided an evaluation of HDTV television for use in tasks that must be routinely performed to remotely maintain a nuclear fuel reprocessing facility. 5 refs., 7 figs., 9 tabs.

  2. Economic evaluation of broadband distribution networks to the home

    Science.gov (United States)

    Merk, Charles A.

    1992-02-01

    Economic wideband, linear fiber optic transmitters and receivers pave the way for broadband to the home. The diamond network architecture (DNA) delivers 1 GHz bandwidth. This provides standard video, HDTV, and switched two-way broadband digital services to the home. An economic model is presented using the DNA that considers the impact of digital TV, HDTV, and the evolution of switched voice and data services on a CATV system.

  3. Faunal Biogeography Community Structure and Genetic Connectivity of North Atlantic Seamounts

    Science.gov (United States)

    2008-09-01

    equipped with three HMI lights with a total of 2000 W of power and a high definition Insite Pacific Zeus HDTV camera (resolution 1035i), and acted as a...definition Insite Pacific Zeus HDTV camera (resolution 1035i) that provided video and still imagery for analysis, an Insite Pacific Titan tilt-rotate...centrally located. The other haplotypes often differed by several nucleotide substitutions (Figure 4). Geographical representation of these data give

  4. Distribution high definition television channels in the former Yugoslavia

    OpenAIRE

    Petrović, Mile B.; Jakšić, Branimir S.; Jakšić, Krsto M.; Spalević, Žaklina S.; Pavlović, Miroslav P.

    2013-01-01

    In this paper presented an analysis of the distribution of high definition television (HDTV) channels in the countries of the Former Yugoslavia: Serbia, Bosnia and Herzegovina, Montenegro, Croatia, Slovenia and Macedonia.. Gives an overview of number of channels on different forms of digital distribution: DVB-T, DVB-S, DVB-C and IPTV. A comparison of the share of HDTV compared to channels of the SDTV (Standard Definition) in various forms of distribution. Also featured is a representation of ...

  5. European Science Notes Information Bulletin Reports on Current European/ Middle Eastern Science

    Science.gov (United States)

    1990-05-01

    under serious pressure. This results, also in part, from the growing use of distributed systems and the resultinv need for comnatibility between... Corporation and Sematech. Two important announcements in the first quarter of 1990 concerning HDTV and the JESSI programs were: " NBC, a division of...RCA Corporation , and Thomson Consumer Electronics announced that they would joint with North American Philips Corporation on HDTV development " On

  6. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  7. Surfel Based Geometry Resonstruction

    DEFF Research Database (Denmark)

    Andersen, Vedrana; Aanæs, Henrik; Bærentzen, Jakob Andreas

    2010-01-01

    We propose a method for retrieving a piecewise smooth surface from noisy data. In data acquired by a scanning process sampled points are almost never on the discontinuities making reconstruction of surfaces with sharp features difficult. Our method is based on a Markov Random Field (MRF) formulat...

  8. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Sporring, Jon; Fogh Olsen, Ole

    2008-01-01

    . To address this problem, we introduce a photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way, we preserve important illumination features, while...

  9. REST based service composition

    DEFF Research Database (Denmark)

    Grönvall, Erik; Ingstrup, Mads; Pløger, Morten

    2011-01-01

    This paper presents an ongoing work developing and testing a Service Composition framework based upon the REST architecture named SECREST. A minimalistic approach have been favored instead of a creating a complete infrastructure. One focus has been on the system's interaction model. Indeed, an aim...

  10. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  11. Technology based Education System

    DEFF Research Database (Denmark)

    Kant Hiran, Kamal; Doshi, Ruchi; Henten, Anders

    2016-01-01

    Abstract - Education plays a very important role for the development of the country. Education has multiple dimensions from schooling to higher education and research. In all these domains, there is invariably a need for technology based teaching and learning tools are highly demanded in the acad...

  12. Formula Based Compensation.

    Science.gov (United States)

    Sears, Doug; Picus, Lawrence O.

    1999-01-01

    Recognizing that traditional salary bargaining is divisive and that teacher salaries should remain competitive, Temple City (California) Unified School District has been experimenting with formula-based compensation for the past four years. Primary benefits are lack of conflict over salary increases, which are determined before negotiating other…

  13. Speckle-based wavemeter

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Jakobsen, Michael Linde; Chakrabarti, Maumita

    2015-01-01

    A spectrometer based on the application of dynamic speckles will be disclosed. The method relies on scattering of primarily coherent radiation from a slanted rough surface. The scattered radiation is collected on a detector array and the speckle displacement is monitored during a change in the in...

  14. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  15. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  16. School Based Health Centers

    Science.gov (United States)

    Children's Aid Society, 2012

    2012-01-01

    School Based Health Centers (SBHC) are considered by experts as one of the most effective and efficient ways to provide preventive health care to children. Few programs are as successful in delivering health care to children at no cost to the patient, and where they are: in school. For many underserved children, The Children's Aid Society's…

  17. Protein Crystal Based Nanomaterials

    Science.gov (United States)

    Bell, Jeffrey A.; VanRoey, Patrick

    2001-01-01

    This is the final report on a NASA Grant. It concerns a description of work done, which includes: (1) Protein crystals cross-linked to form fibers; (2) Engineering of protein to favor crystallization; (3) Better knowledge-based potentials for protein-protein contacts; (4) Simulation of protein crystallization.

  18. Evidence-based guidelines

    DEFF Research Database (Denmark)

    Rovira, Àlex; Wattjes, Mike P; Tintoré, Mar

    2015-01-01

    diagnosis in patients with MS. The aim of this article is to provide guidelines for the implementation of MRI of the brain and spinal cord in the diagnosis of patients who are suspected of having MS. These guidelines are based on an extensive review of the recent literature, as well as on the personal...

  19. XML Based UIScript

    Institute of Scientific and Technical Information of China (English)

    CAI Bin; LIAO Jian-xin; SHEN Qi-wei

    2004-01-01

    In this paper, after the analyzing the UIScript mechanism in intelligent peripheral, a new approach of XML-based UIScript is put forward. The related issues such as the design of UIScript language, the execution environment and its relationship with other script languages are discussed.

  20. Animation-based Sketching

    DEFF Research Database (Denmark)

    Vistisen, Peter

    This thesis is based on the results of a three-year long PhD-study at the Department of Communication and Psychology at Aalborg University. The thesis consist of five original papers, a book manuscript, as well as a linking text with the thesis’ research questions, research design, and summary...

  1. Lunar Base Sitting

    Science.gov (United States)

    Staehle, Robert L.; Burke, James D.; Snyder, Gerald C.; Dowling, Richard; Spudis, Paul D.

    1993-12-01

    Speculation with regard to a permanent lunar base has been with us since Robert Goddard was working on the first liquid-fueled rockets in the 1920's. With the infusion of data from the Apollo Moon flights, a once speculative area of space exploration has become an exciting possibility. A Moon base is not only a very real possibility, but is probably a critical element in the continuation of our piloted space program. This article, originally drafted by World Space Foundation volunteers in conjuction with various academic and research groups, examines some of the strategies involved in selecting an appropriate site for such a lunar base. Site selection involves a number of complex variables, including raw materials for possible rocket propellant generation, hot an cold cycles, view of the sky (for astronomical considerations, among others), geological makeup of the region, and more. This article summarizes the key base siting considerations and suggests some alternatives. Availability of specific resources, including energy and certain minerals, is critical to success.

  2. Knowledge-Based Abstracting.

    Science.gov (United States)

    Black, William J.

    1990-01-01

    Discussion of automatic abstracting of technical papers focuses on a knowledge-based method that uses two sets of rules. Topics discussed include anaphora; text structure and discourse; abstracting techniques, including the keyword method and the indicator phrase method; and tools for text skimming. (27 references) (LRW)

  3. Nanocarbon-based photovoltaics.

    Science.gov (United States)

    Bernardi, Marco; Lohrman, Jessica; Kumar, Priyank V; Kirkeminde, Alec; Ferralis, Nicola; Grossman, Jeffrey C; Ren, Shenqiang

    2012-10-23

    Carbon materials are excellent candidates for photovoltaic solar cells: they are Earth-abundant, possess high optical absorption, and maintain superior thermal and photostability. Here we report on solar cells with active layers made solely of carbon nanomaterials that present the same advantages of conjugated polymer-based solar cells, namely, solution processable, potentially flexible, and chemically tunable, but with increased photostability and the possibility to revert photodegradation. The device active layer composition is optimized using ab initio density functional theory calculations to predict type-II band alignment and Schottky barrier formation. The best device fabricated is composed of PC(70)BM fullerene, semiconducting single-walled carbon nanotubes, and reduced graphene oxide. This active-layer composition achieves a power conversion efficiency of 1.3%-a record for solar cells based on carbon as the active material-and we calculate efficiency limits of up to 13% for the devices fabricated in this work, comparable to those predicted for polymer solar cells employing PCBM as the acceptor. There is great promise for improving carbon-based solar cells considering the novelty of this type of device, the high photostability, and the availability of a large number of carbon materials with yet untapped potential for photovoltaics. Our results indicate a new strategy for efficient carbon-based, solution-processable, thin film, photostable solar cells.

  4. ISFET based enzyme sensors

    NARCIS (Netherlands)

    van der Schoot, Bart H.; Bergveld, Piet

    1987-01-01

    This paper reviews the results that have been reported on ISFET based enzyme sensors. The most important improvement that results from the application of ISFETs instead of glass membrane electrodes is in the method of fabrication. Problems with regard to the pH dependence of the response and the

  5. Cotton-based nonwovens

    Science.gov (United States)

    This article is an abbreviated description of a new cotton-based nonwovens research program at the Southern Regional Research Center, which is one of the four regional research centers of the Agricultural Research Service, U.S. Department of Agriculture. Since cotton is a significant cash crop inte...

  6. Evidence based practice

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2011-01-01

    making that is established in research as well as an optimization of every link in documentation and search processes. EBP is based on the philosophical doctrine of empiricism and, therefore, it is subject to the criticism that has been raised against empiricism. The main criticism of EBP...

  7. BASE - Progress Report 2015

    CERN Document Server

    Ulmer, S; Mooser, A; Sellner, S; Nagahama, H; Higuchi, T; Borchert, M; Schneider, G; Tanaka, T; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2016-01-01

    The BASE collaboration aims at high-precision comparisons of the fundamental properties of the proton and the antiproton, namely, the magnetic g-factors as well as the charge-to-mass ratios of the particles. This annual report summarizes the achievements made in CERN's 2015 antiproton run.

  8. Acid-Base Homeostasis.

    Science.gov (United States)

    Hamm, L Lee; Nakhoul, Nazih; Hering-Smith, Kathleen S

    2015-12-07

    Acid-base homeostasis and pH regulation are critical for both normal physiology and cell metabolism and function. The importance of this regulation is evidenced by a variety of physiologic derangements that occur when plasma pH is either high or low. The kidneys have the predominant role in regulating the systemic bicarbonate concentration and hence, the metabolic component of acid-base balance. This function of the kidneys has two components: reabsorption of virtually all of the filtered HCO3(-) and production of new bicarbonate to replace that consumed by normal or pathologic acids. This production or generation of new HCO3(-) is done by net acid excretion. Under normal conditions, approximately one-third to one-half of net acid excretion by the kidneys is in the form of titratable acid. The other one-half to two-thirds is the excretion of ammonium. The capacity to excrete ammonium under conditions of acid loads is quantitatively much greater than the capacity to increase titratable acid. Multiple, often redundant pathways and processes exist to regulate these renal functions. Derangements in acid-base homeostasis, however, are common in clinical medicine and can often be related to the systems involved in acid-base transport in the kidneys.

  9. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  10. Aptamer-based nanobiosensors.

    Science.gov (United States)

    Kim, Yeon Seok; Raston, Nurul Hanun Ahmad; Gu, Man Bock

    2016-02-15

    It has been more than two decades since aptamer and the systematic evolution of ligands by exponential enrichment (SELEX) method were discovered by Larry Gold and Andrew Ellington in 1990, respectively. Based on the various advantages of aptamers, they have become a potent rival of antibodies in therapeutics and bio-analysis. Especially, the recent advances in aptamer biosensor application are remarkable due to its intrinsic properties of aptamers as nucleic acids and target induced conformational changes, in addition to the introduction of graphene oxide-based easy and simple immobilization-free screening method even for dual aptamers. In addition, the incorporation of various nanomaterials such as metallic nanoparticles, carbon materials, and functional nanospheres in aptasensors has facilitated the improvement of analytical performance and commercial application of aptasensors. In this review, recent prominent reports on aptasensors utilizing nanomaterials were introduced to understand the principle of aptamer-based biosensors and provide an insight for new strategies of aptasensors and the application of various nanomaterials. The perspective on aptamer-based biosensors and diagnostics was also discussed in view of technology and market.

  11. Orff-Based Improvisation.

    Science.gov (United States)

    Thomas, Judith

    1980-01-01

    Described are improvisational activities based on the Orff-Schulwerk teaching technique which include: sound and movement; sound and movement plus visuals; interpretation of designs from nature, architecture, paintings, rotating rondo, singing hands, and moving from words to song. Pictures of children participating in these activities are…

  12. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...

  13. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...

  14. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  15. Dictionary Based Image Segmentation

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2015-01-01

    We propose a method for weakly supervised segmentation of natural images, which may contain both textured or non-textured regions. Our texture representation is based on a dictionary of image patches. To divide an image into separated regions with similar texture we use an implicit level sets...

  16. Problem Based Learning

    DEFF Research Database (Denmark)

    de Graaff, Erik; Guerra, Aida

    , the key principles remain the same everywhere. Graaff & Kolmos (2003) identify the main PBL principles as follows: 1. Problem orientation 2. Project organization through teams or group work 3. Participant-directed 4. Experiental learning 5. Activity-based learning 6. Interdisciplinary learning and 7......Problem-Based Learning (PBL) is an innovative method to organize the learning process in such a way that the students actively engage in finding answers by themselves. During the past 40 years PBL has evolved and diversified resulting in a multitude in variations in models and practices. However....... Exemplary practice. The University of Aalborg in Denmark started with PBL right from the start when the school was founded merging several educational institutes in Northern Denmark in 1974. The Aalborg PBL model is recognized around the world as an example or a source of inspiration, in particular...

  17. Sparse approximation with bases

    CERN Document Server

    2015-01-01

    This book systematically presents recent fundamental results on greedy approximation with respect to bases. Motivated by numerous applications, the last decade has seen great successes in studying nonlinear sparse approximation. Recent findings have established that greedy-type algorithms are suitable methods of nonlinear approximation in both sparse approximation with respect to bases and sparse approximation with respect to redundant systems. These insights, combined with some previous fundamental results, form the basis for constructing the theory of greedy approximation. Taking into account the theoretical and practical demand for this kind of theory, the book systematically elaborates a theoretical framework for greedy approximation and its applications.  The book addresses the needs of researchers working in numerical mathematics, harmonic analysis, and functional analysis. It quickly takes the reader from classical results to the latest frontier, but is written at the level of a graduate course and do...

  18. Conducting Polymer Based Nanobiosensors

    Directory of Open Access Journals (Sweden)

    Chul Soon Park

    2016-06-01

    Full Text Available In recent years, conducting polymer (CP nanomaterials have been used in a variety of fields, such as in energy, environmental, and biomedical applications, owing to their outstanding chemical and physical properties compared to conventional metal materials. In particular, nanobiosensors based on CP nanomaterials exhibit excellent performance sensing target molecules. The performance of CP nanobiosensors varies based on their size, shape, conductivity, and morphology, among other characteristics. Therefore, in this review, we provide an overview of the techniques commonly used to fabricate novel CP nanomaterials and their biosensor applications, including aptasensors, field-effect transistor (FET biosensors, human sense mimicking biosensors, and immunoassays. We also discuss prospects for state-of-the-art nanobiosensors using CP nanomaterials by focusing on strategies to overcome the current limitations.

  19. Flow Based Algorithm

    Directory of Open Access Journals (Sweden)

    T. Karpagam

    2012-01-01

    Full Text Available Problem statement: Network topology design problems find application in several real life scenario. Approach: Most designs in the past either optimize for a single criterion like shortest or cost minimization or maximum flow. Results: This study discussed about solving a multi objective network topology design problem for a realistic traffic model specifically in the pipeline transportation. Here flow based algorithm focusing to transport liquid goods with maximum capacity with shortest distance, this algorithm developed with the sense of basic pert and critical path method. Conclusion/Recommendations: This flow based algorithm helps to give optimal result for transporting maximum capacity with minimum cost. It could be used in the juice factory, milk industry and its best alternate for the vehicle routing problem.

  20. Ontology Based Access Control

    Directory of Open Access Journals (Sweden)

    Özgü CAN

    2010-02-01

    Full Text Available As computer technologies become pervasive, the need for access control mechanisms grow. The purpose of an access control is to limit the operations that a computer system user can perform. Thus, access control ensures to prevent an activity which can lead to a security breach. For the success of Semantic Web, that allows machines to share and reuse the information by using formal semantics for machines to communicate with other machines, access control mechanisms are needed. Access control mechanism indicates certain constraints which must be achieved by the user before performing an operation to provide a secure Semantic Web. In this work, unlike traditional access control mechanisms, an "Ontology Based Access Control" mechanism has been developed by using Semantic Web based policies. In this mechanism, ontologies are used to model the access control knowledge and domain knowledge is used to create policy ontologies.

  1. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  2. Content Based Video Retrieval

    Directory of Open Access Journals (Sweden)

    B. V. Patel

    2012-10-01

    Full Text Available Content based video retrieval is an approach for facilitating the searching and browsing of large image collections over World Wide Web. In this approach, video analysis is conducted on low level visual properties extracted from video frame. We believed that in order to create an effective video retrieval system, visual perception must be taken into account. We conjectured that a technique which employs multiple features for indexing and retrieval would be more effective in the discrimination and search tasks of videos. In order to validate this claim, content based indexing and retrieval systems were implemented using color histogram, various texture features and other approaches. Videos were stored in Oracle 9i Database and a user study measured correctness of response.

  3. Knowledge based maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Sturm, A. [Hamburgische Electacitaets-Werke AG Hamburg (Germany)

    1997-12-31

    The establishment of maintenance strategies is of crucial significance for the reliability of a plant and the economic efficiency of maintenance measures. Knowledge about the condition of components and plants from the technical and business management point of view therefore becomes one of the fundamental questions and the key to efficient management and maintenance. A new way to determine the maintenance strategy can be called: Knowledge Based Maintenance. A simple method for determining strategies while taking the technical condition of the components of the production process into account to the greatest possible degree which can be shown. A software with an algorithm for Knowledge Based Maintenance leads the user during complex work to the determination of maintenance strategies for this complex plant components. (orig.)

  4. Situation based housing

    DEFF Research Database (Denmark)

    Duelund Mortensen, Peder; Welling, Helen; Wiell Nordberg, Lene

    2007-01-01

    of approaches to these goals. This working paper reviews not only a selection of new housing types, but also dwellings from the past, which each contain an aspect of changeability. Our study is based on information from users in the selected housing schemes, gathered from questionnaires, information about...... personal furnishing and zoning as well as interviews. The study is also based on analyses of the architectural configurations of space, light and materiality. Our main question is: can the goal of architectural quality be maintained together with greater possibilities for individual development...... research results will be employed to create a categorization of housing suitable for changing life conditions and with a strong emphasis on high architectural quality...

  5. Evidence-based policy

    DEFF Research Database (Denmark)

    Vohnsen, Nina Holm

    2013-01-01

    A current ambition in welfare states as diverse as Denmark, the UK, and in the USA is to base political decision making on rigorous research (Cartwright et al 2009; Mulgan 2009; Bason 2010). Sound as this might seem the ambition has nevertheless been problematized by both policy-makers and the re......A current ambition in welfare states as diverse as Denmark, the UK, and in the USA is to base political decision making on rigorous research (Cartwright et al 2009; Mulgan 2009; Bason 2010). Sound as this might seem the ambition has nevertheless been problematized by both policy...... a full account, see Vohnsen 2011). These insights will be relevant for the anthropological researcher of legislative processes who wishes to move beyond a merely discursive approach to the study of policy and politics....

  6. Problem Based Learning

    Directory of Open Access Journals (Sweden)

    Paola Cappola

    2014-01-01

    Full Text Available In this current work, I am proposing a general close examination of Problem Based Learning as a student centered educational method in which a problem constitutes  the starting point of the learning process. Such a method provides students with the suitable knowledge for problem solving and presents numerous and significant differences compared to traditional education.In particular, I analyze the theoretic aspects of problem learning by tracing a history and presenting its structure, clarifying the role of the tutor in the various phases of the learning process. The method has found a wide diffusion since the beginning of the 70s and numerous studies have confirmed the advantages. The effectiveness of PBL is construable and is based on principles of constructivism and cognitivism.

  7. Arduino based laser control

    OpenAIRE

    Bernal Muñoz, Ferran

    2015-01-01

    ARDUINO is a vey usefull platform for prototypes. In this project ARDUINO will be used for controling a Semiconductor Tuneable Laser. [ANGLÈS] Diode laser for communications control based on an Arduino board. Temperature control implementation. Software and hardware protection for the laser implementation. [CASTELLÀ] Control de un láser de comunicaciones ópticas desde el ordenador utilizando una placa Arduino. Implementación de un control de temperatura y protección software y hardware ...

  8. Plasmonics based VLSI processes

    Directory of Open Access Journals (Sweden)

    Shreya Bhattacharya

    2013-04-01

    Full Text Available In continuum to my previous paper titled‘Implementation of plasmonics in VLSI’, this paperattempts to explore further, the actual physicalrealization of an all-plasmonic chip. In this paper,various methods of plasmon-basedphotolithography have been discussed and anobservation is made w.r.t the cost effectiveness andease of adaptability. Also, plasmonics based activeelement has been discussed which would helpunravel further arenas ofapproaches and methodstowards the realization of an all-plasmonic chip.

  9. Sea Basing Logistiek

    Science.gov (United States)

    2007-11-01

    iets wat nu voor zware goederen niet mogeljk is met de middelen binnen de Nederlandse krijgsmachit. Nederland heeft bijvoorbeeld (nog) geen specifieke...weer gevechtsklaar zijn. Nederlandse troepen voeren de reconstitute fase in het algemeen uit in Nederland . [Kang&Gue] A special concern in sea based...besluit Nederland zelf in te grijpen. De Nederlandse vredesmissie bevindt zich nabij de kust. maar ver van de grens met buurlanden. De operatie kan niet

  10. Graphene-based biosensors

    Science.gov (United States)

    Lebedev, A. A.; Davydov, V. Yu.; Novikov, S. N.; Litvin, D. P.; Makarov, Yu. N.; Klimovich, V. B.; Samoilovich, M. P.

    2016-07-01

    Results of developing and testing graphene-based sensors capable of detecting protein molecules are presented. The biosensor operation was checked using an immunochemical system comprising fluorescein dye and monoclonal antifluorescein antibodies. The sensor detects fluorescein concentration on a level of 1-10 ng/mL and bovine serum albumin-fluorescein conjugate on a level of 1-5 ng/mL. The proposed device has good prospects for use for early diagnostics of various diseases.

  11. Projectile Base Flow Analysis

    Science.gov (United States)

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  12. Vehicle Based Vector Sensor

    Science.gov (United States)

    2015-09-28

    300001 1 of 16 VEHICLE-BASED VECTOR SENSOR STATEMENT OF GOVERNMENT INTEREST [0001] The invention described herein may be manufactured and...unmanned underwater vehicle that can function as an acoustic vector sensor . (2) Description of the Prior Art [0004] It is known that a propagating...mechanics. An acoustic vector sensor measures the particle motion via an accelerometer and combines Attorney Docket No. 300001 2 of 16 the

  13. Alphavirus-Based Vaccines

    OpenAIRE

    Kenneth Lundstrom

    2014-01-01

    Alphavirus vectors have demonstrated high levels of transient heterologous gene expression both in vitro and in vivo and, therefore, possess attractive features for vaccine development. The most commonly used delivery vectors are based on three single-stranded encapsulated alphaviruses, namely Semliki Forest virus, Sindbis virus and Venezuelan equine encephalitis virus. Alphavirus vectors have been applied as replication-deficient recombinant viral particles and, more recently, as replication...

  14. Lignocellulose-based bioproducts

    CERN Document Server

    Karimi, Keikhosro

    2015-01-01

    This volume provides the technical information required for the production of biofuels and chemicals from lignocellulosic biomass. It starts with a brief overview of the importance, applications, and production processes of different lignocellulosic products. Further chapters review the perspectives of waste-based biofuels and biochemicals; the pretreatment of lignocellulosic biomass for biofuel production; cellulolytic enzyme systems for the hydrolysis of lignocelluloses; and basic and applied aspects of the production of bioethanol, biogas, biohydrogen, and biobutanol from lignocelluloses.

  15. Luxury-based Growth

    OpenAIRE

    Shiro Kuwahara

    2006-01-01

    Assuming that there exists a preference for luxury goods and a knowledge spillover from luxury goods production to goods production, this paper constructs an endogenous economic growth model. The model predicts two steady states: one is a steady positive growth state with regard to luxury goods production, and the other is a zero growth state in the absence of luxury goods production. Thus, this study examines the polarization of economies based on luxury goods consumption

  16. Arduino based laser control

    OpenAIRE

    Bernal Muñoz, Ferran

    2015-01-01

    ARDUINO is a vey usefull platform for prototypes. In this project ARDUINO will be used for controling a Semiconductor Tuneable Laser. [ANGLÈS] Diode laser for communications control based on an Arduino board. Temperature control implementation. Software and hardware protection for the laser implementation. [CASTELLÀ] Control de un láser de comunicaciones ópticas desde el ordenador utilizando una placa Arduino. Implementación de un control de temperatura y protección software y hardware ...

  17. Graphene-based Nanoelectronics

    Science.gov (United States)

    2013-02-01

    structures such as metal-insulator-graphene tunnel junctions, designs and fabrication processes were developed based on e-beam lithography ( EBL ...capable of minimum feature sizes of 7 nm. A variety of high-resolution photoresists were tested with the EBL system to produce graphene structures with...Initiative e-textiles electronic textiles EBL e-beam lithography EDX emergy-dispersive x-ray spectroscopy EIS electrochemical impedance

  18. As bases do petismo

    Directory of Open Access Journals (Sweden)

    David Samuels

    2004-10-01

    Full Text Available A partir dos dados do ESEB de 2002 o autor realiza um estudo das bases eleitorais do PT e de hipóteses sobre a natureza do petismo. Através de técnicas estatísticas multivariadas, são testadas relações do petismo com variáveis demográficas, socioeconômicas e variáveis relativas a questões políticas específicas. Os resultados apontam que apenas a escolaridade tem uma associação específica com o petismo, com implicações para o seu comportamento sócio-político.Based on the results of the 2002 Brazilian Electoral Study, the author analyses the electoral bases of the Worker´s Party and the factors associated with the "petismo". The relationships between the "petismo"and the socioeconomic, demographic and political variables are tested using multivariate analysis. The results indicate that the only "social category"associated with "petismo"is level of education, and it has clear implications to their social and political behavior.

  19. Molecule-based magnets

    Indian Academy of Sciences (India)

    J V Yakhmi

    2009-06-01

    The conventional magnetic materials used in current technology, such as, Fe, Fe2O3, Cr2O3, SmCo5, Nd2Fe14B etc are all atom-based, and their preparation/processing require high temperature routes. Employing self-assembly methods, it is possible to engineer a bulk molecular material with long-range magnetic order, mainly because one can play with the weak intermolecular interactions. Since the first successful synthesis of molecular magnets in 1986, a large variety of them have been synthesized, which can be categorized on the basis of the chemical nature of the magnetic units involved: organic-, metal-based systems, heterobimetallic assemblies, or mixed organic–inorganic systems. The design of molecule-based magnets has also been extended to the design of poly-functional molecular magnets, such as those exhibiting second-order optical nonlinearity, liquid crystallinity, or chirality simultaneously with long-range magnetic order. Solubility, low density and biocompatibility are attractive features of molecular magnets. Being weakly coloured, unlike their opaque classical magnet ‘cousins’ listed above, possibilities of photomagnetic switching exist. Persistent efforts also continue to design the ever-elusive polymer magnets towards applications in industry. While providing a brief overview of the field of molecular magnetism, this article highlights some recent developments in it, with emphasis on a few studies from the author’s own lab.

  20. Microbead agglutination based assays

    KAUST Repository

    Kodzius, Rimantas

    2013-01-21

    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microbeads in the presence of a specific analyte thus enabling the macroscopic observation. Such tests are most often used to explore antibody-antigen reactions. Agglutination has been used for protein assays using a biotin/streptavidin system as well as a hybridization based assay. The agglutination systems are prone to selftermination of the linking analyte, prone to active site saturation and loss of agglomeration at high analyte concentrations. We investigated the molecular target/ligand interaction, explaining the common agglutination problems related to analyte self-termination, linkage of the analyte to the same bead instead of different microbeads. We classified the agglutination process into three kinds of assays: a two- component assay, a three-component assay and a stepped three- component assay. Although we compared these three kinds of assays for recognizing DNA and protein molecules, the assay can be used for virtually any molecule, including ions and metabolites. In total, the optimized assay permits detecting analytes with high sensitivity in a short time, 5 min, at room temperature. Such a system is appropriate for POC testing.

  1. Nanowire-based thermoelectrics

    Science.gov (United States)

    Ali, Azhar; Chen, Yixi; Vasiraju, Venkata; Vaddiraju, Sreeram

    2017-07-01

    Research on thermoelectrics has seen a huge resurgence since the early 1990s. The ability of tuning a material’s electrical and thermal transport behavior upon nanostructuring has led to this revival. Nevertheless, thermoelectric performances of nanowires and related materials lag far behind those achieved with thin-film superlattices and quantum dot-based materials. This is despite the fact that nanowires offer many distinct advantages in enhancing the thermoelectric performances of materials. The simplicity of the strategy is the first and foremost advantage. For example, control of the nanowire diameters and their surface roughnesses will aid in enhancing their thermoelectric performances. Another major advantage is the possibility of obtaining high thermoelectric performances using simpler nanowire chemistries (e.g., elemental and binary compound semiconductors), paving the way for the fabrication of thermoelectric modules inexpensively from non-toxic elements. In this context, the topical review provides an overview of the current state of nanowire-based thermoelectrics. It concludes with a discussion of the future vision of nanowire-based thermoelectrics, including the need for developing strategies aimed at the mass production of nanowires and their interface-engineered assembly into devices. This eliminates the need for trial-and-error strategies and complex chemistries for enhancing the thermoelectric performances of materials.

  2. Lunar base initiative 1992

    Science.gov (United States)

    Koelle, H. H.

    The return to the Moon is no longer a question of yes or no, but a question of when and how. The first landing of a human being on the lunar surface in 1969 was a purely national effort of the U.S.A. Building a lunar base and operating it in the next century is rather a task for all nations of this planet, even if one nation could do it alone. However, there are several alternatives to carry out such a program and these will and should be debated during the next years on an urgent basis. To do this, one has to take into account not only the historical accomplishments and the present trends of cooperation in space programs, but also recent geopolitical developments as well as the frame of reference established by international law. The case for an International Lunar Base (ILB) has been presented to the International Academy of Astronautics on 11 October 1987 by the IAA Ad Hoc Committee "Return-to-the-Moon". This draft of a position paper was subsequently published in Acta Astronautica Vol. 17, No. 5, (pp. 463-489) with the request of public debate particularly by the members of the Academy. Some 80 Academicians responded to this invitation by the President of the Academy and voiced their opinions on the questions and issues raised by this draft of a position paper. This led to a refinement of the arguments and assumptions made and it is now possible to prepare an improved position paper proposing concrete steps which may lead to an ILB. An issue of this proportion must start with a discussion of goals and objectives to be arranged in some kind of a ranked order. It also has to take note of the limitations existing at any time by the availability of suitable space transportation systems. These will determine the acquisition date and rate of growth of a lunar base. The logistics system will also greatly influence the base characteristics and layout. The availability of heavy lift launch vehicles would simplify the task and allow to concentrate the construction

  3. Polyolefin-Based Aerogels

    Science.gov (United States)

    Lee, Je Kyun; Gould, George

    2012-01-01

    An organic polybutadiene (PB) rubberbased aerogel insulation material was developed that will provide superior thermal insulation and inherent radiation protection, exhibiting the flexibility, resiliency, toughness, and durability typical of the parent polymer, yet with the low density and superior insulation properties associated with the aerogels. The rubbery behaviors of the PB rubber-based aerogels are able to overcome the weak and brittle nature of conventional inorganic and organic aerogel insulation materials. Additionally, with higher content of hydrogen in their structure, the PB rubber aerogels will also provide inherently better radiation protection than those of inorganic and carbon aerogels. Since PB rubber aerogels also exhibit good hydrophobicity due to their hydrocarbon molecular structure, they will provide better performance reliability and durability as well as simpler, more economic, and environmentally friendly production over the conventional silica or other inorganic-based aerogels, which require chemical treatment to make them hydrophobic. Inorganic aerogels such as silica aerogels demonstrate many unusual and useful properties. There are several strategies to overcoming the drawbacks associated with the weakness and brittleness of silica aerogels. Development of the flexible fiber-reinforced silica aerogel composite blanket has proven one promising approach, providing a conveniently fielded form factor that is relatively robust toward handling in industrial environments compared to silica aerogel monoliths. However, the flexible silica aerogel composites still have a brittle, dusty character that may be undesirable, or even intolerable, in certain applications. Although the cross-linked organic aerogels such as resorcinol-formaldehyde (RF), polyisocyanurate, and cellulose aerogels show very high impact strength, they are also very brittle with little elongation (i.e., less rubbery). Also, silica and carbon aerogels are less efficient

  4. Characteristics Data Base

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, E.D.; Moore, R.S. (Automated Sciences Group, Inc., Oak Ridge, TN (USA))

    1990-08-01

    The LWR Serial Numbers Database System (SNDB) contains detailed data about individual, historically discharged LWR spent fuel assemblies. This data includes the reactor where used, the year the assemblies were discharged, the pool where they are currently stored, assembly type, burnup, weight, enrichment, and an estimate of their radiological properties. This information is distributed on floppy disks to users in the nuclear industry to assist in planning for the permanent nuclear waste repository. This document describes the design and development of the SNDB. It provides a complete description of the file structures and an outline of the major code modules. It serves as a reference for a programmer maintaining the system, or for others interested in the technical detail of this database. This is the initial version of the SNDB. It contains historical data through December 31, 1987, obtained from the Energy Information Administration (EIA). EIA obtains the data from the utility companies via the RW-859 Survey Form. It evaluates and standardizes the data and distributes the resulting batch level database as a large file on magnetic tape. The Characteristics Data Base obtains this database for use in the LWR Quantities Data Base. Additionally, the CDB obtains the individual assembly level detail from EIA for use in the SNDB. While the Quantities Data Base retains only the level of detail necessary for its reporting, the SNDB does retain and use the batch level data to assist in the identification of a particular assembly serial number. We expect to update the SNDB on an annual basis, as new historical data becomes available.

  5. Vision-based interaction

    CERN Document Server

    Turk, Matthew

    2013-01-01

    In its early years, the field of computer vision was largely motivated by researchers seeking computational models of biological vision and solutions to practical problems in manufacturing, defense, and medicine. For the past two decades or so, there has been an increasing interest in computer vision as an input modality in the context of human-computer interaction. Such vision-based interaction can endow interactive systems with visual capabilities similar to those important to human-human interaction, in order to perceive non-verbal cues and incorporate this information in applications such

  6. Alphavirus-Based Vaccines.

    Science.gov (United States)

    Lundstrom, Kenneth

    2016-01-01

    Alphavirus vectors based on Semliki Forest virus, Sindbis virus, and Venezuelan equine encephalitis virus have been widely applied for vaccine development. Naked RNA replicons, recombinant viral particles, and layered DNA vectors have been subjected to immunization in preclinical animal models with antigens for viral targets and tumor antigens. Moreover, a limited number of clinical trials have been conducted in humans. Vaccination with alphavirus vectors has demonstrated efficient immune responses and has showed protection against challenges with lethal doses of virus and tumor cells, respectively. Moreover, vaccines have been developed against alphaviruses causing epidemics such as Chikungunya virus.

  7. Cellular based cancer vaccines

    DEFF Research Database (Denmark)

    Hansen, Morten; Met, O; Svane, I M;

    2012-01-01

    Cancer vaccines designed to re-calibrate the existing host-tumour interaction, tipping the balance from tumor acceptance towards tumor control holds huge potential to complement traditional cancer therapies. In general, limited success has been achieved with vaccines composed of tumor...... in vitro migration via autocrine receptor-mediated endocytosis of CCR7. In the current review, we discuss optimal design of DC maturation focused on pre-clinical as well as clinical results from standard and polarized dendritic cell based cancer vaccines....

  8. Agent-Based Optimization

    CERN Document Server

    Jędrzejowicz, Piotr; Kacprzyk, Janusz

    2013-01-01

    This volume presents a collection of original research works by leading specialists focusing on novel and promising approaches in which the multi-agent system paradigm is used to support, enhance or replace traditional approaches to solving difficult optimization problems. The editors have invited several well-known specialists to present their solutions, tools, and models falling under the common denominator of the agent-based optimization. The book consists of eight chapters covering examples of application of the multi-agent paradigm and respective customized tools to solve  difficult optimization problems arising in different areas such as machine learning, scheduling, transportation and, more generally, distributed and cooperative problem solving.

  9. Bi-based superconductor

    Directory of Open Access Journals (Sweden)

    S E Mousavi

    2009-08-01

    Full Text Available   In this paper, Bi-Sr-Ca-Cu-O (BCSCCO system superconductor is made by the solid state reaction method. The effect of doping Pb, Cd, Sb, Cu and annealing time on the critical temperature and critical current density have been investigated. The microstructure and morphology of the samples have been studied by X-ray diffraction, scanning electron microscope and energy dispersive X-ray. The results show that the fraction of Bi-2223 phase in the Bi- based superconductor, critical temperature and critical current density depend on the annealing temperature, annealing time and the kind and amount of doping .

  10. Chitosan-based nanocomposites

    CSIR Research Space (South Africa)

    Kesavan Pillai, Sreejarani

    2012-08-01

    Full Text Available , and hygiene devices. They thus represent a strong and emerging answer for improved and eco-friendly materials. This chapter reviews the recent developments in the area of chitosan-based nanocomposites, with a special emphasis on clay-containing nanocomposites...-sized mineral fillers like silica, talc, and clay are added to reduce the cost and improve chitosan’s performance in some way. However, the mechanical properties such as elongation at break and tensile strength of these composites decrease with the incorporation...

  11. Problem Based Game Design

    DEFF Research Database (Denmark)

    Reng, Lars; Schoenau-Fog, Henrik

    2011-01-01

    At Aalborg University’s department of Medialogy, we are utilizing the Problem Based Learning method to encourage students to solve game design problems by pushing the boundaries and designing innovative games. This paper is concerned with describing this method, how students employ it in various...... projects and how they learn to analyse, design, and develop for innovation by using it. We will present various cases to exemplify the approach and focus on how the method engages students and aspires for innovation in digital entertainment and games....

  12. Sustainability Base Construction Update

    Science.gov (United States)

    Mewhinney, Michael

    2012-01-01

    Construction of the new Sustainability Base Collaborative support facility, expected to become the highest performing building in the federal government continues at NASA's Ames Research Center, Moffet Field, Calif. The new building is designed to achieve a platinum rating under the leadership in Energy and Environment Design (LEED) new construction standards for environmentally sustainable construction developed by the U. S. Green Building Council, Washington, D. C. When completed by the end of 2011, the $20.6 million building will feature near zero net energy consumption, use 90 percent less potable water than conventionally build buildings of equivalent size, and will result in reduced building maintenance costs.

  13. Polymerization Using Phosphazene Bases

    KAUST Repository

    Zhao, Junpeng

    2015-09-01

    In the recent rise of metal-free polymerization techniques, organic phosphazene superbases have shown their remarkable strength as promoter/catalyst for the anionic polymerization of various types of monomers. Generally, the complexation of phosphazene base with the counterion (proton or lithium cation) significantly improves the nucleophilicity of the initiator/chain end resulting in highly enhanced polymerization rates, as compared with conventional metalbased initiating systems. In this chapter, the general features of phosphazenepromoted/catalyzed polymerizations and the applications in macromolecular engineering (synthesis of functionalized polymers, block copolymers, and macromolecular architectures) are discussed with challenges and perspectives being pointed out.

  14. NICKEL-BASE ALLOY

    Science.gov (United States)

    Inouye, H.; Manly, W.D.; Roche, T.K.

    1960-01-19

    A nickel-base alloy was developed which is particularly useful for the containment of molten fluoride salts in reactors. The alloy is resistant to both salt corrosion and oxidation and may be used at temperatures as high as 1800 deg F. Basically, the alloy consists of 15 to 22 wt.% molybdenum, a small amount of carbon, and 6 to 8 wt.% chromium, the balance being nickel. Up to 4 wt.% of tungsten, tantalum, vanadium, or niobium may be added to strengthen the alloy.

  15. LIGHTWEIGHT CONCRETE BASED GRANSHLAK

    Directory of Open Access Journals (Sweden)

    NETESA M. I.

    2016-02-01

    Full Text Available Raising of problem. Concrete advisable to obtain a low strength with local secondary resources for recycling and reduce the environmental burden on the environment. But it is important to design such concrete compositions with a reduced flow of cement. It is known that the coefficient of efficiency of use of cement in the concrete of the heavy and B10 is less than about 0.5, which is almost two times smaller than in class B15 concrete and above. Even lower coefficient of efficiency in light concrete cement low strength. Therefore, it is important to find patterns determining the composition of lightweight concrete based on local-products industry with more efficient use of cement in them. Purpose.. Based on the analysis of earlier research results, including with the use of methods of mathematical planning of experiments to determine the concrete contents, which can provide the requirements for the underlying layers of the floor, the compressive strength of which should correspond to the class B5. It is important to provide the required strength at minimum flow of the cement, which is the most expensive and energy-intensive part of concrete. Conclusion. Analysis of the test results of control samples of concrete in 28-day-old, the following laws. The required tensile strength of concrete compressive strength of 7.0 MPa can be obtained in the test range when used in formulations as a filler as the Dnieper hydroelectric power station fly ash and tailings Krivoy Rog iron ore YuGOK. To ensure providing the required characteristic strength of the concrete in the underlying layers of the floor is advisable to use a nominal composition per cubic meter of concrete: cement 160 kg granshlaka Plant named after Petrovsky, 675 kg of fly ash Dnieper HPP 390 kg, 400 kg of sand, 230 liters of water. Thus, while ensuring rational grain composition components can obtain the desired strength lightweight concrete based granshlaka plant Petrovsky, using as fillers

  16. Group Based Interference Alignment

    CERN Document Server

    Ma, Yanjun; Chen, Rui; Yao, Junliang

    2010-01-01

    in $K$-user single-input single-output (SISO) frequency selective fading interference channels, it is shown that the achievable multiplexing gain is almost surely $K/2$ by using interference alignment (IA). However when the signaling dimensions is limited, allocating all the resource to all the users simultaneously is not optimal. According to this problem, a group based interference alignment (GIA) scheme is proposed and a search algorithm is designed to get the group patterns and the resource allocation among them. Analysis results show that our proposed scheme achieves a higher multiplexing gain when the resource is limited.

  17. [Evidence-based physiotherapy].

    Science.gov (United States)

    Bender, Tamás

    2013-12-01

    This article on physiotherapy presents some current evidence stating the strengths and weaknesses of the physiotherapeutic procedures. In the area of physiotherapy empirical data obtained during decades were overtaken by evidence from current studies. The author points out the great problem of physiotherapy, namely the heterogeneity of the applied parameters. Knowledge of current evidence may be very important and helpful for the physicians, but the author proposes, from the practical point of view, that physiotherapeutical procedures based on exprience and used for many years should not be entirely neglected. Nowadays physiotherapy plays an important role in the treament of locomotor diseases but its use is increasing in other fields of medicine, as well.

  18. Unification-Based Glossing

    CERN Document Server

    Hatzivassiloglou, V; Hatzivassiloglou, Vasileios; Knight, Kevin

    1995-01-01

    We present an approach to syntax-based machine translation that combines unification-style interpretation with statistical processing. This approach enables us to translate any Japanese newspaper article into English, with quality far better than a word-for-word translation. Novel ideas include the use of feature structures to encode word lattices and the use of unification to compose and manipulate lattices. Unification also allows us to specify abstract features that delay target-language synthesis until enough source-language information is assembled. Our statistical component enables us to search efficiently among competing translations and locate those with high English fluency.

  19. Problem Based Game Design

    DEFF Research Database (Denmark)

    Reng, Lars; Schoenau-Fog, Henrik

    2011-01-01

    At Aalborg University’s department of Medialogy, we are utilizing the Problem Based Learning method to encourage students to solve game design problems by pushing the boundaries and designing innovative games. This paper is concerned with describing this method, how students employ it in various...... projects and how they learn to analyse, design, and develop for innovation by using it. We will present various cases to exemplify the approach and focus on how the method engages students and aspires for innovation in digital entertainment and games....

  20. TUNGSTEN BASE ALLOYS

    Science.gov (United States)

    Schell, D.H.; Sheinberg, H.

    1959-12-15

    A high-density quaternary tungsten-base alloy having high mechanical strength and good machinability composed of about 2 wt.% Ni, 3 wt.% Cu, 5 wt.% Pb, and 90wt.% W is described. This alloy can be formed by the powder metallurgy technique of hot pressing in a graphite die without causing a reaction between charge and the die and without formation of a carbide case on the final compact, thereby enabling re-use of the graphite die. The alloy is formable at hot- pressing temperatures of from about 1200 to about 1350 deg C. In addition, there is little component shrinkage, thereby eliminating the necessity of subsequent extensive surface machining.

  1. Constraint-based reachability

    Directory of Open Access Journals (Sweden)

    Arnaud Gotlieb

    2013-02-01

    Full Text Available Iterative imperative programs can be considered as infinite-state systems computing over possibly unbounded domains. Studying reachability in these systems is challenging as it requires to deal with an infinite number of states with standard backward or forward exploration strategies. An approach that we call Constraint-based reachability, is proposed to address reachability problems by exploring program states using a constraint model of the whole program. The keypoint of the approach is to interpret imperative constructions such as conditionals, loops, array and memory manipulations with the fundamental notion of constraint over a computational domain. By combining constraint filtering and abstraction techniques, Constraint-based reachability is able to solve reachability problems which are usually outside the scope of backward or forward exploration strategies. This paper proposes an interpretation of classical filtering consistencies used in Constraint Programming as abstract domain computations, and shows how this approach can be used to produce a constraint solver that efficiently generates solutions for reachability problems that are unsolvable by other approaches.

  2. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  3. Design Based Wilderness Education

    Directory of Open Access Journals (Sweden)

    Christopher R. Saulnier

    2015-02-01

    Full Text Available The Massachusetts Institute of Technology (MIT has been collaborating since 2010 with the Singapore Ministry of Education to help develop the Singapore University of Technology and Design (SUTD. One element of this collaboration, the Global Leadership Program (GLP, aims to provide SUTD students with the opportunity to interact with the MIT community and experience MIT’s academic culture. During GLP students participate in a program designed to develop leadership ability while also increasing their understanding of engineering science and design thinking. This paper introduces a curriculum combining the pedagogies of design-based learning and wilderness education that was implemented in the summer of 2014 to holistically address the development of these three competencies. Through design-based learning activities, both for and in a natural environment, students were encouraged to develop competencies in engineering science and engineering design while exploring the diverse attributes essential for success as an engineer. This paper examines the results of a retrospective post-then-pre survey administered to the participants upon completion of the program to explore the effects of the program on the development of professional engineering competencies. We find a statistically significant increase in items associated with Individual Leadership Skill, Group Leadership Skill and the role of Society and the Economy. These results are triangulated with student exit interviews and instructor observations.

  4. Knowledge Based Economy Assessment

    Directory of Open Access Journals (Sweden)

    Madalina Cristina Tocan

    2012-12-01

    Full Text Available The importance of knowledge-based economy (KBE in the XXI century isevident. In the article the reflection of knowledge on economy is analyzed. The main point is targeted to the analysis of characteristics of knowledge expression in economy and to the construction of structure of KBE expression. This allows understanding the mechanism of functioning of knowledge economy. Theauthors highlight the possibility to assess the penetration level of KBE which could manifest itself trough the existence of products of knowledge expression which could be created in acquisition, creation, usage and development of them. The latter phenomenon is interpreted as knowledge expression characteristics: economic and social context, human resources, ICT, innovative business and innovation policy. The reason for this analysis was based on the idea that in spite of the knowledge economy existence in all developed World countries adefinitive, universal list of indicators for mapping and measuring the KBE does not yet exists. Knowledge Expression Assessment Models are presented in the article.

  5. Location-based Scheduling

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    The coordination of activities and resources in order to establish an effective production flow is central to the management of construction projects. The traditional technique for coordination of activities and resources in construction projects is the CPM-scheduling, which has been the predomin......The coordination of activities and resources in order to establish an effective production flow is central to the management of construction projects. The traditional technique for coordination of activities and resources in construction projects is the CPM-scheduling, which has been...... the predominant scheduling method since it was introduced in the late 1950s. Over the years, CPM has proven to be a very powerful technique for planning, scheduling and controlling projects, which among other things is indicated by the development of a large number of CPM-based software applications available...... on the market. However, CPM is primarily an activity based method that takes the activity as the unit of focus and there is criticism raised, specifically in the case of construction projects, on the method for deficient management of construction work and continuous flow of resources. To seek solutions...

  6. Location-based games

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine

    In this dissertation, it is explored which prerequisites are necessary in location-based games (LBGs) to make meaningful the meeting between players and spatiality with an emphasis on physical locations. Throughout the dissertation, it has been shown that LBGs affect players’ perception of and be......In this dissertation, it is explored which prerequisites are necessary in location-based games (LBGs) to make meaningful the meeting between players and spatiality with an emphasis on physical locations. Throughout the dissertation, it has been shown that LBGs affect players’ perception...... of and behavior in everyday spaces, as the games reside on the boundaries between the continuums of play and ordinary, authentic and fictional, and as they merge physical and digital media. These are termed the six dimensions of LBGs. LBGs let the player explore the boundaries between these dimensions...... experiences of being in the world and the creation of meaning. The theory on motivation defines what motivation consists of and how it relates to our actions. This theory has been combined with theories concerning play and play culture, digital media, (digital) games, (optimal) experiences, landscape...

  7. [Competence based medical education].

    Science.gov (United States)

    Bernabó, Jorge G; Buraschi, Jorge; Olcese, Juan; Buraschi, María; Duro, Eduardo

    2007-01-01

    The strategy of curriculum planning in the majority of the Schools of Medicine has shifted, in the past years, from curriculum models based in contents to outcome oriented curricula. Coincidently the interest in defining and evaluating the clinical competences that a graduate must have has grown. In our country, and particularly in the Associated Hospitals belonging to the Unidad Regional de Enseñanza IV of the UBA School of Medicine, evidence has been gathered showing that the acquisition of clinical competences during the grade is in general insufficient. The foundations and characteristics of PREM (Programa de Requisitos Esenciales Mínimos) are described. PREM is a tool to promote the apprenticeship of abilities and necessary skills for the practice of medicine. The objective of the program is to promote the apprenticeship of a well defined list of core competences considered indispensable for a general practitioner. An outcome oriented curriculum with a clear definition of the expected knowledge, skills and attitudes of a graduate of the programme, the promotion of learning experiences centered in the practice and evaluation tools based in direct observation of the student's performance should contribute to close the gap between what the Medicine Schools traditionally teach and evaluate, and what the doctor needs to know and needs to do to perform correctly its profession.

  8. Droplet based microfluidics.

    Science.gov (United States)

    Seemann, Ralf; Brinkmann, Martin; Pfohl, Thomas; Herminghaus, Stephan

    2012-01-01

    Droplet based microfluidics is a rapidly growing interdisciplinary field of research combining soft matter physics, biochemistry and microsystems engineering. Its applications range from fast analytical systems or the synthesis of advanced materials to protein crystallization and biological assays for living cells. Precise control of droplet volumes and reliable manipulation of individual droplets such as coalescence, mixing of their contents, and sorting in combination with fast analysis tools allow us to perform chemical reactions inside the droplets under defined conditions. In this paper, we will review available drop generation and manipulation techniques. The main focus of this review is not to be comprehensive and explain all techniques in great detail but to identify and shed light on similarities and underlying physical principles. Since geometry and wetting properties of the microfluidic channels are crucial factors for droplet generation, we also briefly describe typical device fabrication methods in droplet based microfluidics. Examples of applications and reaction schemes which rely on the discussed manipulation techniques are also presented, such as the fabrication of special materials and biophysical experiments.

  9. Droplet based microfluidics

    Science.gov (United States)

    Seemann, Ralf; Brinkmann, Martin; Pfohl, Thomas; Herminghaus, Stephan

    2012-01-01

    Droplet based microfluidics is a rapidly growing interdisciplinary field of research combining soft matter physics, biochemistry and microsystems engineering. Its applications range from fast analytical systems or the synthesis of advanced materials to protein crystallization and biological assays for living cells. Precise control of droplet volumes and reliable manipulation of individual droplets such as coalescence, mixing of their contents, and sorting in combination with fast analysis tools allow us to perform chemical reactions inside the droplets under defined conditions. In this paper, we will review available drop generation and manipulation techniques. The main focus of this review is not to be comprehensive and explain all techniques in great detail but to identify and shed light on similarities and underlying physical principles. Since geometry and wetting properties of the microfluidic channels are crucial factors for droplet generation, we also briefly describe typical device fabrication methods in droplet based microfluidics. Examples of applications and reaction schemes which rely on the discussed manipulation techniques are also presented, such as the fabrication of special materials and biophysical experiments.

  10. Moon base reactor system

    Science.gov (United States)

    Chavez, H.; Flores, J.; Nguyen, M.; Carsen, K.

    1989-01-01

    The objective of our reactor design is to supply a lunar-based research facility with 20 MW(e). The fundamental layout of this lunar-based system includes the reactor, power conversion devices, and a radiator. The additional aim of this reactor is a longevity of 12 to 15 years. The reactor is a liquid metal fast breeder that has a breeding ratio very close to 1.0. The geometry of the core is cylindrical. The metallic fuel rods are of beryllium oxide enriched with varying degrees of uranium, with a beryllium core reflector. The liquid metal coolant chosen was natural lithium. After the liquid metal coolant leaves the reactor, it goes directly into the power conversion devices. The power conversion devices are Stirling engines. The heated coolant acts as a hot reservoir to the device. It then enters the radiator to be cooled and reenters the Stirling engine acting as a cold reservoir. The engines' operating fluid is helium, a highly conductive gas. These Stirling engines are hermetically sealed. Although natural lithium produces a lower breeding ratio, it does have a larger temperature range than sodium. It is also corrosive to steel. This is why the container material must be carefully chosen. One option is to use an expensive alloy of cerbium and zirconium. The radiator must be made of a highly conductive material whose melting point temperature is not exceeded in the reactor and whose structural strength can withstand meteor showers.

  11. Challenge Based Innovation gala

    CERN Document Server

    CERN. Geneva; Utriainen, Tuuli Maria; Toivonen, Harri; Nordberg, Markus

    2014-01-01

    Challenge Based Innovation gala   There’s a new experiment starting in CERN called IdeaLab where we work together with detector R&D researchers to help them to bridge their knowledge into a more human, societally oriented context. Currently we are located in B153, but will move our activities to a new facility next to the Globe in May 2014. One of our first pilot projects is a 5 month course CBI (Challenge Based Innovation) where two multidisciplinary student teams join forces with Edusafe & TALENT projects at CERN. Their goal is to discover what kind of tools for learning could be created in collaboration with the two groups. After months of user interviews and low resolution prototyping they are ready to share the results with us in the form of an afternoon gala. We warmly welcome you to join us to see the students' results and experience the prototypes they have conceived. The event is in three parts, you are welcome to visit all of them,...

  12. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  13. Surface Plasmon Based Spectrometer

    Science.gov (United States)

    Wig, Andrew; Passian, Ali; Boudreaux, Philip; Ferrell, Tom

    2008-03-01

    A spectrometer that uses surface plasmon excitation in thin metal films to separate light into its component wavelengths is described. The use of surface plasmons as a dispersive medium sets this spectrometer apart from prism, grating, and interference based variants and allows for the miniaturization of this device. Theoretical and experimental results are presented for two different operation models. In the first case surface plasmon tunneling in the near field is used to provide transmission spectra of different broad band-pass, glass filters across the visible wavelength range with high stray-light rejection at low resolution as well as absorption spectra of chlorophyll extracted from a spinach leaf. The second model looks at the far field components of surface plasmon scattering.

  14. Graphene based biosensors

    Energy Technology Data Exchange (ETDEWEB)

    Gürel, Hikmet Hakan, E-mail: hhakan.gurel@kocaeli.edu.tr [Kocaeli University, Kocaeli (Turkey); Salmankurt, Bahadır [Sakarya University, Sakarya (Turkey)

    2016-03-25

    Nanometer-sized graphene as a 2D material has unique chemical and electronic properties. Because of its unique physical, chemical, and electronic properties, its interesting shape and size make it a promising nanomaterial in many biological applications. It is expected that biomaterials incorporating graphene will be developed for the graphene-based drug delivery systems and biomedical devices. The interactions of biomolecules and graphene are long-ranged and very weak. Development of new techniques is very desirable for design of bioelectronics sensors and devices. In this work, we present first-principles calculations within density functional theory to calculate effects of charging on nucleobases on graphene. It is shown that how modify structural and electronic properties of nucleobases on graphene by applied charging.

  15. Touching base with OPERA

    CERN Multimedia

    CERN Bulletin

    2011-01-01

    Three seminars – at CERN, at Gran Sasso and in Japan – and an article calling for the scrutiny of the scientific community: the OPERA Collaboration opened its research publicly. In addition to huge press coverage, this triggered welcome reactions from colleagues around the world, many of whom will attempt to independently interpret and reproduce the measurement. OPERA’s Spokesperson touches base with the Bulletin.   The CERN Main Auditorium was crowded as OPERA Physics co-ordinator Dario Autiero presented the results of their research (23 September 2011). According to the OPERA strategy, the results of the measurements are in the hands of the scientific community and, as for any other scientific result, several months will be needed before other groups will be able to perform an independent measurement. In the meantime, the OPERA Collaboration is dealing with an avalanche of emails from the scientific community, members of the general public, and the press. &...

  16. Base isolation: Fresh insight

    Energy Technology Data Exchange (ETDEWEB)

    Shustov, V.

    1993-07-15

    The objective of the research is a further development of the engineering concept of seismic isolation. Neglecting the transient stage of seismic loading results in a widespread misjudgement: The force of resistance associated with velocity is mostly conceived as a source of damping vibrations, though it is an active force at the same time, during an earthquake type excitation. For very pliant systems such as base isolated structures with relatively low bearing stiffness and with artificially added heavy damping mechanism, the so called `damping`` force may occur even the main pushing force at an earthquake. Thus, one of the two basic pillars of the common seismic isolation philosophy, namely, the doctrine of usefulness and necessity of a strong damping mechanism, is turning out to be a self-deception, sometimes even jeopardizing the safety of structures and discrediting the very idea of seismic isolation. There is a way out: breaking with damping dependancy.

  17. Lunar based massdriver applications

    Science.gov (United States)

    Ehresmann, Manfred; Gabrielli, Roland Atonius; Herdrich, Georg; Laufer, René

    2017-05-01

    The results of a lunar massdriver mission and system analysis are discussed and show a strong case for a permanent lunar settlement with a site near the lunar equator. A modular massdriver concept is introduced, which uses multiple acceleration modules to be able to launch large masses into a trajectory that is able to reach Earth. An orbital mechanics analysis concludes that the launch site will be in the Oceanus Procellarum a flat, Titanium rich lunar mare area. It is further shown that the bulk of massdriver components can be manufactured by collecting lunar minerals, which are broken down into its constituting elements. The mass to orbit transfer rates of massdriver case study are significant and can vary between 1.8 kt and 3.3 megatons per year depending on the available power. Thus a lunar massdriver would act as a catalyst for any space based activities and a game changer for the scale of feasible space projects.

  18. Bases para proyectiles dirigidos

    Directory of Open Access Journals (Sweden)

    Editorial, Equipo

    1959-03-01

    Full Text Available Aunque actualmente no se ha llegado a una línea general de métodos o sistemas que gobiernen un tipo característico de rampa y servicios auxiliares necesarios para el lanzamiento al espacio de proyectiles dirigidos a grandes alturas y distancias, las experiencias obtenidas en diferentes ensayos, utilizando distintos tipos de proyectiles y trayectorias balísticas, han sentado toda una serie de procedimientos, datos y conclusiones de gran valor balístico que, aun teniendo en cuenta la continua evolución del proyectil, sus formas, combustibles y alcances, se conocen ya, con bastante aproximación, las condiciones mínimas que ha de reunir una base dedicada a este tipo de lanzamientos.

  19. Computer based satellite design

    Science.gov (United States)

    Lashbrook, David D.

    1992-06-01

    A computer program to design geosynchronous spacecraft has been developed. The program consists of four separate but interrelated executable computer programs. The programs are compiled to run on a DOS based personnel computer. The source code is written in DoD mandated Ada programming language. The thesis presents the design technique and design equations used in the program. Detailed analysis is performed in the following areas for both dual spin and three axis stabilized spacecraft configurations: (1) Mass Propellent Budget and Mass Summary; (2) Battery Cell and Solar Cell Requirements for a Payload Power Requirement; and (3) Passive Thermal Control Requirements. A user's manual is included as Appendix A, and the source code for the computer programs as Appendix B.

  20. Watershed based intelligent scissors.

    Science.gov (United States)

    Wieclawek, W; Pietka, E

    2015-07-01

    Watershed based modification of intelligent scissors has been developed. This approach requires a preprocessing phase with anisotropic diffusion to reduce subtle edges. Then, the watershed transform enhances the corridors. Finally, a roaming procedure, developed in this study, delineates the edge selected by a user. Due to a very restrictive set of pixels, subjected to the analysis, this approach significantly reduces the computational complexity. Moreover, the accuracy of the algorithm performance makes often one click point to be sufficient for one edge delineation. The method has been evaluated on structures as different in shape and appearance as the retina layers in OCT exams, chest and abdomen in CT and knee in MR studies. The accuracy is comparable with the traditional Life-Wire approach, whereas the analysis time decreases due to the reduction of the user interaction and number of pixels processed by the method.

  1. Ferroelectrics based absorbing layers

    Science.gov (United States)

    Hao, Jianping; Sadaune, Véronique; Burgnies, Ludovic; Lippens, Didier

    2014-07-01

    We show that ferroelectrics-based periodic structure made of BaSrTiO3 (BST) cubes, arrayed onto a metal plate with a thin dielectric spacer film exhibit a dramatic enhancement of absorbance with value close to unity. The enhancement is found around the Mie magnetic resonance of the Ferroelectrics cubes with the backside metal layer stopping any transmitted waves. It also involves quasi-perfect impedance matching resulting in reflection suppression via simultaneous magnetic and electrical activities. In addition, it was shown numerically the existence of a periodicity optimum, which is explained from surface waves analysis along with trade-off between the resonance damping and the intrinsic loss of ferroelectrics cubes. An experimental verification in a hollow waveguide configuration with a good comparison with full-wave numerical modelling is at last reported by measuring the scattering parameters of single and dual BST cubes schemes pointing out coupling effects for densely packed structures.

  2. Transaction based approach

    Science.gov (United States)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  3. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  4. gis-based hydrological model based hydrological model upstream ...

    African Journals Online (AJOL)

    eobe

    Metrological Agency (NIMET) and Jebba Hydroelectric ... cycle by SWAT is based on the water balance equation: = + (. − ... The estimation of the base flow is done using Equation. 5. = . ( ..... Acetic Acid”, Nigerian Journal of Tecnology, Vol. 32,.

  5. Problem-based Learning in a Competency-based World.

    Science.gov (United States)

    Bechtel, Gregory A.; Davidhizar, Ruth; Bradshaw, Martha J.

    1999-01-01

    Problem-based learning emphasizes critical thinking and clinical judgment. Competency-based education focuses on clinical competence. A merger of the two in nursing education could generate higher levels of inquiry and more expert clinicians. (SK)

  6. PROcess Based Diagnostics PROBE

    Science.gov (United States)

    Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.

    2013-01-01

    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  7. Loyalty-based management.

    Science.gov (United States)

    Reichheld, F F

    1993-01-01

    Despite a flurry of activities aimed at serving customers better, few companies have systematically revamped their operations with customer loyalty in mind. Instead, most have adopted improvement programs ad hoc, and paybacks haven't materialized. Building a highly loyal customer base must be integral to a company's basic business strategy. Loyalty leaders like MBNA credit cards are successful because they have designed their entire business systems around customer loyalty--a self-reinforcing system in which the company delivers superior value consistently and reinvents cash flows to find and keep high-quality customers and employees. The economic benefits of high customer loyalty are measurable. When a company consistently delivers superior value and wins customer loyalty, market share and revenues go up, and the cost of acquiring new customers goes down. The better economics mean the company can pay workers better, which sets off a whole chain of events. Increased pay boosts employee moral and commitment; as employees stay longer, their productivity goes up and training costs fall; employees' overall job satisfaction, combined with their experience, helps them serve customers better; and customers are then more inclined to stay loyal to the company. Finally, as the best customers and employees become part of the loyalty-based system, competitors are left to survive with less desirable customers and less talented employees. To compete on loyalty, a company must understand the relationships between customer retention and the other parts of the business--and be able to quantify the linkages between loyalty and profits. It involves rethinking and aligning four important aspects of the business: customers, product/service offering, employees, and measurement systems.

  8. The office based CHIVA

    Directory of Open Access Journals (Sweden)

    Passariello F

    2013-09-01

    Full Text Available Fausto Passariello,1 Stefano Ermini,2 Massimo Cappelli,3 Roberto Delfrate,4 Claude Franceschi5 1Centro Diagnostico Aquarius, Napoli, Italy; 2Private Practice, Grassina, Italy; 3Private Practice, Firenze, Italy; 4Casa di Cure Figlie di Maria, Cremona, Italy; 5Hospital St Joseph, Service d'Explorations Vasculaires, Paris, France Abstract: The cure Conservatrice Hémodynamique de l'Insuffisance Veineuse en Ambulatoire (CHIVA can be office based (OB. The OB-CHIVA protocol is aimed at transferring CHIVA procedures to specialists rooms. The protocol will check the feasibility of OB-CHIVA, data pertaining to recurrence, and will offer the opportunity to study saphenous femoral junction (SFJ stump evolution, the role of the washing vessels and the arch recanalization rate, and gather new data about the effect of the length of the treated saphenous vein. A simplified diagnostic procedure will allow an essential ultrasound examination of the venous net while a schematic and easily readable algorithm guides therapeutic choices. The Riobamba draining crossotomy (RDC tactic is composed of a set of OB procedures. While some of these procedures are, at the moment, only proposals, others are already applied. Devices generally used in ablative procedures such as Light Amplification by Stimulated Emission of Radiation (LASER, radio frequency, steam, and mechanical devices are used in this context to serve to conservative interventions for CHIVA. New techniques have also been proposed for devalvulation and tributary disconnection. Detailed follow-up is necessary in order to determine the effects of therapy and possible disease evolution. Finally, information is added about the informed consent and the ethical considerations of OB-CHIVA research. Keywords: CHIVA, office based procedures, LASER, RF, steam

  9. Mainstreaming gesture based interfaces

    Directory of Open Access Journals (Sweden)

    David Procházka

    2013-01-01

    Full Text Available Gestures are a common way of interaction with mobile devices. They emerged especially with the iPhone production. Gestures in currently used devices are usually based on the original gestures presented by Apple in its iOS (iPhone Operating System. Therefore, there is a wide agreement on the mobile gesture design. In last years, it is possible to see experiments with gesture usage also in the other areas of consumer electronics and computers. The examples can include televisions, large projections etc. These gestures can be marked as spatial or 3D gestures. They are connected with a natural 3D environment rather than with a flat 2D screen. Nevertheless, it is hard to find a comparable design agreement within the spatial gestures. Various projects are based on completely different gesture sets. This situation is confusing for their users and slows down spatial gesture adoption.This paper is focused on the standardization of spatial gestures. The review of projects focused on spatial gesture usage is provided in the first part. The main emphasis is placed on the usability point-of-view. On the basis of our analysis, we argue that the usability is the key issue enabling the wide adoption. The mobile gesture emergence was possible easily because the iPhone gestures were natural. Therefore, it was not necessary to learn them.The design and implementation of our presentation software, which is controlled by gestures, is outlined in the second part of the paper. Furthermore, the usability testing results are provided as well. We have tested our application on a group of users not instructed in the implemented gestures design. These results were compared with the other ones, obtained with our original implementation. The evaluation can be used as the basis for implementation of similar projects.

  10. Telephone-Based Coaching.

    Science.gov (United States)

    Boccio, Mindy; Sanna, Rashel S; Adams, Sara R; Goler, Nancy C; Brown, Susan D; Neugebauer, Romain S; Ferrara, Assiamira; Wiley, Deanne M; Bellamy, David J; Schmittdiel, Julie A

    2017-03-01

    Many Americans continue to smoke, increasing their risk of disease and premature death. Both telephone-based counseling and in-person tobacco cessation classes may improve access for smokers seeking convenient support to quit. Little research has assessed whether such programs are effective in real-world clinical populations. Retrospective cohort study comparing wellness coaching participants with two groups of controls. Kaiser Permanente Northern California, a large integrated health care delivery system. Two hundred forty-one patients who participated in telephonic tobacco cessation coaching from January 1, 2011, to March 31, 2012, and two control groups: propensity-score-matched controls, and controls who participated in a tobacco cessation class during the same period. Wellness coaching participants received an average of two motivational interviewing-based coaching sessions that engaged the patient, evoked their reason to consider quitting, and helped them establish a quit plan. Self-reported quitting of tobacco and fills of tobacco cessation medications within 12 months of follow-up. Logistic regressions adjusting for age, gender, race/ethnicity, and primary language. After adjusting for confounders, tobacco quit rates were higher among coaching participants vs. matched controls (31% vs. 23%, p Coaching participants and class attendees filled tobacco-cessation prescriptions at a higher rate (47% for both) than matched controls (6%, p coaching was as effective as in-person classes and was associated with higher rates of quitting compared to no treatment. The telephonic modality may increase convenience and scalability for health care systems looking to reduce tobacco use and improve health.

  11. Base Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Everett Sondreal; John Hendrikson

    2009-03-31

    In June 2009, the Energy & Environmental Research Center (EERC) completed 11 years of research under the U.S. Department of Energy (DOE) Base Cooperative Agreement No. DE-FC26-98FT40320 funded through the Office of Fossil Energy (OFE) and administered at the National Energy Technology Laboratory (NETL). A wide range of diverse research activities were performed under annual program plans approved by NETL in seven major task areas: (1) resource characterization and waste management, (2) air quality assessment and control, (3) advanced power systems, (4) advanced fuel forms, (5) value-added coproducts, (6) advanced materials, and (7) strategic studies. This report summarizes results of the 67 research subtasks and an additional 50 strategic studies. Selected highlights in the executive summary illustrate the contribution of the research to the energy industry in areas not adequately addressed by the private sector alone. During the period of performance of the agreement, concerns have mounted over the impact of carbon emissions on climate change, and new programs have been initiated by DOE to ensure that fossil fuel resources along with renewable resources can continue to supply the nation's transportation fuel and electric power. The agreement has addressed DOE goals for reductions in CO{sub 2} emissions through efficiency, capture, and sequestration while expanding the supply and use of domestic energy resources for energy security. It has further contributed to goals for near-zero emissions from highly efficient coal-fired power plants; environmental control capabilities for SO{sub 2}, NO{sub x}, fine respirable particulate (PM{sub 2.5}), and mercury; alternative transportation fuels including liquid synfuels and hydrogen; and synergistic integration of fossil and renewable resources (e.g., wind-, biomass-, and coal-based electrical generation).

  12. DUAL BASES FOR A NEW FAMILY OF GENERALIZED BALL BASES

    Institute of Scientific and Technical Information of China (English)

    Hong-yiWu

    2004-01-01

    This paper presents the dual bases for a new family of generalized Ball curves with a position parameter K, which includes the Bezier curve, generalized Said-Ball curve and some intermediate curves. Using the dual bases, the relative Marsden identity, conversion formulas of bases and control points of various curves are obtained.

  13. Upgrade of The Cyber R and D Platform

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Chang Seong; Lee, Kyung Jong; Yang, Tae Chun; Kim, Tae Sung; Hong, Hyun Joo [Kyungsung Univ., Seoul (Korea, Republic of)

    2007-02-15

    Recently, it is necessary to demonstrate the program and experience the safety of radiation disposal by themselves. The objective of this research is to develop a cyber-based performance assessment program for the radiation disposal which is fit for Korean environment. This research covers the following four areas: - Development of Java-based MDPSA pre-processor - Linking MDPSA code to pre and post-processor. - Linking MDPSA code to Cyber R and D platform - Modification of the Cyber R and D platform. The results of this research can be used as a PR database for the central and local government. Using the web-based system, any person or interest group can plug in the system and experience the safety and clarity of the atomic energy and radiation disposal. Also, within the KAERI, research-related knowledge can be stored as a structured format. This enables the sharing, reusability, transparency, reliability and transferability of research results, and promotes the efficiency of research efforts within and outside of research team.

  14. CMOS-Based Biosensor Arrays

    CERN Document Server

    Thewes, R; Schienle, M; Hofmann, F; Frey, A; Brederlow, R; Augustyniak, M; Jenkner, M; Eversmann, B; Schindler-Bauer, P; Atzesberger, M; Holzapfl, B; Beer, G; Haneder, T; Hanke, H -C

    2011-01-01

    CMOS-based sensor array chips provide new and attractive features as compared to today's standard tools for medical, diagnostic, and biotechnical applications. Examples for molecule- and cell-based approaches and related circuit design issues are discussed.

  15. Managing the Gap between Curriculum Based and Problem Based Learning

    DEFF Research Database (Denmark)

    Bygholm, Ann; Buus, Lillian

    2009-01-01

    the challenges in applying problem based learning strategies in a context where several universities, with different cultures of teaching, collaboratively develop and deliver online courses. We present a pedagogical framework embracing both problem based and curriculum based strategies and show how we used....../or but rather both/and. In this paper we describe an approach to design and delivery of online courses in computer science which on the one hand is based on a specified curriculum and on the other hand gives room for different learning strategies, problem based learning being one of them. We discuss...... this as a basis for trying out various online learning strategies....

  16. Agent-Based Cloud Computing

    OpenAIRE

    Sim, Kwang Mong

    2012-01-01

    Agent-based cloud computing is concerned with the design and development of software agents for bolstering cloud service\\ud discovery, service negotiation, and service composition. The significance of this work is introducing an agent-based paradigm for\\ud constructing software tools and testbeds for cloud resource management. The novel contributions of this work include: 1) developing\\ud Cloudle: an agent-based search engine for cloud service discovery, 2) showing that agent-based negotiatio...

  17. Memory-Based Shallow Parsing

    OpenAIRE

    Sang, Erik F. Tjong Kim

    2002-01-01

    We present memory-based learning approaches to shallow parsing and apply these to five tasks: base noun phrase identification, arbitrary base phrase recognition, clause detection, noun phrase parsing and full parsing. We use feature selection techniques and system combination methods for improving the performance of the memory-based learner. Our approach is evaluated on standard data sets and the results are compared with that of other systems. This reveals that our approach works well for ba...

  18. Utilization-Based Congestion Control

    OpenAIRE

    Satoshi Utsumi; Salahuddin Muhammad Salim Zabir

    2012-01-01

    Traditional connection oriented protocols like TCP NewReno perform poorly over wireless links. Theproblem lies in their design assumptions based on loss based congestion control. Various modificationsto loss based congestion control schemes have so far been proposed to overcome the issue. In addition,the comparatively newer family of delay based congestion control mechanisms like Caia-Hamilton Delay(CHD), offer effective solutions for wireless link loss. All these approaches aim at improving ...

  19. ‘"Education-based Research"

    DEFF Research Database (Denmark)

    Degn Johansson, Troels

    This paper lays out a concept of education-based research-the production of research knowledge within the framework of tertiary design education-as an integration of problem-based learning and research-based education. This leads to a critique of reflective practice as the primary way to facilitate...... learning at this level, a discussion of the nature of design problems in the instrumentalist tradition, and some suggestions as to how design studies curricula may facilitate education-based research....

  20. Managing the Gap between Curriculum Based and Problem Based Learning

    DEFF Research Database (Denmark)

    Bygholm, Ann; Buus, Lillian

    2009-01-01

    Traditionally there has been a clear distinction between curriculum based and problem based approaches to accomplish learning. Preferred approaches depend of course on conviction, culture, traditions and also on the specific learning situation. We will argue that it is not a question of either....../or but rather both/and. In this paper we describe an approach to design and delivery of online courses in computer science which on the one hand is based on a specified curriculum and on the other hand gives room for different learning strategies, problem based learning being one of them. We discuss...... the challenges in applying problem based learning strategies in a context where several universities, with different cultures of teaching, collaboratively develop and deliver online courses. We present a pedagogical framework embracing both problem based and curriculum based strategies and show how we used...

  1. Identity-based ring signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    Xiong Hu; Qin Zhiguang; Li Fagen

    2009-01-01

    Identity-based (ID-based) ring signature has drawn great concerns in recent years and many ID-based ring signature schemes have been proposed until now. Unfortunately, all of these ID-based ring signatures are constructed from bilinear pairings, a powerful but computationally expensive primitive. Hence, ID-based ring signature without pairing is of great interest in the field of cryptography. In this paper, the authors firstly propose an ID-based ring signature scheme based on quadratic residues. The proposed scheme is proved to be existentially unforgeable against adaptive chosen message-and-identity attack under the random oracle model, assuming the hardness of factoring. The proposed scheme is more efficient than those which are constructed from bilinear pairings.

  2. Based on Channel Characteristics

    Directory of Open Access Journals (Sweden)

    Zhuo Hao

    2013-01-01

    Full Text Available A number of key agreement schemes based on wireless channel characteristics have been proposed recently. However, previous key agreement schemes require that two nodes which need to agree on a key are within the communication range of each other. Hence, they are not suitable for multihop wireless networks, in which nodes do not always have direct connections with each other. In this paper, we first propose a basic multihop key agreement scheme for wireless ad hoc networks. The proposed basic scheme is resistant to external eavesdroppers. Nevertheless, this basic scheme is not secure when there exist internal eavesdroppers or Man-in-the-Middle (MITM adversaries. In order to cope with these adversaries, we propose an improved multihop key agreement scheme. We show that the improved scheme is secure against internal eavesdroppers and MITM adversaries in a single path. Both performance analysis and simulation results demonstrate that the improved scheme is efficient. Consequently, the improved key agreement scheme is suitable for multihop wireless ad hoc networks.

  3. Explanation-Based Auditing

    CERN Document Server

    Fabbri, Daniel

    2011-01-01

    To comply with emerging privacy laws and regulations, it has become common for applications like electronic health records systems (EHRs) to collect access logs, which record each time a user (e.g., a hospital employee) accesses a piece of sensitive data (e.g., a patient record). Using the access log, it is easy to answer simple queries (e.g., Who accessed Alice's medical record?), but this often does not provide enough information. In addition to learning who accessed their medical records, patients will likely want to understand why each access occurred. In this paper, we introduce the problem of generating explanations for individual records in an access log. The problem is motivated by user-centric auditing applications, and it also provides a novel approach to misuse detection. We develop a framework for modeling explanations which is based on a fundamental observation: For certain classes of databases, including EHRs, the reason for most data accesses can be inferred from data stored elsewhere in the da...

  4. Flow-Based Provenance

    Directory of Open Access Journals (Sweden)

    Sabah Al-Fedaghi

    2017-02-01

    Full Text Available Aim/Purpose: With information almost effortlessly created and spontaneously available, current progress in Information and Communication Technology (ICT has led to the complication that information must be scrutinized for trustworthiness and provenance. Information systems must become provenance-aware to be satisfactory in accountability, reproducibility, and trustworthiness of data. Background:\tMultiple models for abstract representation of provenance have been proposed to describe entities, people, and activities involved in producing a piece of data, including the Open Provenance Model (OPM and the World Wide Web Consortium. These models lack certain concepts necessary for specifying workflows and encoding the provenance of data products used and generated. Methodology: Without loss of generality, the focus of this paper is on OPM depiction of provenance in terms of a directed graph. We have redrawn several case studies in the framework of our proposed model in order to compare and evaluate it against OPM for representing these cases. Contribution: This paper offers an alternative flow-based diagrammatic language that can form a foundation for modeling of provenance. The model described here provides an (abstract machine-like representation of provenance. Findings: The results suggest a viable alternative in the area of diagrammatic representation for provenance applications. Future Research: Future work will seek to achieve more accurate comparisons with current models in the field.

  5. Evidence-based management.

    Science.gov (United States)

    Pfeffer, Jeffrey; Sutton, Robert I

    2006-01-01

    For the most part, managers looking to cure their organizational ills rely on obsolete knowledge they picked up in school, long-standing but never proven traditions, patterns gleaned from experience, methods they happen to be skilled in applying, and information from vendors. They could learn a thing or two from practitioners of evidence-based medicine, a movement that has taken the medical establishment by storm over the past decade. A growing number of physicians are eschewing the usual, flawed resources and are instead identifying, disseminating, and applying research that is soundly conducted and clinically relevant. It's time for managers to do the same. The challenge is, quite simply, to ground decisions in the latest and best knowledge of what actually works. In some ways, that's more difficult to do in business than in medicine. The evidence is weaker in business; almost anyone can (and many people do) claim to be a management expert; and a motley crew of sources--Shakespeare, Billy Graham,Jack Welch, Attila the Hunare used to generate management advice. Still, it makes sense that when managers act on better logic and strong evidence, their companies will beat the competition. Like medicine, management is learned through practice and experience. Yet managers (like doctors) can practice their craft more effectively if they relentlessly seek new knowledge and insight, from both inside and outside their companies, so they can keep updating their assumptions, skills, and knowledge.

  6. Base Camp Architecture

    Directory of Open Access Journals (Sweden)

    Warebi Gabriel Brisibe

    2016-03-01

    Full Text Available Longitudinal or time line studies of change in the architecture of a particular culture are common, but an area still open to further research is change across space or place. In particular, there is need for studies on architectural change of cultures stemming from the same ethnic source split between their homeland and other Diasporas. This change may range from minor deviations to drastic shifts away from an architectural norm and the accumulation of these shifts within a time frame constitutes variations. This article focuses on identifying variations in the architecture of the Ijo fishing group that migrates along the coastline of West Africa. It examines the causes of cross-cultural variation between base camp dwellings of Ijo migrant fishermen in the Bakassi Peninsula in Cameroon and Bayelsa State in Nigeria. The study draws on the idea of the inevitability of cultural and social change over time as proposed in the theories of cultural dynamism and evolution. It tests aspects of cultural transmission theory using the principal coordinates analysis to ascertain the possible causes of variation. From the findings, this research argues that migration has enhanced the forces of cultural dynamism, which have resulted in significant variations in the architecture of this fishing group.

  7. RF Based Spy

    Directory of Open Access Journals (Sweden)

    Robot Prerna Jain

    2014-04-01

    Full Text Available The intention of this paper is to reduce human victims in terrorist attack such as 26/11. So this problem can be overcome by designing the RF based spy robot which involves wireless camera. so that from this we can examine rivals when it required. This robot can quietly enter into enemy area and sends us the information via wireless camera. On the other hand one more feature is added in this robot that is colour sensor. Colour sensor senses the colour of surface and according to that robot will change its colour. Because of this feature this robot can’t easily detected by enemies. The movement of this robot is wirelessly controlled by a hand held RF transmitter to send commands to the RF receiver mounted on the moving robot. Since human life is always Valueable, these robots are the substitution of soldiers in war areas. This spy robot can also be used in star hotels, shopping malls, jewelry show rooms, etc where there can be threat from intruders or terrorists.

  8. Lunar base construction requirements

    Science.gov (United States)

    Jolly, Steve; Helleckson, Brent

    1990-01-01

    The following viewgraph presentation is a review of the Lunar Base Constructibility Study carried out in the spring and summer of 1990. The objective of the study was to develop a method for evaluating the constructibility of Phase A proposals to build facilities on orbit or on extraterrestrial surfaces. Space construction was broadly defined as all forms of assembly, disassembly, connection, disconnection, deployment, stowage, excavation, emplacement, activation, test, transportation, etc., required to create facilities in orbit and on the surfaces of other celestial bodies. It was discovered that decisions made in the face of stated and unstated assumptions early in the design process (commonly called Phase A) can lock in non-optimal construction methods. Often, in order to construct the design, alterations must be made to the design during much later phases of the project. Such 'fixes' can be very difficult, expensive, or perhaps impossible. Assessing constructibility should thus be a part of the iterative design process, starting with the Phase A studies and continuing through production. This study assumes that there exists a minimum set of key construction requirements (i.e., questions whose answers form the set of discriminators) that must be implied or specified in order to assess the constructibility of the design. This set of construction requirements constitutes a 'constructibility filter' which then becomes part of the iterative design process. Five inherently different, dichotomous design reference missions were used in the extraction of these requirements to assure the depth and breath of the list.

  9. Biosensors based on cantilevers.

    Science.gov (United States)

    Alvarez, Mar; Carrascosa, Laura G; Zinoviev, Kiril; Plaza, Jose A; Lechuga, Laura M

    2009-01-01

    Microcantilevers based-biosensors are a new label-free technique that allows the direct detection of biomolecular interactions in a label-less way and with great accuracy by translating the biointeraction into a nanomechanical motion. Low cost and reliable standard silicon technologies are widely used for the fabrication of cantilevers with well-controlled mechanical properties. Over the last years, the number of applications of these sensors has shown a fast growth in diverse fields, such as genomic or proteomic, because of the biosensor flexibility, the low sample consumption, and the non-pretreated samples required. In this chapter, we report a dedicated design and a fabrication process of highly sensitive microcantilever silicon sensors. We will describe as well an application of the device in the environmental field showing the immunodetection of an organic toxic pesticide as an example. The cantilever biofunctionalization process and the subsequent pesticide determination are detected in real time by monitoring the nanometer-scale bending of the microcantilever due to a differential surface stress generated between both surfaces of the device.

  10. Holography based super resolution

    Science.gov (United States)

    Hussain, Anwar; Mudassar, Asloob A.

    2012-05-01

    This paper describes the simulation of a simple technique of superresolution based on holographic imaging in spectral domain. The input beam assembly containing 25 optical fibers with different orientations and positions is placed to illuminate the object in the 4f optical system. The position and orientation of each fiber is calculated with respect to the central fiber in the array. The positions and orientations of the fibers are related to the shift of object spectrum at aperture plane. During the imaging process each fiber is operated once in the whole procedure to illuminate the input object transparency which gives shift to the object spectrum in the spectral domain. This shift of the spectrum is equal to the integral multiple of the pass band aperture width. During the operation of single fiber (ON-state) all other fibers are in OFF-state at that time. The hologram recorded by each fiber at the CCD plane is stored in computer memory. At the end of illumination process total 25 holograms are recorded by the whole fiber array and by applying some post processing and specific algorithm single super resolved image is obtained. The superresolved image is five times better than the band-limited image. The work is demonstrated using computer simulation only.

  11. Alphavirus-based vaccines.

    Science.gov (United States)

    Lundstrom, Kenneth

    2014-06-16

    Alphavirus vectors have demonstrated high levels of transient heterologous gene expression both in vitro and in vivo and, therefore, possess attractive features for vaccine development. The most commonly used delivery vectors are based on three single-stranded encapsulated alphaviruses, namely Semliki Forest virus, Sindbis virus and Venezuelan equine encephalitis virus. Alphavirus vectors have been applied as replication-deficient recombinant viral particles and, more recently, as replication-proficient particles. Moreover, in vitro transcribed RNA, as well as layered DNA vectors have been applied for immunization. A large number of highly immunogenic viral structural proteins expressed from alphavirus vectors have elicited strong neutralizing antibody responses in multispecies animal models. Furthermore, immunization studies have demonstrated robust protection against challenges with lethal doses of virus in rodents and primates. Similarly, vaccination with alphavirus vectors expressing tumor antigens resulted in prophylactic protection against challenges with tumor-inducing cancerous cells. As certain alphaviruses, such as Chikungunya virus, have been associated with epidemics in animals and humans, attention has also been paid to the development of vaccines against alphaviruses themselves. Recent progress in alphavirus vector development and vaccine technology has allowed conducting clinical trials in humans.

  12. Alphavirus-Based Vaccines

    Directory of Open Access Journals (Sweden)

    Kenneth Lundstrom

    2014-06-01

    Full Text Available Alphavirus vectors have demonstrated high levels of transient heterologous gene expression both in vitro and in vivo and, therefore, possess attractive features for vaccine development. The most commonly used delivery vectors are based on three single-stranded encapsulated alphaviruses, namely Semliki Forest virus, Sindbis virus and Venezuelan equine encephalitis virus. Alphavirus vectors have been applied as replication-deficient recombinant viral particles and, more recently, as replication-proficient particles. Moreover, in vitro transcribed RNA, as well as layered DNA vectors have been applied for immunization. A large number of highly immunogenic viral structural proteins expressed from alphavirus vectors have elicited strong neutralizing antibody responses in multispecies animal models. Furthermore, immunization studies have demonstrated robust protection against challenges with lethal doses of virus in rodents and primates. Similarly, vaccination with alphavirus vectors expressing tumor antigens resulted in prophylactic protection against challenges with tumor-inducing cancerous cells. As certain alphaviruses, such as Chikungunya virus, have been associated with epidemics in animals and humans, attention has also been paid to the development of vaccines against alphaviruses themselves. Recent progress in alphavirus vector development and vaccine technology has allowed conducting clinical trials in humans.

  13. Azido-based propellants

    Energy Technology Data Exchange (ETDEWEB)

    Sayles, D.C.

    1987-04-07

    This patent describes an azido-based solid propellant composition having an improved burning rate comprising: a high energy plasticizer of tris-1,2,3(bis(1,2-difluoroamino)ethoxy)propane in an amount from about 24 to about 30 weight percent of the propellant composition; a curative and crosslinking agent of 4,5-epoxycyclohexylmethyl 4'5'-epoxycyclohexylcarboxylate in an amount from about 0.75 to about 1.5 weight percent of the propellant composition; a carboranyl burning rate catalyst of carboranyl-methyl propionate in an amount from about 2 to about 6 weight percent of the propellant composition; graphite linters of about 100 micrometers lengths in an amount from about 1 to about 3 weight percent of the propellant composition; aluminum powder in an amount from about 10 to about 12 weight percent of the propellant composition; aluminum flake in an amount from about 0.5 to about 2 weight percent of the propellant composition; ammonium perchlorate of about 0.9 micrometer diameter in an amount from about 46 to about 52 weight percent of the composition; a processing aid of lecithin in an amount from about 0.1 to about 0.2 weight percent of the propellant composition; and a binder of 2-azidoethyl acrylateacrylic acid copolymer in an amount from about 3 to about 8 weight percent of the propellant composition.

  14. Mixture Based Outlier Filtration

    Directory of Open Access Journals (Sweden)

    P. Pecherková

    2006-01-01

    Full Text Available Success/failure of adaptive control algorithms – especially those designed using the Linear Quadratic Gaussian criterion – depends on the quality of the process data used for model identification. One of the most harmful types of process data corruptions are outliers, i.e. ‘wrong data’ lying far away from the range of real data. The presence of outliers in the data negatively affects an estimation of the dynamics of the system. This effect is magnified when the outliers are grouped into blocks. In this paper, we propose an algorithm for outlier detection and removal. It is based on modelling the corrupted data by a two-component probabilistic mixture. The first component of the mixture models uncorrupted process data, while the second models outliers. When the outlier component is detected to be active, a prediction from the uncorrupted data component is computed and used as a reconstruction of the observed data. The resulting reconstruction filter is compared to standard methods on simulated and real data. The filter exhibits excellent properties, especially in the case of blocks of outliers. 

  15. AUGMENTED REALITY BASED ASSISTANCE

    Directory of Open Access Journals (Sweden)

    N.R.Raajan

    2013-04-01

    Full Text Available The concept of Augmented Reality can be explained as a superimposition of computer generated two dimensional or three dimensional objects over the real time scene acquired into thecapturing device. Thus Augmented Reality adds additional information to the real scene and this can be implemented with the help of markers. The development of the application is simple and easier in case of AR. This idea is extended to the development of Augmented Reality based book that act as a tour guide. This travel guide can give all the basic information regarding the necessities of a finer travel around the places of the destinations. The application will detect the markers found in the real scene and superimpose them with multimedia data giving enormous information. The application can also be made to redirect to the web links for easy access of certain other utilities by interaction. The same idea can also be utilized in engineering laboratories to understand the working of the circuit by visualizing the working of the same circuit where the diagram itself can be used as a marker and thus enhancing the self learning among the students.

  16. Home-based Healthcare Technology

    DEFF Research Database (Denmark)

    Verdezoto, Nervo

    of these systems target a specific treatment or condition and might not be sufficient to support the care management work at home. Based on a case study approach, my research investigates home-based healthcare practices and how they can inform future design of home-based healthcare technology that better account...

  17. Condition based spare parts supply

    NARCIS (Netherlands)

    Lin, X.; Basten, Robertus Johannes Ida; Kranenburg, A.A.; van Houtum, Geert-Jan

    2012-01-01

    We consider a spare parts stock point that serves an installed base of machines. Each machine contains the same critical component, whose degradation behavior is described by a Markov process. We consider condition based spare parts supply, and show that an optimal, condition based inventory policy

  18. Memory-Based Shallow Parsing

    NARCIS (Netherlands)

    Tjong Kim Sang, E.F.

    2002-01-01

    We present memory-based learning approaches to shallow parsing and apply these to five tasks: base noun phrase identification, arbitrary base phrase recognition, clause detection, noun phrase parsing and full parsing. We use feature selection techniques and system combination methods for improving t

  19. Base Camp Design Simulation Training

    Science.gov (United States)

    2011-07-01

    The Army needs officers and noncommissioned officers with requisite base camp competencies. The Army’s Field Manual (FM) 3-34.400 defines a Base Camp...reason, we designed a 600-man base camp on VBS2TM from an AutoCAD diagram found on the Theater Construction Management System (version 3.2). Known

  20. Securing web-based exams

    NARCIS (Netherlands)

    Sessink, O.D.T.; Beeftink, H.H.; Tramper, J.; Hartog, R.J.M.

    2004-01-01

    Learning management systems may offer web-based exam facilities. Such facilities entail a higher risk to exams fraud than traditional paper-based exams. The article discusses security issues with web-based exams, and proposes precautionary measures to reduce the risks. A security model is presented

  1. Securing web-based exams

    NARCIS (Netherlands)

    Sessink, O.D.T.; Beeftink, H.H.; Tramper, J.; Hartog, R.J.M.

    2004-01-01

    Learning management systems may offer web-based exam facilities. Such facilities entail a higher risk to exams fraud than traditional paper-based exams. The article discusses security issues with web-based exams, and proposes precautionary measures to reduce the risks. A security model is presented

  2. Memory-Based Shallow Parsing

    NARCIS (Netherlands)

    Tjong Kim Sang, E.F.

    2002-01-01

    We present memory-based learning approaches to shallow parsing and apply these to five tasks: base noun phrase identification, arbitrary base phrase recognition, clause detection, noun phrase parsing and full parsing. We use feature selection techniques and system combination methods for improving

  3. Space-based detectors

    Science.gov (United States)

    Sesana, A.; Weber, W. J.; Killow, C. J.; Perreur-Lloyd, M.; Robertson, D. I.; Ward, H.; Fitzsimons, E. D.; Bryant, J.; Cruise, A. M.; Dixon, G.; Hoyland, D.; Smith, D.; Bogenstahl, J.; McNamara, P. W.; Gerndt, R.; Flatscher, R.; Hechenblaikner, G.; Hewitson, M.; Gerberding, O.; Barke, S.; Brause, N.; Bykov, I.; Danzmann, K.; Enggaard, A.; Gianolio, A.; Vendt Hansen, T.; Heinzel, G.; Hornstrup, A.; Jennrich, O.; Kullmann, J.; Møller-Pedersen, S.; Rasmussen, T.; Reiche, J.; Sodnik, Z.; Suess, M.; Armano, M.; Sumner, T.; Bender, P. L.; Akutsu, T.; Sathyaprakash, B. S.

    2014-12-01

    The parallel session C5 on Space-Based Detectors gave a broad overview over the planned space missions related to gravitational wave detection. Overviews of the revolutionary science to be expected from LISA was given by Alberto Sesana and Sasha Buchman. The launch of LISA Pathfinder (LPF) is planned for 2015. This mission and its payload "LISA Technology Package" will demonstrate key technologies for LISA. In this context, reference masses in free fall for LISA, and gravitational physics in general, was described by William Weber, laser interferometry at the pico-metre level and the optical bench of LPF was presented by Christian Killow and the performance of the LPF optical metrology system by Paul McNamara. While LPF will not yet be sensitive to gravitational waves, it may nevertheless be used to explore fundamental physics questions, which was discussed by Michele Armano. Some parts of the LISA technology that are not going to be demonstrated by LPF, but under intensive development at the moment, were presented by Oliver Jennrich and Oliver Gerberding. Looking into the future, Japan is studying the design of a mid-frequency detector called DECIGO, which was discussed by Tomotada Akutsu. Using atom interferometry for gravitational wave detection has also been recently proposed, and it was critically reviewed by Peter Bender. In the nearer future, the launch of GRACE Follow-On (for Earth gravity observation) is scheduled for 2017, and it will include a Laser Ranging Interferometer as technology demonstrator. This will be the first inter-spacecraft laser interferometer and has many aspects in common with the LISA long arm, as discussed by Andrew Sutton.

  4. Soy-based renoprotection.

    Science.gov (United States)

    McGraw, Nancy J; Krul, Elaine S; Grunz-Borgmann, Elizabeth; Parrish, Alan R

    2016-05-06

    Chronic kidney disease (CKD) is a significant public health problem as risk factors such as advanced age, obesity, hypertension and diabetes rise in the global population. Currently there are no effective pharmacologic treatments for this disease. The role of diet is important for slowing the progression of CKD and managing symptoms in later stages of renal insufficiency. While low protein diets are generally recommended, maintaining adequate levels of intake is critical for health. There is an increasing appreciation that the source of protein may also be important. Soybean protein has been the most extensively studied plant-based protein in subjects with kidney disease and has demonstrated renal protective properties in a number of clinical studies. Soy protein consumption has been shown to slow the decline in estimated glomerular filtration rate and significantly improve proteinuria in diabetic and non-diabetic patients with nephropathy. Soy's beneficial effects on renal function may also result from its impact on certain physiological risk factors for CKD such as dyslipidemia, hypertension and hyperglycemia. Soy intake is also associated with improvements in antioxidant status and systemic inflammation in early and late stage CKD patients. Studies conducted in animal models have helped to identify the underlying molecular mechanisms that may play a role in the positive effects of soy protein on renal parameters in polycystic kidney disease, metabolically-induced kidney dysfunction and age-associated progressive nephropathy. Despite the established relationship between soy and renoprotection, further studies are needed for a clear understanding of the role of the cellular and molecular target(s) of soy protein in maintaining renal function.

  5. Accelerator based fusion reactor

    Science.gov (United States)

    Liu, Keh-Fei; Chao, Alexander Wu

    2017-08-01

    A feasibility study of fusion reactors based on accelerators is carried out. We consider a novel scheme where a beam from the accelerator hits the target plasma on the resonance of the fusion reaction and establish characteristic criteria for a workable reactor. We consider the reactions d+t\\to n+α,d+{{}3}{{H}\\text{e}}\\to p+α , and p+{{}11}B\\to 3α in this study. The critical temperature of the plasma is determined from overcoming the stopping power of the beam with the fusion energy gain. The needed plasma lifetime is determined from the width of the resonance, the beam velocity and the plasma density. We estimate the critical beam flux by balancing the energy of fusion production against the plasma thermo-energy and the loss due to stopping power for the case of an inert plasma. The product of critical flux and plasma lifetime is independent of plasma density and has a weak dependence on temperature. Even though the critical temperatures for these reactions are lower than those for the thermonuclear reactors, the critical flux is in the range of {{10}22}-{{10}24}~\\text{c}{{\\text{m}}-2}~{{\\text{s}}-1} for the plasma density {ρt}={{10}15}~\\text{c}{{\\text{m}}-3} in the case of an inert plasma. Several approaches to control the growth of the two-stream instability are discussed. We have also considered several scenarios for practical implementation which will require further studies. Finally, we consider the case where the injected beam at the resonance energy maintains the plasma temperature and prolongs its lifetime to reach a steady state. The equations for power balance and particle number conservation are given for this case.

  6. Nanoparticle-based Sensors

    Directory of Open Access Journals (Sweden)

    V.K. Khanna

    2008-09-01

    Full Text Available Nanoparticles exhibit several unique properties that can be applied to develop chemical and biosensorspossessing desirable features like enhanced sensitivity and lower detection limits. Gold nanoparticles arecoated with sugars tailored to recognise different biological substances. When mixed with a weak solution ofthe sugar-coated nanoparticles, the target substance, e.g., ricin or E.coli, attaches to the sugar, thereby alteringits properties and changing the colour. Spores of bacterium labeled with carbon dots have been found to glowupon illumination when viewed with a confocal microscope. Enzyme/nanoparticle-based optical sensors forthe detection of organophosphate (OP compounds employ nanoparticle-modified fluorescence of an inhibitorof the enzyme to generate the signal for the OP compound detection. Nanoparticles shaped as nanoprisms,built of silver atoms, appear red on exposure to light. These nanoparticles are used as diagnostic labels thatglow when target DNA, e.g., those of anthrax or HIV, are present. Of great importance are tools like goldnanoparticle-enhanced surface-plasmon resonance sensor and silver nanoparticle surface-enhanced portableRaman integrated tunable sensor. Nanoparticle metal oxide chemiresistors using micro electro mechanical systemhotplate are very promising devices for toxic gas sensing. Chemiresistors comprising thin films of nanogoldparticles, encapsulated in monomolecular layers of functionalised alkanethiols, deposited on interdigitatedmicroelectrodes, show resistance changes through reversible absorption of vapours of harmful gases. Thispaper reviews the state-of-the-art sensors for chemical and biological terror agents, indicates their capabilitiesand applications, and presents the future scope of these devices.Defence Science Journal, 2008, 58(5, pp.608-616, DOI:http://dx.doi.org/10.14429/dsj.58.1683

  7. Lunar Base Heat Pump

    Science.gov (United States)

    Walker, D.; Fischbach, D.; Tetreault, R.

    1996-01-01

    The objective of this project was to investigate the feasibility of constructing a heat pump suitable for use as a heat rejection device in applications such as a lunar base. In this situation, direct heat rejection through the use of radiators is not possible at a temperature suitable for lde support systems. Initial analysis of a heat pump of this type called for a temperature lift of approximately 378 deg. K, which is considerably higher than is commonly called for in HVAC and refrigeration applications where heat pumps are most often employed. Also because of the variation of the rejection temperature (from 100 to 381 deg. K), extreme flexibility in the configuration and operation of the heat pump is required. A three-stage compression cycle using a refrigerant such as CFC-11 or HCFC-123 was formulated with operation possible with one, two or three stages of compression. Also, to meet the redundancy requirements, compression was divided up over multiple compressors in each stage. A control scheme was devised that allowed these multiple compressors to be operated as required so that the heat pump could perform with variable heat loads and rejection conditions. A prototype heat pump was designed and constructed to investigate the key elements of the high-lift heat pump concept. Control software was written and implemented in the prototype to allow fully automatic operation. The heat pump was capable of operation over a wide range of rejection temperatures and cooling loads, while maintaining cooling water temperature well within the required specification of 40 deg. C +/- 1.7 deg. C. This performance was verified through testing.

  8. Workplace Based Assessment in Psychiatry

    Directory of Open Access Journals (Sweden)

    Ayse Devrim Basterzi

    2009-11-01

    Full Text Available Workplace based assessment refers to the assessment of working practices based on what doctors actually do in the workplace, and is predominantly carried out in the workplace itself. Assessment drives learning and it is therefore essential that workplace-based assessment focuses on important attributes rather than what is easiest to assess. Workplacebased assessment is usually competency based. Workplace based assesments may well facilitate and enhance various aspects of educational supervisions, including its structure, frequency and duration etc. The structure and content of workplace based assesments should be monitored to ensure that its benefits are maximised by remaining tailored to individual trainees' needs. Workplace based assesment should be used for formative and summative assessments. Several formative assessment methods have been developed for use in the workplace such as mini clinical evaluation exercise (mini-cex, evidence based journal club assesment and case based discussion, multi source feedback etc. This review discusses the need of workplace based assesments in psychiatry graduate education and introduces some of the work place based assesment methods.

  9. Seguridad en bases de datos

    Directory of Open Access Journals (Sweden)

    Heidy Alina Nuevo León

    2011-11-01

    Full Text Available Los sistemas de base de datos son de gran uso, el cual va desde el uso de bases de datos ligeras, bases de datos en tiempo real (en algunas ocasiones obtenida a partir de la optimización de bases de datos relacionales y bases de datos relacionales con potentes gestores como aplicación proveedora del servicio. En el mundo del software libre existen gran cantidad de gestores de bases de datos, entre los que se encuentran mysql, berkeley db, sqlite y postgresql. Este último con gran número de seguidores y de personas de gran nivel en el mundo del desarrollo libre que contribuyen a su propósito.  El presente trabajo tiene como objetivo principal mostrar mediante una investigación los diferentes mecanismos para realizar la configuración de la seguridad para el gestor de base de datos Postgres

  10. RIPGIS-NET: a GIS tool for riparian groundwater evapotranspiration in MODFLOW.

    Science.gov (United States)

    Ajami, Hoori; Maddock, Thomas; Meixner, Thomas; Hogan, James F; Guertin, D Phillip

    2012-01-01

    RIPGIS-NET, an Environmental System Research Institute (ESRI's) ArcGIS 9.2/9.3 custom application, was developed to derive parameters and visualize results of spatially explicit riparian groundwater evapotranspiration (ETg), evapotranspiration from saturated zone, in groundwater flow models for ecohydrology, riparian ecosystem management, and stream restoration. Specifically RIPGIS-NET works with riparian evapotranspiration (RIP-ET), a modeling package that works with the MODFLOW groundwater flow model. RIP-ET improves ETg simulations by using a set of eco-physiologically based ETg curves for plant functional subgroups (PFSGs), and separates ground evaporation and plant transpiration processes from the water table. The RIPGIS-NET program was developed in Visual Basic 2005, .NET framework 2.0, and runs in ArcMap 9.2 and 9.3 applications. RIPGIS-NET, a pre- and post-processor for RIP-ET, incorporates spatial variability of riparian vegetation and land surface elevation into ETg estimation in MODFLOW groundwater models. RIPGIS-NET derives RIP-ET input parameters including PFSG evapotranspiration curve parameters, fractional coverage areas of each PFSG in a MODFLOW cell, and average surface elevation per riparian vegetation polygon using a digital elevation model. RIPGIS-NET also provides visualization tools for modelers to create head maps, depth to water table (DTWT) maps, and plot DTWT for a PFSG in a polygon in the Geographic Information System based on MODFLOW simulation results. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  11. Case-based reasoning: The marriage of knowledge base and data base

    Science.gov (United States)

    Pulaski, Kirt; Casadaban, Cyprian

    1988-01-01

    The coupling of data and knowledge has a synergistic effect when building an intelligent data base. The goal is to integrate the data and knowledge almost to the point of indistinguishability, permitting them to be used interchangeably. Examples given in this paper suggest that Case-Based Reasoning is a more integrated way to link data and knowledge than pure rule-based reasoning.

  12. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  13. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  14. Identity-based signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    CHAI ZhenChuan; CAO ZhenFu; DONG XiaoLei

    2007-01-01

    Identity-based (ID-based) cryptography has drawn great concerns in recent years, and most of ID-based schemes are constructed from bilinear parings. Therefore, ID-based scheme without pairing is of great interest in the field of cryptography. Up to now,there still remains a challenge to construct ID-based signature scheme from quadratic residues. Thus, we aim to meet this challenge by proposing a concrete scheme. In this paper, we first introduce the technique of how to calculate a 2lth root of a quadratic residue, and then give a concrete ID-based signature scheme using such technique.We also prove that our scheme is chosen message and ID secure in the random oracle model, assuming the hardness of factoring.

  15. Evidence-based dentistry.

    Science.gov (United States)

    Chambers, David W

    2010-01-01

    Both panegyric and criticism of evidence-based dentistry tend to be clumsy because the concept is poorly defined. This analysis identifies several contributions to the profession that have been made under the EBD banner. Although the concept of clinicians integrating clinical epidemiology, the wisdom of their practices, and patients' values is powerful, its implementation has been distorted by a too heavy emphasis of computerized searches for research findings that meet the standards of academics. Although EBD advocates enjoy sharing anecdotal accounts of mistakes others have made, faulting others is not proof that one's own position is correct. There is no systematic, high-quality evidence that EBD is effective. The metaphor of a three-legged stool (evidence, experience, values, and integration) is used as an organizing principle. "Best evidence" has become a preoccupation among EBD enthusiasts. That overlong but thinly developed leg of the stool is critiqued from the perspectives of the criteria for evidence, the difference between internal and external validity, the relationship between evidence and decision making, the ambiguous meaning of "best," and the role of reasonable doubt. The strongest leg of the stool is clinical experience. Although bias exists in all observations (including searches for evidence), there are simple procedures that can be employed in practice to increase useful and objective evidence there, and there are dangers in delegating policy regarding allowable treatments to external groups. Patient and practitioner values are the shortest leg of the stool. As they are so little recognized, their integration in EBD is problematic and ethical tensions exist where paternalism privileges science over patient's self-determined best interests. Four potential approaches to integration are suggested, recognizing that there is virtually no literature on how the "seat" of the three-legged stool works or should work. It is likely that most dentists

  16. XML-Based SHINE Knowledge Base Interchange Language

    Science.gov (United States)

    James, Mark; Mackey, Ryan; Tikidjian, Raffi

    2008-01-01

    The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.

  17. Quantum Standard Teleportation Based on the Generic Measurement Bases

    Institute of Scientific and Technical Information of China (English)

    HAO San-Ru; HOU Bo-Yu; XI Xiao-Qiang; YUE Rui-Hong

    2003-01-01

    We study the quantum standard teleportation based on the generic measurement bases. It is shown that the quantum standard teleportation does not depend on the explicit expression of the measurement bases. We have giventhe correspondence relation between the measurement performed by Alice and the unitary transformation performed byBob. We also prove that the single particle unknown states and the two-particle unknown cat-like states can be exactlytransmitted by means of the generic measurement bases and the correspondence unitary transformations.

  18. Quantum Standard Teleportation Based on the Generic Measurement Bases

    Institute of Scientific and Technical Information of China (English)

    HAOSan-Ru; HOUBo-Yu; XIXiao-Qiang; YUERui-Hong

    2003-01-01

    We study the quantum standard teleportation based on the generic measurement bases. It is shown that the quantum standard teleportation does not depend on the explicit expression of the measurement bases. We have given the correspondence relation between the measurement performed by Alice and the unitary transformation performed by Bob. We also prove that the single particle unknown states and the two-particle unknown cat-like states can be exactly transmitted by means of the generic measurement bases and the correspondence unitary transformations.

  19. Game-based versus traditional case-based learning

    Science.gov (United States)

    Telner, Deanna; Bujas-Bobanovic, Maja; Chan, David; Chester, Bob; Marlow, Bernard; Meuser, James; Rothman, Arthur; Harvey, Bart

    2010-01-01

    ABSTRACT OBJECTIVE To evaluate family physicians’ enjoyment of and knowledge gained from game-based learning, compared with traditional case-based learning, in a continuing medical education (CME) event on stroke prevention and management. DESIGN An equivalence trial to determine if game-based learning was as effective as case-based learning in terms of attained knowledge levels. Game questions and small group cases were developed. Participants were randomized to either a game-based or a case-based group and took part in the event. SETTING Ontario provincial family medicine conference. PARTICIPANTS Thirty-two family physicians and 3 senior family medicine residents attending the conference. INTERVENTION Participation in either a game-based or a case-based CME learning group. MAIN OUTCOME MEASURES Scores on 40-item immediate and 3-month posttests of knowledge and a satisfaction survey. RESULTS Results from knowledge testing immediately after the event and 3 months later showed no significant difference in scoring between groups. Participants in the game-based group reported higher levels of satisfaction with the learning experience. CONCLUSION Games provide a novel way of organizing CME events. They might provide more group interaction and discussion, as well as improve recruitment to CME events. They might also provide a forum for interdisciplinary CME. Using games in future CME events appears to be a promising approach to facilitate participant learning. PMID:20841574

  20. Remote Monitoring System for Communication Base Based on Short Message

    Directory of Open Access Journals (Sweden)

    Han Yu Fu

    2013-07-01

    Full Text Available This paper presents design and development of an automatic monitoring system of communication base which is an important means to realize modernization of mobile communication base station management. Firstly, this paper proposes the architecture of the monitoring system. The proposed system consists of mocrocontrollers, sensors, GSM module and MFRC500 etc. The value of parameters is measured in the system including terminal is studied and designed, including hardware design based on embedded system and software design. Finally, communication module is discussed. The monitoring system which is designed  based on GSM SMS(short message service can improve the integrity, reliability, flexibility and intellectuality of monitoring system.

  1. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  2. Acids and bases solvent effects on acid-base strenght

    CERN Document Server

    Cox, Brian G

    2013-01-01

    Acids and bases are ubiquitous in chemistry. Our understanding of them, however, is dominated by their behaviour in water. Transfer to non-aqueous solvents leads to profound changes in acid-base strengths and to the rates and equilibria of many processes: for example, synthetic reactions involving acids, bases and nucleophiles; isolation of pharmaceutical actives through salt formation; formation of zwitter- ions in amino acids; and chromatographic separation of substrates. This book seeks to enhance our understanding of acids and bases by reviewing and analysing their behaviour in non-aqueous solvents. The behaviour is related where possible to that in water, but correlations and contrasts between solvents are also presented.

  3. The knowledge base of journalism

    DEFF Research Database (Denmark)

    Svith, Flemming

    In this paper I propose the knowledge base as a fruitful way to apprehend journalism. With the claim that the majority of practice is anchored in knowledge – understood as 9 categories of rationales, forms and levels – this knowledge base appears as a contextual look at journalists’ knowledge......, and place. As an analytical framework, the knowledge base is limited to understand the practice of newspaper journalists, but, conversely, the knowledge base encompasses more general beginnings through the inclusion of overall structural relationships in the media and journalism and general theories...... on practice and knowledge. As the result of an abductive reasoning is a theory proposal, there is a need for more deductive approaches to test the validity of this knowledge base claim. It is thus relevant to investigate which rationales are included in the knowledge base of journalism, as the dimension does...

  4. dBASE IV basics

    Energy Technology Data Exchange (ETDEWEB)

    O`Connor, P.

    1994-09-01

    This is a user`s manual for dBASE IV. dBASE IV is a popular software application that can be used on your personal computer to help organize and maintain your database files. It is actually a set of tools with which you can create, organize, select and manipulate data in a simple yet effective manner. dBASE IV offers three methods of working with the product: (1) control center: (2) command line; and (3) programming.

  5. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  6. Consortia based production of biochemicals

    DEFF Research Database (Denmark)

    Ingemann Jensen, Sheila; Sukumara, Sumesh; Özdemir, Emre

    -based modelling, and state-of-the art metabolic engineering tools to develop a consortium of cells capable of efficient valorization of synthetic hemicellulosic hydrolysate. Stable co-existence and effective covalorization was achieved through niche-differentiation, auxotrophy, and adaptive evolution. In another...... study, stable consortia based fermentation was achieved through niche partitioning, syntrophy (auxotrophy combined with removal of inhibitory side product), and CRISPRi mediated gene silencing. The achieved results demonstrate that consortium based approaches for valorizing complex biomass and waste...

  7. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  8. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the

  9. Methods in Logic Based Control

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg

    1999-01-01

    Desing and theory of Logic Based Control systems.Boolean Algebra, Karnaugh Map, Quine McClusky's algorithm. Sequential control design. Logic Based Control Method, Cascade Control Method. Implementation techniques: relay, pneumatic, TTL/CMOS,PAL and PLC- and Soft_PLC implementation. PLC-design met......Desing and theory of Logic Based Control systems.Boolean Algebra, Karnaugh Map, Quine McClusky's algorithm. Sequential control design. Logic Based Control Method, Cascade Control Method. Implementation techniques: relay, pneumatic, TTL/CMOS,PAL and PLC- and Soft_PLC implementation. PLC...

  10. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  11. On Pimsner-Popa bases

    Indian Academy of Sciences (India)

    Keshab Chandra Bakshi

    2017-02-01

    In this paper, we examine bases for finite index inclusion of ${\\rm II}_1$ factors and connected inclusion of finite dimensional $C^\\ast$-algebras. These bases behave nicely with respect to basic construction towers. As applications we have studied automorphisms of the hyperfinite ${\\rm II}_1$ factor $R$ which are ‘compatible with respect to the Jones’ tower of finite dimensional $C^\\ast$-algebras’. As a further application, in both cases we obtain a characterization, in terms of bases, of basic constructions. Finally we use these bases to describe the phenomenon of multistep basic constructions (in both the cases).

  12. Web-based support systems

    CERN Document Server

    Yao, JingTao

    2010-01-01

    The emerging interdisciplinary study of Web-based support systems focuses on the theories, technologies and tools for the design and implementation of Web-based systems that support various human activities. This book presents the state-of-the-art in Web-based support systems (WSS). The research on WSS is multidisciplinary and focuses on supporting various human activities in different domains/fields based on computer science, information technology, and Web technology. The main goal is to take the opportunities of the Web, to meet the challenges of the Web, to extend the human physical limita

  13. Consumption-based Equity Valuation

    DEFF Research Database (Denmark)

    Bach, Christian; Christensen, Peter O.

    2016-01-01

    Using a CCAPM-based risk-adjustment model, we perform yearly valuations of a large sample of stocks listed on NYSE, AMEX, and NASDAQ over a 30-year period. The model differs from standard valuation models in the sense that it adjusts forecasted residual income for risk in the numerator rather than...... hedge returns based on the best performing standard valuation model for holding periods from 1 to 5 years. In a statistical test of 1-year-ahead excess return predictability based on the models' implied pricing errors, the CCAPM-based valuation model is selected as the better model. Using the standard...

  14. Free-Flowing Solutions for CFD

    Science.gov (United States)

    2003-01-01

    Licensed to over 1,500 customers worldwide, an advanced computational fluid dynamics (CFD) post-processor with a quick learning curve is consistently providing engineering solutions, with just the right balance of visual insight and hard data. FIELDVIEW is the premier product of JMSI, Inc., d.b.a. Intelligent Light, a woman-owned, small business founded in 1994 and located in Lyndhurst, New Jersey. In the early 1990s, Intelligent Light entered into a joint development contract with a research based company to commercialize the post-processing FIELDVIEW code. As Intelligent Light established itself, it purchased the exclusive rights to the code, and structured its business solely around the software technology. As a result, it is enjoying profits and growing at a rate of 25 to 30 percent per year. Advancements made from the earliest commercial launch of FIELDVIEW, all the way up to the recently released versions 8 and 8.2 of the program, have been backed by research collaboration with NASA's Langley Research Center, where some of the world's most progressive work in transient (also known as time-varying) CFD takes place.

  15. Natural-pose hand detection in low-resolution images

    Directory of Open Access Journals (Sweden)

    Nyan Bo Bo1

    2009-07-01

    Full Text Available Robust real-time hand detection and tracking in video sequences would enable many applications in areas as diverse ashuman-computer interaction, robotics, security and surveillance, and sign language-based systems. In this paper, we introducea new approach for detecting human hands that works on single, cluttered, low-resolution images. Our prototype system, whichis primarily intended for security applications in which the images are noisy and low-resolution, is able to detect hands as smallas 2424 pixels in cluttered scenes. The system uses grayscale appearance information to classify image sub-windows as eithercontaining or not containing a human hand very rapidly at the cost of a high false positive rate. To improve on the false positiverate of the main classifier without affecting its detection rate, we introduce a post-processor system that utilizes the geometricproperties of skin color blobs. When we test our detector on a test image set containing 106 hands, 92 of those hands aredetected (86.8% detection rate, with an average false positive rate of 1.19 false positive detections per image. The rapiddetection speed, the high detection rate of 86.8%, and the low false positive rate together ensure that our system is useable asthe main detector in a diverse variety of applications requiring robust hand detection and tracking in low-resolution, clutteredscenes.

  16. A New Eddy Dissipation Rate Formulation for the Terminal Area PBL Prediction System(TAPPS)

    Science.gov (United States)

    Charney, Joseph J.; Kaplan, Michael L.; Lin, Yuh-Lang; Pfeiffer, Karl D.

    2000-01-01

    The TAPPS employs the MASS model to produce mesoscale atmospheric simulations in support of the Wake Vortex project at Dallas Fort-Worth International Airport (DFW). A post-processing scheme uses the simulated three-dimensional atmospheric characteristics in the planetary boundary layer (PBL) to calculate the turbulence quantities most important to the dissipation of vortices: turbulent kinetic energy and eddy dissipation rate. TAPPS will ultimately be employed to enhance terminal area productivity by providing weather forecasts for the Aircraft Vortex Spacing System (AVOSS). The post-processing scheme utilizes experimental data and similarity theory to determine the turbulence quantities from the simulated horizontal wind field and stability characteristics of the atmosphere. Characteristic PBL quantities important to these calculations are determined based on formulations from the Blackadar PBL parameterization, which is regularly employed in the MASS model to account for PBL processes in mesoscale simulations. The TAPPS forecasts are verified against high-resolution observations of the horizontal winds at DFW. Statistical assessments of the error in the wind forecasts suggest that TAPPS captures the essential features of the horizontal winds with considerable skill. Additionally, the turbulence quantities produced by the post-processor are shown to compare favorably with corresponding tower observations.

  17. Development of 3-D Flow Analysis Code for Fuel Assembly using Unstructured Grid System

    Energy Technology Data Exchange (ETDEWEB)

    Myong, Hyon Kook; Kim, Jong Eun; Ahn, Jong Ki; Yang, Seung Yong [Kookmin Univ., Seoul (Korea, Republic of)

    2007-03-15

    The flow through a nuclear rod bundle with mixing vanes are very complex and required a suitable turbulence model to be predicted accurately. Final objective of this study is to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system. In order to develop a CFD code for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system, the following researches are made: - Development of numerical algorithm for CFD code's solver - Grid and geometric connectivity data - Development of software(PowerCFD code) for fluid flow and heat transfer analysis in a nuclear fuel assembly using unstructured grid system - Modulation of software(PowerCFD code) - Development of turbulence model - Development of analysis module of RANS/LES hybrid models - Analysis of turbulent flow and heat transfer - Basic study on LES analysis - Development of main frame on pre/post processors based on GUI - Algorithm for fully-developed flow.

  18. An analysis of the uncertainty in temperature and density estimates from fitting model spectra to data. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    Energy Technology Data Exchange (ETDEWEB)

    Schubmehl, M. [Harley School, Rochester, NY (United States)

    1999-03-01

    Temperature and density histories of direct-drive laser fusion implosions are important to an understanding of the reaction`s progress. Such measurements also document phenomena such as preheating of the core and improper compression that can interfere with the thermonuclear reaction. Model x-ray spectra from the non-LTE (local thermodynamic equilibrium) radiation transport post-processor for LILAC have recently been fitted to OMEGA data. The spectrum fitting code reads in a grid of model spectra and uses an iterative weighted least-squares algorithm to perform a fit to experimental data, based on user-input parameter estimates. The purpose of this research was to upgrade the fitting code to compute formal uncertainties on fitted quantities, and to provide temperature and density estimates with error bars. A standard error-analysis process was modified to compute these formal uncertainties from information about the random measurement error in the data. Preliminary tests of the code indicate that the variances it returns are both reasonable and useful.

  19. Integrating weather and climate predictions for seamless hydrologic ensemble forecasting: A case study in the Yalong River basin

    Science.gov (United States)

    Ye, Aizhong; Deng, Xiaoxue; Ma, Feng; Duan, Qingyun; Zhou, Zheng; Du, Chao

    2017-04-01

    Despite the tremendous improvement made in numerical weather and climate models over the recent years, the forecasts generated by those models still cannot be used directly for hydrological forecasting. A post-processor like the Ensemble Pre-Processor (EPP) developed by U.S. National Weather Service must be used to remove various biases and to extract useful predictive information from those forecasts. In this paper, we investigate how different designs of canonical events in the EPP can help post-process precipitation forecasts from the Global Ensemble Forecast System (GEFS) and Climate Forecast System Version 2 (CFSv2). The use of canonical events allow those products to be linked seamlessly and then the post-processed ensemble precipitation forecasts can be generated using the Schaake Shuffle procedure. We used the post-processed ensemble precipitation forecasts to drive a distributed hydrological model to obtain ensemble streamflow forecasts and evaluated those forecasts against the observed streamflow. We found that the careful design of canonical events can help extract more useful information, especially when up-to-date observed precipitation is used to setup the canonical events. We also found that streamflow forecasts using post-processed precipitation forecasts have longer lead times and higher accuracy than streamflow forecasts made by traditional Extend Streamflow Prediction (ESP) and the forecasts based on original GEFS and CFSv2 precipitation forecasts.

  20. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstrate the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.

  1. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  2. Creep fatigue assessment for EUROFER components

    Energy Technology Data Exchange (ETDEWEB)

    Özkan, Furkan, E-mail: oezkan.furkan@partner.kit.edu; Aktaa, Jarir

    2015-11-15

    Highlights: • Design rules for creep fatigue assessment are developed to EUROFER components. • Creep fatigue assessment tool is developed in FORTRAN code with coupling MAPDL. • Durability of the HCPB-TBM design is discussed under typical fusion reactor loads. - Abstract: Creep-fatigue of test blanket module (TBM) components built from EUROFER is evaluated based on the elastic analysis approach in ASME Boiler Pressure Vessel Code (BPVC). The required allowable number of cycles design fatigue curve and stress-to-rupture curve to estimate the creep-fatigue damage are used from the literature. Local stress, strain and temperature inputs for the analysis of creep-fatigue damage are delivered by the finite element code ANSYS utilizing the Mechanical ANSYS Parametric Design Language (MAPDL). A developed external FORTRAN code used as a post processor is coupled with MAPDL. Influences of different pulse durations (hold-times) and irradiation on creep-fatigue damage for the preliminary design of the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) are discussed for the First Wall component of the TBM box.

  3. KIVA-3V, Release 2: Improvements to KIVA-3V

    Energy Technology Data Exchange (ETDEWEB)

    Anthony A. Amsden

    1999-05-01

    This report describes the changes made in the KIVA-3V computer program since its initial release version dated 24 March 1997. A variety of new features enhance the robustness, efficiency, and usefulness of the overall program for engine modeling. Among these are an automatic restart of the cycle with a reduced timestep in case of iteration limit or temperature overflow, which should greatly reduce the likelihood of having the code crash in mid run. A new option is the automatic deactivation of a port region when it is closed off from the engine cylinder and its reactivation when it again communicates with the cylinder. A number of corrections throughout the code improve accuracy, one of which also corrects the 2-D planar option to make it properly independent of the third dimension. Extensions to the particle-based liquid wall film model make the model somewhat more complete, although it is still considered a work-in-progress. In response to current research in fuel-injected engines, a split-injection option has been added. A new subroutine monitors the whereabouts of the liquid and gaseous phases of the fuel, and for combustion runs the energy balance data and emissions are monitored and printed. New features in the grid generator K3PREP and the graphics post-processor K3POST are also discussed.

  4. An analysis of the uncertainty in temperature and density estimates from fitting model spectra to data. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    Energy Technology Data Exchange (ETDEWEB)

    Schubmehl, M. [Harley School, Rochester, NY (United States)

    1999-03-01

    Temperature and density histories of direct-drive laser fusion implosions are important to an understanding of the reaction`s progress. Such measurements also document phenomena such as preheating of the core and improper compression that can interfere with the thermonuclear reaction. Model x-ray spectra from the non-LTE (local thermodynamic equilibrium) radiation transport post-processor for LILAC have recently been fitted to OMEGA data. The spectrum fitting code reads in a grid of model spectra and uses an iterative weighted least-squares algorithm to perform a fit to experimental data, based on user-input parameter estimates. The purpose of this research was to upgrade the fitting code to compute formal uncertainties on fitted quantities, and to provide temperature and density estimates with error bars. A standard error-analysis process was modified to compute these formal uncertainties from information about the random measurement error in the data. Preliminary tests of the code indicate that the variances it returns are both reasonable and useful.

  5. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    Science.gov (United States)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  6. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    Science.gov (United States)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  7. Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2011-01-01

    The vulnerability and inefficiency of backing up data on-site is prompting school districts to switch to more secure, less troublesome cloud-based options. District auditors are pushing for a better way to back up their data than the on-site, tape-based system that had been used for years. About three years ago, Hendrick School District in…

  8. Team-Based Global Organizations

    DEFF Research Database (Denmark)

    Zander, Lena; Butler, Christina Lea; Mockaitis, Audra;

    2015-01-01

    Purpose-We propose team-based organizing as an alternative to more traditional forms of hierarchy-based organizing in global firms. Methodology/approach-Advancements in the study of global teams, leadership, process, and outcomes were organized into four themes: (1) openness toward linguistic and...

  9. Eye-based head gestures

    DEFF Research Database (Denmark)

    Mardanbegi, Diako; Witzner Hansen, Dan; Pederson, Thomas

    2012-01-01

    A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze...

  10. Dog Mathematics: Exploring Base-4

    Science.gov (United States)

    Kurz, Terri L.; Yanik, H. Bahadir; Lee, Mi Yeon

    2016-01-01

    Using a dog's paw as a basis for numerical representation, sixth grade students explored how to count and regroup using the dog's four digital pads. Teachers can connect these base-4 explorations to the conceptual meaning of place value and regrouping using base-10.

  11. Dimensions of problem based learning

    DEFF Research Database (Denmark)

    Nielsen, Jørgen Lerche; Andreasen, Lars Birch

    2013-01-01

    The article contributes to the literature on problem based learning and problem-oriented project work, building on and reflecting the experiences of the authors through decades of work with problem-oriented project pedagogy. The article explores different dimensions of problem based learning...

  12. Forest biomass-based energy

    Science.gov (United States)

    Janaki R. R. Alavalapati; Pankaj Lal; Andres Susaeta; Robert C. Abt; David N. Wear

    2013-01-01

    Key FindingsHarvesting woody biomass for use as bioenergy is projected to range from 170 million to 336 million green tons by 2050, an increase of 54 to 113 percent over current levels.Consumption projections for forest biomass-based energy, which are based on Energy Information Administration projections, have a high level of...

  13. Activity-based proteasome profiling

    NARCIS (Netherlands)

    Li, Nan

    2013-01-01

    The work described in this thesis is mainly focusing on setting up and application of a quantitative activity‐based proteasome profiling method. Chapter 1 provides a general introduction on the ubiquitin proteasome system (UPS) and activity‐based proteasome profiling. Chapter 2 is a literature

  14. Theory-Based Stakeholder Evaluation

    Science.gov (United States)

    Hansen, Morten Balle; Vedung, Evert

    2010-01-01

    This article introduces a new approach to program theory evaluation called theory-based stakeholder evaluation or the TSE model for short. Most theory-based approaches are program theory driven and some are stakeholder oriented as well. Practically, all of the latter fuse the program perceptions of the various stakeholder groups into one unitary…

  15. Evidence-Based Laboratory Medicine

    Institute of Scientific and Technical Information of China (English)

    Christopher P Price

    2004-01-01

    @@ Whilst there have been several definitions of Evidence-Based Medicine (EBM), the one given by David Sackett is probably the most accurate and well accepted; he stated that "evidence-based medicine is the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients"[1].

  16. Component Based Testing with ioco

    NARCIS (Netherlands)

    van der Bijl, H.M.; Rensink, Arend; Tretmans, G.J.

    Component based testing concerns the integration of components which have already been tested separately. We show that, with certain restrictions, the ioco-test theory for conformance testing is suitable for component based testing, in the sense that the integration of fully conformant components is

  17. Knowledge-Based Asynchronous Programming

    NARCIS (Netherlands)

    Haan, Hendrik Wietze de; Hesselink, Wim H.; Renardel de Lavalette, Gerard R.

    2004-01-01

    A knowledge-based program is a high-level description of the behaviour of agents in terms of knowledge that an agent must have before (s)he may perform an action. The definition of the semantics of knowledge-based programs is problematic, since it involves a vicious circle; the knowledge of an agent

  18. Plasma-based accelerator structures

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, Carl B. [Univ. of California, Berkeley, CA (United States)

    1999-12-01

    Plasma-based accelerators have the ability to sustain extremely large accelerating gradients, with possible high-energy physics applications. This dissertation further develops the theory of plasma-based accelerators by addressing three topics: the performance of a hollow plasma channel as an accelerating structure, the generation of ultrashort electron bunches, and the propagation of laser pulses is underdense plasmas.

  19. Mitochondrial base excision repair assays

    DEFF Research Database (Denmark)

    Maynard, Scott; de Souza-Pinto, Nadja C; Scheibye-Knudsen, Morten

    2010-01-01

    The main source of mitochondrial DNA (mtDNA) damage is reactive oxygen species (ROS) generated during normal cellular metabolism. The main mtDNA lesions generated by ROS are base modifications, such as the ubiquitous 8-oxoguanine (8-oxoG) lesion; however, base loss and strand breaks may also occur...

  20. Methods in Logic Based Control

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg

    1999-01-01

    Desing and theory of Logic Based Control systems.Boolean Algebra, Karnaugh Map, Quine McClusky's algorithm. Sequential control design. Logic Based Control Method, Cascade Control Method. Implementation techniques: relay, pneumatic, TTL/CMOS,PAL and PLC- and Soft_PLC implementation. PLC...

  1. GPP-Based Soft Base Station Designing and Optimization

    Institute of Scientific and Technical Information of China (English)

    Xiao-Feng Tao; Yan-Zhao Hou; Kai-Dong Wang; Hai-Yang He; Y.Jay Guo

    2013-01-01

    It is generally acknowledged that mobile communication base stations are composed of hardware components such as Field Programming Gate Array (FPGA),Digital Signal Processor (DSP),which promise reliable and fluent services for the mobile users.However,with the increasing demand for energy-efficiency,approaches of low power-consumption and high-flexibility are needed urgently.In this circumstance,General Purpose Processor (GPP) attracts people's attention for its low-cost and flexibility.Benefited from the development of modern GPP in multi-core,Single Instruction Multiple Data (SIMD) instructions,larger cache,etc.,GPPs are capable of performing high-density digital processing.In this paper,we compare several software-defined radio (SDR) prototypes and propose the general architecture of GPP-based soft base stations.Then,the schematic design of resource allocation and algorithm optimization in soft base station implementation are studied.As an application example,a prototype of GPP-based soft base station referring to the 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) is realized and evaluated.To the best of our knowledge,it is the first Soft-LTE prototype ever reported.In the end,we evaluate the timing performance of the LTE soft base station and a packet loss ratio of less than 0.003 is obtained.

  2. Schiff Bases: A Versatile Pharmacophore

    Directory of Open Access Journals (Sweden)

    Anu Kajal

    2013-01-01

    Full Text Available Schiff bases are condensation products of primary amines with carbonyl compounds gaining importance day by day in present scenario. Schiff bases are the compounds carrying imine or azomethine (–C=N– functional group and are found to be a versatile pharmacophore for design and development of various bioactive lead compounds. Schiff bases exhibit useful biological activities such anti-inflammatory, analgesic, antimicrobial, anticonvulsant, antitubercular, anticancer, antioxidant, anthelmintic, antiglycation, and antidepressant activities. Schiff bases are also used as catalysts, pigments and dyes, intermediates in organic synthesis, polymer stabilizers, and corrosion inhibitors. The present review summarizes information on the diverse biological activities and also highlights the recently synthesized numerous Schiff bases as potential bioactive core.

  3. Lignin-Based Thermoplastic Materials.

    Science.gov (United States)

    Wang, Chao; Kelley, Stephen S; Venditti, Richard A

    2016-04-21

    Lignin-based thermoplastic materials have attracted increasing interest as sustainable, cost-effective, and biodegradable alternatives for petroleum-based thermoplastics. As an amorphous thermoplastic material, lignin has a relatively high glass-transition temperature and also undergoes radical-induced self-condensation at high temperatures, which limits its thermal processability. Additionally, lignin-based materials are usually brittle and exhibit poor mechanical properties. To improve the thermoplasticity and mechanical properties of technical lignin, polymers or plasticizers are usually integrated with lignin by blending or chemical modification. This Review attempts to cover the reported approaches towards the development of lignin-based thermoplastic materials on the basis of published information. Approaches reviewed include plasticization, blending with miscible polymers, and chemical modifications by esterification, etherification, polymer grafting, and copolymerization. Those lignin-based thermoplastic materials are expected to show applications as engineering plastics, polymeric foams, thermoplastic elastomers, and carbon-fiber precursors.

  4. Tag Based Audio Search Engine

    Directory of Open Access Journals (Sweden)

    Parameswaran Vellachu

    2012-03-01

    Full Text Available The volume of the music database is increasing day by day. Getting the required song as per the choice of the listener is a big challenge. Hence, it is really hard to manage this huge quantity, in terms of searching, filtering, through the music database. It is surprising to see that the audio and music industry still rely on very simplistic metadata to describe music files. However, while searching audio resource, an efficient "Tag Based Audio Search Engine" is necessary. The current research focuses on two aspects of the musical databases 1. Tag Based Semantic Annotation Generation using the tag based approach.2. An audio search engine, using which the user can retrieve the songs based on the users choice. The proposed method can be used to annotation and retrieve songs based on musical instruments used , mood of the song, theme of the song, singer, music director, artist, film director, instrument, genre or style and so on.

  5. Content-Based Image Retrial Based on Hadoop

    Directory of Open Access Journals (Sweden)

    DongSheng Yin

    2013-01-01

    Full Text Available Generally, time complexity of algorithms for content-based image retrial is extremely high. In order to retrieve images on large-scale databases efficiently, a new way for retrieving based on Hadoop distributed framework is proposed. Firstly, a database of images features is built by using Speeded Up Robust Features algorithm and Locality-Sensitive Hashing and then perform the search on Hadoop platform in a parallel way specially designed. Considerable experimental results show that it is able to retrieve images based on content on large-scale cluster and image sets effectively.

  6. Comparative Study of Triangulation based and Feature based Image Morphing

    Directory of Open Access Journals (Sweden)

    Ms. Bhumika G. Bhatt

    2012-01-01

    Full Text Available Image Morphing is one of the most powerful Digital Image processing technique, which is used to enhancemany multimedia projects, presentations, education and computer based training. It is also used inmedical imaging field to recover features not visible in images by establishing correspondence of featuresamong successive pair of scanned images. This paper discuss what morphing is and implementation ofTriangulation based morphing Technique and Feature based Image Morphing. IT analyze both morphingtechniques in terms of different attributes such as computational complexity, Visual quality of morphobtained and complexity involved in selection of features.

  7. Value-based metrics and Internet-based enterprises

    Science.gov (United States)

    Gupta, Krishan M.

    2001-10-01

    Within the last few years, a host of value-based metrics like EVA, MVA, TBR, CFORI, and TSR have evolved. This paper attempts to analyze the validity and applicability of EVA and Balanced Scorecard for Internet based organizations. Despite the collapse of the dot-com model, the firms engaged in e- commerce continue to struggle to find new ways to account for customer-base, technology, employees, knowledge, etc, as part of the value of the firm. While some metrics, like the Balance Scorecard are geared towards internal use, others like EVA are for external use. Value-based metrics are used for performing internal audits as well as comparing firms against one another; and can also be effectively utilized by individuals outside the firm looking to determine if the firm is creating value for its stakeholders.

  8. G -Frames, g -orthonormal bases and g -Riesz bases

    Directory of Open Access Journals (Sweden)

    Seyedeh Sara Karimizad

    2014-05-01

    Full Text Available G-Frames in Hilbert spaces are a redundant set of operators which yield a repre-sentation for each vector in the space. In this paper we investigate the connection betweeng-frames, g-orthonormal bases and g-Riesz bases. We show that a family of bounded opera-tors is a g-Bessel sequences if and only if the Gram matrix associated to its denes a bounded operator

  9. Base isolation system and verificational experiment of base isolated building

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Mikio; Harada, Osamu; Aoyagi, Sakae; Matsuda, Taiji

    1987-05-15

    With the objective of rationalization of the earthquake resistant design and the economical design based thereupon, many base isolation systems have been proposed and its research, development and application have been made in recent years. In order to disseminate the system, it is necessary to accumulate the data obtained from vibration tests and earthquake observations and verify the reliability of the system. From this viewpoint, the Central Research Institute of Electric power Industry and Okumura Corporation did the following experiments with a base isolated building as the object: 1) static power application experiments, 2) shaking experiments, 3) free vibration experiments, 4) regular slight vibration observations and 5) earthquake response observations (continuing). This article reports the outline of the base isolation system and the base isolated building concerned as well as the results of the verification experiments 1) through 3) above. From the results of these verification experiments, the basic vibration characteristics of the base isolation system consisting of laminated rubber and plastic damper were revealed and its functions were able to be verified. Especially during the free vibration experiments, the initial displacement even up to a maximum of 10cm was applied to the portion between the foundation and the structure and this displacement corresponds to the responded amplitude in case of the earthquake of seismic intensity of the 6th degree. It is planned to continue the verification further. (18 figs, 3 tabs, 3 photos, 6 refs)

  10. Context based multimedia information retrieval

    DEFF Research Database (Denmark)

    Mølgaard, Lasse Lohilahti

    with the help of contextual knowledge. Our approach to model the context of multimedia is based on unsupervised methods to automatically extract meaning. We investigate two paths of context modelling. The first part extracts context from the primary media, in this case broadcast news speech, by extracting...... through an approximation based on non-negative matrix factorisation NMF. The second part of the work tries to infer the contextual meaning of music based on extra-musical knowledge, in our case gathered from Wikipedia. The semantic relations between artists are inferred using linking structure...

  11. Solar based hydrogen production systems

    CERN Document Server

    Dincer, Ibrahim

    2013-01-01

    This book provides a comprehensive analysis of various solar based hydrogen production systems. The book covers first-law (energy based) and second-law (exergy based) efficiencies and provides a comprehensive understanding of their implications. It will help minimize the widespread misuse of efficiencies among students and researchers in energy field by using an intuitive and unified approach for defining efficiencies. The book gives a clear understanding of the sustainability and environmental impact analysis of the above systems. The book will be particularly useful for a clear understanding

  12. Wavelength conversion based spectral imaging

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin

    There has been a strong, application driven development of Si-based cameras and spectrometers for imaging and spectral analysis of light in the visible and near infrared spectral range. This has resulted in very efficient devices, with high quantum efficiency, good signal to noise ratio and high...... resolution for this spectral region. Today, an increasing number of applications exists outside the spectral region covered by Si-based devices, e.g. within cleantech, medical or food imaging. We present a technology based on wavelength conversion which will extend the spectral coverage of state of the art...... visible or near infrared cameras and spectrometers to include other spectral regions of interest....

  13. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  14. Seeing graphene-based sheets

    Directory of Open Access Journals (Sweden)

    Jaemyung Kim

    2010-03-01

    Full Text Available Graphene-based sheets such as graphene, graphene oxide and reduced graphene oxide have stimulated great interest due to their promising electronic, mechanical and thermal properties. Microscopy imaging is indispensable for characterizing these single atomic layers, and oftentimes is the first measure of sample quality. This review provides an overview of current imaging techniques for graphene-based sheets and highlights a recently developed fluorescence quenching microscopy technique that allows high-throughput, high-contrast imaging of graphene-based sheets on arbitrary substrate and even in solution.

  15. Formal Component-Based Semantics

    CERN Document Server

    Madlener, Ken; van Eekelen, Marko; 10.4204/EPTCS.62.2

    2011-01-01

    One of the proposed solutions for improving the scalability of semantics of programming languages is Component-Based Semantics, introduced by Peter D. Mosses. It is expected that this framework can also be used effectively for modular meta theoretic reasoning. This paper presents a formalization of Component-Based Semantics in the theorem prover Coq. It is based on Modular SOS, a variant of SOS, and makes essential use of dependent types, while profiting from type classes. This formalization constitutes a contribution towards modular meta theoretic formalizations in theorem provers. As a small example, a modular proof of determinism of a mini-language is developed.

  16. Cubesat-based UV astronomy

    Science.gov (United States)

    Brosch, Noah

    Development of UV astronomy can be traced to go in one main direction: launching even larger telescopes to ensure an as high as possible photon collecting area. This trend causes inevitably escalating mission costs and this, in the present environment of diminishing research budgets, is the main reason for not having as many UV astronomy missions as one would like. I propose an alternative paradigm based on developing UV missions based on the cubesat technology. This allows a very significant cost reduction by basing the platform on custom off-the-shelf (COTS) components, at the price of small collecting apertures. I discuss possible topics that could benefit from such an approach.

  17. Flow tracing based on current

    Institute of Scientific and Technical Information of China (English)

    蔡兴国; 曹海龙

    2001-01-01

    Analyses the flow tracing based on power flow, points out that the detachment of reactive power and active power is unrealiable and concludes that the current is the real basic of flow tracing,and proposes the new flow tracing model based on current, which devides the current into active current and reactive current, analyses the theory about the matrix to deal with the precision and realization of the flow tracing, and then proposes a new pricing model by fixed rate and marginal rate, which keeps not only economy information such as congestion cost in marginal cost based pricing, but also benefits to make both ends meet.

  18. Home-based versus centre-based cardiac rehabilitation.

    Science.gov (United States)

    Anderson, Lindsey; Sharp, Georgina A; Norton, Rebecca J; Dalal, Hasnain; Dean, Sarah G; Jolly, Kate; Cowie, Aynsley; Zawada, Anna; Taylor, Rod S

    2017-06-30

    Cardiovascular disease is the most common cause of death globally. Traditionally, centre-based cardiac rehabilitation programmes are offered to individuals after cardiac events to aid recovery and prevent further cardiac illness. Home-based cardiac rehabilitation programmes have been introduced in an attempt to widen access and participation. This is an update of a review previously published in 2009 and 2015. To compare the effect of home-based and supervised centre-based cardiac rehabilitation on mortality and morbidity, exercise-capacity, health-related quality of life, and modifiable cardiac risk factors in patients with heart disease. We updated searches from the previous Cochrane Review by searching the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE (Ovid), Embase (Ovid), PsycINFO (Ovid) and CINAHL (EBSCO) on 21 September 2016. We also searched two clinical trials registers as well as previous systematic reviews and reference lists of included studies. No language restrictions were applied. We included randomised controlled trials, including parallel group, cross-over or quasi-randomised designs) that compared centre-based cardiac rehabilitation (e.g. hospital, gymnasium, sports centre) with home-based programmes in adults with myocardial infarction, angina, heart failure or who had undergone revascularisation. Two review authors independently screened all identified references for inclusion based on pre-defined inclusion criteria. Disagreements were resolved through discussion or by involving a third review author. Two authors independently extracted outcome data and study characteristics and assessed risk of bias. Quality of evidence was assessed using GRADE principles and a Summary of findings table was created. We included six new studies (624 participants) for this update, which now includes a total of 23 trials that randomised a total of 2890 participants undergoing cardiac rehabilitation. Participants had an acute myocardial

  19. Imprinted silicon-based nanophotonics

    DEFF Research Database (Denmark)

    Borel, Peter Ingo; Olsen, Brian Bilenberg; Frandsen, Lars Hagedorn

    2007-01-01

    We demonstrate and optically characterize silicon-on-insulator based nanophotonic devices fabricated by nanoimprint lithography. In our demonstration, we have realized ordinary and topology-optimized photonic crystal waveguide structures. The topology-optimized structures require lateral pattern ...

  20. Industry Based Survey (IBS) Cod

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Gulf of Maine Atlantic Cod Industry-Based Survey" was a collaboration of the Massachusetts Division of Marine Fisheries and the fishing industry, with support...

  1. The Knowledge Based Information Economy

    OpenAIRE

    1990-01-01

    Working Paper No. 256 is published as "The Knowledge Based Information Economy" (authors: Gunnar Eliasson, Stefan Fölster, Thomas Lindberg, Tomas Pousette and Erol Taymaz). Stockholm: Industrial Institute for Economic and Social Research and Telecon, 1990.

  2. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  3. Kernel model-based diagnosis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The methods for computing the kemel consistency-based diagnoses and the kernel abductive diagnoses are only suited for the situation where part of the fault behavioral modes of the components are known. The characterization of the kernel model-based diagnosis based on the general causal theory is proposed, which can break through the limitation of the above methods when all behavioral modes of each component are known. Using this method, when observation subsets deduced logically are respectively assigned to the empty or the whole observation set, the kernel consistency-based diagnoses and the kernel abductive diagnoses can deal with all situations. The direct relationship between this diagnostic procedure and the prime implicants/implicates is proved, thus linking theoretical result with implementation.

  4. Hospital Value-Based Purchasing

    Data.gov (United States)

    U.S. Department of Health & Human Services — Hospital Value-Based Purchasing (VBP) is part of the Centers for Medicare and Medicaid Services (CMS) long-standing effort to link Medicares payment system to a...

  5. FEMA DFIRM Base Flood Elevations

    Data.gov (United States)

    Minnesota Department of Natural Resources — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally,...

  6. PHYSIOLOGY OF ACID BASE BALANCE

    Directory of Open Access Journals (Sweden)

    Awati

    2014-12-01

    Full Text Available Acid-base, electrolyte, and metabolic disturbances are common in the intensive care unit. Almost all critically ill patients often suffer from compound acid-base and electrolyte disorders. Successful evaluation and management of such patients requires recognition of common patterns (e.g., metabolic acidosis and the ability to dissect one disorder from another. The intensivists needs to identify and correct these condition with the easiest available tools as they are the associated with multiorgan failure. Understanding the elements of normal physiology in these areas is very important so as to diagnose the pathological condition and take adequate measures as early as possible. Arterial blood gas analysis is one such tool for early detection of acid base disorder. Physiology of acid base is complex and here is the attempt to simplify it in our day to day application for the benefit of critically ill patients.

  7. Cell phone based balance trainer

    National Research Council Canada - National Science Library

    Lee, Beom-Chan; Kim, Jeonghee; Chen, Shu; Sienko, Kathleen H

    2012-01-01

    ..., weight, complexity, calibration procedures, cost, and fragility. We have designed and developed a cell phone based vibrotactile feedback system for potential use in balance rehabilitation training in clinical and home environments...

  8. Oil-based paint poisoning

    Science.gov (United States)

    Paint - oil based - poisoning ... Hydrocarbons are the primary poisonous ingredient in oil paints. Some oil paints have heavy metals such as lead, mercury, cobalt, and barium added as pigment. These heavy metals can cause additional ...

  9. Surface stress-based biosensors.

    Science.gov (United States)

    Sang, Shengbo; Zhao, Yuan; Zhang, Wendong; Li, Pengwei; Hu, Jie; Li, Gang

    2014-01-15

    Surface stress-based biosensors, as one kind of label-free biosensors, have attracted lots of attention in the process of information gathering and measurement for the biological, chemical and medical application with the development of technology and society. This kind of biosensors offers many advantages such as short response time (less than milliseconds) and a typical sensitivity at nanogram, picoliter, femtojoule and attomolar level. Furthermore, it simplifies sample preparation and testing procedures. In this work, progress made towards the use of surface stress-based biosensors for achieving better performance is critically reviewed, including our recent achievement, the optimally circular membrane-based biosensors and biosensor array. The further scientific and technological challenges in this field are also summarized. Critical remark and future steps towards the ultimate surface stress-based biosensors are addressed.

  10. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  11. Chip-based droplet sorting

    Energy Technology Data Exchange (ETDEWEB)

    Beer, Neil Reginald; Lee, Abraham; Hatch, Andrew

    2014-07-01

    A non-contact system for sorting monodisperse water-in-oil emulsion droplets in a microfluidic device based on the droplet's contents and their interaction with an applied electromagnetic field or by identification and sorting.

  12. NORTHWOODS Wildlife Habitat Data Base

    Science.gov (United States)

    Mark D. Nelson; Janine M. Benyus; Richard R. Buech

    1992-01-01

    Wildlife habitat data from seven Great Lakes National Forests were combined into a wildlife-habitat matrix named NORTHWOODS. Several electronic file formats of NORTHWOODS data base and documentation are available on floppy disks for microcomputers.

  13. Base Flood Elevation (BFE) Lines

    Data.gov (United States)

    Department of Homeland Security — The Base Flood Elevation (BFE) table is required for any digital data where BFE lines will be shown on the corresponding Flood Insurance Rate Map (FIRM). Normally if...

  14. Cage-based performance capture

    CERN Document Server

    Savoye, Yann

    2014-01-01

    Nowadays, highly-detailed animations of live-actor performances are increasingly easier to acquire and 3D Video has reached considerable attentions in visual media production. In this book, we address the problem of extracting or acquiring and then reusing non-rigid parametrization for video-based animations. At first sight, a crucial challenge is to reproduce plausible boneless deformations while preserving global and local captured properties of dynamic surfaces with a limited number of controllable, flexible and reusable parameters. To solve this challenge, we directly rely on a skin-detached dimension reduction thanks to the well-known cage-based paradigm. First, we achieve Scalable Inverse Cage-based Modeling by transposing the inverse kinematics paradigm on surfaces. Thus, we introduce a cage inversion process with user-specified screen-space constraints. Secondly, we convert non-rigid animated surfaces into a sequence of optimal cage parameters via Cage-based Animation Conversion. Building upon this re...

  15. US Air Force Base Observations

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Hourly observations taken by U.S. Air Force personnel at bases in the United States and around the world. Foreign observations concentrated in the Middle East and...

  16. Chip-based droplet sorting

    Science.gov (United States)

    Beer, Neil Reginald; Lee, Abraham; Hatch, Andrew

    2014-07-01

    A non-contact system for sorting monodisperse water-in-oil emulsion droplets in a microfluidic device based on the droplet's contents and their interaction with an applied electromagnetic field or by identification and sorting.

  17. Studying Sensing-Based Systems

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun

    2013-01-01

    Recent sensing-based systems involve a multitude of users, devices, and places. These types of systems challenge existing approaches for conducting valid system evaluations. Here, the author discusses such evaluation challenges and revisits existing system evaluation methodologies....

  18. Codified Risk Based Inspection Planning

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, M.H.

    2002-01-01

    Simplified methods are described for reliability- and risk-based inspection planning of steel structures. The methods simplify the practical aspects of identifying inspection plans complying both with specific requirements to the maximum acceptable annual probability of structural collapse...

  19. Team-based global organizations

    DEFF Research Database (Denmark)

    Zander, Lena; Butler, Christina; Mockaitis, Audra

    2015-01-01

    diversity in enhancing team creativity and performance, and 2) the sharing of knowledge in team-based organizations, while the other two themes address global team leadership: 3) the unprecedented significance of social capital for the success of global team leader roles; and 4) the link between shared...... leadership, satisfaction and performance in global virtual teams. We bring together ideas from the lively discussion between the audience and the panel members where we identify questions at three levels for bringing research on team-based organizing in global organizations forward: the within-team......This chapter draws on a panel discussion of the future of global organizing as a team-based organization at EIBA 2014 in Uppsala, Sweden. We began by discussing contemporary developments of hybrid forms of hierarchy and teams-based organizing, but we venture to propose that as organizations become...

  20. Team-Based Global Organizations

    DEFF Research Database (Denmark)

    Zander, Lena; Butler, Christina Lea; Mockaitis, Audra

    2015-01-01

    and value diversity as enhancing team creativity and performance, (2) knowledge sharing in team-based organizations, (3) the significance of social capital for global team leader role success, and (4) shared leadership, satisfaction, and performance links in global virtual teams. Findings-We identify......Purpose-We propose team-based organizing as an alternative to more traditional forms of hierarchy-based organizing in global firms. Methodology/approach-Advancements in the study of global teams, leadership, process, and outcomes were organized into four themes: (1) openness toward linguistic...... questions at three levels for bringing research on team-based organizing in global organizations forward. At the within-Team individual level, we discuss the criticality of process and leadership in teams. At the between-Teams group level, we draw attention to that global teams also need to focus...

  1. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  2. Industry Based Survey (IBS) Yellowtail

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The "Southern New England Yellowtail Flounder Industry-Based Survey" was a collaboration between the Rhode Island Division of Fish and Wildlife and the fishing...

  3. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  4. Applying emerging digital video interface standards to airborne avionics sensor and digital map integrations: benefits outweigh the initial costs

    Science.gov (United States)

    Kuehl, C. Stephen

    1996-06-01

    Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal

  5. 3D Multi-View Stereoscopic Display and Its Key Technologies%3D 多视点立体显示及其关键技术

    Institute of Scientific and Technical Information of China (English)

    张兆杨; 安平; 刘苏醒

    2008-01-01

    作为基于 DTV/HDTV 的二维(2D)显示之后的下一代视频显示技术,三维(3D)多视点立体显示已成为国际上的研究热点之一.为建立多视点立体显示系统,阐述了相关的关键技术,包括:光场表示模型和光场获取系统、高效的与现行视频标准兼容的多视点编码和传输方法、解码端任意位置视点的高效绘制方法、3D显示技术以及多视点自由立体显示.针对上述关键技术,分析了当前国际上的发展趋势及存在的问题,同时提出了一种基于交互式自由立体显示的 3D 视频处理系统的解决方案.%As the next generation video display technique after 2D display based on DTV/HDTV, three-di- mensional (3D) multi-view stereoscopic display has been one of the most popular research issues in the world. And for building a multi-view stereoscopic display system, related key technologies are detailed,which includes: Light field representation model and light field capturing system, high efficiency multi-view video coding and transmission method compatible with current video standard, high efficiency render-ing method for arbitrary position view at the decoder, 3D display technologies and multi-view autostereo-scopic display. Focusing on the key technologies above, the latest international development trends and ex-isting problems is analyzed. Meanwhile a solution for implementing a 3D video processing system based on interactive auto-stereoscopic display is proposed.

  6. Base isolation of fluid containers

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. [Cygna Group Inc./ICF Kaiser International, Oakland, CA (United States)

    1995-12-01

    Fluid containers often constitute critical internal equipment in power plants. However, due to possible structure-equipment interaction effect they are particularly vulnerable during strong earthquake events. An effective technique for protecting fluid containers is base isolation. By deflecting the possible seismic input energy into the superstructure, base isolation can substantially reduce seismic demand on the containers, making it more cost effective than equivalent conventional design.

  7. A hybrid base isolation system

    Energy Technology Data Exchange (ETDEWEB)

    Hart, G.C. [Univ. of California, Los Angeles, CA (United States); Lobo, R.F.; Srinivasan, M. [Hart Consultant Group, Santa Monica, CA (United States); Asher, J.W. [kpff Engineers, Santa Monica, CA (United States)

    1995-12-01

    This paper proposes a new analysis procedure for hybrid base isolation buildings when considering the displacement response of a base isolated building to wind loads. The system is considered hybrid because of the presence of viscous dampers in the building above the isolator level. The proposed analysis approach incorporates a detailed site specific wind study combined with a dynamic nonlinear analysis of the building response.

  8. Formalization in Component Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Knudsen, John; Makowski, Piotr;

    2006-01-01

    We present a unifying conceptual framework for components, component interfaces, contracts and composition of components by focusing on the collection of properties or qualities that they must share. A specific property, such as signature, functionality behaviour or timing is an aspect. Each aspe...... by small examples, using UML as concrete syntax for various aspects, and is illustrated by one larger case study based on an industrial prototype of a complex component based system....

  9. CORBA Based CIMS Application Integration

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Common object request broker architecture (CORBA) provides the framework and the mechanism for distributed object operation. It can also be applied to computer integrated manufacturing system (CIMS) application integration. This paper studies the CIMS information service requirement, presents a CORBA based integration approach including the CORBA based CIM information system architecture and the application integration mechanism, and discusses the relationship between CORBA and the CIM application integration platform.

  10. A Survey on Wallman Bases

    OpenAIRE

    Adalberto García-Máynez

    2007-01-01

    [EN] Wallman bases are frequently used in compactification processes of topological spaces. However, they are also related with quasi–uniform structures and they are useful to characterize some topological properties. We present a brief survey on the subject which supports these statements. García-Máynez, A. (2007). A Survey on Wallman Bases. Applied General Topology. 8(2):223-237. doi:10.4995/agt.2007.1886. 223 237 8 2

  11. Evidence-Based Cancer Imaging

    Science.gov (United States)

    Khorasani, Ramin

    2017-01-01

    With the advances in the field of oncology, imaging is increasingly used in the follow-up of cancer patients, leading to concerns about over-utilization. Therefore, it has become imperative to make imaging more evidence-based, efficient, cost-effective and equitable. This review explores the strategies and tools to make diagnostic imaging more evidence-based, mainly in the context of follow-up of cancer patients.

  12. Method for gesture based modeling

    DEFF Research Database (Denmark)

    2006-01-01

    A computer program based method is described for creating models using gestures. On an input device, such as an electronic whiteboard, a user draws a gesture which is recognized by a computer program and interpreted relative to a predetermined meta-model. Based on the interpretation, an algorithm...... is assigned to the gesture drawn by the user. The executed algorithm may, for example, consist in creating a new model element, modifying an existing model element, or deleting an existing model element....

  13. TFC Base de datos relacionales

    OpenAIRE

    Moreno Pozuelo, Isabel

    2015-01-01

    El objetivo de este proyecto es desarrollar una base de datos relacional que contenga los datos necesarios para realizar la gestión de carteras de inversión. L'objectiu d'aquest projecte és desenvolupar una base de dades relacional que contingui les dades necessàries per realitzar la gestió de carteres d'inversió. Bachelor thesis for the Computer Science program on Databases.

  14. Entropy-based benchmarking methods

    OpenAIRE

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...

  15. Centrifuge-Based Fluidic Platforms

    Science.gov (United States)

    Zoval, Jim; Jia, Guangyao; Kido, Horacio; Kim, Jitae; Kim, Nahui; Madou, Marc

    In this chapter centrifuge-based microfluidic platforms are reviewed and compared with other popular microfluidic propulsion methods. The underlying physical principles of centrifugal pumping in microfluidic systems are presented and the various centrifuge fluidic functions such as valving, decanting, calibration, mixing, metering, heating, sample splitting, and separation are introduced. Those fluidic functions have been combined with analytical measurements techniques such as optical imaging, absorbance and fluorescence spectroscopy and mass spectrometry to make the centrifugal platform a powerful solution for medical and clinical diagnostics and high-throughput screening (HTS) in drug discovery. Applications of a compact disc (CD)-based centrifuge platform analyzed in this review include: two-point calibration of an optode-based ion sensor, an automated immunoassay platform, multiple parallel screening assays and cellular-based assays. The use of modified commercial CD drives for high-resolution optical imaging is discussed as well. From a broader perspective, we compare the technical barriers involved in applying microfluidics for sensing and diagnostic as opposed to applying such techniques to HTS. The latter poses less challenges and explains why HTS products based on a CD fluidic platform are already commercially available, while we might have to wait longer to see commercial CD-based diagnostics.

  16. Instantaneous noise-based logic

    CERN Document Server

    Kish, Laszlo B; Peper, Ferdinand

    2010-01-01

    We show two universal, Boolean, deterministic logic schemes based on binary noise timefunctions that can be realized without time averaging units. The first scheme is based on a new bipolar random telegraph wave scheme and the second one makes use of the recent noise-based logic which is conjectured to be the brain's method of logic operations [Physics Letters A 373 (2009) 2338-2342, arXiv:0902.2033]. For binary-valued logic operations, the two simple Boolean schemes presented in this paper use zero (no noise) for the logic Low (L) state. In the random telegraph wave-based scheme, for multi-valued logic operations, additive superpositions of logic states must be avoided, while multiplicative superpositions utilizing hyperspace base vectors can still be utilized. These modifications, while keeping the information richness of multi-valued (noise-based) logic, result in a significant speedup of logic operations for the same signal bandwidth. The logic hyperspace of the first scheme results random telegraph waves...

  17. NASA Imaging for Safety, Science, and History

    Science.gov (United States)

    Grubbs, Rodney; Lindblom, Walt; Bowerman, Deborah S. (Technical Monitor)

    2002-01-01

    Since its creation in 1958 NASA has been making and documenting history, both on Earth and in space. To complete its missions NASA has long relied on still and motion imagery to document spacecraft performance, see what can't be seen by the naked eye, and enhance the safety of astronauts and expensive equipment. Today, NASA is working to take advantage of new digital imagery technologies and techniques to make its missions more safe and efficient. An HDTV camera was on-board the International Space Station from early August, to mid-December, 2001. HDTV cameras previously flown have had degradation in the CCD during the short duration of a Space Shuttle flight. Initial performance assessment of the CCD during the first-ever long duration space flight of a HDTV camera and earlier flights is discussed. Recent Space Shuttle launches have been documented with HDTV cameras and new long lenses giving clarity never before seen with video. Examples and comparisons will be illustrated between HD, highspeed film, and analog video of these launches and other NASA tests. Other uses of HDTV where image quality is of crucial importance will also be featured.

  18. Graph Based Segmentation in Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    P. S. Suhasini

    2008-01-01

    Full Text Available Problem statement: Traditional image retrieval systems are content based image retrieval systems which rely on low-level features for indexing and retrieval of images. CBIR systems fail to meet user expectations because of the gap between the low level features used by such systems and the high level perception of images by humans. To meet the requirement as a preprocessing step Graph based segmentation is used in Content Based Image Retrieval (CBIR. Approach: Graph based segmentation is has the ability to preserve detail in low-variability image regions while ignoring detail in high-variability regions. After segmentation the features are extracted for the segmented images, texture features using wavelet transform and color features using histogram model and the segmented query image features are compared with the features of segmented data base images. The similarity measure used for texture features is Euclidean distance measure and for color features Quadratic distance approach. Results: The experimental results demonstrate about 12% improvement in the performance for color feature with segmentation. Conclusions/Recommendations: Along with this improvement Neural network learning can be embedded in this system to reduce the semantic gap.

  19. Content Based Image Retrieval Based on Color: A Survey

    Directory of Open Access Journals (Sweden)

    Mussarat Yasmin

    2015-11-01

    Full Text Available Information sharing, interpretation and meaningful expression have used digital images in the past couple of decades very usefully and extensively. This extensive use not only evolved the digital communication world with ease and usability but also produced unwanted difficulties around the use of digital images. Because of their extensive usage it sometimes becomes harder to filter images based on their visual contents. To overcome these problems, Content Based Image Retrieval (CBIR was introduced as one of the recent ways to find specific images in massive databases of digital images for efficiency or in other words for continuing the use of digital images in information sharing. In the past years, many systems of CBIR have been anticipated, developed and brought into usage as an outcome of huge research done in CBIR domain. Based on the contents of images, different approaches of CBIR have different implementations for searching images resulting in different measures of performance and accuracy. Some of them are in fact very effective approaches for fast and efficient content based image retrieval. This research highlights the hard work done by researchers to develop the image retrieval techniques based on the color of images. These techniques along with their pros and cons as well as their application in relevant fields are discussed in the survey paper. Moreover, the techniques are also categorized on the basis of common approach used.

  20. Description-based and experience-based decisions: individual analysis

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2012-05-01

    Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.

  1. From scientifically based research to evidence based learning

    Directory of Open Access Journals (Sweden)

    Rosa Cera

    2016-02-01

    Full Text Available This essay is a reflection on the peculiarities of the scientifically based research and on the distinctive elements of the EBL (evidence based learning, methodology used in the study on the “Relationship between Metacognition, Self-efficacy and Self-regulation in Learning”. The EBL method, based on the standardization of data, explains how the students’ learning experience can be considered as a set of “data” and can be used to explain how and when the research results can be considered generalizable and transferable to other learning situations. The reflections present in this study have also allowed us to illustrate the impact that its results have had on the micro and macro level of reality. They helped to fill in the gaps concerning the learning/teaching processes, contributed to the enrichment of the scientific literature on this subject and allowed to establish standards through rigorous techniques such as systematic reviews and meta-analysis.

  2. Image based Monument Recognition using Graph based Visual Saliency

    DEFF Research Database (Denmark)

    Kalliatakis, Grigorios; Triantafyllidis, Georgios

    2013-01-01

    This article presents an image-based application aiming at simple image classification of well-known monuments in the area of Heraklion, Crete, Greece. This classification takes place by utilizing Graph Based Visual Saliency (GBVS) and employing Scale Invariant Feature Transform (SIFT) or Speeded...... Up Robust Features (SURF). For this purpose, images taken at various places of interest are being compared to an existing database containing images of these places at different angles and zoom. The time required for the matching progress in such application is an important element. To this goal......, the images have been previously processed according to the Graph Based Visual Saliency model in order to keep either SIFT or SURF features corresponding to the actual monuments while the background “noise” is minimized. The application is then able to classify these images, helping the user to better...

  3. Nasal base narrowing: the combined alar base excision technique.

    Science.gov (United States)

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  4. Generalized eigenvalue based spectrum sensing

    KAUST Repository

    Shakir, Muhammad

    2012-01-01

    Spectrum sensing is one of the fundamental components in cognitive radio networks. In this chapter, a generalized spectrum sensing framework which is referred to as Generalized Mean Detector (GMD) has been introduced. In this context, we generalize the detectors based on the eigenvalues of the received signal covariance matrix and transform the eigenvalue based spectrum sensing detectors namely: (i) the Eigenvalue Ratio Detector (ERD) and two newly proposed detectors which are referred to as (ii) the GEometric Mean Detector (GEMD) and (iii) the ARithmetic Mean Detector (ARMD) into an unified framework of generalize spectrum sensing. The foundation of the proposed framework is based on the calculation of exact analytical moments of the random variables of the decision threshold of the respective detectors. The decision threshold has been calculated in a closed form which is based on the approximation of Cumulative Distribution Functions (CDFs) of the respective test statistics. In this context, we exchange the analytical moments of the two random variables of the respective test statistics with the moments of the Gaussian (or Gamma) distribution function. The performance of the eigenvalue based detectors is compared with the several traditional detectors including the energy detector (ED) to validate the importance of the eigenvalue based detectors and the performance of the GEMD and the ARMD particularly in realistic wireless cognitive radio network. Analytical and simulation results show that the newly proposed detectors yields considerable performance advantage in realistic spectrum sensing scenarios. Moreover, the presented results based on proposed approximation approaches are in perfect agreement with the empirical results. © 2012 Springer Science+Business Media Dordrecht.

  5. Shear Bond Strength of Bracket Bases to Adhesives Based on Bracket Base Design

    Science.gov (United States)

    2016-04-13

    in vitro comparison with foil-mesh. European Journal of Orthodontics 1989; 11:144- 153. Retief, DH.; Sadowsky, PL. Clinical experience with the...strength that is clinically acceptable for performing orthodontics (Reynolds 1975). Modern orthodontic shear bond strength studies generally report bond...bases, in addition to their claimed equal or superior bond strengths with traditional mesh bases, become important in both clinical orthodontics and

  6. Antibacterial and antifungal metal based triazole Schiff bases.

    Science.gov (United States)

    Chohan, Zahid H; Hanif, Muhammad

    2013-10-01

    A new series of four biologically active triazole derived Schiff base ligands (L(1)-L(4)) and their cobalt(II), nickel(II), copper(II) and zinc(II) complexes (1-16) have been synthesized and characterized. The ligands were prepared by the condensation reaction of 3-amino-5-methylthio-1H-1,2,4-triazole with chloro-, bromo- and nitro-substituted 2-hydroxybenzaldehyde in an equimolar ratio. The antibacterial and antifungal bioactivity data showed the metal(II) complexes to be more potent antibacterial and antifungal than the parent Schiff bases against one or more bacterial and fungal species.

  7. Characteristic properties of Fibonacci-based mutually unbiased bases

    Energy Technology Data Exchange (ETDEWEB)

    Seyfarth, Ulrich; Alber, Gernot [Institut fuer Angewandte Physik, Technische Universitaet Darmstadt, 64289 Darmstadt (Germany); Ranade, Kedar [Institut fuer Quantenphysik, Universitaet Ulm, Albert-Einstein-Allee 11, 89069 Ulm (Germany)

    2012-07-01

    Complete sets of mutually unbiased bases (MUBs) offer interesting applications in quantum information processing ranging from quantum cryptography to quantum state tomography. Different construction schemes provide different perspectives on these bases which are typically also deeply connected to various mathematical research areas. In this talk we discuss characteristic properties resulting from a recently established connection between construction methods for cyclic MUBs and Fibonacci polynomials. As a remarkable fact this connection leads to construction methods which do not involve any relations to mathematical properties of finite fields.

  8. PBW bases and KLR algebras

    CERN Document Server

    Kato, Syu

    2012-01-01

    We generalize Lusztig's geometric construction of the PBW bases of finite quantum groups of type $\\mathsf{ADE}$ under the framework of [Varagnolo-Vasserot, J. reine angew. Math. 659 (2011)]. In particular, every PBW base of such quantum groups are proven to yield a orthogonal collection in the module category of KLR-algebras. This enables us to prove Lusztig's conjecture on the positivity of the canonical (lower global) bases in terms of the (lower) PBW bases, and Kashiwara's problem on the finiteness of the global dimensions of KLR-algebras in the $\\mathsf{ADE}$ case. To achieve our goal, we develop a general formulation which guarantees nice properties of extension algebras, including a new criteria of purity of weights. (This part also applies to quiver Schur algebras.) In the appendix, we provide a proof of Shoji's conjecture on limit symbols of type $\\mathsf{B}$ [Shoji, Adv. Stud. Pure Math. 40 (2004)] based on the general formulation developed in this paper.

  9. Word-Based Text Compression

    CERN Document Server

    Platos, Jan

    2008-01-01

    Today there are many universal compression algorithms, but in most cases is for specific data better using specific algorithm - JPEG for images, MPEG for movies, etc. For textual documents there are special methods based on PPM algorithm or methods with non-character access, e.g. word-based compression. In the past, several papers describing variants of word-based compression using Huffman encoding or LZW method were published. The subject of this paper is the description of a word-based compression variant based on the LZ77 algorithm. The LZ77 algorithm and its modifications are described in this paper. Moreover, various ways of sliding window implementation and various possibilities of output encoding are described, as well. This paper also includes the implementation of an experimental application, testing of its efficiency and finding the best combination of all parts of the LZ77 coder. This is done to achieve the best compression ratio. In conclusion there is comparison of this implemented application wi...

  10. Summary of LOGDEX data base

    Energy Technology Data Exchange (ETDEWEB)

    Hill, T.; Sepehrnoori, K.

    1981-08-01

    A summary of LOGDEX, the digitized well log data base maintained by the Center for Energy Studies at The University of Texas at Austin is presented. These well logs were obtained from various oil companies and then converted from paper well logs to numeric information on magnetic computer tapes for input into the well log data base. This data base serves as a resource for application programs in the study of geopressured geothermal energy resources, for well logging research, and for geological research. Currently the location and scope of well log data that may be found within the LOGDEX data base are limited to wells along the Texas-Louisiana Gulf Coast that are known to have a potential as a geopressured geothermal energy resource. Additionally the location of these wells in that area is highly localized into areas that have been defined by Department of Energy researchers as having a high potential for geopressured geothermal energy. The LOGDEX data base currently contains data from more than 350 wells, representing more than 1600 logs and 16,600,000 curve feet of data. For quick reference to a given log, the summary listing has been indexed into seven divisions: well classification, location by county or parish, curve type, log type, operators, location by state, and well names. These indexes are arranged alphabetically and cross-referenced by page number.

  11. Mechanical Geometry Theorem Proving Based on Groebner Bases

    Institute of Scientific and Technical Information of China (English)

    吴尽昭

    1997-01-01

    A new method for the mechanical elementary geometry theorem proving is presented by using Groebner bases of polynomial ideals.It has two main advantages over the approach proposed in literature:(i)It is complete and not a refutational procdure;(ii) The subcases of the geometry statements which are not generally true can be differentiated clearly.

  12. solar-based boost differen based boost differen ased boost ...

    African Journals Online (AJOL)

    eobe

    known as a Solar known as a Solar-based boost differential single ph ... vital to condition it to work at a particu its exponential .... problems of low and variable output voltage of PV and ..... was verified in the simulink within the Math lab 2007a.

  13. Secure Base Priming Diminishes Conflict-Based Anger and Anxiety

    Science.gov (United States)

    Koren, Tamara; Bartholomew, Kim

    2016-01-01

    This study examines the impact of a visual representation of a secure base (i.e. a secure base prime) on attenuating experimentally produced anger and anxiety. Specifically, we examined the assuaging of negative emotions through exposure to an image of a mother-infant embrace or a heterosexual couple embracing. Subjects seated at a computer terminal rated their affect (Pre Affect) using the Affect Adjective Checklist (AAC) then listened to two sets of intense two person conflicts. After the first conflict exposure they rated affect again (Post 1 AAC). Following the second exposure they saw a blank screen (control condition), pictures of everyday objects (distraction condition) or a photo of two people embracing (Secure Base Prime condition). They then reported emotions using the Post 2 AAC. Compared to either control or distraction subjects, Secure Base Prime (SBP) subjects reported significantly less anger and anxiety. These results were then replicated using an internet sample with control, SBP and two new controls: Smiling Man (to control for expression of positive affect) and Cold Mother (an unsmiling mother with infant). The SBP amelioration of anger and anxiety was replicated with the internet sample. No control groups produced this effect, which was generated only by a combination of positive affect in a physically embracing dyad. The results are discussed in terms of attachment theory and research on spreading activation. PMID:27606897

  14. Mo-Si-B-Based Coatings for Ceramic Base Substrates

    Science.gov (United States)

    Perepezko, John Harry (Inventor); Sakidja, Ridwan (Inventor); Ritt, Patrick (Inventor)

    2015-01-01

    Alumina-containing coatings based on molybdenum (Mo), silicon (Si), and boron (B) ("MoSiB coatings") that form protective, oxidation-resistant scales on ceramic substrate at high temperatures are provided. The protective scales comprise an aluminoborosilicate glass, and may additionally contain molybdenum. Two-stage deposition methods for forming the coatings are also provided.

  15. Consumption-based Equity Valuation

    DEFF Research Database (Denmark)

    Bach, Christian; Christensen, Peter Ove

    2013-01-01

    the standard valuation models in most dimensions. We further show that the standard CAPM and the Fama-French three-factor based approaches to risk-adjustment substantially overestimate the cost of risk. This error more than offsets yet another error, which is committed when using analysts' forecasts of long-term...... growth, which are three to four times higher than what can be considered to be empirically reasonable. Using the CCAPM-based approach to risk-adjustment in the numerator, the results are consistent with investors being very conservative in their valuation of long-term value creation but also very......Using a CCAPM-based risk-adjustment model consistent with general asset pricing theory, we perform yearly valuations of a large sample of stocks listed on NYSE, AMEX and NASDAQ over a thirty-year period. The model differs from standard valuation models in the sense that it adjusts forecasted...

  16. Polyomino-Based Digital Halftoning

    CERN Document Server

    Vanderhaeghe, David

    2008-01-01

    In this work, we present a new method for generating a threshold structure. This kind of structure can be advantageously used in various halftoning algorithms such as clustered-dot or dispersed-dot dithering, error diffusion with threshold modulation, etc. The proposed method is based on rectifiable polyominoes -- a non-periodic hierarchical structure, which tiles the Euclidean plane with no gaps. Each polyomino contains a fixed number of discrete threshold values. Thanks to its inherent non-periodic nature combined with off-line optimization of threshold values, our polyomino-based threshold structure shows blue-noise spectral properties. The halftone images produced with this threshold structure have high visual quality. Although the proposed method is general, and can be applied on any polyomino tiling, we consider one particular case: tiling with G-hexominoes. We compare our polyomino-based threshold structure with the best known state-of-the-art methods for generation threshold matrices, and conclude con...

  17. Photoelectrochemical based direct conversion systems

    Energy Technology Data Exchange (ETDEWEB)

    Kocha, S.; Arent, D.; Peterson, M. [National Renewable Energy Lab., Golden, CO (United States)] [and others

    1995-09-01

    The goal of this research is to develop a stable, cost effective, photoelectrochemical based system that will split water upon illumination, producing hydrogen and oxygen directly, using sunlight as the only energy input. This type of direct conversion system combines a photovoltaic material and an electrolyzer into a single monolithic device. We report on our studies of two multifunction multiphoton photoelectrochemical devices, one based on the ternary semiconductor gallium indium phosphide, (GaInP{sub 2}), and the other one based on amorphous silicon carbide. We also report on our studies of the solid state surface treatment of GaInP{sub 2} as well as our continuing effort to develop synthetic techniques for the attachment of transition metal complexes to the surface of semiconductor electrodes. All our surface studies are directed at controlling the interface energetics and forming stable catalytic surfaces.

  18. Measurement-based quantum repeaters

    CERN Document Server

    Zwerger, M; Briegel, H J

    2012-01-01

    We introduce measurement-based quantum repeaters, where small-scale measurement-based quantum processors are used to perform entanglement purification and entanglement swapping in a long-range quantum communication protocol. In the scheme, pre-prepared entangled states stored at intermediate repeater stations are coupled with incoming photons by simple Bell-measurements, without the need of performing additional quantum gates or measurements. We show how to construct the required resource states, and how to minimize their size. We analyze the performance of the scheme under noise and imperfections, with focus on small-scale implementations involving entangled states of few qubits. We find measurement-based purification protocols with significantly improved noise thresholds. Furthermore we show that already resource states of small size suffice to significantly increase the maximal communication distance. We also discuss possible advantages of our scheme for different set-ups.

  19. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  20. Graphene-based smart materials

    Science.gov (United States)

    Yu, Xiaowen; Cheng, Huhu; Zhang, Miao; Zhao, Yang; Qu, Liangti; Shi, Gaoquan

    2017-09-01

    The high specific surface area and the excellent mechanical, electrical, optical and thermal properties of graphene make it an attractive component for high-performance stimuli-responsive or 'smart' materials. Complementary to these inherent properties, functionalization or hybridization can substantially improve the performance of these materials. Typical graphene-based smart materials include mechanically exfoliated perfect graphene, chemical vapour deposited high-quality graphene, chemically modified graphene (for example, graphene oxide and reduced graphene oxide) and their macroscopic assemblies or composites. These materials are sensitive to a range of stimuli, including gas molecules or biomolecules, pH value, mechanical strain, electrical field, and thermal or optical excitation. In this Review, we outline different graphene-based smart materials and their potential applications in actuators, chemical or strain sensors, self-healing materials, photothermal therapy and controlled drug delivery. We also introduce the working mechanisms of graphene-based smart materials and discuss the challenges facing the realization of their practical applications.

  1. [Cell based therapy for COPD].

    Science.gov (United States)

    Kubo, Hiroshi

    2007-04-01

    To develop a new cell based therapy for chronic obstructive pulmonary disease (COPD), we need to understand 1) the role of tissue-specific and bone marrow-derived stem cells, 2) extracellular matrix, and 3) growth factors. Recently, bronchioalveolar stem cells were identified in murine distal lungs. Impairment of these stem cells may cause improper lung repair after inflammation, resulting in pulmonary emphysema. Bone marrow-derived cells are necessary to repair injured lungs. However, the long term role of these cells is not understood yet. Although we need more careful analysis and additional experiments, growth factors, such as hepatocyte growth factor, are good candidates for the new cell based therapy for COPD. Lung was believed as a non-regenerative organ. Based on these recent reports about lung regeneration and stem cells, however, new strategies to treat COPD and a new point of view to understand the pathophysiology of COPD are rising.

  2. Solar Panel based Milk Pasteurization

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Pedersen, Tom Søndergaard

    This paper treats the subject of analysis, design and development of the control system for a solar panel based milk pasteurization system to be used in small villages in Tanzania. The analysis deals with the demands for an acceptable pasteurization, the varying energy supply and the low cost, low...... complexity, simple user interface and high reliability demands. Based on these demands a concept for the pasteurization system is established and a control system is developed. A solar panel has been constructed and the energy absorption has been tested in Tanzania. Based on the test, the pasteurization...... system is dimensioned. A functional prototype of the pasteurization facility with a capacity of 200 l milk/hour has been developed and tested. The system is prepared for solar panels as the main energy source and is ready for a test in Tanzania....

  3. Solar Panel based Milk Pasteurization

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Pedersen, Tom Søndergaard

    2002-01-01

    This paper treats the subject of analysis, design and development of the control system for a solar panel based milk pasteurization system to be used in small villages in Tanzania. The analysis deals with the demands for an acceptable pasteurization, the varying energy supply and the low cost, low...... complexity, simple user interface and high reliability demands. Based on these demands a concept for the pasteurization system is established and a control system is developed. A solar panel has been constructed and the energy absorption has been tested in Tanzania. Based on the test, the pasteurization...... system is dimensioned. A functional prototype of the pasteurization facility with a capacity of 200 l milk/hour has been developed and tested. The system is prepared for solar panels as the main energy source and is ready for a test in Tanzania....

  4. Broadcast-Based Spatial Queries

    Institute of Scientific and Technical Information of China (English)

    Kwang-Jin Park; Moon-Bae Song; Chong-Sun Hwang

    2005-01-01

    Indexing techniques have been developed for wireless data broadcast environments, in order to conserve the scarce power resources of the mobile clients. However, the use of interleaved index segments in a broadcast cycle increases the average access latency for the clients. In this paper, the broadcast-based spatial query processing methods (BBS)are presented for the location-based services. In the BBS, broadcasted data objects are sorted sequentially based on their locations, and the server broadcasts the location dependent data along with an index segment. Then, a sequential prefetching and caching scheme is designed to reduce the query response time. The performance of this scheme is investigated in relation to various environmental variables, such as the distributions of the data objects, the average speed of the clients and the size of the service area.

  5. Musculoskeletal colloquialisms based on weapons.

    Science.gov (United States)

    Agrawal, Anuj

    2017-01-01

    Eponyms and colloquialisms are commonly used in orthopaedic literature and convey a great deal of information in a concise fashion. Several orthopaedic conditions have characteristic clinical or radiologic appearances, mimicking the appearance of certain arms or weapons. Most of these are easy to memorise and recognise, provided the orthopaedic surgeon is aware of the colloquialism and familiar with the appearance of the weapon on which it is based. Unfortunately, many such colloquialisms are based on traditional weapons no longer in current use, and their appearances are not familiar to most orthopaedists, creating confusion and difficulty in understanding them. In this paper, we have reviewed the musculoskeletal colloquialisms based on weapons, including a brief description of the weapon with illustrations, highlighting the importance of the colloquialism in diagnosis or treatment of musculoskeletal conditions.

  6. Classification of Base Sequences (+1,

    Directory of Open Access Journals (Sweden)

    Dragomir Ž. Ðoković

    2010-01-01

    Full Text Available Base sequences BS(+1, are quadruples of {±1}-sequences (;;;, with A and B of length +1 and C and D of length n, such that the sum of their nonperiodic autocor-relation functions is a -function. The base sequence conjecture, asserting that BS(+1, exist for all n, is stronger than the famous Hadamard matrix conjecture. We introduce a new definition of equivalence for base sequences BS(+1, and construct a canonical form. By using this canonical form, we have enumerated the equivalence classes of BS(+1, for ≤30. As the number of equivalence classes grows rapidly (but not monotonically with n, the tables in the paper cover only the cases ≤13.

  7. Probe-based data storage

    CERN Document Server

    Koelmans, Wabe W; Abelmann, L

    2015-01-01

    Probe-based data storage attracted many researchers from academia and industry, resulting in unprecendeted high data-density demonstrations. This topical review gives a comprehensive overview of the main contributions that led to the major accomplishments in probe-based data storage. The most investigated technologies are reviewed: topographic, phase-change, magnetic, ferroelectric and atomic and molecular storage. Also, the positioning of probes and recording media, the cantilever arrays and parallel readout of the arrays of cantilevers are discussed. This overview serves two purposes. First, it provides an overview for new researchers entering the field of probe storage, as probe storage seems to be the only way to achieve data storage at atomic densities. Secondly, there is an enormous wealth of invaluable findings that can also be applied to many other fields of nanoscale research such as probe-based nanolithography, 3D nanopatterning, solid-state memory technologies and ultrafast probe microscopy.

  8. Cereal based oral rehydration solutions.

    Science.gov (United States)

    Kenya, P R; Odongo, H W; Oundo, G; Waswa, K; Muttunga, J; Molla, A M; Nath, S K; Molla, A; Greenough, W B; Juma, R

    1989-07-01

    A total of 257 boys (age range 4-55 months), who had acute diarrhoea with moderate to severe dehydration, were randomly assigned to treatment with either the World Health Organisation/United Nations Childrens Fund (WHO/Unicef) recommended oral rehydration solution or cereal based oral rehydration solution made either of maize, millet, sorghum, or rice. After the initial rehydration was achieved patients were offered traditional weaning foods. Treatment with oral rehydration solution continued until diarrhoea stopped. Accurate intake and output was maintained throughout the study period. Efficacy of the treatment was compared between the different treatment groups in terms of intake of the solution, stool output, duration of diarrhoea after admission, and weight gain after 24, 48, and 72 hours, and after resolution of diarrhoea. Results suggest that all the cereal based solutions were as effective as glucose based standard oral rehydration solution in the treatment of diarrhoea.

  9. DNA-Based Nanopore Sensing.

    Science.gov (United States)

    Liu, Lei; Wu, Hai-Chen

    2016-12-05

    Nanopore sensing is an attractive, label-free approach that can measure single molecules. Although initially proposed for rapid and low-cost DNA sequencing, nanopore sensors have been successfully employed in the detection of a wide variety of substrates. Early successes were mostly achieved based on two main strategies by 1) creating sensing elements inside the nanopore through protein mutation and chemical modification or 2) using molecular adapters to enhance analyte recognition. Over the past five years, DNA molecules started to be used as probes for sensing rather than substrates for sequencing. In this Minireview, we highlight the recent research efforts of nanopore sensing based on DNA-mediated characteristic current events. As nanopore sensing is becoming increasingly important in biochemical and biophysical studies, DNA-based sensing may find wider applications in investigating DNA-involving biological processes. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. On the base sequence conjecture

    CERN Document Server

    Djokovic, Dragomir Z

    2010-01-01

    Let BS(m,n) denote the set of base sequences (A;B;C;D), with A and B of length m and C and D of length n. The base sequence conjecture (BSC) asserts that BS(n+1,n) exist (i.e., are non-empty) for all n. This is known to be true for n <= 36 and when n is a Golay number. We show that it is also true for n=37 and n=38. It is worth pointing out that BSC is stronger than the famous Hadamard matrix conjecture. In order to demonstrate the abundance of base sequences, we have previously attached to BS(n+1,n) a graph Gamma_n and computed the Gamma_n for n <= 27. We now extend these computations and determine the Gamma_n for n=28,...,35. We also propose a conjecture describing these graphs in general.

  11. Diversity-Based Boosting Algorithm

    Directory of Open Access Journals (Sweden)

    Jafar A. Alzubi

    2016-05-01

    Full Text Available Boosting is a well known and efficient technique for constructing a classifier ensemble. An ensemble is built incrementally by altering the distribution of training data set and forcing learners to focus on misclassification errors. In this paper, an improvement to Boosting algorithm called DivBoosting algorithm is proposed and studied. Experiments on several data sets are conducted on both Boosting and DivBoosting. The experimental results show that DivBoosting is a promising method for ensemble pruning. We believe that it has many advantages over traditional boosting method because its mechanism is not solely based on selecting the most accurate base classifiers but also based on selecting the most diverse set of classifiers.

  12. Isotope-based quantum information

    CERN Document Server

    G Plekhanov, Vladimir

    2012-01-01

    The present book provides to the main ideas and techniques of the rapid progressing field of quantum information and quantum computation using isotope - mixed materials. It starts with an introduction to the isotope physics and then describes of the isotope - based quantum information and quantum computation. The ability to manipulate and control electron and/or nucleus spin in semiconductor devices provides a new route to expand the capabilities of inorganic semiconductor-based electronics and to design innovative devices with potential application in quantum computing. One of the major challenges towards these objectives is to develop semiconductor-based systems and architectures in which the spatial distribution of spins and their properties can be controlled. For instance, to eliminate electron spin decoherence resulting from hyperfine interaction due to nuclear spin background, isotopically controlled devices are needed (i.e., nuclear spin-depleted). In other emerging concepts, the control of the spatial...

  13. Location-based Web Search

    Science.gov (United States)

    Ahlers, Dirk; Boll, Susanne

    In recent years, the relation of Web information to a physical location has gained much attention. However, Web content today often carries only an implicit relation to a location. In this chapter, we present a novel location-based search engine that automatically derives spatial context from unstructured Web resources and allows for location-based search: our focused crawler applies heuristics to crawl and analyze Web pages that have a high probability of carrying a spatial relation to a certain region or place; the location extractor identifies the actual location information from the pages; our indexer assigns a geo-context to the pages and makes them available for a later spatial Web search. We illustrate the usage of our spatial Web search for location-based applications that provide information not only right-in-time but also right-on-the-spot.

  14. 14 CFR 119.47 - Maintaining a principal base of operations, main operations base, and main maintenance base...

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Maintaining a principal base of operations, main operations base, and main maintenance base; change of address. 119.47 Section 119.47 Aeronautics... operations base, and main maintenance base; change of address. (a) Each certificate holder must maintain...

  15. Laser-based coatings removal

    Energy Technology Data Exchange (ETDEWEB)

    Freiwald, J.G.; Freiwald, D.A. [F2 Associates, Inc., Albuquerque, NM (United States)

    1995-10-01

    Over the years as building and equipment surfaces became contaminated with low levels of uranium or plutonium dust, coats of paint were applied to stabilize the contaminants in place. Most of the earlier paint used was lead-based paint. More recently, various non-lead-based paints, such as two-part epoxy, are used. For D&D (decontamination and decommissioning), it is desirable to remove the paints or other coatings rather than having to tear down and dispose of the entire building. This report describes the use of pulse-repetetion laser systems for the removal of paints and coatings.

  16. Inquiry-based science education

    DEFF Research Database (Denmark)

    Østergaard, Lars Domino; Sillasen, Martin Krabbe; Hagelskjær, Jens

    2010-01-01

    Inquiry-based science education (IBSE) er en internationalt afprøvet naturfagsdidaktisk metode der har til formål at øge elevernes interesse for og udbytte af naturfag. I artiklen redegøres der for metoden, der kan betegnes som en elevstyret problem- og undersøgelsesbaseret naturfagsundervisnings......Inquiry-based science education (IBSE) er en internationalt afprøvet naturfagsdidaktisk metode der har til formål at øge elevernes interesse for og udbytte af naturfag. I artiklen redegøres der for metoden, der kan betegnes som en elevstyret problem- og undersøgelsesbaseret...

  17. HISTORY BASED PROBABILISTIC BACKOFF ALGORITHM

    Directory of Open Access Journals (Sweden)

    Narendran Rajagopalan

    2012-01-01

    Full Text Available Performance of Wireless LAN can be improved at each layer of the protocol stack with respect to energy efficiency. The Media Access Control layer is responsible for the key functions like access control and flow control. During contention, Backoff algorithm is used to gain access to the medium with minimum probability of collision. After studying different variations of back off algorithms that have been proposed, a new variant called History based Probabilistic Backoff Algorithm is proposed. Through mathematical analysis and simulation results using NS-2, it is seen that proposed History based Probabilistic Backoff algorithm performs better than Binary Exponential Backoff algorithm.

  18. Flexible experimental FPGA based platform

    DEFF Research Database (Denmark)

    Andersen, Karsten Holm; Nymand, Morten

    2016-01-01

    This paper presents an experimental flexible Field Programmable Gate Array (FPGA) based platform for testing and verifying digital controlled dc-dc converters. The platform supports different types of control strategies, dc-dc converter topologies and switching frequencies. The controller platform...... interface supporting configuration and reading of setup parameters, controller status and the acquisition memory in a simple way. The FPGA based platform, provides an easy way within education or research to use different digital control strategies and different converter topologies controlled by an FPGA...

  19. Knowledge-based nursing diagnosis

    Science.gov (United States)

    Roy, Claudette; Hay, D. Robert

    1991-03-01

    Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.

  20. Community-based recreational football

    DEFF Research Database (Denmark)

    Bruun, Ditte Marie; Bjerre, Eik; Krustrup, Peter

    2014-01-01

    is limited and the majority of prostate cancer survivors remain sedentary. Hence, novel approaches to evaluate and promote physical activity are warranted. This paper presents the rationale behind the delivery and evaluation of community-based recreational football offered in existing football clubs under...... the Danish Football Association to promote quality of life and physical activity adherence in prostate cancer survivors. The RE-AIM framework will be applied to evaluate the impact of the intervention including outcomes both at the individual and organizational level. By introducing community-based sport...