WorldWideScience

Sample records for video coding standards

  1. The emerging High Efficiency Video Coding standard (HEVC)

    International Nuclear Information System (INIS)

    Raja, Gulistan; Khan, Awais

    2013-01-01

    High definition video (HDV) is becoming popular day by day. This paper describes the performance analysis of latest upcoming video standard known as High Efficiency Video Coding (HEVC). HEVC is designed to fulfil all the requirements for future high definition videos. In this paper, three configurations (intra only, low delay and random access) of HEVC are analyzed using various 480p, 720p and 1080p high definition test video sequences. Simulation results show the superior objective and subjective quality of HEVC

  2. Video Coding Technique using MPEG Compression Standards

    African Journals Online (AJOL)

    Akorede

    The two dimensional discrete cosine transform (2-D DCT) is an integral part of video and image compression ... solution for the optimum trade-off by applying rate-distortion theory has been ..... Int. J. the computer, the internet and management,.

  3. Improved entropy encoding for high efficient video coding standard

    Directory of Open Access Journals (Sweden)

    B.S. Sunil Kumar

    2018-03-01

    Full Text Available The High Efficiency Video Coding (HEVC has better coding efficiency, but the encoding performance has to be improved to meet the growing multimedia applications. This paper improves the standard entropy encoding by introducing the optimized weighing parameters, so that higher rate of compression can be accomplished over the standard entropy encoding. The optimization is performed using the recently introduced firefly algorithm. The experimentation is carried out using eight benchmark video sequences and the PSNR for varying rate of data transmission is investigated. Comparative analysis based on the performance statistics is made with the standard entropy encoding. From the obtained results, it is clear that the originality of the decoded video sequence is preserved far better than the proposed method, though the compression rate is increased. Keywords: Entropy, Encoding, HEVC, PSNR, Compression

  4. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  5. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    Science.gov (United States)

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  6. 3D video coding: an overview of present and upcoming standards

    Science.gov (United States)

    Merkle, Philipp; Müller, Karsten; Wiegand, Thomas

    2010-07-01

    An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.

  7. Basic prediction techniques in modern video coding standards

    CERN Document Server

    Kim, Byung-Gyu

    2016-01-01

    This book discusses in detail the basic algorithms of video compression that are widely used in modern video codec. The authors dissect complicated specifications and present material in a way that gets readers quickly up to speed by describing video compression algorithms succinctly, without going to the mathematical details and technical specifications. For accelerated learning, hybrid codec structure, inter- and intra- prediction techniques in MPEG-4, H.264/AVC, and HEVC are discussed together. In addition, the latest research in the fast encoder design for the HEVC and H.264/AVC is also included.

  8. Subjective Video Quality Assessment in H.264/AVC Video Coding Standard

    Directory of Open Access Journals (Sweden)

    Z. Miličević

    2012-11-01

    Full Text Available This paper seeks to provide an approach for subjective video quality assessment in the H.264/AVC standard. For this purpose a special software program for the subjective assessment of quality of all the tested video sequences is developed. It was developed in accordance with recommendation ITU-T P.910, since it is suitable for the testing of multimedia applications. The obtained results show that in the proposed selective intra prediction and optimized inter prediction algorithm there is a small difference in picture quality (signal-to-noise ratio between decoded original and modified video sequences.

  9. Video coding standards AVS China, H.264/MPEG-4 PART 10, HEVC, VP6, DIRAC and VC-1

    CERN Document Server

    Rao, K R; Hwang, Jae Jeong

    2014-01-01

    Review by Ashraf A. Kassim, Professor, Department of Electrical & Computer Engineering, and Associate Dean, School of Engineering, National University of Singapore.     The book consists of eight chapters of which the first two provide an overview of various video & image coding standards, and video formats. The next four chapters present in detail the Audio & video standard (AVS) of China, the H.264/MPEG-4 Advanced video coding (AVC) standard, High efficiency video coding (HEVC) standard and the VP6 video coding standard (now VP10) respectively. The performance of the wavelet based Dirac video codec is compared with H.264/MPEG-4 AVC in chapter 7. Finally in chapter 8, the VC-1 video coding standard is presented together with VC-2 which is based on the intra frame coding of Dirac and an outline of a H.264/AVC to VC-1 transcoder.   The authors also present and discuss relevant research literature such as those which document improved methods & techniques, and also point to other related reso...

  10. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  11. High efficiency video coding coding tools and specification

    CERN Document Server

    Wien, Mathias

    2015-01-01

    The video coding standard High Efficiency Video Coding (HEVC) targets at improved compression performance for video resolutions of HD and beyond, providing Ultra HD video at similar compressed bit rates as for HD video encoded with the well-established video coding standard H.264 | AVC. Based on known concepts, new coding structures and improved coding tools have been developed and specified in HEVC. The standard is expected to be taken up easily by established industry as well as new endeavors, answering the needs of todays connected and ever-evolving online world. This book presents the High Efficiency Video Coding standard and explains it in a clear and coherent language. It provides a comprehensive and consistently written description, all of a piece. The book targets at both, newbies to video coding as well as experts in the field. While providing sections with introductory text for the beginner, it suits as a well-arranged reference book for the expert. The book provides a comprehensive reference for th...

  12. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  13. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  14. Code domain steganography in video tracks

    Science.gov (United States)

    Rymaszewski, Sławomir

    2008-01-01

    This article is dealing with a practical method of hiding secret information in video stream. Method is dedicated for MPEG-2 stream. The algorithm takes to consider not only MPEG video coding scheme described in standard but also bits PES-packets encapsulation in MPEG-2 Program Stream (PS). This modification give higher capacity and more effective bit rate control for output stream than previously proposed methods.

  15. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...... cross band correlation and increase local adaptivity in noise modeling. During decoding, the updated information is used to iteratively reestimate the motion and reconstruction in the proposed motion and reconstruction reestimation (MORE) scheme. The MORE scheme not only reestimates the motion vectors...

  16. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  17. The H.264/MPEG4 advanced video coding

    Science.gov (United States)

    Gromek, Artur

    2009-06-01

    H.264/MPEG4-AVC is the newest video coding standard recommended by International Telecommunication Union - Telecommunication Standardization Section (ITU-T) and the ISO/IEC Moving Picture Expert Group (MPEG). The H.264/MPEG4-AVC has recently become leading standard for generic audiovisual services, since deployment for digital television. Nowadays is commonly used in wide range of video application ranging like mobile services, videoconferencing, IPTV, HDTV, video storage and many more. In this article, author briefly describes the technology applied in the H.264/MPEG4-AVC video coding standard, the way of real-time implementation and the way of future development.

  18. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  19. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  20. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  1. On video formats and coding efficiency

    NARCIS (Netherlands)

    Bellers, E.B.; Haan, de G.

    2001-01-01

    This paper examines the efficiency of MPEG-2 coding for interlaced and progressive video, and compares de-interlacing and picture rate up-conversion before and after coding. We found receiver side de-interlacing and picture rate up-conversion (i.e. after coding) to give better image quality at a

  2. Adaptive format conversion for scalable video coding

    Science.gov (United States)

    Wan, Wade K.; Lim, Jae S.

    2001-12-01

    The enhancement layer in many scalable coding algorithms is composed of residual coding information. There is another type of information that can be transmitted instead of (or in addition to) residual coding. Since the encoder has access to the original sequence, it can utilize adaptive format conversion (AFC) to generate the enhancement layer and transmit the different format conversion methods as enhancement data. This paper investigates the use of adaptive format conversion information as enhancement data in scalable video coding. Experimental results are shown for a wide range of base layer qualities and enhancement bitrates to determine when AFC can improve video scalability. Since the parameters needed for AFC are small compared to residual coding, AFC can provide video scalability at low enhancement layer bitrates that are not possible with residual coding. In addition, AFC can also be used in addition to residual coding to improve video scalability at higher enhancement layer bitrates. Adaptive format conversion has not been studied in detail, but many scalable applications may benefit from it. An example of an application that AFC is well-suited for is the migration path for digital television where AFC can provide immediate video scalability as well as assist future migrations.

  3. Complexity-aware high efficiency video coding

    CERN Document Server

    Correa, Guilherme; Agostini, Luciano; Cruz, Luis A da Silva

    2016-01-01

    This book discusses computational complexity of High Efficiency Video Coding (HEVC) encoders with coverage extending from the analysis of HEVC compression efficiency and computational complexity to the reduction and scaling of its encoding complexity. After an introduction to the topic and a review of the state-of-the-art research in the field, the authors provide a detailed analysis of the HEVC encoding tools compression efficiency and computational complexity.  Readers will benefit from a set of algorithms for scaling the computational complexity of HEVC encoders, all of which take advantage from the flexibility of the frame partitioning structures allowed by the standard.  The authors also provide a set of early termination methods based on data mining and machine learning techniques, which are able to reduce the computational complexity required to find the best frame partitioning structures. The applicability of the proposed methods is finally exemplified with an encoding time control system that emplo...

  4. Communicating pictures a course in image and video coding

    CERN Document Server

    Bull, David R

    2014-01-01

    Communicating Pictures starts with a unique historical perspective of the role of images in communications and then builds on this to explain the applications and requirements of a modern video coding system. It draws on the author's extensive academic and professional experience of signal processing and video coding to deliver a text that is algorithmically rigorous, yet accessible, relevant to modern standards, and practical. It offers a thorough grounding in visual perception, and demonstrates how modern image and video compression methods can be designed in order to meet the rate-quality performance levels demanded by today's applications, networks and users. With this book you will learn: Practical issues when implementing a codec, such as picture boundary extension and complexity reduction, with particular emphasis on efficient algorithms for transforms, motion estimators and error resilience Conflicts between conventional video compression, based on variable length coding and spatiotemporal prediction,...

  5. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  6. Random Linear Network Coding for 5G Mobile Video Delivery

    Directory of Open Access Journals (Sweden)

    Dejan Vukobratovic

    2018-03-01

    Full Text Available An exponential increase in mobile video delivery will continue with the demand for higher resolution, multi-view and large-scale multicast video services. Novel fifth generation (5G 3GPP New Radio (NR standard will bring a number of new opportunities for optimizing video delivery across both 5G core and radio access networks. One of the promising approaches for video quality adaptation, throughput enhancement and erasure protection is the use of packet-level random linear network coding (RLNC. In this review paper, we discuss the integration of RLNC into the 5G NR standard, building upon the ideas and opportunities identified in 4G LTE. We explicitly identify and discuss in detail novel 5G NR features that provide support for RLNC-based video delivery in 5G, thus pointing out to the promising avenues for future research.

  7. High efficiency video coding (HEVC) algorithms and architectures

    CERN Document Server

    Budagavi, Madhukar; Sullivan, Gary

    2014-01-01

    This book provides developers, engineers, researchers and students with detailed knowledge about the High Efficiency Video Coding (HEVC) standard. HEVC is the successor to the widely successful H.264/AVC video compression standard, and it provides around twice as much compression as H.264/AVC for the same level of quality. The applications for HEVC will not only cover the space of the well-known current uses and capabilities of digital video – they will also include the deployment of new services and the delivery of enhanced video quality, such as ultra-high-definition television (UHDTV) and video with higher dynamic range, wider range of representable color, and greater representation precision than what is typically found today. HEVC is the next major generation of video coding design – a flexible, reliable and robust solution that will support the next decade of video applications and ease the burden of video on world-wide network traffic. This book provides a detailed explanation of the various parts ...

  8. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  9. Coding Transparency in Object-Based Video

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2006-01-01

    A novel algorithm for coding gray level alpha planes in object-based video is presented. The scheme is based on segmentation in multiple layers. Different coders are specifically designed for each layer. In order to reduce the bit rate, cross-layer redundancies as well as temporal correlation are...

  10. Coding visual features extracted from video sequences.

    Science.gov (United States)

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  11. Empirical Evaluation of Superposition Coded Multicasting for Scalable Video

    KAUST Repository

    Chun Pong Lau

    2013-03-01

    In this paper we investigate cross-layer superposition coded multicast (SCM). Previous studies have proven its effectiveness in exploiting better channel capacity and service granularities via both analytical and simulation approaches. However, it has never been practically implemented using a commercial 4G system. This paper demonstrates our prototype in achieving the SCM using a standard 802.16 based testbed for scalable video transmissions. In particular, to implement the superposition coded (SPC) modulation, we take advantage a novel software approach, namely logical SPC (L-SPC), which aims to mimic the physical layer superposition coded modulation. The emulation results show improved throughput comparing with generic multicast method.

  12. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore......, with DVC the complexity is shifted from encoder to decoder, making the coding architecture a viable solution for encoders with limited resources. DVC may empower new applications which can benefit from this reversed coding architecture. Multiview Distributed Video Coding (M-DVC) is the application...... of the to-be-decoded frame. Another key element is the Residual estimation, indicating the reliability of the SI, which is used to calculate the parameters of the correlation noise model between SI and original frame. In this thesis new methods for Inter-camera SI generation are analyzed in the Stereo...

  13. Video coding for decoding power-constrained embedded devices

    Science.gov (United States)

    Lu, Ligang; Sheinin, Vadim

    2004-01-01

    Low power dissipation and fast processing time are crucial requirements for embedded multimedia devices. This paper presents a technique in video coding to decrease the power consumption at a standard video decoder. Coupled with a small dedicated video internal memory cache on a decoder, the technique can substantially decrease the amount of data traffic to the external memory at the decoder. A decrease in data traffic to the external memory at decoder will result in multiple benefits: faster real-time processing and power savings. The encoder, given prior knowledge of the decoder"s dedicated video internal memory cache management scheme, regulates its choice of motion compensated predictors to reduce the decoder"s external memory accesses. This technique can be used in any standard or proprietary encoder scheme to generate a compliant output bit stream decodable by standard CPU-based and dedicated hardware-based decoders for power savings with the best quality-power cost trade off. Our simulation results show that with a relatively small amount of dedicated video internal memory cache, the technique may decrease the traffic between CPU and external memory over 50%.

  14. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  15. Efficient Enhancement for Spatial Scalable Video Coding Transmission

    Directory of Open Access Journals (Sweden)

    Mayada Khairy

    2017-01-01

    Full Text Available Scalable Video Coding (SVC is an international standard technique for video compression. It is an extension of H.264 Advanced Video Coding (AVC. In the encoding of video streams by SVC, it is suitable to employ the macroblock (MB mode because it affords superior coding efficiency. However, the exhaustive mode decision technique that is usually used for SVC increases the computational complexity, resulting in a longer encoding time (ET. Many other algorithms were proposed to solve this problem with imperfection of increasing transmission time (TT across the network. To minimize the ET and TT, this paper introduces four efficient algorithms based on spatial scalability. The algorithms utilize the mode-distribution correlation between the base layer (BL and enhancement layers (ELs and interpolation between the EL frames. The proposed algorithms are of two categories. Those of the first category are based on interlayer residual SVC spatial scalability. They employ two methods, namely, interlayer interpolation (ILIP and the interlayer base mode (ILBM method, and enable ET and TT savings of up to 69.3% and 83.6%, respectively. The algorithms of the second category are based on full-search SVC spatial scalability. They utilize two methods, namely, full interpolation (FIP and the full-base mode (FBM method, and enable ET and TT savings of up to 55.3% and 76.6%, respectively.

  16. Probabilistic Decision Based Block Partitioning for Future Video Coding

    KAUST Repository

    Wang, Zhao

    2017-11-29

    In the latest Joint Video Exploration Team development, the quadtree plus binary tree (QTBT) block partitioning structure has been proposed for future video coding. Compared to the traditional quadtree structure of High Efficiency Video Coding (HEVC) standard, QTBT provides more flexible patterns for splitting the blocks, which results in dramatically increased combinations of block partitions and high computational complexity. In view of this, a confidence interval based early termination (CIET) scheme is proposed for QTBT to identify the unnecessary partition modes in the sense of rate-distortion (RD) optimization. In particular, a RD model is established to predict the RD cost of each partition pattern without the full encoding process. Subsequently, the mode decision problem is casted into a probabilistic framework to select the final partition based on the confidence interval decision strategy. Experimental results show that the proposed CIET algorithm can speed up QTBT block partitioning structure by reducing 54.7% encoding time with only 1.12% increase in terms of bit rate. Moreover, the proposed scheme performs consistently well for the high resolution sequences, of which the video coding efficiency is crucial in real applications.

  17. Recent advances in multiview distributed video coding

    Science.gov (United States)

    Dufaux, Frederic; Ouaret, Mourad; Ebrahimi, Touradj

    2007-04-01

    We consider dense networks of surveillance cameras capturing overlapped images of the same scene from different viewing directions, such a scenario being referred to as multi-view. Data compression is paramount in such a system due to the large amount of captured data. In this paper, we propose a Multi-view Distributed Video Coding approach. It allows for low complexity / low power consumption at the encoder side, and the exploitation of inter-view correlation without communications among the cameras. We introduce a combination of temporal intra-view side information and homography inter-view side information. Simulation results show both the improvement of the side information, as well as a significant gain in terms of coding efficiency.

  18. Novel Intermode Prediction Algorithm for High Efficiency Video Coding Encoder

    Directory of Open Access Journals (Sweden)

    Chan-seob Park

    2014-01-01

    Full Text Available The joint collaborative team on video coding (JCT-VC is developing the next-generation video coding standard which is called high efficiency video coding (HEVC. In the HEVC, there are three units in block structure: coding unit (CU, prediction unit (PU, and transform unit (TU. The CU is the basic unit of region splitting like macroblock (MB. Each CU performs recursive splitting into four blocks with equal size, starting from the tree block. In this paper, we propose a fast CU depth decision algorithm for HEVC technology to reduce its computational complexity. In 2N×2N PU, the proposed method compares the rate-distortion (RD cost and determines the depth using the compared information. Moreover, in order to speed up the encoding time, the efficient merge SKIP detection method is developed additionally based on the contextual mode information of neighboring CUs. Experimental result shows that the proposed algorithm achieves the average time-saving factor of 44.84% in the random access (RA at Main profile configuration with the HEVC test model (HM 10.0 reference software. Compared to HM 10.0 encoder, a small BD-bitrate loss of 0.17% is also observed without significant loss of image quality.

  19. Scalable Video Coding with Interlayer Signal Decorrelation Techniques

    Directory of Open Access Journals (Sweden)

    Yang Wenxian

    2007-01-01

    Full Text Available Scalability is one of the essential requirements in the compression of visual data for present-day multimedia communications and storage. The basic building block for providing the spatial scalability in the scalable video coding (SVC standard is the well-known Laplacian pyramid (LP. An LP achieves the multiscale representation of the video as a base-layer signal at lower resolution together with several enhancement-layer signals at successive higher resolutions. In this paper, we propose to improve the coding performance of the enhancement layers through efficient interlayer decorrelation techniques. We first show that, with nonbiorthogonal upsampling and downsampling filters, the base layer and the enhancement layers are correlated. We investigate two structures to reduce this correlation. The first structure updates the base-layer signal by subtracting from it the low-frequency component of the enhancement layer signal. The second structure modifies the prediction in order that the low-frequency component in the new enhancement layer is diminished. The second structure is integrated in the JSVM 4.0 codec with suitable modifications in the prediction modes. Experimental results with some standard test sequences demonstrate coding gains up to 1 dB for I pictures and up to 0.7 dB for both I and P pictures.

  20. Expressing Youth Voice through Video Games and Coding

    Science.gov (United States)

    Martin, Crystle

    2017-01-01

    A growing body of research focuses on the impact of video games and coding on learning. The research often elevates learning the technical skills associated with video games and coding or the importance of problem solving and computational thinking, which are, of course, necessary and relevant. However, the literature less often explores how young…

  1. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  2. Video over DSL with LDGM Codes for Interactive Applications

    Directory of Open Access Journals (Sweden)

    Laith Al-Jobouri

    2016-05-01

    Full Text Available Digital Subscriber Line (DSL network access is subject to error bursts, which, for interactive video, can introduce unacceptable latencies if video packets need to be re-sent. If the video packets are protected against errors with Forward Error Correction (FEC, calculation of the application-layer channel codes themselves may also introduce additional latency. This paper proposes Low-Density Generator Matrix (LDGM codes rather than other popular codes because they are more suitable for interactive video streaming, not only for their computational simplicity but also for their licensing advantage. The paper demonstrates that a reduction of up to 4 dB in video distortion is achievable with LDGM Application Layer (AL FEC. In addition, an extension to the LDGM scheme is demonstrated, which works by rearranging the columns of the parity check matrix so as to make it even more resilient to burst errors. Telemedicine and video conferencing are typical target applications.

  3. Image and video compression for multimedia engineering fundamentals, algorithms, and standards

    CERN Document Server

    Shi, Yun Q

    2008-01-01

    Part I: Fundamentals Introduction Quantization Differential Coding Transform Coding Variable-Length Coding: Information Theory Results (II) Run-Length and Dictionary Coding: Information Theory Results (III) Part II: Still Image Compression Still Image Coding: Standard JPEG Wavelet Transform for Image Coding: JPEG2000 Nonstandard Still Image Coding Part III: Motion Estimation and Compensation Motion Analysis and Motion Compensation Block Matching Pel-Recursive Technique Optical Flow Further Discussion and Summary on 2-D Motion Estimation Part IV: Video Compression Fundam

  4. Perceptual coding of stereo endoscopy video for minimally invasive surgery

    Science.gov (United States)

    Bartoli, Guido; Menegaz, Gloria; Yang, Guang Zhong

    2007-03-01

    In this paper, we propose a compression scheme that is tailored for stereo-laparoscope sequences. The inter-frame correlation is modeled by the deformation field obtained by elastic registration between two subsequent frames and exploited for prediction of the left sequence. The right sequence is lossy encoded by prediction from the corresponding left images. Wavelet-based coding is applied to both the deformation vector fields and residual images. The resulting system supports spatio temporal scalability, while providing lossless performance. The implementation of the wavelet transform by integer lifting ensures a low computational complexity, thus reducing the required run-time memory allocation and on line implementation. Extensive psychovisual tests were performed for system validation and characterization with respect to the MPEG4 standard for video coding. Results are very encouraging: the PSVC system features the functionalities making it suitable for PACS while providing a good trade-off between usability and performance in lossy mode.

  5. Video processing for human perceptual visual quality-oriented video coding.

    Science.gov (United States)

    Oh, Hyungsuk; Kim, Wonha

    2013-04-01

    We have developed a video processing method that achieves human perceptual visual quality-oriented video coding. The patterns of moving objects are modeled by considering the limited human capacity for spatial-temporal resolution and the visual sensory memory together, and an online moving pattern classifier is devised by using the Hedge algorithm. The moving pattern classifier is embedded in the existing visual saliency with the purpose of providing a human perceptual video quality saliency model. In order to apply the developed saliency model to video coding, the conventional foveation filtering method is extended. The proposed foveation filter can smooth and enhance the video signals locally, in conformance with the developed saliency model, without causing any artifacts. The performance evaluation results confirm that the proposed video processing method shows reliable improvements in the perceptual quality for various sequences and at various bandwidths, compared to existing saliency-based video coding methods.

  6. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    Science.gov (United States)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  7. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  8. Video traffic characteristics of modern encoding standards: H.264/AVC with SVC and MVC extensions and H.265/HEVC.

    Science.gov (United States)

    Seeling, Patrick; Reisslein, Martin

    2014-01-01

    Video encoding for multimedia services over communication networks has significantly advanced in recent years with the development of the highly efficient and flexible H.264/AVC video coding standard and its SVC extension. The emerging H.265/HEVC video coding standard as well as 3D video coding further advance video coding for multimedia communications. This paper first gives an overview of these new video coding standards and then examines their implications for multimedia communications by studying the traffic characteristics of long videos encoded with the new coding standards. We review video coding advances from MPEG-2 and MPEG-4 Part 2 to H.264/AVC and its SVC and MVC extensions as well as H.265/HEVC. For single-layer (nonscalable) video, we compare H.265/HEVC and H.264/AVC in terms of video traffic and statistical multiplexing characteristics. Our study is the first to examine the H.265/HEVC traffic variability for long videos. We also illustrate the video traffic characteristics and statistical multiplexing of scalable video encoded with the SVC extension of H.264/AVC as well as 3D video encoded with the MVC extension of H.264/AVC.

  9. Coding the Complexity of Activity in Video Recordings

    DEFF Research Database (Denmark)

    Harter, Christopher Daniel; Otrel-Cass, Kathrin

    2017-01-01

    This paper presents a theoretical approach to coding and analyzing video data on human interaction and activity, using principles found in cultural historical activity theory. The systematic classification or coding of information contained in video data on activity can be arduous and time...... Bødker’s in 1996, three possible areas of expansion to Susanne Bødker’s method for analyzing video data were found. Firstly, a technological expansion due to contemporary developments in sophisticated analysis software, since the mid 1990’s. Secondly, a conceptual expansion, where the applicability...... of using Activity Theory outside of the context of human–computer interaction, is assessed. Lastly, a temporal expansion, by facilitating an organized method for tracking the development of activities over time, within the coding and analysis of video data. To expand on the above areas, a prototype coding...

  10. Film grain noise modeling in advanced video coding

    Science.gov (United States)

    Oh, Byung Tae; Kuo, C.-C. Jay; Sun, Shijun; Lei, Shawmin

    2007-01-01

    A new technique for film grain noise extraction, modeling and synthesis is proposed and applied to the coding of high definition video in this work. The film grain noise is viewed as a part of artistic presentation by people in the movie industry. On one hand, since the film grain noise can boost the natural appearance of pictures in high definition video, it should be preserved in high-fidelity video processing systems. On the other hand, video coding with film grain noise is expensive. It is desirable to extract film grain noise from the input video as a pre-processing step at the encoder and re-synthesize the film grain noise and add it back to the decoded video as a post-processing step at the decoder. Under this framework, the coding gain of the denoised video is higher while the quality of the final reconstructed video can still be well preserved. Following this idea, we present a method to remove film grain noise from image/video without distorting its original content. Besides, we describe a parametric model containing a small set of parameters to represent the extracted film grain noise. The proposed model generates the film grain noise that is close to the real one in terms of power spectral density and cross-channel spectral correlation. Experimental results are shown to demonstrate the efficiency of the proposed scheme.

  11. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  12. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  13. Extending JPEG-LS for low-complexity scalable video coding

    DEFF Research Database (Denmark)

    Ukhanova, Anna; Sergeev, Anton; Forchhammer, Søren

    2011-01-01

    JPEG-LS, the well-known international standard for lossless and near-lossless image compression, was originally designed for non-scalable applications. In this paper we propose a scalable modification of JPEG-LS and compare it with the leading image and video coding standards JPEG2000 and H.264/SVC...

  14. Method and device for decoding coded digital video signals

    NARCIS (Netherlands)

    2000-01-01

    The invention relates to a video coding method and system including a quantization and coding sub-assembly (38) in which a quantization parameter is controlled by another parameter defined as being in direct relation with the dynamic range value of the data contained in given blocks of pixels.

  15. Context based Coding of Quantized Alpha Planes for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2002-01-01

    In object based video, each frame is a composition of objects that are coded separately. The composition is performed through the alpha plane that represents the transparency of the object. We present an alternative to MPEG-4 for coding of alpha planes that considers their specific properties....... Comparisons in terms of rate and distortion are provided, showing that the proposed coding scheme for still alpha planes is better than the algorithms for I-frames used in MPEG-4....

  16. Least-Square Prediction for Backward Adaptive Video Coding

    Directory of Open Access Journals (Sweden)

    Li Xin

    2006-01-01

    Full Text Available Almost all existing approaches towards video coding exploit the temporal redundancy by block-matching-based motion estimation and compensation. Regardless of its popularity, block matching still reflects an ad hoc understanding of the relationship between motion and intensity uncertainty models. In this paper, we present a novel backward adaptive approach, named "least-square prediction" (LSP, and demonstrate its potential in video coding. Motivated by the duality between edge contour in images and motion trajectory in video, we propose to derive the best prediction of the current frame from its causal past using least-square method. It is demonstrated that LSP is particularly effective for modeling video material with slow motion and can be extended to handle fast motion by temporal warping and forward adaptation. For typical QCIF test sequences, LSP often achieves smaller MSE than , full-search, quarter-pel block matching algorithm (BMA without the need of transmitting any overhead.

  17. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  18. A robust fusion method for multiview distributed video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Ascenso, Joao; Brites, Catarina

    2014-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the redundancy of the source (video) at the decoder side, as opposed to predictive coding, where the encoder leverages the redundancy. To exploit the correlation between views, multiview predictive video codecs require the encoder...... with a robust fusion system able to improve the quality of the fused SI along the decoding process through a learning process using already decoded data. We shall here take the approach to fuse the estimated distributions of the SIs as opposed to a conventional fusion algorithm based on the fusion of pixel...... values. The proposed solution is able to achieve gains up to 0.9 dB in Bjøntegaard difference when compared with the best-performing (in a RD sense) single SI DVC decoder, chosen as the best of an inter-view and a temporal SI-based decoder one....

  19. Spatial Pyramid Covariance based Compact Video Code for Robust Face Retrieval in TV-series.

    Science.gov (United States)

    Li, Yan; Wang, Ruiping; Cui, Zhen; Shan, Shiguang; Chen, Xilin

    2016-10-10

    We address the problem of face video retrieval in TV-series which searches video clips based on the presence of specific character, given one face track of his/her. This is tremendously challenging because on one hand, faces in TV-series are captured in largely uncontrolled conditions with complex appearance variations, and on the other hand retrieval task typically needs efficient representation with low time and space complexity. To handle this problem, we propose a compact and discriminative representation for the huge body of video data, named Compact Video Code (CVC). Our method first models the face track by its sample (i.e., frame) covariance matrix to capture the video data variations in a statistical manner. To incorporate discriminative information and obtain more compact video signature suitable for retrieval, the high-dimensional covariance representation is further encoded as a much lower-dimensional binary vector, which finally yields the proposed CVC. Specifically, each bit of the code, i.e., each dimension of the binary vector, is produced via supervised learning in a max margin framework, which aims to make a balance between the discriminability and stability of the code. Besides, we further extend the descriptive granularity of covariance matrix from traditional pixel-level to more general patchlevel, and proceed to propose a novel hierarchical video representation named Spatial Pyramid Covariance (SPC) along with a fast calculation method. Face retrieval experiments on two challenging TV-series video databases, i.e., the Big Bang Theory and Prison Break, demonstrate the competitiveness of the proposed CVC over state-of-the-art retrieval methods. In addition, as a general video matching algorithm, CVC is also evaluated in traditional video face recognition task on a standard Internet database, i.e., YouTube Celebrities, showing its quite promising performance by using an extremely compact code with only 128 bits.

  20. Selective encryption for H.264/AVC video coding

    Science.gov (United States)

    Shi, Tuo; King, Brian; Salama, Paul

    2006-02-01

    Due to the ease with which digital data can be manipulated and due to the ongoing advancements that have brought us closer to pervasive computing, the secure delivery of video and images has become a challenging problem. Despite the advantages and opportunities that digital video provide, illegal copying and distribution as well as plagiarism of digital audio, images, and video is still ongoing. In this paper we describe two techniques for securing H.264 coded video streams. The first technique, SEH264Algorithm1, groups the data into the following blocks of data: (1) a block that contains the sequence parameter set and the picture parameter set, (2) a block containing a compressed intra coded frame, (3) a block containing the slice header of a P slice, all the headers of the macroblock within the same P slice, and all the luma and chroma DC coefficients belonging to the all the macroblocks within the same slice, (4) a block containing all the ac coefficients, and (5) a block containing all the motion vectors. The first three are encrypted whereas the last two are not. The second method, SEH264Algorithm2, relies on the use of multiple slices per coded frame. The algorithm searches the compressed video sequence for start codes (0x000001) and then encrypts the next N bits of data.

  1. Efficient Power Allocation for Video over Superposition Coding

    KAUST Repository

    Lau, Chun Pong

    2013-03-01

    In this paper we consider a wireless multimedia system by mapping scalable video coded (SVC) bit stream upon superposition coded (SPC) signals, referred to as (SVC-SPC) architecture. Empirical experiments using a software-defined radio(SDR) emulator are conducted to gain a better understanding of its efficiency, specifically, the impact of the received signal due to different power allocation ratios. Our experimental results show that to maintain high video quality, the power allocated to the base layer should be approximately four times higher than the power allocated to the enhancement layer.

  2. Building Standards and Codes for Energy Conservation

    Science.gov (United States)

    Gross, James G.; Pierlert, James H.

    1977-01-01

    Current activity intended to lead to energy conservation measures in building codes and standards is reviewed by members of the Office of Building Standards and Codes Services of the National Bureau of Standards. For journal availability see HE 508 931. (LBH)

  3. Nondestructive testing standards and the ASME code

    International Nuclear Information System (INIS)

    Spanner, J.C.

    1991-04-01

    Nondestructive testing (NDT) requirements and standards are an important part of the ASME Boiler and Pressure Vessel Code. In this paper, the evolution of these requirements and standards is reviewed in the context of the unique technical and legal stature of the ASME Code. The coherent and consistent manner by which the ASME Code rules are organized is described, and the interrelationship between the various ASME Code sections, the piping codes, and the ASTM Standards is discussed. Significant changes occurred in ASME Sections 5 and 11 during the 1980s, and these are highlighted along with projections and comments regarding future trends and changes in these important documents. 4 refs., 8 tabs

  4. Codes and Standards Technical Team Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None

    2013-06-01

    The Hydrogen Codes and Standards Tech Team (CSTT) mission is to enable and facilitate the appropriate research, development, & demonstration (RD&D) for the development of safe, performance-based defensible technical codes and standards that support the technology readiness and are appropriate for widespread consumer use of fuel cells and hydrogen-based technologies with commercialization by 2020. Therefore, it is important that the necessary codes and standards be in place no later than 2015.

  5. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Science.gov (United States)

    Wu, Yueying; Liu, Pengyu; Gao, Yuan; Jia, Kebin

    2016-01-01

    High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI) extraction using the high efficiency video coding (H.265/HEVC) standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP) selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0). The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  6. Medical Ultrasound Video Coding with H.265/HEVC Based on ROI Extraction.

    Directory of Open Access Journals (Sweden)

    Yueying Wu

    Full Text Available High-efficiency video compression technology is of primary importance to the storage and transmission of digital medical video in modern medical communication systems. To further improve the compression performance of medical ultrasound video, two innovative technologies based on diagnostic region-of-interest (ROI extraction using the high efficiency video coding (H.265/HEVC standard are presented in this paper. First, an effective ROI extraction algorithm based on image textural features is proposed to strengthen the applicability of ROI detection results in the H.265/HEVC quad-tree coding structure. Second, a hierarchical coding method based on transform coefficient adjustment and a quantization parameter (QP selection process is designed to implement the otherness encoding for ROIs and non-ROIs. Experimental results demonstrate that the proposed optimization strategy significantly improves the coding performance by achieving a BD-BR reduction of 13.52% and a BD-PSNR gain of 1.16 dB on average compared to H.265/HEVC (HM15.0. The proposed medical video coding algorithm is expected to satisfy low bit-rate compression requirements for modern medical communication systems.

  7. Intra prediction using face continuity in 360-degree video coding

    Science.gov (United States)

    Hanhart, Philippe; He, Yuwen; Ye, Yan

    2017-09-01

    This paper presents a new reference sample derivation method for intra prediction in 360-degree video coding. Unlike the conventional reference sample derivation method for 2D video coding, which uses the samples located directly above and on the left of the current block, the proposed method considers the spherical nature of 360-degree video when deriving reference samples located outside the current face to which the block belongs, and derives reference samples that are geometric neighbors on the sphere. The proposed reference sample derivation method was implemented in the Joint Exploration Model 3.0 (JEM-3.0) for the cubemap projection format. Simulation results for the all intra configuration show that, when compared with the conventional reference sample derivation method, the proposed method gives, on average, luma BD-rate reduction of 0.3% in terms of the weighted spherical PSNR (WS-PSNR) and spherical PSNR (SPSNR) metrics.

  8. The Simple Video Coder: A free tool for efficiently coding social video data.

    Science.gov (United States)

    Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C

    2017-08-01

    Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.

  9. H.264 Layered Coded Video over Wireless Networks: Channel Coding and Modulation Constraints

    Directory of Open Access Journals (Sweden)

    Ghandi MM

    2006-01-01

    Full Text Available This paper considers the prioritised transmission of H.264 layered coded video over wireless channels. For appropriate protection of video data, methods such as prioritised forward error correction coding (FEC or hierarchical quadrature amplitude modulation (HQAM can be employed, but each imposes system constraints. FEC provides good protection but at the price of a high overhead and complexity. HQAM is less complex and does not introduce any overhead, but permits only fixed data ratios between the priority layers. Such constraints are analysed and practical solutions are proposed for layered transmission of data-partitioned and SNR-scalable coded video where combinations of HQAM and FEC are used to exploit the advantages of both coding methods. Simulation results show that the flexibility of SNR scalability and absence of picture drift imply that SNR scalability as modelled is superior to data partitioning in such applications.

  10. Transcoding method from H.264/AVC to high efficiency video coding based on similarity of intraprediction, interprediction, and motion vector

    Science.gov (United States)

    Liu, Mei-Feng; Zhong, Guo-Yun; He, Xiao-Hai; Qing, Lin-Bo

    2016-09-01

    Currently, most video resources on line are encoded in the H.264/AVC format. More fluent video transmission can be obtained if these resources are encoded in the newest international video coding standard: high efficiency video coding (HEVC). In order to improve the video transmission and storage on line, a transcoding method from H.264/AVC to HEVC is proposed. In this transcoding algorithm, the coding information of intraprediction, interprediction, and motion vector (MV) in H.264/AVC video stream are used to accelerate the coding in HEVC. It is found through experiments that the region of interprediction in HEVC overlaps that in H.264/AVC. Therefore, the intraprediction for the region in HEVC, which is interpredicted in H.264/AVC, can be skipped to reduce coding complexity. Several macroblocks in H.264/AVC are combined into one PU in HEVC when the MV difference between two of the macroblocks in H.264/AVC is lower than a threshold. This method selects only one coding unit depth and one prediction unit (PU) mode to reduce the coding complexity. An MV interpolation method of combined PU in HEVC is proposed according to the areas and distances between the center of one macroblock in H.264/AVC and that of the PU in HEVC. The predicted MV accelerates the motion estimation for HEVC coding. The simulation results show that our proposed algorithm achieves significant coding time reduction with a little loss in bitrates distortion rate, compared to the existing transcoding algorithms and normal HEVC coding.

  11. Error Resilience in Current Distributed Video Coding Architectures

    Directory of Open Access Journals (Sweden)

    Tonoli Claudia

    2009-01-01

    Full Text Available In distributed video coding the signal prediction is shifted at the decoder side, giving therefore most of the computational complexity burden at the receiver. Moreover, since no prediction loop exists before transmission, an intrinsic robustness to transmission errors has been claimed. This work evaluates and compares the error resilience performance of two distributed video coding architectures. In particular, we have considered a video codec based on the Stanford architecture (DISCOVER codec and a video codec based on the PRISM architecture. Specifically, an accurate temporal and rate/distortion based evaluation of the effects of the transmission errors for both the considered DVC architectures has been performed and discussed. These approaches have been also compared with H.264/AVC, in both cases of no error protection, and simple FEC error protection. Our evaluations have highlighted in all cases a strong dependence of the behavior of the various codecs to the content of the considered video sequence. In particular, PRISM seems to be particularly well suited for low-motion sequences, whereas DISCOVER provides better performance in the other cases.

  12. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    Science.gov (United States)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  13. Joint disparity and motion estimation using optical flow for multiview Distributed Video Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Brites, Catarina

    2014-01-01

    Distributed Video Coding (DVC) is a video coding paradigm where the source statistics are exploited at the decoder based on the availability of Side Information (SI). In a monoview video codec, the SI is generated by exploiting the temporal redundancy of the video, through motion estimation and c...

  14. Electrical, instrumentation, and control codes and standards

    International Nuclear Information System (INIS)

    Kranning, A.N.

    1978-01-01

    During recent years numerous documents in the form of codes and standards have been developed and published to provide design, fabrication and construction rules and criteria applicable to instrumentation, control and power distribution facilities for nuclear power plants. The contents of this LTR were prepared by NUS Corporation under Subcontract K5108 and provide a consolidated index and listing of the documents selected for their application to procurement of materials and design of modifications and new construction at the LOFT facility. These codes and standards should be applied together with the National Electrical Code, the ID Engineering Standards and LOFT Specifications to all LOFT instrument and electrical design activities

  15. Efficient Coding of Shape and Transparency for Video Objects

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2007-01-01

    A novel scheme for coding gray-level alpha planes in object-based video is presented. Gray-level alpha planes convey the shape and the transparency information, which are required for smooth composition of video objects. The algorithm proposed is based on the segmentation of the alpha plane...... in three layers: binary shape layer, opaque layer, and intermediate layer. Thus, the latter two layers replace the single transparency layer of MPEG-4 Part 2. Different encoding schemes are specifically designed for each layer, utilizing cross-layer correlations to reduce the bit rate. First, the binary...... demonstrating that the proposed techniques provide substantial bit rate savings coding shape and transparency when compared to the tools adopted in MPEG-4 Part 2....

  16. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10) for encoding video data and a corresponding video decoding device, wherein during decoding PPG relevant information shall be preserved. For this purpose the video coding device (10) comprises a first encoder (20) for encoding input video

  17. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  18. Distributed coding/decoding complexity in video sensor networks.

    Science.gov (United States)

    Cordeiro, Paulo J; Assunção, Pedro

    2012-01-01

    Video Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. In such large scale environments which include video coding, transmission and display/storage, there are several open problems to overcome in practical implementations. This paper addresses the most relevant challenges posed by VSNs, namely stringent bandwidth usage and processing time/power constraints. In particular, the paper proposes a novel VSN architecture where large sets of visual sensors with embedded processors are used for compression and transmission of coded streams to gateways, which in turn transrate the incoming streams and adapt them to the variable complexity requirements of both the sensor encoders and end-user decoder terminals. Such gateways provide real-time transcoding functionalities for bandwidth adaptation and coding/decoding complexity distribution by transferring the most complex video encoding/decoding tasks to the transcoding gateway at the expense of a limited increase in bit rate. Then, a method to reduce the decoding complexity, suitable for system-on-chip implementation, is proposed to operate at the transcoding gateway whenever decoders with constrained resources are targeted. The results show that the proposed method achieves good performance and its inclusion into the VSN infrastructure provides an additional level of complexity control functionality.

  19. Recent activities on nuclear codes and standards

    International Nuclear Information System (INIS)

    Minematsu, Akiyoshi; Ishimoto, Shozaburo; Honjin, Masao

    2000-01-01

    The technical codes and standards relating to the nuclear power stations in Japan are prepared by shapes of laws (ministerial ordinances and bulletins) issued by the government and obliged to comply with by 'the Law concerning the Regulations of Nuclear Material Substances, Nuclear Fuel Substances and Nuclear Reactors' and 'the Electricity Business Act' and of guides defined by the Nuclear Safety Commission, and further some private standards have been issued at a shape of complement of these laws and guides by receiving national recommendation. On the other hand, in the fields of electricity and heat facilities except atomic energy, simplification and feature stipulation of the national technical codes and standards was recently carried out, by which a system usable for the private standards in and out of Japan were prepared through approval of the private Japan Electrotechnical Standards and Codes Committee (JESC). As the nuclear field was now excepted from simultaneous transfer to the private standard and the standard application system, it is expected in future to realize similar transfer if possible and preparation of the private standards is now being advanced. Here were introduced on present state on technical codes and standards relating to the nuclear power generation facilities and recent trends on their private standardization. (G.K.)

  20. A New Video Coding Algorithm Using 3D-Subband Coding and Lattice Vector Quantization

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.H. [Taejon Junior College, Taejon (Korea, Republic of); Lee, K.Y. [Sung Kyun Kwan University, Suwon (Korea, Republic of)

    1997-12-01

    In this paper, we propose an efficient motion adaptive 3-dimensional (3D) video coding algorithm using 3D subband coding (3D-SBC) and lattice vector quantization (LVQ) for low bit rate. Instead of splitting input video sequences into the fixed number of subbands along the temporal axes, we decompose them into temporal subbands of variable size according to motions in frames. Each spatio-temporally splitted 7 subbands are partitioned by quad tree technique and coded with lattice vector quantization(LVQ). The simulation results show 0.1{approx}4.3dB gain over H.261 in peak signal to noise ratio(PSNR) at low bit rate (64Kbps). (author). 13 refs., 13 figs., 4 tabs.

  1. Spherical rotation orientation indication for HEVC and JEM coding of 360 degree video

    Science.gov (United States)

    Boyce, Jill; Xu, Qian

    2017-09-01

    Omnidirectional (or "360 degree") video, representing a panoramic view of a spherical 360° ×180° scene, can be encoded using conventional video compression standards, once it has been projection mapped to a 2D rectangular format. Equirectangular projection format is currently used for mapping 360 degree video to a rectangular representation for coding using HEVC/JEM. However, video in the top and bottom regions of the image, corresponding to the "north pole" and "south pole" of the spherical representation, is significantly warped. We propose to perform spherical rotation of the input video prior to HEVC/JEM encoding in order to improve the coding efficiency, and to signal parameters in a supplemental enhancement information (SEI) message that describe the inverse rotation process recommended to be applied following HEVC/JEM decoding, prior to display. Experiment results show that up to 17.8% bitrate gain (using the WS-PSNR end-to-end metric) can be achieved for the Chairlift sequence using HM16.15 and 11.9% gain using JEM6.0, and an average gain of 2.9% for HM16.15 and 2.2% for JEM6.0.

  2. Rulemaking efforts on codes and standards

    International Nuclear Information System (INIS)

    Millman, G.C.

    1992-01-01

    Section 50.55a of the NRC regulations provides a mechanism for incorporating national codes and standards into the regulatory process. It incorporates by reference ASME Boiler and Pressure Vessel Code (ASME B and PV Code) Section 3 rules for construction and Section 11 rules for inservice inspection and inservice testing. The regulation is periodically amended to update these references. The rulemaking process, as applied to Section 50.55a amendments, is overviewed to familiarize users with associated internal activities of the NRC staff and the manner in which public comments are integrated into the process. The four ongoing rulemaking actions that would individually amend Section 50.55a are summarized. Two of the actions would directly impact requirements for inservice testing. Benefits accrued with NRC endorsement of the ASME B and PV Code, and possible future endorsement of the ASME Operations and Maintenance Code (ASME OM Code), are identified. Emphasis is placed on the need for code writing committees to be especially sensitive to user feedback on code rules incorporated into the regulatory process to ensure that the rules are complete, technically accurate, clear, practical, and enforceable

  3. Depth-based Multi-View 3D Video Coding

    DEFF Research Database (Denmark)

    Zamarin, Marco

    techniques are used to extract dense motion information and generate improved candidate side information. Multiple candidates are merged employing multi-hypothesis strategies. Promising rate-distortion performance improvements compared with state-of-the-art Wyner-Ziv decoders are reported, both when texture......-view video. Depth maps are typically used to synthesize the desired output views, and the performance of view synthesis algorithms strongly depends on the accuracy of depth information. In this thesis, novel algorithms for efficient depth map compression in MVD scenarios are proposed, with particular focus...... on edge-preserving solutions. In a proposed scheme, texture-depth correlation is exploited to predict surface shapes in the depth signal. In this way depth coding performance can be improved in terms of both compression gain and edge-preservation. Another solution proposes a new intra coding mode targeted...

  4. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    Thomas André

    2007-03-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  5. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    André Thomas

    2007-01-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  6. A hardware-oriented concurrent TZ search algorithm for High-Efficiency Video Coding

    Science.gov (United States)

    Doan, Nghia; Kim, Tae Sung; Rhee, Chae Eun; Lee, Hyuk-Jae

    2017-12-01

    High-Efficiency Video Coding (HEVC) is the latest video coding standard, in which the compression performance is double that of its predecessor, the H.264/AVC standard, while the video quality remains unchanged. In HEVC, the test zone (TZ) search algorithm is widely used for integer motion estimation because it effectively searches the good-quality motion vector with a relatively small amount of computation. However, the complex computation structure of the TZ search algorithm makes it difficult to implement it in the hardware. This paper proposes a new integer motion estimation algorithm which is designed for hardware execution by modifying the conventional TZ search to allow parallel motion estimations of all prediction unit (PU) partitions. The algorithm consists of the three phases of zonal, raster, and refinement searches. At the beginning of each phase, the algorithm obtains the search points required by the original TZ search for all PU partitions in a coding unit (CU). Then, all redundant search points are removed prior to the estimation of the motion costs, and the best search points are then selected for all PUs. Compared to the conventional TZ search algorithm, experimental results show that the proposed algorithm significantly decreases the Bjøntegaard Delta bitrate (BD-BR) by 0.84%, and it also reduces the computational complexity by 54.54%.

  7. An Adaptive Motion Estimation Scheme for Video Coding

    Directory of Open Access Journals (Sweden)

    Pengyu Liu

    2014-01-01

    Full Text Available The unsymmetrical-cross multihexagon-grid search (UMHexagonS is one of the best fast Motion Estimation (ME algorithms in video encoding software. It achieves an excellent coding performance by using hybrid block matching search pattern and multiple initial search point predictors at the cost of the computational complexity of ME increased. Reducing time consuming of ME is one of the key factors to improve video coding efficiency. In this paper, we propose an adaptive motion estimation scheme to further reduce the calculation redundancy of UMHexagonS. Firstly, new motion estimation search patterns have been designed according to the statistical results of motion vector (MV distribution information. Then, design a MV distribution prediction method, including prediction of the size of MV and the direction of MV. At last, according to the MV distribution prediction results, achieve self-adaptive subregional searching by the new estimation search patterns. Experimental results show that more than 50% of total search points are dramatically reduced compared to the UMHexagonS algorithm in JM 18.4 of H.264/AVC. As a result, the proposed algorithm scheme can save the ME time up to 20.86% while the rate-distortion performance is not compromised.

  8. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  9. Traffic and Quality Characterization of the H.264/AVC Scalable Video Coding Extension

    Directory of Open Access Journals (Sweden)

    Geert Van der Auwera

    2008-01-01

    Full Text Available The recent scalable video coding (SVC extension to the H.264/AVC video coding standard has unprecedented compression efficiency while supporting a wide range of scalability modes, including temporal, spatial, and quality (SNR scalability, as well as combined spatiotemporal SNR scalability. The traffic characteristics, especially the bit rate variabilities, of the individual layer streams critically affect their network transport. We study the SVC traffic statistics, including the bit rate distortion and bit rate variability distortion, with long CIF resolution video sequences and compare them with the corresponding MPEG-4 Part 2 traffic statistics. We consider (i temporal scalability with three temporal layers, (ii spatial scalability with a QCIF base layer and a CIF enhancement layer, as well as (iii quality scalability modes FGS and MGS. We find that the significant improvement in RD efficiency of SVC is accompanied by substantially higher traffic variabilities as compared to the equivalent MPEG-4 Part 2 streams. We find that separately analyzing the traffic of temporal-scalability only encodings gives reasonable estimates of the traffic statistics of the temporal layers embedded in combined spatiotemporal encodings and in the base layer of combined FGS-temporal encodings. Overall, we find that SVC achieves significantly higher compression ratios than MPEG-4 Part 2, but produces unprecedented levels of traffic variability, thus presenting new challenges for the network transport of scalable video.

  10. 3D Scan-Based Wavelet Transform and Quality Control for Video Coding

    Directory of Open Access Journals (Sweden)

    Parisot Christophe

    2003-01-01

    Full Text Available Wavelet coding has been shown to achieve better compression than DCT coding and moreover allows scalability. 2D DWT can be easily extended to 3D and thus applied to video coding. However, 3D subband coding of video suffers from two drawbacks. The first is the amount of memory required for coding large 3D blocks; the second is the lack of temporal quality due to the sequence temporal splitting. In fact, 3D block-based video coders produce jerks. They appear at blocks temporal borders during video playback. In this paper, we propose a new temporal scan-based wavelet transform method for video coding combining the advantages of wavelet coding (performance, scalability with acceptable reduced memory requirements, no additional CPU complexity, and avoiding jerks. We also propose an efficient quality allocation procedure to ensure a constant quality over time.

  11. Texture side information generation for distributed coding of video-plus-depth

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Raket, Lars Lau; Zamarin, Marco

    2013-01-01

    We consider distributed video coding in a monoview video-plus-depth scenario, aiming at coding textures jointly with their corresponding depth stream. Distributed Video Coding (DVC) is a video coding paradigm in which the complexity is shifted from the encoder to the decoder. The Side Information...... components) is strongly correlated, so the additional depth information may be used to generate more accurate SI for the texture stream, increasing the efficiency of the system. In this paper we propose various methods for accurate texture SI generation, comparing them with other state-of-the-art solutions...

  12. Canadian energy standards : residential energy code requirements

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, K. [SAR Engineering Ltd., Burnaby, BC (Canada)

    2006-09-15

    A survey of residential energy code requirements was discussed. New housing is approximately 13 per cent more efficient than housing built 15 years ago, and more stringent energy efficiency requirements in building codes have contributed to decreased energy use and greenhouse gas (GHG) emissions. However, a survey of residential energy codes across Canada has determined that explicit demands for energy efficiency are currently only present in British Columbia (BC), Manitoba, Ontario and Quebec. The survey evaluated more than 4300 single-detached homes built between 2000 and 2005 using data from the EnerGuide for Houses (EGH) database. House area, volume, airtightness and construction characteristics were reviewed to create archetypes for 8 geographic areas. The survey indicated that in Quebec and the Maritimes, 90 per cent of houses comply with ventilation system requirements of the National Building Code, while compliance in the rest of Canada is much lower. Heat recovery ventilation use is predominant in the Atlantic provinces. Direct-vent or condensing furnaces constitute the majority of installed systems in provinces where natural gas is the primary space heating fuel. Details of Insulation levels for walls, double-glazed windows, and building code insulation standards were also reviewed. It was concluded that if R-2000 levels of energy efficiency were applied, total average energy consumption would be reduced by 36 per cent in Canada. 2 tabs.

  13. Exploiting the Error-Correcting Capabilities of Low Density Parity Check Codes in Distributed Video Coding using Optical Flow

    DEFF Research Database (Denmark)

    Rakêt, Lars Lau; Søgaard, Jacob; Salmistraro, Matteo

    2012-01-01

    We consider Distributed Video Coding (DVC) in presence of communication errors. First, we present DVC side information generation based on a new method of optical flow driven frame interpolation, where a highly optimized TV-L1 algorithm is used for the flow calculations and combine three flows....... Thereafter methods for exploiting the error-correcting capabilities of the LDPCA code in DVC are investigated. The proposed frame interpolation includes a symmetric flow constraint to the standard forward-backward frame interpolation scheme, which improves quality and handling of large motion. The three...... flows are combined in one solution. The proposed frame interpolation method consistently outperforms an overlapped block motion compensation scheme and a previous TV-L1 optical flow frame interpolation method with an average PSNR improvement of 1.3 dB and 2.3 dB respectively. For a GOP size of 2...

  14. Data Representation, Coding, and Communication Standards.

    Science.gov (United States)

    Amin, Milon; Dhir, Rajiv

    2015-06-01

    The immense volume of cases signed out by surgical pathologists on a daily basis gives little time to think about exactly how data are stored. An understanding of the basics of data representation has implications that affect a pathologist's daily practice. This article covers the basics of data representation and its importance in the design of electronic medical record systems. Coding in surgical pathology is also discussed. Finally, a summary of communication standards in surgical pathology is presented, including suggested resources that establish standards for select aspects of pathology reporting. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Transform domain Wyner-Ziv video coding with refinement of noise residue and side information

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2010-01-01

    are successively updating the estimated noise residue for noise modeling and side information frame quality during decoding. Experimental results show that the proposed decoder can improve the Rate- Distortion (RD) performance of a state-of-the-art Wyner Ziv video codec for the set of test sequences.......Distributed Video Coding (DVC) is a video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of side information at the decoder. This paper considers feedback channel based Transform Domain Wyner-Ziv (TDWZ) DVC. The coding efficiency of TDWZ video...... coding does not match that of conventional video coding yet, mainly due to the quality of side information and inaccurate noise estimation. In this context, a novel TDWZ video decoder with noise residue refinement (NRR) and side information refinement (SIR) is proposed. The proposed refinement schemes...

  16. Adaptive Distributed Video Coding with Correlation Estimation using Expectation Propagation.

    Science.gov (United States)

    Cui, Lijuan; Wang, Shuang; Jiang, Xiaoqian; Cheng, Samuel

    2012-10-15

    Distributed video coding (DVC) is rapidly increasing in popularity by the way of shifting the complexity from encoder to decoder, whereas no compression performance degrades, at least in theory. In contrast with conventional video codecs, the inter-frame correlation in DVC is explored at decoder based on the received syndromes of Wyner-Ziv (WZ) frame and side information (SI) frame generated from other frames available only at decoder. However, the ultimate decoding performances of DVC are based on the assumption that the perfect knowledge of correlation statistic between WZ and SI frames should be available at decoder. Therefore, the ability of obtaining a good statistical correlation estimate is becoming increasingly important in practical DVC implementations. Generally, the existing correlation estimation methods in DVC can be classified into two main types: pre-estimation where estimation starts before decoding and on-the-fly (OTF) estimation where estimation can be refined iteratively during decoding. As potential changes between frames might be unpredictable or dynamical, OTF estimation methods usually outperforms pre-estimation techniques with the cost of increased decoding complexity (e.g., sampling methods). In this paper, we propose a low complexity adaptive DVC scheme using expectation propagation (EP), where correlation estimation is performed OTF as it is carried out jointly with decoding of the factor graph-based DVC code. Among different approximate inference methods, EP generally offers better tradeoff between accuracy and complexity. Experimental results show that our proposed scheme outperforms the benchmark state-of-the-art DISCOVER codec and other cases without correlation tracking, and achieves comparable decoding performance but with significantly low complexity comparing with sampling method.

  17. Adaptive distributed video coding with correlation estimation using expectation propagation

    Science.gov (United States)

    Cui, Lijuan; Wang, Shuang; Jiang, Xiaoqian; Cheng, Samuel

    2012-10-01

    Distributed video coding (DVC) is rapidly increasing in popularity by the way of shifting the complexity from encoder to decoder, whereas no compression performance degrades, at least in theory. In contrast with conventional video codecs, the inter-frame correlation in DVC is explored at decoder based on the received syndromes of Wyner-Ziv (WZ) frame and side information (SI) frame generated from other frames available only at decoder. However, the ultimate decoding performances of DVC are based on the assumption that the perfect knowledge of correlation statistic between WZ and SI frames should be available at decoder. Therefore, the ability of obtaining a good statistical correlation estimate is becoming increasingly important in practical DVC implementations. Generally, the existing correlation estimation methods in DVC can be classified into two main types: pre-estimation where estimation starts before decoding and on-the-fly (OTF) estimation where estimation can be refined iteratively during decoding. As potential changes between frames might be unpredictable or dynamical, OTF estimation methods usually outperforms pre-estimation techniques with the cost of increased decoding complexity (e.g., sampling methods). In this paper, we propose a low complexity adaptive DVC scheme using expectation propagation (EP), where correlation estimation is performed OTF as it is carried out jointly with decoding of the factor graph-based DVC code. Among different approximate inference methods, EP generally offers better tradeoff between accuracy and complexity. Experimental results show that our proposed scheme outperforms the benchmark state-of-the-art DISCOVER codec and other cases without correlation tracking, and achieves comparable decoding performance but with significantly low complexity comparing with sampling method.

  18. An Effective Transform Unit Size Decision Method for High Efficiency Video Coding

    Directory of Open Access Journals (Sweden)

    Chou-Chen Wang

    2014-01-01

    Full Text Available High efficiency video coding (HEVC is the latest video coding standard. HEVC can achieve higher compression performance than previous standards, such as MPEG-4, H.263, and H.264/AVC. However, HEVC requires enormous computational complexity in encoding process due to quadtree structure. In order to reduce the computational burden of HEVC encoder, an early transform unit (TU decision algorithm (ETDA is adopted to pruning the residual quadtree (RQT at early stage based on the number of nonzero DCT coefficients (called NNZ-EDTA to accelerate the encoding process. However, the NNZ-ETDA cannot effectively reduce the computational load for sequences with active motion or rich texture. Therefore, in order to further improve the performance of NNZ-ETDA, we propose an adaptive RQT-depth decision for NNZ-ETDA (called ARD-NNZ-ETDA by exploiting the characteristics of high temporal-spatial correlation that exist in nature video sequences. Simulation results show that the proposed method can achieve time improving ratio (TIR about 61.26%~81.48% when compared to the HEVC test model 8.1 (HM 8.1 with insignificant loss of image quality. Compared with the NNZ-ETDA, the proposed method can further achieve an average TIR about 8.29%~17.92%.

  19. Protocol Standards for Reporting Video Data in Academic Journals.

    Science.gov (United States)

    Rowland, Pamela A; Ignacio, Romeo C; de Moya, Marc A

    2016-04-01

    Editors of biomedical journals have estimated that a majority (40%-90%) of studies published in scientific journals cannot be replicated, even though an inherent principle of publication is that others should be able to replicate and build on published claims. Each journal sets its own protocols for establishing "quality" in articles, yet over the past 50 years, few journals in any field--especially medical education--have specified protocols for reporting the use of video data in research. The authors found that technical and industry-driven aspects of video recording, as well as a lack of standardization and reporting requirements by research journals, have led to major limitations in the ability to assess or reproduce video data used in research. Specific variables in the videotaping process (e.g., camera angle), which can be changed or be modified, affect the quality of recorded data, leading to major reporting errors and, in turn, unreliable conclusions. As more data are now in the form of digital videos, the historical lack of reporting standards makes it increasingly difficult to accurately replicate medical educational studies. Reproducibility is especially important as the medical education community considers setting national high-stakes standards in medicine and surgery based on video data. The authors of this Perspective provide basic protocol standards for investigators and journals using video data in research publications so as to allow for reproducibility.

  20. Globalization of ASME Nuclear Codes and Standards

    International Nuclear Information System (INIS)

    Swayne, Rick; Erler, Bryan A.

    2006-01-01

    With the globalization of the nuclear industry, it is clear that the reactor suppliers are based in many countries around the world (such as United States, France, Japan, Canada, South Korea, South Africa) and they will be marketing their reactors to many countries around the world (such as US, China, South Korea, France, Canada, Finland, Taiwan). They will also be fabricating their components in many different countries around the world. With this situation, it is clear that the requirements of ASME Nuclear Codes and Standards need to be adjusted to accommodate the regulations, fabricating processes, and technology of various countries around the world. It is also very important for the American Society of Mechanical Engineers (ASME) to be able to assure that products meeting the applicable ASME Code requirements will provide the same level of safety and quality assurance as those products currently fabricated under the ASME accreditation process. To do this, many countries are in the process of establishing or changing their regulations, and it is important for ASME to interface with the appropriate organizations in those countries, in order to ensure there is effective use of ASME Codes and standards around the world. (authors)

  1. Mutiple LDPC Decoding using Bitplane Correlation for Transform Domain Wyner-Ziv Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    Distributed video coding (DVC) is an emerging video coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. This paper considers a Low Density Parity Check (LDPC) based Transform Domain Wyner-Ziv (TDWZ) video...... codec. To improve the LDPC coding performance in the context of TDWZ, this paper proposes a Wyner-Ziv video codec using bitplane correlation through multiple parallel LDPC decoding. The proposed scheme utilizes inter bitplane correlation to enhance the bitplane decoding performance. Experimental results...

  2. Game-Theoretic Rate-Distortion-Complexity Optimization of High Efficiency Video Coding

    DEFF Research Database (Denmark)

    Ukhanova, Ann; Milani, Simone; Forchhammer, Søren

    2013-01-01

    profiles in order to tailor the computational load to the different hardware and power-supply resources of devices. In this work, we focus on optimizing the quantization parameter and partition depth in HEVC via a game-theoretic approach. The proposed rate control strategy alone provides 0.2 dB improvement......This paper presents an algorithm for rate-distortioncomplexity optimization for the emerging High Efficiency Video Coding (HEVC) standard, whose high computational requirements urge the need for low-complexity optimization algorithms. Optimization approaches need to specify different complexity...

  3. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  4. Cross-band noise model refinement for transform domain Wyner–Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2012-01-01

    TDWZ video coding trails that of conventional video coding solutions, mainly due to the quality of side information, inaccurate noise modeling and loss in the final coding step. The major goal of this paper is to enhance the accuracy of the noise modeling, which is one of the most important aspects...... influencing the coding performance of DVC. A TDWZ video decoder with a novel cross-band based adaptive noise model is proposed, and a noise residue refinement scheme is introduced to successively update the estimated noise residue for noise modeling after each bit-plane. Experimental results show...... that the proposed noise model and noise residue refinement scheme can improve the rate-distortion (RD) performance of TDWZ video coding significantly. The quality of the side information modeling is also evaluated by a measure of the ideal code length....

  5. Alternative Fuels Data Center: Codes and Standards Resources

    Science.gov (United States)

    resources linked below help project developers and code officials prepare and review code-compliant projects , storage, and infrastructure. The following charts show the SDOs responsible for these alternative fuel codes and standards. Biodiesel Vehicle and Infrastructure Codes and Standards Chart Electric Vehicle and

  6. Review of codes, standards, and regulations for natural gas locomotives.

    Science.gov (United States)

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  7. Using standardized video cases for assessment of medical communication skills: reliability of an objective structured video examination by computer

    NARCIS (Netherlands)

    Hulsman, R. L.; Mollema, E. D.; Oort, F. J.; Hoos, A. M.; de Haes, J. C. J. M.

    2006-01-01

    OBJECTIVE: Using standardized video cases in a computerized objective structured video examination (OSVE) aims to measure cognitive scripts underlying overt communication behavior by questions on knowledge, understanding and performance. In this study the reliability of the OSVE assessment is

  8. An Unequal Secure Encryption Scheme for H.264/AVC Video Compression Standard

    Science.gov (United States)

    Fan, Yibo; Wang, Jidong; Ikenaga, Takeshi; Tsunoo, Yukiyasu; Goto, Satoshi

    H.264/AVC is the newest video coding standard. There are many new features in it which can be easily used for video encryption. In this paper, we propose a new scheme to do video encryption for H.264/AVC video compression standard. We define Unequal Secure Encryption (USE) as an approach that applies different encryption schemes (with different security strength) to different parts of compressed video data. This USE scheme includes two parts: video data classification and unequal secure video data encryption. Firstly, we classify the video data into two partitions: Important data partition and unimportant data partition. Important data partition has small size with high secure protection, while unimportant data partition has large size with low secure protection. Secondly, we use AES as a block cipher to encrypt the important data partition and use LEX as a stream cipher to encrypt the unimportant data partition. AES is the most widely used symmetric cryptography which can ensure high security. LEX is a new stream cipher which is based on AES and its computational cost is much lower than AES. In this way, our scheme can achieve both high security and low computational cost. Besides the USE scheme, we propose a low cost design of hybrid AES/LEX encryption module. Our experimental results show that the computational cost of the USE scheme is low (about 25% of naive encryption at Level 0 with VEA used). The hardware cost for hybrid AES/LEX module is 4678 Gates and the AES encryption throughput is about 50Mbps.

  9. Adaptive modeling of sky for video processing and coding applications

    NARCIS (Netherlands)

    Zafarifar, B.; With, de P.H.N.; Lagendijk, R.L.; Weber, Jos H.; Berg, van den A.F.M.

    2006-01-01

    Video content analysis for still- and moving images can be used for various applications, such as high-level semantic-driven operations or pixel-level contentdependent image manipulation. Within video content analysis, sky regions of an image form visually important objects, for which interesting

  10. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    Science.gov (United States)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  11. Review and comparison of WWER and LWR Codes and Standards

    International Nuclear Information System (INIS)

    Buckthorpe, D.; Tashkinov, A.; Brynda, J.; Davies, L.M.; Cueto-Felgeueroso, C.; Detroux, P.; Bieniussa, K.; Guinovart, J.

    2003-01-01

    The results of work on a collaborative project on comparison of Codes and Standards used for safety related components of the WWER and LWR type reactors is presented. This work was performed on behalf of the European Commission, Working Group Codes and Standards and considers areas such as rules, criteria and provisions, failure mechanisms , derivation and understanding behind the fatigue curves, piping, materials and aging, manufacturing and ISI. WWERs are essentially designed and constructed using the Russian PNAE Code together with special provisions in a few countries (e.g. Czech Republic) from national standards. The LWR Codes have a strong dependence on the ASME Code. Also within Western Europe other codes are used including RCC-M, KTA and British Standards. A comparison of procedures used in all these codes and standards have been made to investigate the potential for equivalencies between the codes and any grounds for future cooperation between eastern and western experts in this field. (author)

  12. Bit Plane Coding based Steganography Technique for JPEG2000 Images and Videos

    Directory of Open Access Journals (Sweden)

    Geeta Kasana

    2016-02-01

    Full Text Available In this paper, a Bit Plane Coding (BPC based steganography technique for JPEG2000 images and Motion JPEG2000 video is proposed. Embedding in this technique is performed in the lowest significant bit planes of the wavelet coefficients of a cover image. In JPEG2000 standard, the number of bit planes of wavelet coefficients to be used in encoding is dependent on the compression rate and are used in Tier-2 process of JPEG2000. In the proposed technique, Tier-1 and Tier-2 processes of JPEG2000 and Motion JPEG2000 are executed twice on the encoder side to collect the information about the lowest bit planes of all code blocks of a cover image, which is utilized in embedding and transmitted to the decoder. After embedding secret data, Optimal Pixel Adjustment Process (OPAP is applied on stego images to enhance its visual quality. Experimental results show that proposed technique provides large embedding capacity and better visual quality of stego images than existing steganography techniques for JPEG2000 compressed images and videos. Extracted secret image is similar to the original secret image.

  13. The future of 3D and video coding in mobile and the internet

    Science.gov (United States)

    Bivolarski, Lazar

    2013-09-01

    The current Internet success has already changed our social and economic world and is still continuing to revolutionize the information exchange. The exponential increase of amount and types of data that is currently exchanged on the Internet represents significant challenge for the design of future architectures and solutions. This paper reviews the current status and trends in the design of solutions and research activities in the future Internet from point of view of managing the growth of bandwidth requirements and complexity of the multimedia that is being created and shared. Outlines the challenges that are present before the video coding and approaches to the design of standardized media formats and protocols while considering the expected convergence of multimedia formats and exchange interfaces. The rapid growth of connected mobile devices adds to the current and the future challenges in combination with the expected, in near future, arrival of multitude of connected devices. The new Internet technologies connecting the Internet of Things with wireless visual sensor networks and 3D virtual worlds requires conceptually new approaches of media content handling from acquisition to presentation in the 3D Media Internet. Accounting for the entire transmission system properties and enabling adaptation in real-time to context and content throughout the media proceeding path will be paramount in enabling the new media architectures as well as the new applications and services. The common video coding formats will need to be conceptually redesigned to allow for the implementation of the necessary 3D Media Internet features.

  14. Design considerations for view interpolation in a 3D video coding framework

    NARCIS (Netherlands)

    Morvan, Y.; Farin, D.S.; With, de P.H.N.; Lagendijk, R.L.; Weber, Jos H.; Berg, van den A.F.M.

    2006-01-01

    A 3D video stream typically consists of a set of views capturing simultaneously the same scene. For an efficient transmission of the 3D video, a compression technique is required. In this paper, we describe a coding architecture and appropriate algorithms that enable the compression and

  15. International Accreditation of ASME Codes and Standards

    International Nuclear Information System (INIS)

    Green, Mervin R.

    1989-01-01

    ASME established a Boiler Code Committee to develop rules for the design, fabrication and inspection of boilers. This year we recognize 75 years of that Code and will publish a history of that 75 years. The first Code and subsequent editions provided for a Code Symbol Stamp or mark which could be affixed by a manufacturer to a newly constructed product to certify that the manufacturer had designed, fabricated and had inspected it in accordance with Code requirements. The purpose of the ASME Mark is to identify those boilers that meet ASME Boiler and Pressure Vessel Code requirements. Through thousands of updates over the years, the Code has been revised to reflect technological advances and changing safety needs. Its scope has been broadened from boilers to include pressure vessels, nuclear components and systems. Proposed revisions to the Code are published for public review and comment four times per year and revisions and interpretations are published annually; it's a living and constantly evolving Code. You and your organizations are a vital part of the feedback system that keeps the Code alive. Because of this dynamic Code, we no longer have columns in newspapers listing boiler explosions. Nevertheless, it has been argued recently that ASME should go further in internationalizing its Code. Specifically, representatives of several countries, have suggested that ASME delegate to them responsibility for Code implementation within their national boundaries. The question is, thus, posed: Has the time come to franchise responsibility for administration of ASME's Code accreditation programs to foreign entities or, perhaps, 'institutes.' And if so, how should this be accomplished?

  16. Efficient Power Allocation for Video over Superposition Coding

    KAUST Repository

    Lau, Chun Pong; Jamshaid, K.; Shihada, Basem

    2013-01-01

    are conducted to gain a better understanding of its efficiency, specifically, the impact of the received signal due to different power allocation ratios. Our experimental results show that to maintain high video quality, the power allocated to the base layer

  17. Improved virtual channel noise model for transform domain Wyner-Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2009-01-01

    Distributed video coding (DVC) has been proposed as a new video coding paradigm to deal with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. A virtual channel noise model is utilized at the decoder to estimate...... the noise distribution between the side information frame and the original frame. This is one of the most important aspects influencing the coding performance of DVC. Noise models with different granularity have been proposed. In this paper, an improved noise model for transform domain Wyner-Ziv video...... coding is proposed, which utilizes cross-band correlation to estimate the Laplacian parameters more accurately. Experimental results show that the proposed noise model can improve the rate-distortion (RD) performance....

  18. Motion-adaptive intraframe transform coding of video signals

    NARCIS (Netherlands)

    With, de P.H.N.

    1989-01-01

    Spatial transform coding has been widely applied for image compression because of its high coding efficiency. However, in many intraframe systems, in which every TV frame is independently processed, coding of moving objects in the case of interlaced input signals is not addressed. In this paper, we

  19. Variable disparity-motion estimation based fast three-view video coding

    Science.gov (United States)

    Bae, Kyung-Hoon; Kim, Seung-Cheol; Hwang, Yong Seok; Kim, Eun-Soo

    2009-02-01

    In this paper, variable disparity-motion estimation (VDME) based 3-view video coding is proposed. In the encoding, key-frame coding (KFC) based motion estimation and variable disparity estimation (VDE) for effectively fast three-view video encoding are processed. These proposed algorithms enhance the performance of 3-D video encoding/decoding system in terms of accuracy of disparity estimation and computational overhead. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm's PSNRs is 37.66 and 40.55 dB, and the processing time is 0.139 and 0.124 sec/frame, respectively.

  20. Motion estimation for video coding efficient algorithms and architectures

    CERN Document Server

    Chakrabarti, Indrajit; Chatterjee, Sumit Kumar

    2015-01-01

    The need of video compression in the modern age of visual communication cannot be over-emphasized. This monograph will provide useful information to the postgraduate students and researchers who wish to work in the domain of VLSI design for video processing applications. In this book, one can find an in-depth discussion of several motion estimation algorithms and their VLSI implementation as conceived and developed by the authors. It records an account of research done involving fast three step search, successive elimination, one-bit transformation and its effective combination with diamond search and dynamic pixel truncation techniques. Two appendices provide a number of instances of proof of concept through Matlab and Verilog program segments. In this aspect, the book can be considered as first of its kind. The architectures have been developed with an eye to their applicability in everyday low-power handheld appliances including video camcorders and smartphones.

  1. C++ Coding Standards 101 Rules, Guidelines, and Best Practices

    CERN Document Server

    Sutter, Herb

    2005-01-01

    Consistent, high-quality coding standards improve software quality, reduce time-to-market, promote teamwork, eliminate time wasted on inconsequential matters, and simplify maintenance. Now, two of the world's most respected C++ experts distill the rich collective experience of the global C++ community into a set of coding standards that every developer and development team can understand and use as a basis for their own coding standards.

  2. Running code as part of an open standards policy

    OpenAIRE

    Shah, Rajiv; Kesan, Jay

    2009-01-01

    Governments around the world are considering implementing or even mandating open standards policies. They believe these policies will provide economic, socio-political, and technical benefits. In this article, we analyze the failure of the Massachusetts’s open standards policy as applied to document formats. We argue it failed due to the lack of running code. Running code refers to multiple independent, interoperable implementations of an open standard. With running code, users have choice ...

  3. Bridging Inter-flow and Intra-flow Network Coding for Video Applications

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2013-01-01

    transmission approach to decide how much and when to send redundancy in the network, and a minimalistic feedback mechanism to guarantee delivery of generations of the different flows. Given the delay constraints of video applications, we proposed a simple yet effective coding mechanism, Block Coding On The Fly...

  4. Telemetry Standards, RCC Standard 106-17, Chapter 4, Pulse Code Modulation Standards

    Science.gov (United States)

    2017-07-01

    A-4 Appendix 4-B. Citations ...investigation can be found in a paper by J. L. Maury, Jr. and J. Styles , “Development of Optimum Frame Synchronization Codes for Goddard Space Flight Center...Standards, RCC Standard 106-17 Chapter 4, July 2017 B-1 APPENDIX 4-B Citations Aeronautical Radio, Inc. Mark 33 Digital Information Transfer

  5. Re-estimation of Motion and Reconstruction for Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Raket, Lars Lau; Forchhammer, Søren

    2014-01-01

    Transform domain Wyner-Ziv (TDWZ) video coding is an efficient approach to distributed video coding (DVC), which provides low complexity encoding by exploiting the source statistics at the decoder side. The DVC coding efficiency depends mainly on side information and noise modeling. This paper...... proposes a motion re-estimation technique based on optical flow to improve side information and noise residual frames by taking partially decoded information into account. To improve noise modeling, a noise residual motion re-estimation technique is proposed. Residual motion compensation with motion...

  6. Resource-Constrained Low-Complexity Video Coding for Wireless Transmission

    DEFF Research Database (Denmark)

    Ukhanova, Ann

    of video quality. We proposed a new metric for objective quality assessment that considers frame rate. As many applications deal with wireless video transmission, we performed an analysis of compression and transmission systems with a focus on power-distortion trade-off. We proposed an approach...... for ratedistortion-complexity optimization of upcoming video compression standard HEVC. We also provided a new method allowing decrease of power consumption on mobile devices in 3G networks. Finally, we proposed low-delay and low-power approaches for video transmission over wireless personal area networks, including......Constrained resources like memory, power, bandwidth and delay requirements in many mobile systems pose limitations for video applications. Standard approaches for video compression and transmission do not always satisfy system requirements. In this thesis we have shown that it is possible to modify...

  7. Quality assurance requirements in various codes and standards

    International Nuclear Information System (INIS)

    Shaaban, H.I.; EL-Sayed, A.; Aly, A.E.

    1987-01-01

    The quality assurance requirements in various countries and according to various international codes and standards are presented, compared and critically discussed. Cases of developing countries are also discussed, and the use of IAEA code of practice and other codes for quality assurance in these countries is reviewed. Recommendations are made regarding the quality assurance system to be applied for Egypt's nuclear power plants

  8. Grid Standards and Codes | Grid Modernization | NREL

    Science.gov (United States)

    photovoltaics (PV) adoption have given a sense of urgency to the standards development process. The Accelerating Systems Integration Standards team is addressing this urgency by providing leadership and direction for

  9. Empirical Evaluation of Superposition Coded Multicasting for Scalable Video

    KAUST Repository

    Chun Pong Lau; Shihada, Basem; Pin-Han Ho

    2013-01-01

    In this paper we investigate cross-layer superposition coded multicast (SCM). Previous studies have proven its effectiveness in exploiting better channel capacity and service granularities via both analytical and simulation approaches. However

  10. Spatial-Aided Low-Delay Wyner-Ziv Video Coding

    Directory of Open Access Journals (Sweden)

    Bo Wu

    2009-01-01

    Full Text Available In distributed video coding, the side information (SI quality plays an important role in Wyner-Ziv (WZ frame coding. Usually, SI is generated at the decoder by the motion-compensated interpolation (MCI from the past and future key frames under the assumption that the motion trajectory between the adjacent frames is translational with constant velocity. However, this assumption is not always true and thus, the coding efficiency for WZ coding is often unsatisfactory in video with high and/or irregular motion. This situation becomes more serious in low-delay applications since only motion-compensated extrapolation (MCE can be applied to yield SI. In this paper, a spatial-aided Wyner-Ziv video coding (WZVC in low-delay application is proposed. In SA-WZVC, at the encoder, each WZ frame is coded as performed in the existing common Wyner-Ziv video coding scheme and meanwhile, the auxiliary information is also coded with the low-complexity DPCM. At the decoder, for the WZ frame decoding, auxiliary information should be decoded firstly and then SI is generated with the help of this auxiliary information by the spatial-aided motion-compensated extrapolation (SA-MCE. Theoretical analysis proved that when a good tradeoff between the auxiliary information coding and WZ frame coding is achieved, SA-WZVC is able to achieve better rate distortion performance than the conventional MCE-based WZVC without auxiliary information. Experimental results also demonstrate that SA-WZVC can efficiently improve the coding performance of WZVC in low-delay application.

  11. Lossless Coding Standards for Space Data Systems

    Science.gov (United States)

    Rice, R. F.

    1996-01-01

    The International Consultative Committee for Space Data Systems (CCSDS) is preparing to issue its first recommendation for a digital data compression standard. Because the space data systems of primary interest are employed to support scientific investigations requiring accurate representation, this initial standard will be restricted to lossless compression.

  12. Low-Complexity Multiple Description Coding of Video Based on 3D Block Transforms

    Directory of Open Access Journals (Sweden)

    Andrey Norkin

    2007-02-01

    Full Text Available The paper presents a multiple description (MD video coder based on three-dimensional (3D transforms. Two balanced descriptions are created from a video sequence. In the encoder, video sequence is represented in a form of coarse sequence approximation (shaper included in both descriptions and residual sequence (details which is split between two descriptions. The shaper is obtained by block-wise pruned 3D-DCT. The residual sequence is coded by 3D-DCT or hybrid, LOT+DCT, 3D-transform. The coding scheme is targeted to mobile devices. It has low computational complexity and improved robustness of transmission over unreliable networks. The coder is able to work at very low redundancies. The coding scheme is simple, yet it outperforms some MD coders based on motion-compensated prediction, especially in the low-redundancy region. The margin is up to 3 dB for reconstruction from one description.

  13. Codes and standards and other guidance cited in regulatory documents

    International Nuclear Information System (INIS)

    Nickolaus, J.R.; Bohlander, K.L.

    1996-08-01

    As part of the U.S. Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program (SRP-UDP), Pacific Northwest National Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. The SRP-UDP has been completed and the SRP-Maintenance Program (SRP-MP) is now maintaining this listing. Besides updating previous information, Revision 3 adds approximately 80 citations. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC's Bulletins, Information Notices, Circulars, Enforcement Manual, Generic Letters, Inspection Manual, Policy Statements, Regulatory Guides, Standard Technical Specifications and the Standard Review Plan (NUREG-0800)

  14. Codes and standards and other guidance cited in regulatory documents

    Energy Technology Data Exchange (ETDEWEB)

    Nickolaus, J.R.; Bohlander, K.L.

    1996-08-01

    As part of the U.S. Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program (SRP-UDP), Pacific Northwest National Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. The SRP-UDP has been completed and the SRP-Maintenance Program (SRP-MP) is now maintaining this listing. Besides updating previous information, Revision 3 adds approximately 80 citations. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC`s Bulletins, Information Notices, Circulars, Enforcement Manual, Generic Letters, Inspection Manual, Policy Statements, Regulatory Guides, Standard Technical Specifications and the Standard Review Plan (NUREG-0800).

  15. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    Science.gov (United States)

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  16. Representation mutations from standard genetic codes

    Science.gov (United States)

    Aisah, I.; Suyudi, M.; Carnia, E.; Suhendi; Supriatna, A. K.

    2018-03-01

    Graph is widely used in everyday life especially to describe model problem and describe it concretely and clearly. In addition graph is also used to facilitate solve various kinds of problems that are difficult to be solved by calculation. In Biology, graph can be used to describe the process of protein synthesis in DNA. Protein has an important role for DNA (deoxyribonucleic acid) or RNA (ribonucleic acid). Proteins are composed of amino acids. In this study, amino acids are related to genetics, especially the genetic code. The genetic code is also known as the triplet or codon code which is a three-letter arrangement of DNA nitrogen base. The bases are adenine (A), thymine (T), guanine (G) and cytosine (C). While on RNA thymine (T) is replaced with Urasil (U). The set of all Nitrogen bases in RNA is denoted by N = {C U, A, G}. This codon works at the time of protein synthesis inside the cell. This codon also encodes the stop signal as a sign of the stop of protein synthesis process. This paper will examine the process of protein synthesis through mathematical studies and present it in three-dimensional space or graph. The study begins by analysing the set of all codons denoted by NNN such that to obtain geometric representations. At this stage there is a matching between the sets of all nitrogen bases N with Z 2 × Z 2; C=(\\overline{0},\\overline{0}),{{U}}=(\\overline{0},\\overline{1}),{{A}}=(\\overline{1},\\overline{0}),{{G}}=(\\overline{1},\\overline{1}). By matching the algebraic structure will be obtained such as group, group Klein-4,Quotien group etc. With the help of Geogebra software, the set of all codons denoted by NNN can be presented in a three-dimensional space as a multicube NNN and also can be represented as a graph, so that can easily see relationship between the codon.

  17. Motion Vector Sharing and Bitrate Allocation for 3D Video-Plus-Depth Coding

    Directory of Open Access Journals (Sweden)

    Béatrice Pesquet-Popescu

    2008-08-01

    Full Text Available The video-plus-depth data representation uses a regular texture video enriched with the so-called depth map, providing the depth distance for each pixel. The compression efficiency is usually higher for smooth, gray level data representing the depth map than for classical video texture. However, improvements of the coding efficiency are still possible, taking into account the fact that the video and the depth map sequences are strongly correlated. Classically, the correlation between the texture motion vectors and the depth map motion vectors is not exploited in the coding process. The aim of this paper is to reduce the amount of information for describing the motion of the texture video and of the depth map sequences by sharing one common motion vector field. Furthermore, in the literature, the bitrate control scheme generally fixes for the depth map sequence a percentage of 20% of the texture stream bitrate. However, this fixed percentage can affect the depth coding efficiency, and it should also depend on the content of each sequence. We propose a new bitrate allocation strategy between the texture and its associated per-pixel depth information. We provide comparative analysis to measure the quality of the resulting 3D+t sequences.

  18. The status of Korean nuclear codes and standards

    International Nuclear Information System (INIS)

    Namha Kim; Jong-Hae Kim

    2005-01-01

    Korea Electric Power Industry Code (KEPIC), a set of integrated standards applicable to the design, construction and operation of electric power facilities including nuclear power plants, has been developed on the basis of referring to the prevailing U.S. codes and standards which had been applied to the electric power facilities in Korea. Being the developing and managing organization of KEPIC, Korea Electric Association (KEA) published its first edition in 1995, the second in 200,0 and is expected to publish the 2005 edition. KEPIC was applied to the construction of Ulchin Nuclear Units 5 and 6 in 1997, and will be applicable to the construction of forthcoming nuclear power plants in Korea. Along with the effectuation of the Agreement on Technical Barriers to Trade (TBT) in 1995, the international trend related to codes and standards is changing rapidly. The KEA is, therefore, making its utmost efforts so as for KEPIC to keep abreast with the changing environment in international arena. KEA notified ISO/IEC Information Centre of its acceptance of the Code of Good Practice in the Agreement on TBT. The 2005 KEPIC edition will be retrofitted according to the ISO/IEC Guide 21- Adoption of International Standards as regional or national standards. KEA's efforts will help KEPIC correspond with international standards such as ISO/IEC standards, and internationally recognized standards such as ASME codes and standards. (authors)

  19. Comparison of codes and standards for radiographic inspection

    International Nuclear Information System (INIS)

    Bingoeldag, M. M.; Aksu, M.; Akguen, A. F.

    1995-01-01

    This report compares the procedurel requirements and acceptance criteria for radiographic inspections specified in the relevant national and international codes and standards. In particular, detailed analysis of inspection conditions such as exposure arrangements, and contrast requirements are given

  20. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  1. Standard problems for structural computer codes

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.

    1985-01-01

    BNL is investigating the ranges of validity of the analytical methods used to predict the behavior of nuclear safety related structures under accidental and extreme environmental loadings. During FY 85, the investigations were concentrated on special problems that can significantly influence the outcome of the soil structure interaction evaluation process. Specially, limitations and applicability of the standard interaction methods when dealing with lift-off, layering and water table effects, were investigated. This paper describes the work and the results obtained during FY 85 from the studies on lift-off, layering and water-table effects in soil-structure interaction

  2. Programming Video Games and Simulations in Science Education: Exploring Computational Thinking through Code Analysis

    Science.gov (United States)

    Garneli, Varvara; Chorianopoulos, Konstantinos

    2018-01-01

    Various aspects of computational thinking (CT) could be supported by educational contexts such as simulations and video-games construction. In this field study, potential differences in student motivation and learning were empirically examined through students' code. For this purpose, we performed a teaching intervention that took place over five…

  3. Innovation Opportunities: An Overview of Standards and Platforms in the Video Game Industry

    OpenAIRE

    Laakso, Mikael; Nyman, Linus Morten

    2014-01-01

    The video game industry offers insights into the significance of standards and platforms. Furthermore, it shows examples of how new entrants can offer innovative services, while reducing their own risk, through bridging the boundaries between standards. Through an exploration of both past and present, this article aims to serve as a primer for understanding, firstly, the technological standards and platforms of the video game industry, and secondly, the recent innovations within the video gam...

  4. ASME nuclear codes and standards risk management strategic plan

    International Nuclear Information System (INIS)

    Balkey, Kenneth R.

    2003-01-01

    Over the past 15 years, several risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to better manage the numerous initiatives in the future, the ASME Board on Nuclear Codes and Standards has recently developed and approved a Risk Management Strategic Plan. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent issuance of the ASME Standard for Probabilistic Risk Assessment (PRA) for Nuclear Power Plant Applications. The paper discusses potential applications within ASME Nuclear Codes and Standards that may require expansion of the PRA Standard, such as for new generation reactors, or the development of new PRA Standards. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, and related U.S. regulatory activities are also summarized. (author)

  5. Optimization of high-definition video coding and hybrid fiber-wireless transmission in the 60 GHz band

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Pham, Tien Thang; Beltrán, Marta

    2011-01-01

    The paper addresses the problem of distribution of highdefinition video over fiber-wireless networks. The physical layer architecture with the low complexity envelope detection solution is investigated. We present both experimental studies and simulation of high quality high-definition compressed...... video transmission over 60 GHz fiberwireless link. Using advanced video coding we satisfy low complexity and low delay constraints, meanwhile preserving the superb video quality after significantly extended wireless distance. © 2011 Optical Society of America....

  6. 76 FR 22383 - National Fire Codes: Request for Proposals for Revision of Codes and Standards

    Science.gov (United States)

    2011-04-21

    ... Chemical Extinguishing Systems. NFPA 22-2008 Standard for Water 5/23/2011 Tanks for Private Fire Protection... Ensembles for Technical Rescue Incidents. NFPA 1925-2008 Standard on Marine Fire- 5/23/2011 Fighting Vessels... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request...

  7. Joint source/channel coding of scalable video over noisy channels

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, G.; Zakhor, A. [Department of Electrical Engineering and Computer Sciences University of California Berkeley, California94720 (United States)

    1997-01-01

    We propose an optimal bit allocation strategy for a joint source/channel video codec over noisy channel when the channel state is assumed to be known. Our approach is to partition source and channel coding bits in such a way that the expected distortion is minimized. The particular source coding algorithm we use is rate scalable and is based on 3D subband coding with multi-rate quantization. We show that using this strategy, transmission of video over very noisy channels still renders acceptable visual quality, and outperforms schemes that use equal error protection only. The flexibility of the algorithm also permits the bit allocation to be selected optimally when the channel state is in the form of a probability distribution instead of a deterministic state. {copyright} {ital 1997 American Institute of Physics.}

  8. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  9. ASME nuclear codes and standards risk management strategic planning

    International Nuclear Information System (INIS)

    Hill, Ralph S. III; Balkey, Kenneth R.; Erler, Bryan A.; Wesley Rowley, C.

    2007-01-01

    This paper is prepared in honor and in memory of the late Professor Emeritus Yasuhide Asada to recognize his contributions to ASME Nuclear Codes and Standards initiatives, particularly those related to risk-informed technology and System Based Code developments. For nearly two decades, numerous risk-informed initiatives have been completed or are under development within the ASME Nuclear Codes and Standards organization. In order to properly manage the numerous initiatives currently underway or planned for the future, the ASME Board on Nuclear Codes and Standards (BNCS) has an established Risk Management Strategic Plan (Plan) that is maintained and updated by the ASME BNCS Risk Management Task Group. This paper presents the latest approved version of the plan beginning with a background of applications completed to date, including the recent probabilistic risk assessment (PRA) standards developments for nuclear power plant applications. The paper discusses planned applications within ASME Nuclear Codes and Standards that will require expansion of the ASME PRA Standard to support new advanced light water reactor and next generation reactor developments, such as for high temperature gas-cooled reactors. Emerging regulatory developments related to risk-informed, performance- based approaches are summarized. A long-term vision for the potential development and evolution to a nuclear systems code that adopts a risk-informed approach across a facility life-cycle (design, construction, operation, maintenance, and closure) is also summarized. Finally, near term and long term actions are defined across the ASME Nuclear Codes and Standards organizations related to risk management, including related U.S. regulatory activities. (author)

  10. Iterative Multiview Side Information for Enhanced Reconstruction in Distributed Video Coding

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Distributed video coding (DVC is a new paradigm for video compression based on the information theoretical results of Slepian and Wolf (SW and Wyner and Ziv (WZ. DVC entails low-complexity encoders as well as separate encoding of correlated video sources. This is particularly attractive for multiview camera systems in video surveillance and camera sensor network applications, where low complexity is required at the encoder. In addition, the separate encoding of the sources implies no communication between the cameras in a practical scenario. This is an advantage since communication is time and power consuming and requires complex networking. In this work, different intercamera estimation techniques for side information (SI generation are explored and compared in terms of estimating quality, complexity, and rate distortion (RD performance. Further, a technique called iterative multiview side information (IMSI is introduced, where the final SI is used in an iterative reconstruction process. The simulation results show that IMSI significantly improves the RD performance for video with significant motion and activity. Furthermore, DVC outperforms AVC/H.264 Intra for video with average and low motion but it is still inferior to the Inter No Motion and Inter Motion modes.

  11. Building climate change into infrastructure codes and standards

    International Nuclear Information System (INIS)

    Auld, H.; Klaasen, J.; Morris, R.; Fernandez, S.; MacIver, D.; Bernstein, D.

    2009-01-01

    'Full text:' Building codes and standards and the climatic design values embedded within these legal to semi-legal documents have profound safety, health and economic implications for Canada's infrastructure systems. The climatic design values that have been used for the design of almost all of today's more than $5.5 Trillion in infrastructure are based on historical climate data and assume that the extremes of the past will represent future conditions. Since new infrastructure based on codes and standards will be built to survive for decades to come, it is critically important that existing climatic design information be as accurate and up-to-date as possible, that the changing climate be monitored to detect and highlight vulnerabilities of existing infrastructure, that forensic studies of climate-related failures be undertaken and that codes and standards processes incorporate future climates and extremes as much as possible. Uncertainties in the current climate change models and their scenarios currently challenge our ability to project future extremes regionally and locally. Improvements to the spatial and temporal resolution of these climate change scenarios, along with improved methodologies to treat model biases and localize results, will allow future codes and standards to better reflect the extremes and weathering conditions expected over the lifespan of structures. In the meantime, other information and code processes can be used to incorporate changing climate conditions into upcoming infrastructure codes and standards, to “bridge” the model uncertainty gap and to complement the state of existing projections. This presentation will outline some of the varied information and processes that will be used to incorporate climate change adaptation into the next development cycle of the National Building Code of Canada and numerous other national CSA infrastructure standards. (author)

  12. Lightweight Object Tracking in Compressed Video Streams Demonstrated in Region-of-Interest Coding

    Directory of Open Access Journals (Sweden)

    Lerouge Sam

    2007-01-01

    Full Text Available Video scalability is a recent video coding technology that allows content providers to offer multiple quality versions from a single encoded video file in order to target different kinds of end-user devices and networks. One form of scalability utilizes the region-of-interest concept, that is, the possibility to mark objects or zones within the video as more important than the surrounding area. The scalable video coder ensures that these regions-of-interest are received by an end-user device before the surrounding area and preferably in higher quality. In this paper, novel algorithms are presented making it possible to automatically track the marked objects in the regions of interest. Our methods detect the overall motion of a designated object by retrieving the motion vectors calculated during the motion estimation step of the video encoder. Using this knowledge, the region-of-interest is translated, thus following the objects within. Furthermore, the proposed algorithms allow adequate resizing of the region-of-interest. By using the available information from the video encoder, object tracking can be done in the compressed domain and is suitable for real-time and streaming applications. A time-complexity analysis is given for the algorithms proving the low complexity thereof and the usability for real-time applications. The proposed object tracking methods are generic and can be applied to any codec that calculates the motion vector field. In this paper, the algorithms are implemented within MPEG-4 fine-granularity scalability codec. Different tests on different video sequences are performed to evaluate the accuracy of the methods. Our novel algorithms achieve a precision up to 96.4 .

  13. Lightweight Object Tracking in Compressed Video Streams Demonstrated in Region-of-Interest Coding

    Directory of Open Access Journals (Sweden)

    Rik Van de Walle

    2007-01-01

    Full Text Available Video scalability is a recent video coding technology that allows content providers to offer multiple quality versions from a single encoded video file in order to target different kinds of end-user devices and networks. One form of scalability utilizes the region-of-interest concept, that is, the possibility to mark objects or zones within the video as more important than the surrounding area. The scalable video coder ensures that these regions-of-interest are received by an end-user device before the surrounding area and preferably in higher quality. In this paper, novel algorithms are presented making it possible to automatically track the marked objects in the regions of interest. Our methods detect the overall motion of a designated object by retrieving the motion vectors calculated during the motion estimation step of the video encoder. Using this knowledge, the region-of-interest is translated, thus following the objects within. Furthermore, the proposed algorithms allow adequate resizing of the region-of-interest. By using the available information from the video encoder, object tracking can be done in the compressed domain and is suitable for real-time and streaming applications. A time-complexity analysis is given for the algorithms proving the low complexity thereof and the usability for real-time applications. The proposed object tracking methods are generic and can be applied to any codec that calculates the motion vector field. In this paper, the algorithms are implemented within MPEG-4 fine-granularity scalability codec. Different tests on different video sequences are performed to evaluate the accuracy of the methods. Our novel algorithms achieve a precision up to 96.4%.

  14. 1995 building energy codes and standards workshops: Summary and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Sandahl, L.J.; Shankle, D.L.

    1996-02-01

    During the spring of 1995, Pacific Northwest National Laboratory (PNNL) conducted four two-day Regional Building Energy Codes and Standards workshops across the US. Workshops were held in Chicago, Denver, Rhode Island, and Atlanta. The workshops were designed to benefit state-level officials including staff of building code commissions, energy offices, public utility commissions, and others involved with adopting/updating, implementing, and enforcing building energy codes in their states. The workshops provided an opportunity for state and other officials to learn more about residential and commercial building energy codes and standards, the role of the US Department of Energy and the Building Standards and Guidelines Program at Pacific Northwest National Laboratory, Home Energy Rating Systems (HERS), Energy Efficient Mortgages (EEM), training issues, and other topics related to the development, adoption, implementation, and enforcement of building energy codes. Participants heard success stories, got tips on enforcement training, and received technical support materials. In addition to receiving information on the above topics, workshop participants had an opportunity to provide input on code adoption issues, building industry training issues, building design issues, and exemplary programs across the US. This paper documents the workshop planning, findings, and follow-up processes.

  15. Codes, standards, and requirements for DOE facilities: natural phenomena design

    International Nuclear Information System (INIS)

    Webb, A.B.

    1985-01-01

    The basic requirements for codes, standards, and requirements are found in DOE Orders 5480.1A, 5480.4, and 6430.1. The type of DOE facility to be built and the hazards which it presents will determine the criteria to be applied for natural phenomena design. Mandatory criteria are established in the DOE orders for certain designs but more often recommended guidance is given. National codes and standards form a great body of experience from which the project engineer may draw. Examples of three kinds of facilities and the applicable codes and standards are discussed. The safety program planning approach to project management used at Westinghouse Hanford is outlined. 5 figures, 2 tables

  16. Lightening protection, techniques, applied codes and standards. Vol. 4

    International Nuclear Information System (INIS)

    Mahmoud, M.; Shaaban, H.; Lamey, S.

    1996-01-01

    Lightening is the only natural disaster that protection against is highly effective. Therefore for the safety of critical installations specifically nuclear, an effective lightening protection system (LPS) is required. The design and installation of LPS's have been addressed by many international codes and standards. In this paper, the various LPS's are discussed and compared, including radioactive air terminals, ionizing air terminals, and terminals equipped with electrical trigging devices. Also, the so-called dissipation array systems are discussed and compared to other systems technically and economically. Moreover, the available international codes and standards related to the lightening protection are discussed. such standards include those published by the national fire protection association (NFPA), lightening protection institute (LPI), underwriters laboratories (UL), and british standards Finally, the possibility of developing an egyptian national standards is discussed

  17. 78 FR 24725 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2013-04-26

    ... Production, Storage, and Handling of Liquefied Natural Gas (LNG). NFPA 61--2013 Standard for the 7/6/2015... Nitrate Film. NFPA 51--2013 Standard for the Design 7/6/2015 and Installation of Oxygen-Fuel Gas Systems... Charging Plants. NFPA 52--2013 Vehicular Gaseous Fuel 1/3/2014 Systems Code. NFPA 53--2011 Recommended...

  18. 77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2012-11-09

    ... Water Mist Fire P Protection Systems. NFPA 921 Guide for Fire and Explosion P Investigations. NFPA 1005 Standard for Professional P Qualifications for Marine Fire Fighting for Land-Based Fire Fighters. NFPA 1192... DEPARTMENT OF COMMERCE National Institute of Standards and Technology National Fire Codes: Request...

  19. Defining the cognitive enhancing properties of video games: Steps Towards Standardization and Translation.

    Science.gov (United States)

    Goodwin, Shikha Jain; Dziobek, Derek

    2016-09-01

    Ever since video games were available to the general public, they have intrigued brain researchers for many reasons. There is an enormous amount of diversity in the video game research, ranging from types of video games used, the amount of time spent playing video games, the definition of video gamer versus non-gamer to the results obtained after playing video games. In this paper, our goal is to provide a critical discussion of these issues, along with some steps towards generalization using the discussion of an article published by Clemenson and Stark (2005) as the starting point. The authors used a distinction between 2D versus 3D video games to compare their effects on the learning and memory in humans. The primary hypothesis of the authors is that the exploration of virtual environments while playing video games is a human correlate of environment enrichment. Authors found that video gamers performed better than the non-video gamers, and if non-gamers are trained on playing video gamers, 3D games provide better environment enrichment compared to 2D video games, as indicated by better memory scores. The end goal of standardization in video games is to be able to translate the field so that the results can be used for greater good.

  20. 75 FR 75186 - Interview Room Video System Standard Special Technical Committee Request for Proposals for...

    Science.gov (United States)

    2010-12-02

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OJP (NIJ) Docket No. 1534] Interview Room Video System Standard Special Technical Committee Request for Proposals for Certification and Testing Expertise... Interview Room Video System Standard and corresponding certification program requirements. This work is...

  1. Standardized access, display, and retrieval of medical video

    Science.gov (United States)

    Bellaire, Gunter; Steines, Daniel; Graschew, Georgi; Thiel, Andreas; Bernarding, Johannes; Tolxdorff, Thomas; Schlag, Peter M.

    1999-05-01

    The system presented here enhances documentation and data- secured, second-opinion facilities by integrating video sequences into DICOM 3.0. We present an implementation for a medical video server extended by a DICOM interface. Security mechanisms conforming with DICOM are integrated to enable secure internet access. Digital video documents of diagnostic and therapeutic procedures should be examined regarding the clip length and size necessary for second opinion and manageable with today's hardware. Image sources relevant for this paper include 3D laparoscope, 3D surgical microscope, 3D open surgery camera, synthetic video, and monoscopic endoscopes, etc. The global DICOM video concept and three special workplaces of distinct applications are described. Additionally, an approach is presented to analyze the motion of the endoscopic camera for future automatic video-cutting. Digital stereoscopic video sequences are especially in demand for surgery . Therefore DSVS are also integrated into the DICOM video concept. Results are presented describing the suitability of stereoscopic display techniques for the operating room.

  2. A Complete Video Coding Chain Based on Multi-Dimensional Discrete Cosine Transform

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2010-09-01

    Full Text Available The paper deals with a video compression method based on the multi-dimensional discrete cosine transform. In the text, the encoder and decoder architectures including the definitions of all mathematical operations like the forward and inverse 3-D DCT, quantization and thresholding are presented. According to the particular number of currently processed pictures, the new quantization tables and entropy code dictionaries are proposed in the paper. The practical properties of the 3-D DCT coding chain compared with the modern video compression methods (such as H.264 and WebM and the computing complexity are presented as well. It will be proved the best compress properties could be achieved by complex H.264 codec. On the other hand the computing complexity - especially on the encoding side - is lower for the 3-D DCT method.

  3. Improved Side Information Generation for Distributed Video Coding by Exploiting Spatial and Temporal Correlations

    Directory of Open Access Journals (Sweden)

    Ye Shuiming

    2009-01-01

    Full Text Available Distributed video coding (DVC is a video coding paradigm allowing low complexity encoding for emerging applications such as wireless video surveillance. Side information (SI generation is a key function in the DVC decoder, and plays a key-role in determining the performance of the codec. This paper proposes an improved SI generation for DVC, which exploits both spatial and temporal correlations in the sequences. Partially decoded Wyner-Ziv (WZ frames, based on initial SI by motion compensated temporal interpolation, are exploited to improve the performance of the whole SI generation. More specifically, an enhanced temporal frame interpolation is proposed, including motion vector refinement and smoothing, optimal compensation mode selection, and a new matching criterion for motion estimation. The improved SI technique is also applied to a new hybrid spatial and temporal error concealment scheme to conceal errors in WZ frames. Simulation results show that the proposed scheme can achieve up to 1.0 dB improvement in rate distortion performance in WZ frames for video with high motion, when compared to state-of-the-art DVC. In addition, both the objective and perceptual qualities of the corrupted sequences are significantly improved by the proposed hybrid error concealment scheme, outperforming both spatial and temporal concealments alone.

  4. Mixture block coding with progressive transmission in packet video. Appendix 1: Item 2. M.S. Thesis

    Science.gov (United States)

    Chen, Yun-Chung

    1989-01-01

    Video transmission will become an important part of future multimedia communication because of dramatically increasing user demand for video, and rapid evolution of coding algorithm and VLSI technology. Video transmission will be part of the broadband-integrated services digital network (B-ISDN). Asynchronous transfer mode (ATM) is a viable candidate for implementation of B-ISDN due to its inherent flexibility, service independency, and high performance. According to the characteristics of ATM, the information has to be coded into discrete cells which travel independently in the packet switching network. A practical realization of an ATM video codec called Mixture Block Coding with Progressive Transmission (MBCPT) is presented. This variable bit rate coding algorithm shows how a constant quality performance can be obtained according to user demand. Interactions between codec and network are emphasized including packetization, service synchronization, flow control, and error recovery. Finally, some simulation results based on MBCPT coding with error recovery are presented.

  5. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... Transform Domain Wyner-Ziv (TDWZ) coding and proposes using optical flow to improve side information generation and clustering to improve noise modeling. The optical flow technique is exploited at the decoder side to compensate weaknesses of block based methods, when using motion-compensation to generate...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...

  6. Efficient depth intraprediction method for H.264/AVC-based three-dimensional video coding

    Science.gov (United States)

    Oh, Kwan-Jung; Oh, Byung Tae

    2015-04-01

    We present an intracoding method that is applicable to depth map coding in multiview plus depth systems. Our approach combines skip prediction and plane segmentation-based prediction. The proposed depth intraskip prediction uses the estimated direction at both the encoder and decoder, and does not need to encode residual data. Our plane segmentation-based intraprediction divides the current block into biregions, and applies a different prediction scheme for each segmented region. This method avoids incorrect estimations across different regions, resulting in higher prediction accuracy. Simulation results demonstrate that the proposed scheme is superior to H.264/advanced video coding intraprediction and has the ability to improve the subjective rendering quality.

  7. Context adaptive binary arithmetic coding-based data hiding in partially encrypted H.264/AVC videos

    Science.gov (United States)

    Xu, Dawen; Wang, Rangding

    2015-05-01

    A scheme of data hiding directly in a partially encrypted version of H.264/AVC videos is proposed which includes three parts, i.e., selective encryption, data embedding and data extraction. Selective encryption is performed on context adaptive binary arithmetic coding (CABAC) bin-strings via stream ciphers. By careful selection of CABAC entropy coder syntax elements for selective encryption, the encrypted bitstream is format-compliant and has exactly the same bit rate. Then a data-hider embeds the additional data into partially encrypted H.264/AVC videos using a CABAC bin-string substitution technique without accessing the plaintext of the video content. Since bin-string substitution is carried out on those residual coefficients with approximately the same magnitude, the quality of the decrypted video is satisfactory. Video file size is strictly preserved even after data embedding. In order to adapt to different application scenarios, data extraction can be done either in the encrypted domain or in the decrypted domain. Experimental results have demonstrated the feasibility and efficiency of the proposed scheme.

  8. An experimental digital consumer recorder for MPEG-coded video signals

    NARCIS (Netherlands)

    Saeijs, R.W.J.J.; With, de P.H.N.; Rijckaert, A.M.A.; Wong, C.

    1995-01-01

    The concept and real-time implementation of an experimental home-use digital recorder is presented, capable of recording MPEG-compressed video signals. The system has small recording mechanics based on the DVC standard and it uses MPEG compression for trick-mode signals as well

  9. CSA guide to Canadian wind turbine codes and standards

    International Nuclear Information System (INIS)

    2008-01-01

    The Canadian wind energy sector has become one of the fastest-growing wind energy markets in the world. Growth of the industry has been supported by various government agencies. However, many projects have experienced cost over-runs or cancellations as a result of unclear regulatory requirements, and wind energy developers are currently subject to a variety of approval processes involving several different authorities. This Canadian Standards Association (CSA) guide provided general information on codes and standards related to the design, approval, installation, operation, and maintenance of wind turbines in Canada. CSA codes and standards were developed by considering 5 new standards adopted by the International Electrotechnical Commission (IEC) Technical Committee on Wind Turbines. The standards described in this document related to acoustic noise measurement techniques; power performance measurements of electricity-producing wind turbines; lightning protection for wind turbine generator systems; design requirements for turbines; and design requirements for small wind turbines. The guide addressed specific subject areas related to the development of wind energy projects that involve formal or regulatory approval processes. Subject areas included issues related to safety, environmental design considerations, site selection, and mechanical systems. Information on associated standards and codes was also included

  10. Performance Analysis of Video PHY Controller Using Unidirection and Bi-directional IO Standard via 7 Series FPGA

    DEFF Research Database (Denmark)

    Das, Bhagwan; Abdullah, M F L; Hussain, Dil muhammed Akbar

    2017-01-01

    graphics consumes more power, this creates a need of designing the low power design for Video PHY controller. In this paper, the performance of Video PHY controller is analyzed by comparing the power consumption of unidirectional and bi-directional IO Standard over 7 series FPGA. It is determined...... that total on-chip power is reduced for unidirectional IO Standard based Video PHY controller compared to bidirectional IO Standard based Video PHY controller. The most significant achievement of this work is that it is concluded that unidirectional IO Standard based Video PHY controller consume least...... standby power compared to bidirectional IO Standard based Video PHY controller. It is defined that for 6 GHz operated frequency Video PHY controller, the 32% total on-chip power is reduced using unidirectional IO Standard based Video PHY controller is less compared to bidirectional IO Standard based Video...

  11. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  12. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1995-01-01

    A method is described for generating electron cross sections that are compatible with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electronphoton transport problems

  13. Future direction of ASME nuclear codes and standards

    International Nuclear Information System (INIS)

    Ennis, Kevin; Sheehan, Mark E.

    2003-01-01

    While the nuclear power industry in the US is in a period of stasis, there continues to be a great deal of activity in the ASME nuclear standards development arena. As plants age, the need for new approaches in standardization changes with the changing needs of the industry. New tools are becoming available in the form of risk analysis, and this is finding its way into more and more of ASME's standards activities. This paper will take a look at the direction that ASME nuclear Codes and Standards are heading in this and other areas, as well as taking a look at some advance reactor concepts and plans for standards to address new technologies

  14. High-speed low-complexity video coding with EDiCTius: a DCT coding proposal for JPEG XS

    Science.gov (United States)

    Richter, Thomas; Fößel, Siegfried; Keinert, Joachim; Scherl, Christian

    2017-09-01

    In its 71th meeting, the JPEG committee issued a call for low complexity, high speed image coding, designed to address the needs of low-cost video-over-ip applications. As an answer to this call, Fraunhofer IIS and the Computing Center of the University of Stuttgart jointly developed an embedded DCT image codec requiring only minimal resources while maximizing throughput on FPGA and GPU implementations. Objective and subjective tests performed for the 73rd meeting confirmed its excellent performance and suitability for its purpose, and it was selected as one of the two key contributions for the development of a joined test model. In this paper, its authors describe the design principles of the codec, provide a high-level overview of the encoder and decoder chain and provide evaluation results on the test corpus selected by the JPEG committee.

  15. Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding

    Directory of Open Access Journals (Sweden)

    Xin Li

    2014-06-01

    Full Text Available Sparse coding is an emerging method that has been successfully applied to both robust object tracking and recognition in the vision literature. In this paper, we propose to explore a sparse coding-based approach toward joint object tracking-and-recognition and explore its potential in the analysis of forward-looking infrared (FLIR video to support nighttime machine vision systems. A key technical contribution of this work is to unify existing sparse coding-based approaches toward tracking and recognition under the same framework, so that they can benefit from each other in a closed-loop. On the one hand, tracking the same object through temporal frames allows us to achieve improved recognition performance through dynamical updating of template/dictionary and combining multiple recognition results; on the other hand, the recognition of individual objects facilitates the tracking of multiple objects (i.e., walking pedestrians, especially in the presence of occlusion within a crowded environment. We report experimental results on both the CASIAPedestrian Database and our own collected FLIR video database to demonstrate the effectiveness of the proposed joint tracking-and-recognition approach.

  16. Fast bi-directional prediction selection in H.264/MPEG-4 AVC temporal scalable video coding.

    Science.gov (United States)

    Lin, Hung-Chih; Hang, Hsueh-Ming; Peng, Wen-Hsiao

    2011-12-01

    In this paper, we propose a fast algorithm that efficiently selects the temporal prediction type for the dyadic hierarchical-B prediction structure in the H.264/MPEG-4 temporal scalable video coding (SVC). We make use of the strong correlations in prediction type inheritance to eliminate the superfluous computations for the bi-directional (BI) prediction in the finer partitions, 16×8/8×16/8×8 , by referring to the best temporal prediction type of 16 × 16. In addition, we carefully examine the relationship in motion bit-rate costs and distortions between the BI and the uni-directional temporal prediction types. As a result, we construct a set of adaptive thresholds to remove the unnecessary BI calculations. Moreover, for the block partitions smaller than 8 × 8, either the forward prediction (FW) or the backward prediction (BW) is skipped based upon the information of their 8 × 8 partitions. Hence, the proposed schemes can efficiently reduce the extensive computational burden in calculating the BI prediction. As compared to the JSVM 9.11 software, our method saves the encoding time from 48% to 67% for a large variety of test videos over a wide range of coding bit-rates and has only a minor coding performance loss. © 2011 IEEE

  17. ASME nuclear codes and standards: Recent technical initiatives

    International Nuclear Information System (INIS)

    Feigel, R. E.

    1995-01-01

    Although nuclear power construction is currently in a hiatus in the US, ASME and its volunteer committees remain committed to continual improvements in the technical requirements in its nuclear codes. This paper provides an overview of several significant recent revisions to ASME' s nuclear codes. Additionally, other important initiatives currently being addressed by ASME committees will be described. With the largest population of operating light water nuclear plants in the world and worldwide use of its nuclear codes, ASME continues to support technical advancements in its nuclear codes and standards. While revisions of various magnitude are an ongoing process, several recent revisions embody significant changes based on state of the art design philosophy and substantial industry experience. In the design area, a significant revisions has recently been approved which will significantly reduce conservatisms in seismic piping design as well as provide simplified design rules. Major revisions have also been made to the requirements for nuclear material manufacturers and suppliers, which should result in clearer understanding of this difficult administrative area of the code. In the area of Section XI inservice rules, substantial studies are underway to investigate the application of probabilistic, risked based inspection in lieu of the current deterministic inspection philosophy. While much work still is required in this area, it is an important potential application of the emerging field of risk based inspection

  18. A Standard-Compliant Virtual Meeting System with Active Video Object Tracking

    Science.gov (United States)

    Lin, Chia-Wen; Chang, Yao-Jen; Wang, Chih-Ming; Chen, Yung-Chang; Sun, Ming-Ting

    2002-12-01

    This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU) for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network) and the H.324 WAN (wide-area network) users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.

  19. A Standard-Compliant Virtual Meeting System with Active Video Object Tracking

    Directory of Open Access Journals (Sweden)

    Chang Yao-Jen

    2002-01-01

    Full Text Available This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network and the H.324 WAN (wide-area network users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.

  20. Merits and difficulties in adopting codes, standards and nuclear regulations

    International Nuclear Information System (INIS)

    El-Saiedi, A.F.; Morsy, S.; Mariy, A.

    1978-01-01

    Developing countries planning for introducing nuclear power plants as a source of energy have to develop or adopt sound regulatory practices. These are necessary to help governmental authorities to assess the safety of nuclear power plants and to perform inspections needed to confirm the established safe and sound limits. The first requirement is to form an independent regulatory body capable of setting up and enforcing proper safety regulations. The formation of this body is governed by several considerations related to local conditions in the developing countries, which may not always be favourable. It is quite impractical for countries with limited experience in the nuclear power field to develop their own codes, standards and regulations required for the nuclear regulatory body to perform its tasks. A practical way is to adopt codes, standards and regulations of a well-developed country. This has merits as well as drawbacks. The latter are related to problems of personnel, software, equipment and facilities. The difficulties involved in forming a nuclear regulatory body, and the merits and difficulties in adopting foreign codes, standards and regulations required for such body to perform its tasks, are discussed in this paper. Discussions are applicable to many developing countries and particular emphasis is given to the conditions and practices in Egypt. (author)

  1. Video-assisted thoracoscopic surgery (VATS) lobectomy using a standardized anterior approach

    DEFF Research Database (Denmark)

    Hansen, Henrik Jessen; Petersen, René Horsleben; Christensen, Merete

    2011-01-01

    Lobectomy using video-assisted thoracoscopic surgery (VATS) still is a controversial operation despite its many observed benefits. The controversy may be due to difficulties performing the procedure. This study addresses a standardized anterior approach facilitating the operation....

  2. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... documents from ICC's Chicago District Office: International Code Council, 4051 W Flossmoor Road, Country... Energy Conservation Code. International Existing Building Code. International Fire Code. International...

  3. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  4. ASME Section XI trends in developing nuclear codes and standards

    International Nuclear Information System (INIS)

    Hedden, O.F.

    1995-01-01

    When the author began working on nuclear power many years ago, he knew that perfection was the only acceptable technical standard. Unfortunately, this became an obsession with perfection that has had unfavorable consequences in some of the non-technical areas of work in ASME nuclear power Codes and Standards. However, the economic problems of the nuclear power industry now demand a more pragmatic approach if the industry is to continue. Not only does each item considered for action need to be evaluated to criteria that may in some cases be less than perfection, but one needs to consider whether it contributes tangibly to either safety or to reduction in technical or administrative burden. These should be the governing, criteria. The introduction of risk-based inspection methodologies will certainly be an important element in doing this successfully. One needs to consider these criteria collectively, as one discusses each item at the committee level, and individually, as one votes on each item. In the past, the author has been concerned that the industry was not acting quickly enough in taking advantage of opportunities offered by the Code to increase safety or to reduce cost. While he still has some concern, he thinks communication channels have been greatly improved. Now he is becoming more concerned with both the collective and individual actions that delay beneficial changes. The second part of the author's talk has to do with the relevance of the code committees in the nuclear power industry regulatory process

  5. Improving a Power Line Communications Standard with LDPC Codes

    Directory of Open Access Journals (Sweden)

    Hsu Christine

    2007-01-01

    Full Text Available We investigate a power line communications (PLC scheme that could be used to enhance the HomePlug 1.0 standard, specifically its ROBO mode which provides modest throughput for the worst case PLC channel. The scheme is based on using a low-density parity-check (LDPC code, in lieu of the concatenated Reed-Solomon and convolutional codes in ROBO mode. The PLC channel is modeled with multipath fading and Middleton's class A noise. Clipping is introduced to mitigate the effect of impulsive noise. A simple and effective method is devised to estimate the variance of the clipped noise for LDPC decoding. Simulation results show that the proposed scheme outperforms the HomePlug 1.0 ROBO mode and has lower computational complexity. The proposed scheme also dispenses with the repetition of information bits in ROBO mode to gain time diversity, resulting in 4-fold increase in physical layer throughput.

  6. Technological characteristics of digital video broadcasting: Handheld standard DVB-H

    Directory of Open Access Journals (Sweden)

    Andreja B. Samčović

    2011-07-01

    Full Text Available This paper gives an overview of the Digital Video Broadcasting - Handheld standard DVB-H, as a part of the DVB Project. This standard is based on the previous standard DVB-T, which was developed for the terrestrial digital television. The ways of DVB-H signal transmission are also described. Development of advanced technology enabled the digital video broadcasting over wireless portable terminals. This paper discusses the key technological features of the DVB-H standard, such as: time slicing, forward error correction, 4K mode and in-depth interleavers.

  7. Interactive Video Coding and Transmission over Heterogeneous Wired-to-Wireless IP Networks Using an Edge Proxy

    Directory of Open Access Journals (Sweden)

    Modestino James W

    2004-01-01

    Full Text Available Digital video delivered over wired-to-wireless networks is expected to suffer quality degradation from both packet loss and bit errors in the payload. In this paper, the quality degradation due to packet loss and bit errors in the payload are quantitatively evaluated and their effects are assessed. We propose the use of a concatenated forward error correction (FEC coding scheme employing Reed-Solomon (RS codes and rate-compatible punctured convolutional (RCPC codes to protect the video data from packet loss and bit errors, respectively. Furthermore, the performance of a joint source-channel coding (JSCC approach employing this concatenated FEC coding scheme for video transmission is studied. Finally, we describe an improved end-to-end architecture using an edge proxy in a mobile support station to implement differential error protection for the corresponding channel impairments expected on the two networks. Results indicate that with an appropriate JSCC approach and the use of an edge proxy, FEC-based error-control techniques together with passive error-recovery techniques can significantly improve the effective video throughput and lead to acceptable video delivery quality over time-varying heterogeneous wired-to-wireless IP networks.

  8. Digital video transcoding for transmission and storage

    CERN Document Server

    Sun, Huifang; Chen, Xuemin

    2004-01-01

    Professionals in the video and multimedia industries need a book that explains industry standards for video coding and how to convert the compressed information between standards. Digital Video Transcoding for Transmission and Storage answers this demand while also supplying the theories and principles of video compression and transcoding technologies. Emphasizing digital video transcoding techniques, this book summarizes its content via examples of practical methods for transcoder implementation. It relates almost all of its featured transcoding technologies to practical applications.This vol

  9. Temporal Scalability through Adaptive -Band Filter Banks for Robust H.264/MPEG-4 AVC Video Coding

    Directory of Open Access Journals (Sweden)

    Pau G

    2006-01-01

    Full Text Available This paper presents different structures that use adaptive -band hierarchical filter banks for temporal scalability. Open-loop and closed-loop configurations are introduced and illustrated using existing video codecs. In particular, it is shown that the H.264/MPEG-4 AVC codec allows us to introduce scalability by frame shuffling operations, thus keeping backward compatibility with the standard. The large set of shuffling patterns introduced here can be exploited to adapt the encoding process to the video content features, as well as to the user equipment and transmission channel characteristics. Furthermore, simulation results show that this scalability is obtained with no degradation in terms of subjective and objective quality in error-free environments, while in error-prone channels the scalable versions provide increased robustness.

  10. Using Digital Video Production to Meet the Common Core Standards

    Science.gov (United States)

    Nichols, Maura

    2012-01-01

    The implementation of the Common Core Standards has just begun and these standards will impact a generation that communicates with technology more than anything else. Texting, cell phones, Facebook, YouTube, Skype, etc. are the ways they speak with their friends and the world. The Common Core Standards recognize this. According to the Common Core…

  11. Electronic health record standards, coding systems, frameworks, and infrastructures

    CERN Document Server

    Sinha, Pradeep K; Bendale, Prashant; Mantri, Manisha; Dande, Atreya

    2013-01-01

    Discover How Electronic Health Records Are Built to Drive the Next Generation of Healthcare Delivery The increased role of IT in the healthcare sector has led to the coining of a new phrase ""health informatics,"" which deals with the use of IT for better healthcare services. Health informatics applications often involve maintaining the health records of individuals, in digital form, which is referred to as an Electronic Health Record (EHR). Building and implementing an EHR infrastructure requires an understanding of healthcare standards, coding systems, and frameworks. This book provides an

  12. 3D scene reconstruction based on multi-view distributed video coding in the Zernike domain for mobile applications

    Science.gov (United States)

    Palma, V.; Carli, M.; Neri, A.

    2011-02-01

    In this paper a Multi-view Distributed Video Coding scheme for mobile applications is presented. Specifically a new fusion technique between temporal and spatial side information in Zernike Moments domain is proposed. Distributed video coding introduces a flexible architecture that enables the design of very low complex video encoders compared to its traditional counterparts. The main goal of our work is to generate at the decoder the side information that optimally blends temporal and interview data. Multi-view distributed coding performance strongly depends on the side information quality built at the decoder. At this aim for improving its quality a spatial view compensation/prediction in Zernike moments domain is applied. Spatial and temporal motion activity have been fused together to obtain the overall side-information. The proposed method has been evaluated by rate-distortion performances for different inter-view and temporal estimation quality conditions.

  13. Integer-linear-programing optimization in scalable video multicast with adaptive modulation and coding in wireless networks.

    Science.gov (United States)

    Lee, Dongyul; Lee, Chaewoo

    2014-01-01

    The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  14. Integer-Linear-Programing Optimization in Scalable Video Multicast with Adaptive Modulation and Coding in Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dongyul Lee

    2014-01-01

    Full Text Available The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC with adaptive modulation and coding (AMC provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm.

  15. Regulatory Endorsement Activities for ASME Nuclear Codes and Standards

    International Nuclear Information System (INIS)

    West, Raymond A.

    2006-01-01

    The ASME Board on Nuclear Codes and Standards (BNCS) has formed a Task Group on Regulatory Endorsement (TG-RE) that is currently in discussions with the United States Nuclear Regulatory Commission (NRC) to look at suggestions and recommendations that can be used to help with the endorsement of new and revised ASME Nuclear Codes and Standards (NC and S). With the coming of new reactors in the USA in the very near future we need to look at both the regulations and all the ASME NC and S to determine where we need to make changes to support these new plants. At the same time it is important that we maintain our operating plants while addressing ageing management needs of our existing reactors. This is going to take new thinking, time, resources, and money. For all this to take place the regulations and requirements that we use must be clear concise and necessary for safety and to that end both the NRC and ASME are working together to make this happen. Because of the influence that the USA has in the world in dealing with these issues, this paper is written to inform the international nuclear engineering community about the issues and what actions are being addressed under this effort. (author)

  16. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  17. Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding

    Science.gov (United States)

    Dung, Lan-Rong; Lin, Meng-Chun

    This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.

  18. Standardization of computer programs - basis of the Czechoslovak library of nuclear codes

    International Nuclear Information System (INIS)

    Gregor, M.

    1987-01-01

    A standardized form of computer code documentation has been established in the CSSR in the field of reactor safety. Structure and content of the documentation are described and codes already subject to this process are mentioned. The formation of a Czechoslovak nuclear code library and facilitated discussion of safety reports containing results of standardized codes are aimed at

  19. Bit-depth scalable video coding with new inter-layer prediction

    Directory of Open Access Journals (Sweden)

    Chiang Jui-Chiu

    2011-01-01

    Full Text Available Abstract The rapid advances in the capture and display of high-dynamic range (HDR image/video content make it imperative to develop efficient compression techniques to deal with the huge amounts of HDR data. Since HDR device is not yet popular for the moment, the compatibility problems should be considered when rendering HDR content on conventional display devices. To this end, in this study, we propose three H.264/AVC-based bit-depth scalable video-coding schemes, called the LH scheme (low bit-depth to high bit-depth, the HL scheme (high bit-depth to low bit-depth, and the combined LH-HL scheme, respectively. The schemes efficiently exploit the high correlation between the high and the low bit-depth layers on the macroblock (MB level. Experimental results demonstrate that the HL scheme outperforms the other two schemes in some scenarios. Moreover, it achieves up to 7 dB improvement over the simulcast approach when the high and low bit-depth representations are 12 bits and 8 bits, respectively.

  20. Impact of packet losses in scalable 3D holoscopic video coding

    Science.gov (United States)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2014-05-01

    Holoscopic imaging became a prospective glassless 3D technology to provide more natural 3D viewing experiences to the end user. Additionally, holoscopic systems also allow new post-production degrees of freedom, such as controlling the plane of focus or the viewing angle presented to the user. However, to successfully introduce this technology into the consumer market, a display scalable coding approach is essential to achieve backward compatibility with legacy 2D and 3D displays. Moreover, to effectively transmit 3D holoscopic content over error-prone networks, e.g., wireless networks or the Internet, error resilience techniques are required to mitigate the impact of data impairments in the user quality perception. Therefore, it is essential to deeply understand the impact of packet losses in terms of decoding video quality for the specific case of 3D holoscopic content, notably when a scalable approach is used. In this context, this paper studies the impact of packet losses when using a three-layer display scalable 3D holoscopic video coding architecture previously proposed, where each layer represents a different level of display scalability (i.e., L0 - 2D, L1 - stereo or multiview, and L2 - full 3D holoscopic). For this, a simple error concealment algorithm is used, which makes use of inter-layer redundancy between multiview and 3D holoscopic content and the inherent correlation of the 3D holoscopic content to estimate lost data. Furthermore, a study of the influence of 2D views generation parameters used in lower layers on the performance of the used error concealment algorithm is also presented.

  1. Fifty years of progress in speech coding standards

    Science.gov (United States)

    Cox, Richard

    2004-10-01

    Over the past 50 years, speech coding has taken root worldwide. Early applications were for the military and transmission for telephone networks. The military gave equal priority to intelligibility and low bit rate. The telephone network gave priority to high quality and low delay. These illustrate three of the four areas in which requirements must be set for any speech coder application: bit rate, quality, delay, and complexity. While the military could afford relatively expensive terminal equipment for secure communications, the telephone network needed low cost for massive deployment in switches and transmission equipment worldwide. Today speech coders are at the heart of the wireless phones and telephone answering systems we use every day. In addition to the technology and technical invention that has occurred, standards make it possible for all these different systems to interoperate. The primary areas of standardization are the public switched telephone network, wireless telephony, and secure telephony for government and military applications. With the advent of IP telephony there are additional standardization efforts and challenges. In this talk the progress in all areas is reviewed as well as a reflection on Jim Flanagan's impact on this field during the past half century.

  2. Video coding and decoding devices and methods preserving PPG relevant information

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a video encoding device (10, 10', 10") and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  3. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10, 10', 10'') and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  4. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1997-01-01

    A method is described for generating electron cross sections that are comparable with standard discrete ordinates codes without modification. There are many advantages of using an established discrete ordinates solver, e.g. immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and man-made radiation environments. The cross sections have been successfully used in the DORT, TWODANT and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electron-photon transport problems. The key to the method is a simultaneous solution of the continuous-slowing-down (CSD) portion and elastic-scattering portion of the scattering source by the Goudsmit-Saunderson theory. The resulting multigroup-Legendre cross sections are much smaller than the true scattering cross sections that they represent. Under certain conditions, the cross sections are guaranteed positive and converge with a low-order Legendre expansion

  5. Multidimensional electron-photon transport with standard discrete ordinates codes

    International Nuclear Information System (INIS)

    Drumm, C.R.

    1997-01-01

    A method is described for generating electron cross sections that are compatible with standard discrete ordinates codes without modification. There are many advantages to using an established discrete ordinates solver, e.g., immediately available adjoint capability. Coupled electron-photon transport capability is needed for many applications, including the modeling of the response of electronics components to space and synthetic radiation environments. The cross sections have been successfully used in the DORT, TWODANT, and TORT discrete ordinates codes. The cross sections are shown to provide accurate and efficient solutions to certain multidimensional electron-photon transport problems. The key to the method is a simultaneous solution of the continuous-slowing-down and elastic-scattering portions of the scattering source by the Goudsmit-Saunderson theory. The resulting multigroup-Legendre cross sections are much smaller than the true scattering cross sections that they represent. Under certain conditions, the cross sections are guaranteed positive and converge with a low-order Legendre expansion

  6. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Houf, William G. (Sandia National Laboratories, Livermore, CA); Fluer, Inc., Paso Robels, CA; Fluer, Larry (Fluer, Inc., Paso Robels, CA); Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  7. Are industry codes and standards a valid cost containment approach

    International Nuclear Information System (INIS)

    Rowley, C.W.; Simpson, G.T.; Young, R.K.

    1990-01-01

    The nuclear industry has historically concentrated on safety design features for many years, but recently has been shifting to the reliability of the operating systems and components. The Navy has already gone through this transition and has found that Reliability Centered Maintenance (RCM) is an invaluable tool to improve the reliability of components, systems, ships, and classes of ships. There is a close correlation of Navy ships and equipment to commercial nuclear power plants and equipment. The Navy has a central engineering and configuration management organization (Naval Sea Systems Command) for over 500 ships, where as the over 100 commercial nuclear power plants and 52 nuclear utilities represent a fragmented owner/management structure. This paper suggests that the results of the application of RCM in the Navy can be duplicated to a large degree in the commercial nuclear power industry by the development and utilization of nuclear codes and standards

  8. Efficient MPEG-2 to H.264/AVC Transcoding of Intra-Coded Video

    Directory of Open Access Journals (Sweden)

    Vetro Anthony

    2007-01-01

    Full Text Available This paper presents an efficient transform-domain architecture and corresponding mode decision algorithms for transcoding intra-coded video from MPEG-2 to H.264/AVC. Low complexity is achieved in several ways. First, our architecture employs direct conversion of the transform coefficients, which eliminates the need for the inverse discrete cosine transform (DCT and forward H.264/AVC transform. Then, within this transform-domain architecture, we perform macroblock-based mode decisions based on H.264/AVC transform coefficients, which is possible using a novel method of calculating distortion in the transform domain. The proposed method for distortion calculation could be used to make rate-distortion optimized mode decisions with lower complexity. Compared to the pixel-domain architecture with rate-distortion optimized mode decision, simulation results show that there is a negligible loss in quality incurred by the direct conversion of transform coefficients and the proposed transform-domain mode decision algorithms, while complexity is significantly reduced. To further reduce the complexity, we also propose two fast mode decision algorithms. The first algorithm ranks modes based on a simple cost function in the transform domain, then computes the rate-distortion optimal mode from a reduced set of ranked modes. The second algorithm exploits temporal correlations in the mode decision between temporally adjacent frames. Simulation results show that these algorithms provide additional computational savings over the proposed transform-domain architecture while maintaining virtually the same coding efficiency.

  9. Partial Encryption of Entropy-Coded Video Compression Using Coupled Chaotic Maps

    Directory of Open Access Journals (Sweden)

    Fadi Almasalha

    2014-10-01

    Full Text Available Due to pervasive communication infrastructures, a plethora of enabling technologies is being developed over mobile and wired networks. Among these, video streaming services over IP are the most challenging in terms of quality, real-time requirements and security. In this paper, we propose a novel scheme to efficiently secure variable length coded (VLC multimedia bit streams, such as H.264. It is based on code word error diffusion and variable size segment shuffling. The codeword diffusion and the shuffling mechanisms are based on random operations from a secure and computationally efficient chaos-based pseudo-random number generator. The proposed scheme is ubiquitous to the end users and can be deployed at any node in the network. It provides different levels of security, with encrypted data volume fluctuating between 5.5–17%. It works on the compressed bit stream without requiring any decoding. It provides excellent encryption speeds on different platforms, including mobile devices. It is 200% faster and 150% more power efficient when compared with AES software-based full encryption schemes. Regarding security, the scheme is robust to well-known attacks in the literature, such as brute force and known/chosen plain text attacks.

  10. Creating Standardized Video Recordings of Multimodal Interactions across Cultures

    DEFF Research Database (Denmark)

    Rehm, Matthias; André, Elisabeth; Bee, Nikolaus

    2009-01-01

    the literature is often too anecdotal to serve as the basis for modeling a system’s behavior, making it necessary to collect multimodal corpora in a standardized fashion in different cultures. In this chapter, the challenges of such an endeavor are introduced and solutions are presented by examples from a German......-Japanese project that aims at modeling culture-specific behaviors for Embodied Conversational Agents....

  11. Using standardized patients versus video cases for representing clinical problems in problem-based learning.

    Science.gov (United States)

    Yoon, Bo Young; Choi, Ikseon; Choi, Seokjin; Kim, Tae-Hee; Roh, Hyerin; Rhee, Byoung Doo; Lee, Jong-Tae

    2016-06-01

    The quality of problem representation is critical for developing students' problem-solving abilities in problem-based learning (PBL). This study investigates preclinical students' experience with standardized patients (SPs) as a problem representation method compared to using video cases in PBL. A cohort of 99 second-year preclinical students from Inje University College of Medicine (IUCM) responded to a Likert scale questionnaire on their learning experiences after they had experienced both video cases and SPs in PBL. The questionnaire consisted of 14 items with eight subcategories: problem identification, hypothesis generation, motivation, collaborative learning, reflective thinking, authenticity, patient-doctor communication, and attitude toward patients. The results reveal that using SPs led to the preclinical students having significantly positive experiences in boosting patient-doctor communication skills; the perceived authenticity of their clinical situations; development of proper attitudes toward patients; and motivation, reflective thinking, and collaborative learning when compared to using video cases. The SPs also provided more challenges than the video cases during problem identification and hypotheses generation. SPs are more effective than video cases in delivering higher levels of authenticity in clinical problems for PBL. The interaction with SPs engages preclinical students in deeper thinking and discussion; growth of communication skills; development of proper attitudes toward patients; and motivation. Considering the higher cost of SPs compared with video cases, SPs could be used most advantageously during the preclinical period in the IUCM curriculum.

  12. A parallel 3-D discrete wavelet transform architecture using pipelined lifting scheme approach for video coding

    Science.gov (United States)

    Hegde, Ganapathi; Vaya, Pukhraj

    2013-10-01

    This article presents a parallel architecture for 3-D discrete wavelet transform (3-DDWT). The proposed design is based on the 1-D pipelined lifting scheme. The architecture is fully scalable beyond the present coherent Daubechies filter bank (9, 7). This 3-DDWT architecture has advantages such as no group of pictures restriction and reduced memory referencing. It offers low power consumption, low latency and high throughput. The computing technique is based on the concept that lifting scheme minimises the storage requirement. The application specific integrated circuit implementation of the proposed architecture is done by synthesising it using 65 nm Taiwan Semiconductor Manufacturing Company standard cell library. It offers a speed of 486 MHz with a power consumption of 2.56 mW. This architecture is suitable for real-time video compression even with large frame dimensions.

  13. Deep linear autoencoder and patch clustering-based unified one-dimensional coding of image and video

    Science.gov (United States)

    Li, Honggui

    2017-09-01

    This paper proposes a unified one-dimensional (1-D) coding framework of image and video, which depends on deep learning neural network and image patch clustering. First, an improved K-means clustering algorithm for image patches is employed to obtain the compact inputs of deep artificial neural network. Second, for the purpose of best reconstructing original image patches, deep linear autoencoder (DLA), a linear version of the classical deep nonlinear autoencoder, is introduced to achieve the 1-D representation of image blocks. Under the circumstances of 1-D representation, DLA is capable of attaining zero reconstruction error, which is impossible for the classical nonlinear dimensionality reduction methods. Third, a unified 1-D coding infrastructure for image, intraframe, interframe, multiview video, three-dimensional (3-D) video, and multiview 3-D video is built by incorporating different categories of videos into the inputs of patch clustering algorithm. Finally, it is shown in the results of simulation experiments that the proposed methods can simultaneously gain higher compression ratio and peak signal-to-noise ratio than those of the state-of-the-art methods in the situation of low bitrate transmission.

  14. Video Modeling of SBIRT for Alcohol Use Disorders Increases Student Empathy in Standardized Patient Encounters.

    Science.gov (United States)

    Crisafio, Anthony; Anderson, Victoria; Frank, Julia

    2018-04-01

    The purpose of this study was to assess the usefulness of adding video models of brief alcohol assessment and counseling to a standardized patient (SP) curriculum that covers and tests acquisition of this skill. The authors conducted a single-center, retrospective cohort study of third- and fourth-year medical students between 2013 and 2015. All students completed a standardized patient (SP) encounter illustrating the diagnosis of alcohol use disorder, followed by an SP exam on the same topic. Beginning in August 2014, the authors supplemented the existing formative SP exercise on problem drinking with one of two 5-min videos demonstrating screening, brief intervention, and referral for treatment (SBIRT). P values and Z tests were performed to evaluate differences between students who did and did not see the video in knowledge and skills related to alcohol use disorders. One hundred ninety-four students were included in this analysis. Compared to controls, subjects did not differ in their ability to uncover and accurately characterize an alcohol problem during a standardized encounter (mean exam score 41.29 vs 40.93, subject vs control, p = 0.539). However, the SPs' rating of students' expressions of empathy were significantly higher for the group who saw the video (81.63 vs 69.79%, p videos would improve students' recognition and knowledge of alcohol-related conditions. However, feedback from the SPs produced the serendipitous finding that the communication skills demonstrated in the videos had a sustained effect in enhancing students' professional behavior.

  15. Stereoscopic Visual Attention-Based Regional Bit Allocation Optimization for Multiview Video Coding

    Directory of Open Access Journals (Sweden)

    Dai Qionghai

    2010-01-01

    Full Text Available We propose a Stereoscopic Visual Attention- (SVA- based regional bit allocation optimization for Multiview Video Coding (MVC by the exploiting visual redundancies from human perceptions. We propose a novel SVA model, where multiple perceptual stimuli including depth, motion, intensity, color, and orientation contrast are utilized, to simulate the visual attention mechanisms of human visual system with stereoscopic perception. Then, a semantic region-of-interest (ROI is extracted based on the saliency maps of SVA. Both objective and subjective evaluations of extracted ROIs indicated that the proposed SVA model based on ROI extraction scheme outperforms the schemes only using spatial or/and temporal visual attention clues. Finally, by using the extracted SVA-based ROIs, a regional bit allocation optimization scheme is presented to allocate more bits on SVA-based ROIs for high image quality and fewer bits on background regions for efficient compression purpose. Experimental results on MVC show that the proposed regional bit allocation algorithm can achieve over % bit-rate saving while maintaining the subjective image quality. Meanwhile, the image quality of ROIs is improved by  dB at the cost of insensitive image quality degradation of the background image.

  16. Using game theory for perceptual tuned rate control algorithm in video coding

    Science.gov (United States)

    Luo, Jiancong; Ahmad, Ishfaq

    2005-03-01

    This paper proposes a game theoretical rate control technique for video compression. Using a cooperative gaming approach, which has been utilized in several branches of natural and social sciences because of its enormous potential for solving constrained optimization problems, we propose a dual-level scheme to optimize the perceptual quality while guaranteeing "fairness" in bit allocation among macroblocks. At the frame level, the algorithm allocates target bits to frames based on their coding complexity. At the macroblock level, the algorithm distributes bits to macroblocks by defining a bargaining game. Macroblocks play cooperatively to compete for shares of resources (bits) to optimize their quantization scales while considering the Human Visual System"s perceptual property. Since the whole frame is an entity perceived by viewers, macroblocks compete cooperatively under a global objective of achieving the best quality with the given bit constraint. The major advantage of the proposed approach is that the cooperative game leads to an optimal and fair bit allocation strategy based on the Nash Bargaining Solution. Another advantage is that it allows multi-objective optimization with multiple decision makers (macroblocks). The simulation results testify the algorithm"s ability to achieve accurate bit rate with good perceptual quality, and to maintain a stable buffer level.

  17. Codes and standards an European point of view

    International Nuclear Information System (INIS)

    Roche, R.L.; Corsi, F.

    1987-01-01

    The first part of this paper is related to the European situation in which Construction Codes for FBR components are developed. Attention is given to the different agreements between European Countries. After a description of the present state of Codes development, indications are given on the future work in this field. Several appendix are devoted to the state of Codes in different European Countries and to the action of European Commission

  18. Content Adaptive Lagrange Multiplier Selection for Rate-Distortion Optimization in 3-D Wavelet-Based Scalable Video Coding

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2018-03-01

    Full Text Available Rate-distortion optimization (RDO plays an essential role in substantially enhancing the coding efficiency. Currently, rate-distortion optimized mode decision is widely used in scalable video coding (SVC. Among all the possible coding modes, it aims to select the one which has the best trade-off between bitrate and compression distortion. Specifically, this tradeoff is tuned through the choice of the Lagrange multiplier. Despite the prevalence of conventional method for Lagrange multiplier selection in hybrid video coding, the underlying formulation is not applicable to 3-D wavelet-based SVC where the explicit values of the quantization step are not available, with on consideration of the content features of input signal. In this paper, an efficient content adaptive Lagrange multiplier selection algorithm is proposed in the context of RDO for 3-D wavelet-based SVC targeting quality scalability. Our contributions are two-fold. First, we introduce a novel weighting method, which takes account of the mutual information, gradient per pixel, and texture homogeneity to measure the temporal subband characteristics after applying the motion-compensated temporal filtering (MCTF technique. Second, based on the proposed subband weighting factor model, we derive the optimal Lagrange multiplier. Experimental results demonstrate that the proposed algorithm enables more satisfactory video quality with negligible additional computational complexity.

  19. Applying emerging digital video interface standards to airborne avionics sensor and digital map integrations: benefits outweigh the initial costs

    Science.gov (United States)

    Kuehl, C. Stephen

    1996-06-01

    Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal

  20. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1982-01-01

    A compilation of detailed information from more than three hundred and fifty DOE-prescribed or OSHA-referenced industrial safety codes and standards is presented. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. A glossary of letter initials/abbreviations for the organizations or documents whose codes or standards are contained in this Cross-Index, is listed

  1. 1 CFR 21.14 - Deviations from standard organization of the Code of Federal Regulations.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Deviations from standard organization of the... CODIFICATION General Numbering § 21.14 Deviations from standard organization of the Code of Federal Regulations. (a) Any deviation from standard Code of Federal Regulations designations must be approved in advance...

  2. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  3. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Science.gov (United States)

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  4. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    Directory of Open Access Journals (Sweden)

    Behrang Barekatain

    Full Text Available In recent years, Random Network Coding (RNC has emerged as a promising solution for efficient Peer-to-Peer (P2P video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  5. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Rosenthal, Andrew [New Mexico State Univ., Las Cruces, NM (United States)

    2013-12-30

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stove pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.

  6. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1981-01-01

    This Cross-Index volume is the 1981 compilation of detailed information from more than three hundred and fifty DOE prescribed or OSHA referenced industrial safety codes and standards and is revised yearly to provide information from current codes. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. Each code is given a two-digit reference code number or letter in the Contents section (pages C to L) of this volume. This reference code provides ready identification of any code listed in the Cross-Index. The computerized information listings are on the left-hand portion of Cross-Index page; in order to the right of the listing are the reference code letters or numbers, the section, paragraph and page of the referenced code containing expanded information on the individual listing

  7. The Codex standard and code for irradiated foods

    International Nuclear Information System (INIS)

    Erwin, L.

    1985-01-01

    A brief background on the work by the Codex Alimentarius Commission on irradiated foods is given. An Australian model food standard for irradiated foods, based on the Codex standard, is being developed

  8. Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal

    Science.gov (United States)

    Zamudio, Gabriel S.; José, Marco V.

    2018-03-01

    In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.

  9. Development of standards, codes of practice and guidelines at the national level

    International Nuclear Information System (INIS)

    Swindon, T.N.

    1989-01-01

    Standards, codes of practice and guidelines are defined and their different roles in radiation protection specified. The work of the major bodies that develop such documents in Australia - the National Health and Medical Research Council and the Standards Association of Australia - is discussed. The codes of practice prepared under the Environment Protection (Nuclear Codes) Act, 1978, an act of the Australian Federal Parliament, are described and the guidelines associated with them outlined. 5 refs

  10. Using self-similarity compensation for improving inter-layer prediction in scalable 3D holoscopic video coding

    Science.gov (United States)

    Conti, Caroline; Nunes, Paulo; Ducla Soares, Luís.

    2013-09-01

    Holoscopic imaging, also known as integral imaging, has been recently attracting the attention of the research community, as a promising glassless 3D technology due to its ability to create a more realistic depth illusion than the current stereoscopic or multiview solutions. However, in order to gradually introduce this technology into the consumer market and to efficiently deliver 3D holoscopic content to end-users, backward compatibility with legacy displays is essential. Consequently, to enable 3D holoscopic content to be delivered and presented on legacy displays, a display scalable 3D holoscopic coding approach is required. Hence, this paper presents a display scalable architecture for 3D holoscopic video coding with a three-layer approach, where each layer represents a different level of display scalability: Layer 0 - a single 2D view; Layer 1 - 3D stereo or multiview; and Layer 2 - the full 3D holoscopic content. In this context, a prediction method is proposed, which combines inter-layer prediction, aiming to exploit the existing redundancy between the multiview and the 3D holoscopic layers, with self-similarity compensated prediction (previously proposed by the authors for non-scalable 3D holoscopic video coding), aiming to exploit the spatial redundancy inherent to the 3D holoscopic enhancement layer. Experimental results show that the proposed combined prediction can improve significantly the rate-distortion performance of scalable 3D holoscopic video coding with respect to the authors' previously proposed solutions, where only inter-layer or only self-similarity prediction is used.

  11. A Joint Watermarking and ROI Coding Scheme for Annotating Traffic Surveillance Videos

    Directory of Open Access Journals (Sweden)

    Su Po-Chyi

    2010-01-01

    Full Text Available We propose a new application of information hiding by employing the digital watermarking techniques to facilitate the data annotation in traffic surveillance videos. There are two parts in the proposed scheme. The first part is the object-based watermarking, in which the information of each vehicle collected by the intelligent transportation system will be conveyed/stored along with the visual data via information hiding. The scheme is integrated with H.264/AVC, which is assumed to be adopted by the surveillance system, to achieve an efficient implementation. The second part is a Region of Interest (ROI rate control mechanism for encoding traffic surveillance videos, which helps to improve the overall performance. The quality of vehicles in the video will be better preserved and a good rate-distortion performance can be attained. Experimental results show that this potential scheme works well in traffic surveillance videos.

  12. Efficient video coding integrating MPEG-2 and picture-rate conversion

    NARCIS (Netherlands)

    Bruin, de F.J.; Bruls, W.H.A.; Burazerovic, D.; Haan, de G.

    2002-01-01

    We present an MPEG-2 compliant video codec using picture-rate upconversion during decoding. The upconversion autonomously regenerates major parts of frames without vectorial and residual data. Consequently, the bitrate is greatly reduced.

  13. OECD International Standard Problem number 34. Falcon code comparison report

    International Nuclear Information System (INIS)

    Williams, D.A.

    1994-12-01

    ISP-34 is the first ISP to address fission product transport issues and has been strongly supported by a large number of different countries and organisations. The ISP is based on two experiments, FAL-ISP-1 and FAL-ISP-2, which were conducted in AEA's Falcon facility. Specific features of the experiments include quantification of chemical effects and aerosol behaviour. In particular, multi-component aerosol effects and vapour-aerosol interactions can all be investigated in the Falcon facility. Important parameters for participants to predict were the deposition profiles and composition, key chemical species and reactions, evolution of suspended material concentrations, and the effects of steam condensation onto aerosols and particle hygroscopicity. The results of the Falcon ISP support the belief that aerosol physics is generally well modelled in primary circuit codes, but the chemistry models in many of the codes need to be improved, since chemical speciation is one of the main factors which controls transport and deposition behaviour. The importance of chemical speciation, aerosol nucleation, and the role of multi-component aerosols in determining transport and deposition behaviour are evident. The role of re-vaporization in these Falcon experiments is not clear; it is not possible to compare those codes which predicted re-vaporization with quantitative data. The evidence from this ISP exercise indicates that the containment codes can predict thermal-hydraulics conditions satisfactorily. However, the differences in the predicted aerosol locations in the Falcon tests had shown that aerosol behaviour was very susceptible to parameters such as particle size distribution

  14. Reconfigurable Secure Video Codec Based on DWT and AES Processor

    OpenAIRE

    Rached Tourki; M. Machhout; B. Bouallegue; M. Atri; M. Zeghid; D. Dia

    2010-01-01

    In this paper, we proposed a secure video codec based on the discrete wavelet transformation (DWT) and the Advanced Encryption Standard (AES) processor. Either, use of video coding with DWT or encryption using AES is well known. However, linking these two designs to achieve secure video coding is leading. The contributions of our work are as follows. First, a new method for image and video compression is proposed. This codec is a synthesis of JPEG and JPEG2000,which is implemented using Huffm...

  15. Codes and standards and other guidance cited in regulatory documents. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Ankrum, A.; Nickolaus, J.; Vinther, R.; Maguire-Moffitt, N.; Hammer, J.; Sherfey, L.; Warner, R. [Pacific Northwest Lab., Richland, WA (United States)

    1994-08-01

    As part of the US Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program, Pacific Northwest Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. In addition to updating previous information, Revision 1 adds citations from the NRC Inspection Manual and the Improved Standard Technical Specifications. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC`s Bulletins, Information Notices, Circulars, Generic Letters, Policy Statements, Regulatory Guides, and the Standard Review Plan (NUREG-0800).

  16. Codes and standards and other guidance cited in regulatory documents. Revision 1

    International Nuclear Information System (INIS)

    Ankrum, A.; Nickolaus, J.; Vinther, R.; Maguire-Moffitt, N.; Hammer, J.; Sherfey, L.; Warner, R.

    1994-08-01

    As part of the US Nuclear Regulatory Commission (NRC) Standard Review Plan Update and Development Program, Pacific Northwest Laboratory developed a listing of industry consensus codes and standards and other government and industry guidance referred to in regulatory documents. In addition to updating previous information, Revision 1 adds citations from the NRC Inspection Manual and the Improved Standard Technical Specifications. This listing identifies the version of the code or standard cited in the regulatory document, the regulatory document, and the current version of the code or standard. It also provides a summary characterization of the nature of the citation. This listing was developed from electronic searches of the Code of Federal Regulations and the NRC's Bulletins, Information Notices, Circulars, Generic Letters, Policy Statements, Regulatory Guides, and the Standard Review Plan (NUREG-0800)

  17. 77 FR 34020 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2012-06-08

    .... (CO) Detection and Warning Equipment. NFPA 790--2012 Standard for Competency of Third-Party Field 6/22... Health-Related Fitness Programs for 1/4/2013. Fire Department Members. NFPA 1584--2008 Standard on the...

  18. Numerical analysis and nuclear standard code application to thermal fatigue

    International Nuclear Information System (INIS)

    Merola, M.

    1992-01-01

    The present work describes the Joint Research Centre Ispra contribution to the IAEA benchmark exercise 'Lifetime Behaviour of the First Wall of Fusion Machines'. The results of the numerical analysis of the reference thermal fatigue experiment are presented. Then a discussion on the numerical analysis of thermal stress is tackled, pointing out its particular aspects in view of their influence on the stress field evaluation. As far as the design-allowable number of cycles are concerned the American nuclear code ASME and the French code RCC-MR are applied and the reasons for the different results obtained are investigated. As regards a realistic fatigue lifetime evaluation, the main problems to be solved are brought out. This work, is intended as a preliminary basis for a discussion focusing on the main characteristics of the thermal fatigue problem from both a numerical and a lifetime assessment point of view. In fact the present margin of discretion left to the analyst may cause undue discrepancies in the results obtained. A sensitivity analysis of the main parameters involved is desirable and more precise design procedures should be stated

  19. Cross-Index to DOE-prescribed industrial safety codes and standards

    International Nuclear Information System (INIS)

    1980-01-01

    This Cross-Index volume is the 1980 compilation of detailed information from more than two hundred and ninety Department of Energy (DOE) prescribed or Occupational Health and Safety Administration (OSHA) referenced industrial safety codes and standards. The compilation of this material was conceived and initiated in 1973, and is revised yearly to provide information from current codes. Condensed data from individual code portions are listed according to reference code, section, paragraph, and page. Each code is given a two-digit reference code number or letter in the Contents section. This reference code provides ready identification of any code listed in the Cross-Index. The computerized information listings are on the left-hand portion of Cross-Index page; in order to the right of the listing are the reference code letters or numbers, the section, paragraph, and page of the referenced code containing expanded information on the individual listing. Simplified How to Use directions are listed. A glossary of letter initials/abbreviations for the organizations or documents, whose codes or standards are contained in this Cross-Index, is included

  20. 77 FR 67628 - National Fire Codes: Request for Public Input for Revision of Codes and Standards

    Science.gov (United States)

    2012-11-13

    ... Standard on Fire and 7/8/2013 Life Safety in Animal Housing Facilities. NFPA 160--2011 Standard for the Use... five years in Revision Cycles that begin twice each year and take approximately two years to complete. Each Revision Cycle proceeds according to a published schedule that includes final dates for all major...

  1. 76 FR 70414 - National Fire Protection Association (NFPA) Proposes To Revise Codes and Standards

    Science.gov (United States)

    2011-11-14

    ... Commercial Cooking Operations. NFPA 99--2012 Health Care Facilities Code 6/22/2012 NFPA 99B--2010 Standard... Explosion Investigations..... 1/4/2012 NFPA 1005--2007 Standard for Professional Qualifications for 1/4/2012 Marine Fire Fighting for Land-Based Fire Fighters. NFPA 1021--2009 Standard for Fire Officer Professional...

  2. Accelerating Wavelet-Based Video Coding on Graphics Hardware using CUDA

    NARCIS (Netherlands)

    Laan, Wladimir J. van der; Roerdink, Jos B.T.M.; Jalba, Andrei C.; Zinterhof, P; Loncaric, S; Uhl, A; Carini, A

    2009-01-01

    The Discrete Wavelet Transform (DWT) has a wide range of applications from signal processing to video and image compression. This transform, by means of the lifting scheme, can be performed in a memory mid computation efficient way on modern, programmable GPUs, which can be regarded as massively

  3. Temporal signal energy correction and low-complexity encoder feedback for lossy scalable video coding

    NARCIS (Netherlands)

    Loomans, M.J.H.; Koeleman, C.J.; With, de P.H.N.

    2010-01-01

    In this paper, we address two problems found in embedded implementations of Scalable Video Codecs (SVCs): the temporal signal energy distribution and frame-to-frame quality fluctuations. The unequal energy distribution between the low- and high-pass band with integer-based wavelets leads to

  4. Accelerating wavelet-based video coding on graphics hardware using CUDA

    NARCIS (Netherlands)

    Laan, van der W.J.; Roerdink, J.B.T.M.; Jalba, A.C.; Zinterhof, P.; Loncaric, S.; Uhl, A.; Carini, A.

    2009-01-01

    The DiscreteWavelet Transform (DWT) has a wide range of applications from signal processing to video and image compression. This transform, by means of the lifting scheme, can be performed in a memory and computation efficient way on modern, programmable GPUs, which can be regarded as massively

  5. Interband coding extension of the new lossless JPEG standard

    Science.gov (United States)

    Memon, Nasir D.; Wu, Xiaolin; Sippy, V.; Miller, G.

    1997-01-01

    Due to the perceived inadequacy of current standards for lossless image compression, the JPEG committee of the International Standards Organization (ISO) has been developing a new standard. A baseline algorithm, called JPEG-LS, has already been completed and is awaiting approval by national bodies. The JPEG-LS baseline algorithm despite being simple is surprisingly efficient, and provides compression performance that is within a few percent of the best and more sophisticated techniques reported in the literature. Extensive experimentations performed by the authors seem to indicate that an overall improvement by more than 10 percent in compression performance will be difficult to obtain even at the cost of great complexity; at least not with traditional approaches to lossless image compression. However, if we allow inter-band decorrelation and modeling in the baseline algorithm, nearly 30 percent improvement in compression gains for specific images in the test set become possible with a modest computational cost. In this paper we propose and investigate a few techniques for exploiting inter-band correlations in multi-band images. These techniques have been designed within the framework of the baseline algorithm, and require minimal changes to the basic architecture of the baseline, retaining its essential simplicity.

  6. Boltzmann-Fokker-Planck calculations using standard discrete-ordinates codes

    International Nuclear Information System (INIS)

    Morel, J.E.

    1987-01-01

    The Boltzmann-Fokker-Planck (BFP) equation can be used to describe both neutral and charged-particle transport. Over the past several years, the author and several collaborators have developed methods for representing Fokker-Planck operators with standard multigroup-Legendre cross-section data. When these data are input to a standard S/sub n/ code such as ONETRAN, the code actually solves the Boltzmann-Fokker-Planck equation rather than the Boltzmann equation. This is achieved wihout any modification to the S/sub n/ codes. Because BFP calculations can be more demanding from a numerical viewpoint than standard neutronics calculations, we have found it useful to implement new quadrature methods ad convergence acceleration methods in the standard discrete-ordinates code, ONETRAN. We discuss our BFP cross-section representation techniques, our improved quadrature and acceleration techniques, and present results from BFP coupled electron-photon transport calculations performed with ONETRAN. 19 refs., 7 figs

  7. Hydrogen Codes and Standards: An Overview of U.S. DOE Activities

    International Nuclear Information System (INIS)

    James M Ohi

    2006-01-01

    The Hydrogen, Fuel Cells, and Infrastructure Technologies (HFCIT) Program of the U.S. Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL), with the help of leading standards and model code development organizations, other national laboratories, and key stakeholders, are developing a coordinated and collaborative government-industry effort to prepare, review, and promulgate hydrogen codes and standards needed to expedite hydrogen infrastructure development. The focus of this effort is to put in place a coordinated and comprehensive hydrogen codes and standards program at the national and international levels. This paper updates an overview of the U.S. program to facilitate and coordinate the development of hydrogen codes and standards that was presented by the author at WHEC 15. (authors)

  8. Guidelines on Active Content and Mobile Code: Recommendations of the National Institute of Standards and Technology

    National Research Council Canada - National Science Library

    Jansen, Wayne

    2001-01-01

    .... One such category of technologies is active content. Broadly speaking, active content refers to electronic documents that, unlike past character documents based on the American Standard Code for Information Interchange (ASCII...

  9. Licensing procedure, nuclear codes and standards in the Federal Republic of Germany

    International Nuclear Information System (INIS)

    Schultheiss, G.F.

    1980-01-01

    The present paper deals with legal background of licensing in nuclear technology and atomic energy use, licensing procedures for nuclear power plants and with codes, standards and guidelines in the Federal Republic of Germany. (orig./RW)

  10. NODC Standard Format Marine Toxic Substances and Pollutants (F144) chemical identification codes (NODC Accession 9200273)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This archival information package contains a listing of codes and chemical names that were used in NODC Standard Format Marine Toxic Substances and Pollutants (F144)...

  11. IAEA Workshop (Training Course) on Codes and Standards for Sodium Cooled Fast Reactors. Working Material

    International Nuclear Information System (INIS)

    2010-01-01

    The training course consisted of lectures and Q&A sessions. The lectures dealt with the history of the development of Design Codes and Standards for Sodium Cooled Fast Reactors (SFRs) in the respective country, the detailed description of the current design Codes and Standards for SFRs and their application to ongoing Fast Reactor design projects, as well as the ongoing development work and plans for the future in this area. Annex 1 contains the detailed Workshop program

  12. Standard problems to evaluate soil structure interaction computer codes

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Philippacopoulos, A.J.

    1979-01-01

    The seismic response of nuclear power plant structures is often calculated using lumped parameter methods. A finite element model of the structure is coupled to the soil with a spring-dashpot system used to represent the interaction process. The parameters of the interaction model are based on analytic solutions to simple problems which are idealizations of the actual problems of interest. The objective of the work reported in this paper is to compare predicted responses using the standard lumped parameter models with experimental data. These comparisons are shown to be good for a fairly uniform soil system and for loadings which do not result in nonlinear interaction effects such as liftoff. 7 references, 7 figures

  13. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    Directory of Open Access Journals (Sweden)

    Cevdet Kızıl

    2014-08-01

    Full Text Available The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commercial code and Turkish accounting standards?”, “Do accounting academicians integrate new Turkish commercial code and Turkish accounting standards to their lectures?”, “How does modern accounting education methodology and technology coincides with the teaching of new Turkish commercial code and Turkish accounting standards?”, “Do universities offer mandatory and elective courses which cover the new Turkish commercial code and Turkish accounting standards?” and “If such courses are offered, what are their names, percentage in the curriculum and degree of coverage?”Research contributes to the literature in several ways. Firstly, new Turkish commercial code and Turkish accounting standards are current significant topics for the accounting profession. Furthermore, the accounting education provides a basis for the implementations in public and private sector. Besides, one of the intentions of new Turkish commercial code and Turkish accounting standards is to foster transparency. That is definitely a critical concept also in terms of mergers, acquisitions and investments. Stakeholders of today’s business world such as investors, shareholders, entrepreneurs, auditors and government are in need of more standardized global accounting principles Thus, revision and redesigning of accounting educations plays an important role. Emphasized points also clearly prove the necessity and functionality of this research.

  14. Safety, codes and standards for hydrogen installations. Metrics development and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Aaron P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dedrick, Daniel E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); San Marchi, Christopher W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-04-01

    Automakers and fuel providers have made public commitments to commercialize light duty fuel cell electric vehicles and fueling infrastructure in select US regions beginning in 2014. The development, implementation, and advancement of meaningful codes and standards is critical to enable the effective deployment of clean and efficient fuel cell and hydrogen solutions in the energy technology marketplace. Metrics pertaining to the development and implementation of safety knowledge, codes, and standards are important to communicate progress and inform future R&D investments. This document describes the development and benchmarking of metrics specific to the development of hydrogen specific codes relevant for hydrogen refueling stations. These metrics will be most useful as the hydrogen fuel market transitions from pre-commercial to early-commercial phases. The target regions in California will serve as benchmarking case studies to quantify the success of past investments in research and development supporting safety codes and standards R&D.

  15. Review of ASME nuclear codes and standards- subcommittee on repairs, replacements, and modifications

    International Nuclear Information System (INIS)

    Mawson, T.J.

    1990-01-01

    As requested by the ASME board on Nuclear Codes and Standards, the Pressure Vessel Research Committee initiated a project to review Sections III and XI of the ASME Boiler and Pressure Vessel Code for the purposes of improving, clarifying, providing transition, consistency, compatibility, and simplifying code requirements. The project was organized with six subcommittees to address various Code activities: design; tests and examinations; documentation; quality assurance; repair, replacement and modification; and general requirements. This paper discusses how the subcommittee on repair, replacement and modification was organized to review the repair, replacement and modification requirements of the ASME boiler and pressure vessel code, Section III and Section XI for Class 1, 2, and 3 and MC components and their supports, and other documents of the nuclear industry related to the repair, replacement and modification requirements of the ASME code

  16. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    Energy Technology Data Exchange (ETDEWEB)

    Ball, J.; Glowa, G.; Wren, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Ewig, F. [GRS Koln (Germany); Dickenson, S. [AEAT, (United Kingdom); Billarand, Y.; Cantrel, L. [IPSN (France); Rydl, A. [NRIR (Czech Republic); Royen, J. [OECD/NEA (France)

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I{sup -} concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  17. International standard problem (ISP) no. 41 follow up exercise: Containment iodine computer code exercise: parametric studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-11-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I - concentration. The codes used in this exercise were IODE(IPSN), IODE(NRIR), IMPAIR(GRS), INSPECT(AEAT), IMOD(AECL) and LIRIC(AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained front intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility, (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (author)

  18. 76 FR 70413 - National Fire Protection Association (NFPA): Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2011-11-14

    ... Private Fire Protection. P NFPA 36 Standard for Solvent Extraction Plants P NFPA 52 Vehicular Gaseous Fuel Systems Code P NFPA 67 Guideline on Explosion Protection for Gaseous N Mixtures in Pipe Systems. NFPA 68 Standard on Explosion Protection by Deflagration P Venting. NFPA 70B Recommended Practice for Electrical...

  19. R&D for Safety Codes and Standards: Materials and Components Compatibility

    Energy Technology Data Exchange (ETDEWEB)

    Somerday, Brian P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFleur, Chris [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Marchi, Chris San [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-08-01

    This project addresses the following technical barriers from the Safety, Codes and Standards section of the 2012 Fuel Cell Technologies Office Multi-Year Research, Development and Demonstration Plan (section 3.8): (A) Safety data and information: limited access and availability (F) Enabling national and international markets requires consistent RCS (G) Insufficient technical data to revise standards.

  20. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games

    Science.gov (United States)

    Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah

    2015-01-01

    Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842

  1. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.

    Science.gov (United States)

    Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M

    2015-07-01

    Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.

  2. Safety standards, legislation and codes of practice for fuel cell manufacture and operation

    Energy Technology Data Exchange (ETDEWEB)

    Wilcox, C.P.

    1999-07-01

    This report examines safety standards, legislation and codes of practice for fuel cell manufacture and operation in the UK, Europe and internationally. Management of health and safety in the UK is discussed, and the characteristics of phosphoric acid (PAFC), proton exchange membrane (PEM), molten carbonate (MCFC), solid oxide (SOFC) fuel cells are described. Fuel cell power plant standards and manufacture in the UK, design and operational considerations, end of life disposal, automotive fuel cell system, and fuelling and vehicular concerns are explored, and standards, legislation and codes of practice are explained in the appendix.

  3. Review and Implementation of the Emerging CCSDS Recommended Standard for Multispectral and Hyperspectral Lossless Image Coding

    Science.gov (United States)

    Sanchez, Jose Enrique; Auge, Estanislau; Santalo, Josep; Blanes, Ian; Serra-Sagrista, Joan; Kiely, Aaron

    2011-01-01

    A new standard for image coding is being developed by the MHDC working group of the CCSDS, targeting onboard compression of multi- and hyper-spectral imagery captured by aircraft and satellites. The proposed standard is based on the "Fast Lossless" adaptive linear predictive compressor, and is adapted to better overcome issues of onboard scenarios. In this paper, we present a review of the state of the art in this field, and provide an experimental comparison of the coding performance of the emerging standard in relation to other state-of-the-art coding techniques. Our own independent implementation of the MHDC Recommended Standard, as well as of some of the other techniques, has been used to provide extensive results over the vast corpus of test images from the CCSDS-MHDC.

  4. A Hybrid Scheme Based on Pipelining and Multitasking in Mobile Application Processors for Advanced Video Coding

    Directory of Open Access Journals (Sweden)

    Muhammad Asif

    2015-01-01

    Full Text Available One of the key requirements for mobile devices is to provide high-performance computing at lower power consumption. The processors used in these devices provide specific hardware resources to handle computationally intensive video processing and interactive graphical applications. Moreover, processors designed for low-power applications may introduce limitations on the availability and usage of resources, which present additional challenges to the system designers. Owing to the specific design of the JZ47x series of mobile application processors, a hybrid software-hardware implementation scheme for H.264/AVC encoder is proposed in this work. The proposed scheme distributes the encoding tasks among hardware and software modules. A series of optimization techniques are developed to speed up the memory access and data transferring among memories. Moreover, an efficient data reusage design is proposed for the deblock filter video processing unit to reduce the memory accesses. Furthermore, fine grained macroblock (MB level parallelism is effectively exploited and a pipelined approach is proposed for efficient utilization of hardware processing cores. Finally, based on parallelism in the proposed design, encoding tasks are distributed between two processing cores. Experiments show that the hybrid encoder is 12 times faster than a highly optimized sequential encoder due to proposed techniques.

  5. Coding Local and Global Binary Visual Features Extracted From Video Sequences

    Science.gov (United States)

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks, while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the Bag-of-Visual-Word (BoVW) model. Several applications, including for example visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget, while attaining a target level of efficiency. In this paper we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can be conveniently adopted to support the Analyze-Then-Compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the Compress-Then-Analyze (CTA) paradigm. In this paper we experimentally compare ATC and CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: homography estimation and content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with CTA, especially in bandwidth limited scenarios.

  6. Coding Local and Global Binary Visual Features Extracted From Video Sequences.

    Science.gov (United States)

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the bag-of-visual word model. Several applications, including, for example, visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget while attaining a target level of efficiency. In this paper, we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can conveniently be adopted to support the analyze-then-compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs the visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the compress-then-analyze (CTA) paradigm. In this paper, we experimentally compare the ATC and the CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: 1) homography estimation and 2) content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with the CTA, especially in bandwidth limited scenarios.

  7. Direct migration motion estimation and mode decision to decoder for a low-complexity decoder Wyner-Ziv video coding

    Science.gov (United States)

    Lei, Ted Chih-Wei; Tseng, Fan-Shuo

    2017-07-01

    This paper addresses the problem of high-computational complexity decoding in traditional Wyner-Ziv video coding (WZVC). The key focus is the migration of two traditionally high-computationally complex encoder algorithms, namely motion estimation and mode decision. In order to reduce the computational burden in this process, the proposed architecture adopts the partial boundary matching algorithm and four flexible types of block mode decision at the decoder. This approach does away with the need for motion estimation and mode decision at the encoder. The experimental results show that the proposed padding block-based WZVC not only decreases decoder complexity to approximately one hundredth that of the state-of-the-art DISCOVER decoding but also outperforms DISCOVER codec by up to 3 to 4 dB.

  8. Video demystified

    CERN Document Server

    Jack, Keith

    2004-01-01

    This international bestseller and essential reference is the "bible" for digital video engineers and programmers worldwide. This is by far the most informative analog and digital video reference available, includes the hottest new trends and cutting-edge developments in the field. Video Demystified, Fourth Edition is a "one stop" reference guide for the various digital video technologies. The fourth edition is completely updated with all new chapters on MPEG-4, H.264, SDTV/HDTV, ATSC/DVB, and Streaming Video (Video over DSL, Ethernet, etc.), as well as discussions of the latest standards throughout. The accompanying CD-ROM is updated to include a unique set of video test files in the newest formats. *This essential reference is the "bible" for digital video engineers and programmers worldwide *Contains all new chapters on MPEG-4, H.264, SDTV/HDTV, ATSC/DVB, and Streaming Video *Completely revised with all the latest and most up-to-date industry standards.

  9. Recent development in the ASME O and M committee codes, standards, and guides

    International Nuclear Information System (INIS)

    Rowley, C.W.

    1999-01-01

    The ASME O and M Committee continues to expand and update its code, standards, and guides as contained in the ASME OM Code and the ASME OM Standards/Guides. This paper will describe recent changes to these two ASME documents, including technical inquiries, code cases, and the major reformat of the ASME OM Code 1998 Edition. Also two new Parts to the ASME OM S/G will be discussed: OM Part 23 and OM Part 24, which are close to being initially published. A third new Part to the ASME OM S/G has been authorized and has recently started to get organized: Part 26, 'Thermal Calibration of RTDs'. In addition this paper will describe the future plans for these two documents as provided in the O and M Committee Strategic Plan. (author)

  10. A repository of codes of ethics and technical standards in health informatics.

    Science.gov (United States)

    Samuel, Hamman W; Zaïane, Osmar R

    2014-01-01

    We present a searchable repository of codes of ethics and standards in health informatics. It is built using state-of-the-art search algorithms and technologies. The repository will be potentially beneficial for public health practitioners, researchers, and software developers in finding and comparing ethics topics of interest. Public health clinics, clinicians, and researchers can use the repository platform as a one-stop reference for various ethics codes and standards. In addition, the repository interface is built for easy navigation, fast search, and side-by-side comparative reading of documents. Our selection criteria for codes and standards are two-fold; firstly, to maintain intellectual property rights, we index only codes and standards freely available on the internet. Secondly, major international, regional, and national health informatics bodies across the globe are surveyed with the aim of understanding the landscape in this domain. We also look at prevalent technical standards in health informatics from major bodies such as the International Standards Organization (ISO) and the U. S. Food and Drug Administration (FDA). Our repository contains codes of ethics from the International Medical Informatics Association (IMIA), the iHealth Coalition (iHC), the American Health Information Management Association (AHIMA), the Australasian College of Health Informatics (ACHI), the British Computer Society (BCS), and the UK Council for Health Informatics Professions (UKCHIP), with room for adding more in the future. Our major contribution is enhancing the findability of codes and standards related to health informatics ethics by compilation and unified access through the health informatics ethics repository.

  11. MARS-KS code validation activity through the atlas domestic standard problem

    International Nuclear Information System (INIS)

    Choi, K. Y.; Kim, Y. S.; Kang, K. H.; Park, H. S.; Cho, S.

    2012-01-01

    The 2 nd Domestic Standard Problem (DSP-02) exercise using the ATLAS integral effect test data was executed to transfer the integral effect test data to domestic nuclear industries and to contribute to improving the safety analysis methodology for PWRs. A small break loss of coolant accident of a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. Ten calculation results using MARS-KS code were collected, major prediction results were described qualitatively and code prediction accuracy was assessed quantitatively using the FFTBM. In addition, special code assessment activities were carried out to find out the area where the model improvement is required in the MARS-KS code. The lessons from this DSP-02 and recommendations to code developers are described in this paper. (authors)

  12. PERTV: A standard file version of the PERT-V code

    International Nuclear Information System (INIS)

    George, D.C.; LaBauve, R.J.

    1988-02-01

    The PERT-V code, used in two-dimensional perturbation theory, fast reactor analysis, has been modified to accept input data from standard files ISOTXS, GEODST, ZNATDN, NDXSRF, DLAYXS, RTFLUX, and ATFLUX. This modification has greatly reduced the additional input that must be supplied by the user. The new version of PERT-V, PERTV, has all the options of the original code including a plotting capability. 10 refs., 3 figs., 12 tabs

  13. Contributions of the ORNL piping program to nuclear piping design codes and standards

    International Nuclear Information System (INIS)

    Moore, S.E.

    1975-11-01

    The ORNL Piping Program was conceived and established to develop basic information on the structural behavior of nuclear power plant piping components and to prepare this information in forms suitable for use in design analysis and codes and standards. One of the objectives was to develop and qualify stress indices and flexibility factors for direct use in Code-prescribed design analysis methods. Progress in this area is described

  14. Former Yugoslav Republic of Macedonia; Report on Observance of Standards and Codes: Fiscal Transparency Module

    OpenAIRE

    International Monetary Fund

    2006-01-01

    This report summarizes the Observance of Standards and Codes on Fiscal Transparency for the Former Yugoslav Republic of Macedonia. It provides an assessment of fiscal transparency practices in the Former Yugoslav Republic (FYR) of Macedonia in relation to the requirements of the IMF Code of Good Practices on Fiscal Transparency based on discussions with the authorities and other organizations and through a fiscal transparency questionnaire. It also provides recommendations for improving fisca...

  15. Final Technical Report for GO17004 Regulatory Logic: Codes and Standards for the Hydrogen Economy

    Energy Technology Data Exchange (ETDEWEB)

    Nakarado, Gary L. [Regulatory Logic LLC, Golden, CO (United States)

    2017-02-22

    The objectives of this project are to: develop a robust supporting research and development program to provide critical hydrogen behavior data and a detailed understanding of hydrogen combustion and safety across a range of scenarios, needed to establish setback distances in building codes and minimize the overall data gaps in code development; support and facilitate the completion of technical specifications by the International Organization for Standardization (ISO) for gaseous hydrogen refueling (TS 20012) and standards for on-board liquid (ISO 13985) and gaseous or gaseous blend (ISO 15869) hydrogen storage by 2007; support and facilitate the effort, led by the NFPA, to complete the draft Hydrogen Technologies Code (NFPA 2) by 2008; with experimental data and input from Technology Validation Program element activities, support and facilitate the completion of standards for bulk hydrogen storage (e.g., NFPA 55) by 2008; facilitate the adoption of the most recently available model codes (e.g., from the International Code Council [ICC]) in key regions; complete preliminary research and development on hydrogen release scenarios to support the establishment of setback distances in building codes and provide a sound basis for model code development and adoption; support and facilitate the development of Global Technical Regulations (GTRs) by 2010 for hydrogen vehicle systems under the United Nations Economic Commission for Europe, World Forum for Harmonization of Vehicle Regulations and Working Party on Pollution and Energy Program (ECE-WP29/GRPE); and to Support and facilitate the completion by 2012 of necessary codes and standards needed for the early commercialization and market entry of hydrogen energy technologies.

  16. Technical Skills Training for Veterinary Students: A Comparison of Simulators and Video for Teaching Standardized Cardiac Dissection.

    Science.gov (United States)

    Allavena, Rachel E; Schaffer-White, Andrea B; Long, Hanna; Alawneh, John I

    The goal of the study was to evaluate alternative student-centered approaches that could replace autopsy sessions and live demonstration and to explore refinements in assessment procedures for standardized cardiac dissection. Simulators and videos were identified as feasible, economical, student-centered teaching methods for technical skills training in medical contexts, and a direct comparison was undertaken. A low-fidelity anatomically correct simulator approximately the size of a horse's heart with embedded dissection pathways was constructed and used with a series of laminated photographs of standardized cardiac dissection. A video of a standardized cardiac dissection of a normal horse's heart was recorded and presented with audio commentary. Students were allowed to nominate a preference for learning method, and students who indicated no preference were randomly allocated to keep group numbers even. Objective performance data from an objective structure assessment criterion and student perception data on confidence and competency from surveys showed both innovations were similarly effective. Evaluator reflections as well as usage logs to track patterns of student use were both recorded. A strong selection preference was identified for kinesthetic learners choosing the simulator and visual learners choosing the video. Students in the video cohort were better at articulating the reasons for dissection procedures and sequence due to the audio commentary, and student satisfaction was higher with the video. The major conclusion of this study was that both methods are effective tools for technical skills training, but consideration should be given to the preferred learning style of adult learners to maximize educational outcomes.

  17. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  18. Report on the Current Technical Issues on ASME Nuclear Code and Standard

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2008-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  19. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-15

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  20. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    Energy Technology Data Exchange (ETDEWEB)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  1. Building America Guidance for Identifying and Overcoming Code, Standard, and Rating Method Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Pamala C.; Halverson, Mark A.

    2013-09-01

    The U.S. Department of Energy’s (DOE) Building America program implemented a new Codes and Standards Innovation (CSI) Team in 2013. The Team’s mission is to assist Building America (BA) research teams and partners in identifying and resolving conflicts between Building America innovations and the various codes and standards that govern the construction of residences. A CSI Roadmap was completed in September, 2013. This guidance document was prepared using the information in the CSI Roadmap to provide BA research teams and partners with specific information and approaches to identifying and overcoming potential barriers to Building America (BA) innovations arising in and/or stemming from codes, standards, and rating methods. For more information on the BA CSI team, please email: CSITeam@pnnl.gov

  2. Overview on pre-harmonization studies conducted by the Working Group on Codes and Standards

    International Nuclear Information System (INIS)

    Guinovart, J.

    1998-01-01

    For more than twenty years, the Working Group on Codes and Standards (WGCS) has been an Advisory Expert Group of the European Commission and three subgroups were formed to consider manufacture and inspection, structural mechanics and materials topics. The WGCS seeks to promote studies at the pre-harmonisation level, for the clarification and building of consensus in the European Community concerning technical issues of relevance for the integrity of safety-related components. It deals with pre-standardization process regarding industrial codes whose rules are applicable to design, construction and operation of NPP components in European Community

  3. Review of application code and standards for mechanical and piping design of HANARO fuel test loop

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.

    1998-02-01

    The design and installation of the irradiation test facility for verification test of the fuel performance are very important in connection with maximization of the utilization of HANARO. HANARO fuel test loop was designed in accordance with the same code and standards of nuclear power plant because HANARO FTL will be operated the high pressure and temperature same as nuclear power plant operation conditions. The objective of this study is to confirm the propriety of application code and standards for mechanical and piping of HANARO fuel test loop and to decide the technical specification of FTL systems. (author). 18 refs., 8 tabs., 6 figs.

  4. Japanese national project for establishment of codes and standards for stationary PEFC system

    International Nuclear Information System (INIS)

    Sumi, S.; Ohmura, T.; Yamaguchi, R.; Kikuzawa, H.

    2003-01-01

    For the purpose of practical utilization of the PEFC cogeneration system, we are promoting the national projects of the 'Establishment of Codes and Standards for Stationary PEFC System'. The objective is to prepare the software platforms for wide spreading use, which are required in the introduction stage of the PEFC cogeneration systems, such as code and standards for safety, reliability, performance and so on. For this objective, using test samples of the systems and the stacks, developments of test and evaluation devices, collection of various kinds of data and establishment of test and evaluation methods are under way. (author)

  5. Evaluation on applicability of the rules, regulations, and industrial codes and standards for SMART development

    International Nuclear Information System (INIS)

    Choi, Suhn; Lee, C C.; Lee, C.K.; Kim, K.K.; Kim, J.P.; Kim, J.H.; Cho, B.H.; Kang, D J.; Bae, G.H.; Chung, M.; Chang, M.H.

    1999-03-01

    In this report, evaluation on applicability of the rules, regulations, and industrial codes and standards for SMART has been made. As the first step, past-to-present status of licensing structures were reviewed. Then, the rules, regulations, and standards applied to YGN 3-6 were listed and reviewed. Finally, evaluation on applicability of such rules and standards for SMART are made in each design fields. During this step technical evaluations on each items of rules, regulations and standards are made and the possible remedies or comments are suggested. The results are summarized in a tabular form and enclosed as Appendix. (Author). 8 refs., 5 tabs., 3 figs

  6. An adaptive mode-driven spatiotemporal motion vector prediction for wavelet video coding

    Science.gov (United States)

    Zhao, Fan; Liu, Guizhong; Qi, Yong

    2010-07-01

    The three-dimensional subband/wavelet codecs use 5/3 filters rather than Haar filters for the motion compensation temporal filtering (MCTF) to improve the coding gain. In order to curb the increased motion vector rate, an adaptive motion mode driven spatiotemporal motion vector prediction (AMDST-MVP) scheme is proposed. First, by making use of the direction histograms of four motion vector fields resulting from the initial spatial motion vector prediction (SMVP), the motion mode of the current GOP is determined according to whether the fast or complex motion exists in the current GOP. Then the GOP-level MVP scheme is thereby determined by either the S-MVP or the AMDST-MVP, namely, AMDST-MVP is the combination of S-MVP and temporal-MVP (T-MVP). If the latter is adopted, the motion vector difference (MVD) between the neighboring MV fields and the S-MVP resulting MV of the current block is employed to decide whether or not the MV of co-located block in the previous frame is used for prediction the current block. Experimental results show that AMDST-MVP not only can improve the coding efficiency but also reduce the number of computation complexity.

  7. Changing priorities of codes and standards -- quality engineering: Experiences in plant construction, maintenance, and operation

    International Nuclear Information System (INIS)

    Antony, D.D.; Suleski, P.F.; Meier, J.C.

    1994-01-01

    Application of the ASME Code across various fossil and nuclear plants necessitates a Company approach adapted by unique status of each plant. This arises from State Statutes, Federal Regulations and consideration of each plant's as-built history over a broad time frame of design, construction and operation. Additionally, the National Board Inspection Code accompanies Minnesota Statutes for plants owned by Northern States Power Company. This paper addresses some key points on NSP's use of ASME Code as a principal mechanical standard in plant design, construction and operation. A primary resource facilitating review of Code provisions is accurate status on current plant configuration. As plant design changes arise, the Code Edition/Addenda of original construction and installed upgrades or replacements are considered against available options allowed by current standards and dialog with the Jurisdictional Authority. Consistent with the overall goal of safe and reliable plant operation, there are numerous Code details and future needs to be addressed in concert with expected plant economics and planned outages for implementation. The discussion begins in the late 60's with new construction of Monticello and Prairie Island (both nuclear), through Sherburne County Units 1 through 3 (fossil), and their changes, replacements or repairs as operating plants

  8. JAERI thermal reactor standard code system for reactor design and analysis SRAC

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro

    1985-01-01

    SRAC, JAERI thermal reactor standard code system for reactor design and analysis, developed in Japan Atomic Energy Research Institute, is for all types of thermal neutron nuclear design and analysis. The code system has undergone extensive verifications to confirm its functions, and has been used in core modification of the research reactor, detailed design of the multi-purpose high temperature gas reactor and analysis of the experiment with a critical assembly. In nuclear calculation with the code system, multi-group lattice calculation is first made with the libraries. Then, with the resultant homogeneous equivalent group constants, reactor core calculation is made. Described are the following: purpose and development of the code system, functions of the SRAC system, bench mark tests and usage state and future development. (Mori, K.)

  9. End-of-life decisions in Malaysia: Adequacies of ethical codes and developing legal standards.

    Science.gov (United States)

    Kassim, Puteri Nemie Jahn; Alias, Fadhlina

    2015-06-01

    End-of-life decision-making is an area of medical practice in which ethical dilemmas and legal interventions have become increasingly prevalent. Decisions are no longer confined to clinical assessments; rather, they involve wider considerations such as a patient's religious and cultural beliefs, financial constraints, and the wishes and needs of family members. These decisions affect everyone concerned, including members of the community as a whole. Therefore it is imperative that clear ethical codes and legal standards are developed to help guide the medical profession on the best possible course of action for patients. This article considers the relevant ethical, codes and legal provisions in Malaysia governing certain aspects of end-of-life decision-making. It highlights the lack of judicial decisions in this area as well as the limitations with the Malaysian regulatory system. The article recommends the development of comprehensive ethical codes and legal standards to guide end-of-life decision-making in Malaysia.

  10. Up to code: does your company's conduct meet world-class standards?

    Science.gov (United States)

    Paine, Lynn; Deshpandé, Rohit; Margolis, Joshua D; Bettcher, Kim Eric

    2005-12-01

    Codes of conduct have long been a feature of corporate life. Today, they are arguably a legal necessity--at least for public companies with a presence in the United States. But the issue goes beyond U.S. legal and regulatory requirements. Sparked by corruption and excess of various types, dozens of industry, government, investor, and multisector groups worldwide have proposed codes and guidelines to govern corporate behavior. These initiatives reflect an increasingly global debate on the nature of corporate legitimacy. Given the legal, organizational, reputational, and strategic considerations, few companies will want to be without a code. But what should it say? Apart from a handful of essentials spelled out in Sarbanes-Oxley regulations and NYSE rules, authoritative guidance is sorely lacking. In search of some reference points for managers, the authors undertook a systematic analysis of a select group of codes. In this article, they present their findings in the form of a "codex," a reference source on code content. The Global Business Standards Codex contains a set of overarching principles as well as a set of conduct standards for putting those principles into practice. The GBS Codex is not intended to be adopted as is, but is meant to be used as a benchmark by those wishing to create their own world-class code. The provisions of the codex must be customized to a company's specific business and situation; individual companies' codes will include their own distinctive elements as well. What the codex provides is a starting point grounded in ethical fundamentals and aligned with an emerging global consensus on basic standards of corporate behavior.

  11. 75 FR 66725 - National Fire Protection Association (NFPA) Proposes To Revise Codes and Standards

    Science.gov (United States)

    2010-10-29

    ... for Solvent 5/23/2011 Extraction Plants. NFPA 51--2007 Standard for the Design and 11/23/2010... Practice for a 5/23/2011 Field Flame Test for Textiles and Films. NFPA 909--2010 Code for the Protection of...

  12. Building America Guidance for Identifying and Overcoming Code, Standard, and Rating Method Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Cole, P. C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Halverson, M. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    This guidance document was prepared using the input from the meeting summarized in the draft CSI Roadmap to provide Building America research teams and partners with specific information and approaches to identifying and overcoming potential barriers to Building America innovations arising in and/or stemming from codes, standards, and rating methods.

  13. Reliability improvement: where do we go from here. The role of codes and standards

    International Nuclear Information System (INIS)

    Davidson, R.H.

    1976-01-01

    The role of codes and standards in contributing to the future improvement is discussed. The Nuclear Plant Reliability Data System is examined. It is suggested that two systems of the type are needed. One system should focus on component and system reliability while the other should focus on system availability, capacity factor, and fixed outage rate assessment

  14. Adapting Canada's northern infrastructure to climate change: the role of codes and standards

    International Nuclear Information System (INIS)

    Steenhof, P.

    2009-01-01

    This report provides the results of a research project that investigated the use of codes and standards in terms of their potential for fostering adaptation to the future impacts of climate change on built infrastructure in Canada's north. This involved a literature review, undertaking key informant interviews, and a workshop where key stakeholders came together to dialogue on the challenges facing built infrastructure in the north as a result of climate change and the role of codes and standards to help mitigate climate change risk. In this article, attention is given to the topic area of climate data and information requirements related to climate and climate change. This was an important focal area that was identified through this broader research effort since adequate data is essential in allowing codes and standards to meet their ultimate policy objective. A number of priorities have been identified specific to data and information needs in the context of the research topic investigated: There is a need to include northerners in developing the climate and permafrost data required for codes and standards so that these reflect the unique geographical, economic, and cultural realities and variability of the north; Efforts should be undertaken to realign climate design values so that they reflect both present and future risks; There is a need for better information on the rate and extent of permafrost degradation in the north; and, There is a need to improve monitoring of the rate of climate change in the Arctic. (author)

  15. Cape Verde Report on the Observance of Standards and Codes : Accounting and Auditing

    OpenAIRE

    World Bank

    2012-01-01

    This Report on the Observance of Standards and Codes (ROSC) provides an assessment of the strengths and weaknesses of the existing financial reporting infrastructure that underpins financial accounting and auditing practices in Cape Verde. The assessment focuses on six pillars of financial reporting infrastructure: statutory framework, professional education and training, accountancy profe...

  16. Discrete-ordinates electron transport calculations using standard neutron transport codes

    International Nuclear Information System (INIS)

    Morel, J.E.

    1979-01-01

    The primary purpose of this work was to develop a method for using standard neutron transport codes to perform electron transport calculations. The method is to develop approximate electron cross sections which are sufficiently well-behaved to be treated with standard S/sub n/ methods, but which nonetheless yield flux solutions which are very similar to the exact solutions. The main advantage of this approach is that, once the approximate cross sections are constructed, their multigroup Legendre expansion coefficients can be calculated and input to any standard S/sub n/ code. Discrete-ordinates calculations were performed to determine the accuracy of the flux solutions for problems corresponding to 1.0-MeV electrons incident upon slabs of aluminum and gold. All S/sub n/ calculations were compared with similar calculations performed with an electron Monte Carlo code, considered to be exact. In all cases, the discrete-ordinates solutions for integral flux quantities (i.e., scalar flux, energy deposition profiles, etc.) are generally in agreement with the Monte Carlo solutions to within approximately 5% or less. The central conclusion is that integral electron flux quantities can be efficiently and accurately calculated using standard S/sub n/ codes in conjunction with approximate cross sections. Furthermore, if group structures and approximate cross section construction are optimized, accurate differential flux energy spectra may also be obtainable without having to use an inordinately large number of energy groups. 1 figure

  17. ASME nuclear codes and standards: Scope of coverage and current initiatives

    International Nuclear Information System (INIS)

    Eisenberg, G. M.

    1995-01-01

    The objective of this paper is to address the broad scope of coverage of nuclear codes, standards and guides produced and administered by the American Society of Mechanical Engineers (ASME). Background information is provided regarding the evolution of the present activities. Details are provided on current initiatives intended to permit ASME to meet the needs of a changing nuclear industry on a worldwide scale. During the early years of commercial nuclear power, ASME produced a code for the construction of nuclear vessels used in the reactor coolant pressure boundary, containment and auxiliary systems. In response to industry growth, ASME Code coverage soon broadened to include rules for construction of other nuclear components, and inservice inspection of nuclear reactor coolant systems. In the years following this, the scope of ASME nuclear codes, standards and guides has been broadened significantly to include air cleaning activities for nuclear power reactors, operation and maintenance of nuclear power plants, quality assurance programs, cranes for nuclear facilities, qualification of mechanical equipment, and concrete reactor vessels and containments. ASME focuses on globalization of its codes, standards and guides by encouraging and promoting their use in the international community and by actively seeking participation of international members on its technical and supervisory committees and in accreditation activities. Details are provided on current international representation. Initiatives are underway to separate the technical requirements from administrative and enforcement requirements, to convert to hard metric units, to provide for non-U. S. materials, and to provide for translations into non-English languages. ASME activity as an accredited ISO 9000 registrar for suppliers of mechanical equipment is described. Rules are being developed for construction of containment systems for nuclear spent fuel and high-level waste transport packagings. Intensive

  18. Harmonization of nuclear codes and standards, pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, S.S.

    2006-01-01

    Full text: The codes and standards, both at the national and international level, have had a major impact on the industry worldwide and served it well in maintaining the performance and safety of the nuclear reactors and facilities. The codes and standards, in general, are consensus documents and do seek public input at various levels before they are finalized and rolled out for use by the nuclear vendors, consultants, utilities and regulatory bodies. However, the extensive development of prescriptive national standards if unchecked against the global environment and trade agreements (NAFTA, WTO, etc.) can also become barriers and cause difficulties to compete in the world market. During the last decade, the national and international writing standards writing bodies have recognized these issues and are moving more towards the rationalization and harmonization of their standards with the more widely accepted generic standards. The Pacific Nuclear Council (PNC) recognized the need for harmonization of the nuclear codes and standards for its member countries and formed a Task Group to achieve its objectives. The Task Group has a number of members from the PNC member countries. In 2005 PNC further raised the importance of this activity and formed a Working Group to cover a broader scope. The Working Group (WG) mandate is to identify and analyze the different codes and standards introduced to the Pacific Basin region, in order to achieve mutual understanding, harmonization and application in each country. This o requires the WG to develop and encourage the use of reasonably consistent criteria for the design and development, engineering, procurement, fabrication, construction, testing, operations, maintenance, waste management, decommissioning and the management of the commercial nuclear power plants in the Pacific Basin so as to: Promote consistent safety, quality, environmental and management standards for nuclear energy and other peaceful applications of nuclear

  19. Using video-taped examples of standardized patient to teach medical students taking informed consent

    Directory of Open Access Journals (Sweden)

    SHIRIN HABIBI KHORASANI

    2015-04-01

    Full Text Available Introduction: Medical student should be trained in medical ethics and one of the most essential issues in this field is taking informed consents. In this research, we compared the effect of effectiveness of teaching methods on students’ ability in taking informed consent from patients. Methods: This semi-experimental study was carried out on fifty eight subjects from the 4th-year students of Shiraz University of Medical Sciences who attended in medical ethics course before their ‘clinical clerkship’training.Method of sampling was census and students were randomly allocated into two groups of control group (n=28 was trained in traditional lecture-based class and the case groupnamed as A1 (n=22 were taught by video-taped examples of standardized patient.Then A1 group attended in traditional lecture-based classes named as A2. The groups were evaluated in terms the ability of recognition of ethical issues through the scenario based ethical examination before and after each training. Scenarios were related to the topics of informed consent. Data were analyzed by SPSS 14 software using descriptive statistics and anova test. P-value less than 0.05 was considered as significant. Results: The mean scores results of A2, A1 and B group were found to be 7.21, 5.91 and 5.73 out of 8, respectively. Comparison between the groups demonstrated that the ability of taking informed consent was significantly higher in A2 group (p<0.001, followed by A1 group (p<0.05, while was the least in the B group (p=0.875. Conclusion: According to this research, lecture-based teaching is still of great value in teaching medical ethics, but when combined with standardized patient, the outcome will be much better. It should be considered that mixed methods of teaching should be used together for better result.

  20. Using video-taped examples of standardized patient to teach medical students taking informed consent.

    Science.gov (United States)

    Habibi Khorasani, Shirin; Ebrahimi, Sedigheh

    2015-04-01

    Medical student should be trained in medical ethics and one of the most essential issues in this field is taking informed consents. In this research, we compared the effect of effectiveness of teaching methods on students' ability intaking informed consent from patients. This semi-experimental study was carried out on fifty eight subjects from the 4th-year students  of Shiraz University of Medical Sciences who attended in medical ethics coursebefore their 'clinical clerkship'training.Method of sampling was census and students were randomly allocated into two groups of control group(n=28) was trained in traditional lecture-based class and the case groupnamed as A1(n=22) were taught by video-taped examples of standardized patient.Then A1 group attended in traditional lecture-based classes named as A2. The groups were evaluated in terms the ability of recognition of ethical issuesthrough the scenario based ethical examination before and after each training. Scenarios were related to the topics ofinformed consent. Data were analyzed by SPSS 14 software using descriptive statistics and anovatest.P-Value less than 0.05 was considered as significant. The mean scores results of A2, A1and B groupwere found to be7.21 , 5.91 and 5.73 out of 8,respectively. Comparison between the groups demonstrated that the ability of taking informed consent was significantly higher in A2 group (plecture-based teaching is still of great value in teaching medical ethics, but when combined with standardized patient, the outcome will be much better.it should be considered that mixed methodsof teaching should be used together for better result.

  1. Design validation of the ITER EC upper launcher according to codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Spaeh, Peter, E-mail: peter.spaeh@kit.edu [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Aiello, Gaetano [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Gagliardi, Mario [Karlsruhe Institute of Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); F4E, Fusion for Energy, Joint Undertaking, Barcelona (Spain); Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro [Karlsruhe Institute of Technology, Institute for Applied Materials, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany); Weinhorst, Bastian [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Association KIT-EURATOM, P.O. Box 3640, D-76021 Karlsruhe (Germany)

    2015-10-15

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  2. Design validation of the ITER EC upper launcher according to codes and standards

    International Nuclear Information System (INIS)

    Spaeh, Peter; Aiello, Gaetano; Gagliardi, Mario; Grossetti, Giovanni; Meier, Andreas; Scherer, Theo; Schreck, Sabine; Strauss, Dirk; Vaccaro, Alessandro; Weinhorst, Bastian

    2015-01-01

    Highlights: • A set of applicable codes and standards has been chosen for the ITER EC upper launcher. • For a particular component load combinations, failure modes and stress categorizations have been determined. • The design validation was performed in accordance with the “design by analysis”-approach of the ASME boiler and pressure vessel code section III. - Abstract: The ITER electron cyclotron (EC) upper launcher has passed the CDR (conceptual design review) in 2005 and the PDR (preliminary design review) in 2009 and is in its final design phase now. The final design will be elaborated by the European consortium ECHUL-CA with contributions from several research institutes in Germany, Italy, the Netherlands and Switzerland. Within this consortium KIT is responsible for the design of the structural components (the upper port plug, UPP) and also the design integration of the launcher. As the selection of applicable codes and standards was under discussion for the past decade, the conceptual and the preliminary design of the launcher structure were not elaborated in straight accordance with a particular code but with a variety of well-acknowledged engineering practices. For the final design it is compulsory to validate the design with respect to a typical engineering code in order to be compliant with the ITER quality and nuclear requirements and to get acceptance from the French regulator. This paper presents typical design validation of the closure plate, which is the vacuum and Tritium barrier and thus a safety relevant component of the upper port plug (UPP), performed with the ASME boiler and pressure vessel code. Rationales for choosing this code are given as well as a comparison between different design methods, like the “design by rule” and the “design by analysis” approach. Also the selections of proper load specifications and the identification of potential failure modes are covered. In addition to that stress categorizations, analyses

  3. Harmonization of nuclear codes and standards. Pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, Shami

    2008-01-01

    Nuclear codes and standards have been an integral part of the nuclear industry since its inception. As the industry came into the main stream over the 2nd half of the 20th century, a number of national and international standards were developed to support a specific nuclear reactor concept. These codes and standards have been a key component of the industry to maintain its focus on nuclear safety, reliability and quality. Both national and international standards have served the industry well in obtaining public, shareholder and regulatory acceptance. The existing suite of national and international standards is required to support the emerging nuclear renaissance. However as noted above under Pacific Nuclear Council (PNC), Manufacturing Design Evaluation Program (MDEP) and SMiRT discussions, the time has come now for the codes and standards writing bodies and the industry to take the next step to examine the relevance of existing suite in view of current needs and challenges. This review must account for the changing global environment including global supply chain and regulatory framework, resources, deregulation, free trade, and industry need for competitiveness and performance excellence. The Task Group (TG) has made limited progress in this review period as no additional information on the listing of codes and standards have been received from the members. However, TG Chair has been successful in obtaining considerable interest from some additional individuals from the member countries. It is important that PNC management seek additional participation from the member countries and asks for their active engagement in the Working Group (WG) TG activities to achieve its mandate and deliverables. The harmonization of codes and standards is a key area for the emerging nuclear renaissance and as noted by a number of international organizations (refer to MDEP action noted above) that these tasks can not be completed unless we have the right level of resources and

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  6. Simulation of International Standard Problem No. 44 'KAEVER' experiments on aerosol behaviour with the CONTAIN code

    International Nuclear Information System (INIS)

    Kljenak, I.

    2001-01-01

    Experiments on aerosol behavior in a vapor-saturated atmosphere, which were performed in the KAEVER experimental facility and proposed for the OECD International Standard Problem No. 44, were simulated with the CONTAIN thermal-hydraulic computer code. The purpose of the work was to assess the capability of the CONTAIN code to model aerosol condensation and deposition in a containment of a light-water-reactor nuclear power plant at severe accident conditions. Results of dry and wet aerosol concentrations are presented and analyzed.(author)

  7. The applicability of ALPHA/PHOENIX/ANC nuclear design code system on Korean standard PWR's

    International Nuclear Information System (INIS)

    Lee, Kookjong; Choi, Kie-Yong; Lee, Hae-Chan; Roh, Eun-Rae

    1996-01-01

    For the Korean Standard Nuclear Power Plant (KSNPP) designed based on Combustion Engineering (CE) System 80, the Westinghouse nuclear design code system ALPHA/PHOENIX/ANC was applied to the follow-up design of initial and reload core of KSNPP. The follow-up design results of Yonggwang Unit 3 Cycle 1, 2 and Yonggwang Unit 4 Cycle 1 have shown good agreements with the measured data. The assemblywise power distributions have shown less than 2% average differences and critical boron concentrations have shown less than 20 ppm differences. All the low power physics test parameters are in good agreement. Consequently, APA design code system can be applied to KNSPP cores. (author)

  8. Probable mode prediction for H.264 advanced video coding P slices using removable SKIP mode distortion estimation

    Science.gov (United States)

    You, Jongmin; Jeong, Jechang

    2010-02-01

    The H.264/AVC (advanced video coding) is used in a wide variety of applications including digital broadcasting and mobile applications, because of its high compression efficiency. The variable block mode scheme in H.264/AVC contributes much to its high compression efficiency but causes a selection problem. In general, rate-distortion optimization (RDO) is the optimal mode selection strategy, but it is computationally intensive. For this reason, the H.264/AVC encoder requires a fast mode selection algorithm for use in applications that require low-power and real-time processing. A probable mode prediction algorithm for the H.264/AVC encoder is proposed. To reduce the computational complexity of RDO, the proposed method selects probable modes among all allowed block modes using removable SKIP mode distortion estimation. Removable SKIP mode distortion is used to estimate whether or not a further divided block mode is appropriate for a macroblock. It is calculated using a no-motion reference block with a few computations. Then the proposed method reduces complexity by performing the RDO process only for probable modes. Experimental results show that the proposed algorithm can reduce encoding time by an average of 55.22% without significant visual quality degradation and increased bit rate.

  9. Video Classification and Adaptive QoP/QoS Control for Multiresolution Video Applications on IPTV

    Directory of Open Access Journals (Sweden)

    Huang Shyh-Fang

    2012-01-01

    Full Text Available With the development of heterogeneous networks and video coding standards, multiresolution video applications over networks become important. It is critical to ensure the service quality of the network for time-sensitive video services. Worldwide Interoperability for Microwave Access (WIMAX is a good candidate for delivering video signals because through WIMAX the delivery quality based on the quality-of-service (QoS setting can be guaranteed. The selection of suitable QoS parameters is, however, not trivial for service users. Instead, what a video service user really concerns with is the video quality of presentation (QoP which includes the video resolution, the fidelity, and the frame rate. In this paper, we present a quality control mechanism in multiresolution video coding structures over WIMAX networks and also investigate the relationship between QoP and QoS in end-to-end connections. Consequently, the video presentation quality can be simply mapped to the network requirements by a mapping table, and then the end-to-end QoS is achieved. We performed experiments with multiresolution MPEG coding over WIMAX networks. In addition to the QoP parameters, the video characteristics, such as, the picture activity and the video mobility, also affect the QoS significantly.

  10. Verification of thermal-hydraulic computer codes against standard problems for WWER reflooding

    International Nuclear Information System (INIS)

    Alexander D Efanov; Vladimir N Vinogradov; Victor V Sergeev; Oleg A Sudnitsyn

    2005-01-01

    Full text of publication follows: The computational assessment of reactor core components behavior under accident conditions is impossible without knowledge of the thermal-hydraulic processes occurring in this case. The adequacy of the results obtained using the computer codes to the real processes is verified by carrying out a number of standard problems. In 2000-2003, the fulfillment of three Russian standard problems on WWER core reflooding was arranged using the experiments on full-height electrically heated WWER 37-rod bundle model cooldown in regimes of bottom (SP-1), top (SP-2) and combined (SP-3) reflooding. The representatives from the eight MINATOM's organizations took part in this work, in the course of which the 'blind' and posttest calculations were performed using various versions of the RELAP5, ATHLET, CATHARE, COBRA-TF, TRAP, KORSAR computer codes. The paper presents a brief description of the test facility, test section, test scenarios and conditions as well as the basic results of computational analysis of the experiments. The analysis of the test data revealed a significantly non-one-dimensional nature of cooldown and rewetting of heater rods heated up to a high temperature in a model bundle. This was most pronounced at top and combined reflooding. The verification of the model reflooding computer codes showed that most of computer codes fairly predict the peak rod temperature and the time of bundle cooldown. The exception is provided by the results of calculations with the ATHLET and CATHARE codes. The nature and rate of rewetting front advance in the lower half of the bundle are fairly predicted practically by all computer codes. The disagreement between the calculations and experimental results for the upper half of the bundle is caused by the difficulties of computational simulation of multidimensional effects by 1-D computer codes. In this regard, a quasi-two-dimensional computer code COBRA-TF offers certain advantages. Overall, the closest

  11. High-Penetration Photovoltaics Standards and Codes Workshop, Denver, Colorado, May 20, 2010: Workshop Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, M.; Kroposki, B.; Basso, T.; Lynn, K.; Herig, C.; Bower, W.

    2010-09-01

    Effectively interconnecting high-level penetration of photovoltaic (PV) systems requires careful technical attention to ensuring compatibility with electric power systems. Standards, codes, and implementation have been cited as major impediments to widespread use of PV within electric power systems. On May 20, 2010, in Denver, Colorado, the National Renewable Energy Laboratory, in conjunction with the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), held a workshop to examine the key technical issues and barriers associated with high PV penetration levels with an emphasis on codes and standards. This workshop included building upon results of the High Penetration of Photovoltaic (PV) Systems into the Distribution Grid workshop held in Ontario California on February 24-25, 2009, and upon the stimulating presentations of the diverse stakeholder presentations.

  12. JJ1017 committee report: image examination order codes--standardized codes for imaging modality, region, and direction with local expansion: an extension of DICOM.

    Science.gov (United States)

    Kimura, Michio; Kuranishi, Makoto; Sukenobu, Yoshiharu; Watanabe, Hiroki; Tani, Shigeki; Sakusabe, Takaya; Nakajima, Takashi; Morimura, Shinya; Kabata, Shun

    2002-06-01

    The digital imaging and communications in medicine (DICOM) standard includes parts regarding nonimage data information, such as image study ordering data and performed procedure data, and is used for sharing information between HIS/RIS and modality systems, which is essential for IHE. To bring such parts of the DICOM standard into force in Japan, a joint committee of JIRA and JAHIS established the JJ1017 management guideline, specifying, for example, which items are legally required in Japan, while remaining optional in the DICOM standard. In Japan, the contents of orders from referring physicians for radiographic examinations include details of the examination. Such details are not used typically by referring physicians requesting radiographic examinations in the United States, because radiologists in the United States often determine the examination protocol. The DICOM standard has code tables for examination type, region, and direction for image examination orders. However, this investigation found that it does not include items that are detailed sufficiently for use in Japan, because of the above-mentioned reason. To overcome these drawbacks, we have generated the JJ1017 code for these 3 codes for use based on the JJ1017 guidelines. This report introduces the JJ1017 code. These codes (the study type codes in particular) must be expandable to keep up with technical advances in equipment. Expansion has 2 directions: width for covering more categories and depth for specifying the information in more detail (finer categories). The JJ1017 code takes these requirements into consideration and clearly distinguishes between the stem part as the common term and the expansion. The stem part of the JJ1017 code partially utilizes the DICOM codes to remain in line with the DICOM standard. This work is an example of how local requirements can be met by using the DICOM standard and extending it.

  13. Regulations, Codes, and Standards (RCS) Template for California Hydrogen Dispensing Stations

    Energy Technology Data Exchange (ETDEWEB)

    Rivkin, C.; Blake, C.; Burgess, R.; Buttner, W.; Post, M.

    2012-11-01

    This report explains the Regulations, Codes, and Standards (RCS) requirements for hydrogen dispensing stations in the State of California. The reports shows the basic components of a hydrogen dispensing station in a simple schematic drawing; the permits and approvals that would typically be required for the construction and operation of a hydrogen dispensing station; and a basic permit that might be employed by an Authority Having Jurisdiction (AHJ).

  14. International standards: the World Organisation for Animal Health Terrestrial Animal Health Code.

    Science.gov (United States)

    Thiermann, A B

    2015-04-01

    This paper provides a description of the international standards contained in the TerrestrialAnimal Health Code of the World Organisation for Animal Health (OIE) that relate to the prevention and control of vector-borne diseases. It identifies the rights and obligations of OIE Member Countries regarding the notification of animal disease occurrences, as well as the recommendations to be followed for a safe and efficient international trade of animals and their products.

  15. Non-standard model for electron heat transport for multidimensional hydrodynamic codes

    Energy Technology Data Exchange (ETDEWEB)

    Nicolai, Ph.; Busquet, M.; Schurtz, G. [CEA/DAM-Ile de France, 91 - Bruyeres Le Chatel (France)

    2000-07-01

    In simulations of laser-produced plasma, modeling of heat transport requires an artificial limitation of standard Spitzer-Haerm fluxes. To improve heat conduction processing, we have developed a multidimensional model which accounts for non-local features of heat transport and effects of self-generated magnetic fields. This consistent treatment of both mechanisms has been implemented in a two-dimensional radiation-hydrodynamic code. First results indicate good agreements between simulations and experimental data. (authors)

  16. Non-standard model for electron heat transport for multidimensional hydrodynamic codes

    International Nuclear Information System (INIS)

    Nicolai, Ph.; Busquet, M.; Schurtz, G.

    2000-01-01

    In simulations of laser-produced plasma, modeling of heat transport requires an artificial limitation of standard Spitzer-Haerm fluxes. To improve heat conduction processing, we have developed a multidimensional model which accounts for non-local features of heat transport and effects of self-generated magnetic fields. This consistent treatment of both mechanisms has been implemented in a two-dimensional radiation-hydrodynamic code. First results indicate good agreements between simulations and experimental data. (authors)

  17. Accounting Education Approach in the Context of New Turkish Commercial Code and Turkish Accounting Standards

    OpenAIRE

    Cevdet Kızıl; Ayşe Tansel Çetin; Ahmed Bulunmaz

    2014-01-01

    The aim of this article is to investigate the impact of new Turkish commercial code and Turkish accounting standards on accounting education. This study takes advantage of the survey method for gathering information and running the research analysis. For this purpose, questionnaire forms are distributed to university students personally and via the internet.This paper includes significant research questions such as “Are accounting academicians informed and knowledgeable on new Turkish commerc...

  18. Integrating industry nuclear codes and standards into United States Department of Energy facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jacox, J.

    1995-02-01

    Recently the United States Department of Energy (DOE) has mandated facilities under their jurisdiction use various industry Codes and Standards developed for civilian power reactors that operate under U.S. Nuclear Regulatory Commission License. While this is a major step forward in putting all our nuclear facilities under common technical standards there are always problems associated with implementing such advances. This paper will discuss some of the advantages and problems experienced to date. These include the universal challenge of educating new users of any technical documents, repeating errors made by the NRC licensed facilities over the years and some unique problems specific to DOE facilities.

  19. Validation of the ASSERT subchannel code for prediction of CHF in standard and non-standard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Kiteley, J.C.; Carver, M.B.; Zhou, Q.N.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting critical heat flux (CHF) at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is the only tool available to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries. 28 refs., 12 figs

  20. Validation of the assert subchannel code: Prediction of CHF in standard and non-standard Candu bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1993-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of CANDU PHWR fuel channels, and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of prediting CHF at these local conditions, makes it a unique tool for predicting CHF for situations outside the existing experimental data base. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. This paper discusses the numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology. The evolutionary validation plan is discussed, and early validation exercises are summarized. The paper concentrates, however, on more recent validation exercises in standard and non-standard geometries

  1. ENDF-UTILITY-CODES, codes to check and standardize data in the Evaluated Nuclear Data File (ENDF)

    International Nuclear Information System (INIS)

    Dunford, Charles L.

    2007-01-01

    1 - Description of program or function: The ENDF Utility Codes include 9 codes to check and standardize data in the Evaluated Nuclear Data File (ENDF). Four programs of this release, GETMAT, LISTEF, PLOTEF and SETMDC are no more maintained since release 6.13. The suite of ENDF utility codes includes: - CHECKR (version 7.01) is a program for checking that an evaluated data file conforms to the ENDF format. - FIZCON (version 7.02) is a program for checking that an evaluated data file has valid data and conforms to recommended procedures. - GETMAT (version 6.13) is designed to retrieve one or more materials from an ENDF formatted data file. The output will contain only the selected materials. - INTER (version 7.01) calculates thermal cross sections, g-factors, resonance integrals, fission spectrum averaged cross sections and 14.0 MeV (or other energy) cross sections for major reactions in an ENDF-6 or ENDF-5 format data file. - LISTEF (version 6.13) is designed to produce summary and annotated listings of a data file in either ENDF-6 or ENDF-5 format. - PLOTEF (version 6.13) is designed to produce graphical displays of a data file in either ENDF-5 or ENDF-6 format. The form of graphical output depends on the graphical devices available at the installation where this code will be used. - PSYCHE (version 7.02) is a program for checking the physics content of an evaluated data file. It can recognise the difference between ENDF-5 or ENDF-6 formats and performs its tests accordingly. - SETMDC (version 6.13) is a utility program that converts the source decks of programs to different computers (DOS, UNIX, LINUX, VMS, Windows). - STANEF (version 7.01) performs bookkeeping operations on a data file containing one or more material evaluations in ENDF format. The version 7.02 of the ENDF Utility Codes corrects all bugs reported to NNDC as of April 1, 2005 and supersedes all previous releases. Three codes CHECKR, STANEF, and INTER were actually ported from the 7.01 release

  2. High data-rate video broadcasting over 3G wireless systems

    NARCIS (Netherlands)

    Atici, C.; Sunay, M.O.

    2007-01-01

    In cellular environments, video broadcasting is a challenging problem in which the number of users receiving the service and the average successfully decoded video data-rate have to be intelligently optimized. When video is broadcasted using the 3G packet data standard, 1xEV-DO, the code space may

  3. Comparison of the General Electric BWR/6 standard plant design to the IAEA NUSS codes and guides

    International Nuclear Information System (INIS)

    D'Ardenne, W.H.; Sherwood, G.G.

    1985-01-01

    The General Electric BWR/6 Mark III standard plant design meets or exceeds current requirements of published International Atomic Energy Agency (IAEA) Nuclear Safety Standards (NUSS) codes and guides. This conclusion is based on a review of the NUSS codes and guides by General Electric and by the co-ordinated US review of the NUSS codes and guides during their development. General Electric compared the published IAEA NUSS codes and guides with the General Electric design. The applicability of each code and guide to the BWR/6 Mark III standard plant design was determined. Each code or guide was reviewed by a General Electric engineer knowledgeable about the structures, systems and components addressed and the technical area covered by that code or guide. The results of this review show that the BWR/6 Mark III standard plant design meets or exceeds the applicable requirements of the published IAEA NUSS codes and guides. The co-ordinated US review of the IAEA NUSS codes and guides corroborates the General Electric review. In the co-ordinated US review, the USNRC and US industry organizations (including General Electric) review the NUSS codes and guides during their development. This review ensures that the NUSS codes and guides are consistent with the current US government regulations, guidance and regulatory practices, US voluntary industry codes and standards, and accepted US industry design, construction and operational practices. If any inconsistencies are identified, comments are submitted to the IAEA by the USNRC. All US concerns submitted to the IAEA have been resolved. General Electric design reviews and the Final Design Approval (FDA) issued by the USNRC have verified that the General Electric BWR/6 Mark III standard plant design meets or exceeds the current US requirements, guidance and practices. Since these requirements, guidance and practices meet or exceed those of the NUSS codes and guides, so does the General Electric design. (author)

  4. SRAC: JAERI thermal reactor standard code system for reactor design and analysis

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Takano, Hideki; Horikami, Kunihiko; Ishiguro, Yukio; Kaneko, Kunio; Hara, Toshiharu.

    1983-01-01

    The SRAC (Standard Reactor Analysis Code) is a code system for nuclear reactor analysis and design. It is composed of neutron cross section libraries and auxiliary processing codes, neutron spectrum routines, a variety of transport, 1-, 2- and 3-D diffusion routines, dynamic parameters and cell burn-up routines. By making the best use of the individual code function in the SRAC system, the user can select either the exact method for an accurate estimate of reactor characteristics or the economical method aiming at a shorter computer time, depending on the purpose of study. The user can select cell or core calculation; fixed source or eigenvalue problem; transport (collision probability or Sn) theory or diffusion theory. Moreover, smearing and collapsing of macroscopic cross sections are separately done by the user's selection. And a special attention is paid for double heterogeneity. Various techniques are employed to access the data storage and to optimize the internal data transfer. Benchmark calculations using the SRAC system have been made extensively for the Keff values of various types of critical assemblies (light water, heavy water and graphite moderated systems, and fast reactor systems). The calculated results show good prediction for the experimental Keff values. (author)

  5. TANDA TANGAN DIGITAL MENGGUNAKAN QR CODE DENGAN METODE ADVANCED ENCRYPTION STANDARD

    Directory of Open Access Journals (Sweden)

    Abdul Gani Putra Suratma

    2017-04-01

    Full Text Available Tanda tangan digital (digital signature adalah sebuah skema matematis yang secara unik mengidentifikasikan seorang pengirim, sekaligus untuk membuktikan keaslian dari pemilik sebuah pesan atau dokumen digital, sehingga sebuah tanda tangan digital yang autentik (sah, sudah cukup menjadi alasan bagi penerima un- tuk percaya bahwa sebuah pesan atau dokumen yang diterima adalah berasal dari pengirim yang telah diketahui. Perkembangan teknologi memungkinkan adanya tanda tangan digital yang dapat digunakan untuk melakukan pembuktian secara matematis, sehingga informasi yang didapat oleh satu pihak dari pihak lain dapat diidentifikasi untuk memastikan keaslian informasi yang diterima. Tanda tangan digital merupakan mekanisme otentikasi yang memungkinkan pembuat pesan menambahkan sebuah kode yang bertindak sebagai tanda tangannya. Tujuan dari penelitian ini menerapkan QR Code atau yang dikenal dengan istilah QR (Quick Respon dan Algoritma yang akan ditambahkan yaitu AES (Advanced Encryption Standard sebagai tanda tangan digital sehingga hasil dari penelitian penerapan QR Code menggunakan algoritma Advanced Encryption Standard sebagai tanda tangan digital dapat berfungsi sebagai otentikasi tanda tangan pimpinan serta ve- rivikasi dokumen pengambilan barang yang sah. dari penelitian ini akurasi klasifi- kasi QR Code dengan menggunakan naïve bayes classifier sebesar 90% dengan precision positif sebesar 80% dan precision negatif sebesar 100%.

  6. International pressure vessels and piping codes and standards. Volume 2: Current perspectives; PVP-Volume 313-2

    International Nuclear Information System (INIS)

    Rao, K.R.; Asada, Yasuhide; Adams, T.M.

    1995-01-01

    The topics in this volume include: (1) Recent or imminent changes to Section 3 design sections; (2) Select perspectives of ASME Codes -- Section 3; (3) Select perspectives of Boiler and Pressure Vessel Codes -- an international outlook; (4) Select perspectives of Boiler and Pressure Vessel Codes -- ASME Code Sections 3, 8 and 11; (5) Codes and Standards Perspectives for Analysis; (6) Selected design perspectives on flow-accelerated corrosion and pressure vessel design and qualification; (7) Select Codes and Standards perspectives for design and operability; (8) Codes and Standards perspectives for operability; (9) What's new in the ASME Boiler and Pressure Vessel Code?; (10) A look at ongoing activities of ASME Sections 2 and 3; (11) A look at current activities of ASME Section 11; (12) A look at current activities of ASME Codes and Standards; (13) Simplified design methodology and design allowable stresses -- 1 and 2; (14) Introduction to Power Boilers, Section 1 of the ASME Code -- Part 1 and 2. Separate abstracts were prepared for most of the individual papers

  7. Manipulations of the features of standard video lottery terminal (VLT) games: effects in pathological and non-pathological gamblers.

    Science.gov (United States)

    Loba, P; Stewart, S H; Klein, R M; Blackburn, J R

    2001-01-01

    The present study was conducted to identify game parameters that would reduce the risk of abuse of video lottery terminals (VLTs) by pathological gamblers, while exerting minimal effects on the behavior of non-pathological gamblers. Three manipulations of standard VLT game features were explored. Participants were exposed to: a counter which displayed a running total of money spent; a VLT spinning reels game where participants could no longer "stop" the reels by touching the screen; and sensory feature manipulations. In control conditions, participants were exposed to standard settings for either a spinning reels or a video poker game. Dependent variables were self-ratings of reactions to each set of parameters. A set of 2(3) x 2 x 2 (game manipulation [experimental condition(s) vs. control condition] x game [spinning reels vs. video poker] x gambler status [pathological vs. non-pathological]) repeated measures ANOVAs were conducted on all dependent variables. The findings suggest that the sensory manipulations (i.e., fast speed/sound or slow speed/no sound manipulations) produced the most robust reaction differences. Before advocating harm reduction policies such as lowering sensory features of VLT games to reduce potential harm to pathological gamblers, it is important to replicate findings in a more naturalistic setting, such as a real bar.

  8. International symposium on standards and codes of practice in medical radiation dosimetry. Book of extended synopses

    International Nuclear Information System (INIS)

    2002-01-01

    The development of radiation measurement standards by National Metrology Institutes (NMIs) and their dissemination to Secondary Standard Dosimetry Laboratories (SSDLs), cancer therapy centres and hospitals represent essential aspects of the radiation dosimetry measurement chain. Although the demands for accuracy in radiotherapy initiated the establishment of such measurement chains, similar traceable dosimetry procedures have been implemented, or are being developed, in other areas of radiation medicine (e.g. diagnostic radiology and nuclear medicine), in radiation protection and in industrial applications of radiation. In the past few years the development of primary standards of absorbed dose to water in 60 Co for radiotherapy dosimetry has made direct calibrations in terms of absorbed dose to water available in many countries for the first time. Some laboratories have extended the development of these standards to high energy photon and electron beams and to low and medium energy x-ray beams. Other countries, however, still base their dosimetry for radiotherapy on air kerma standards. Dosimetry for conventional external beam radiotherapy was probably the field where standardized procedures adopted by medical physicists at hospitals were developed first. Those were related to exposure and air kerma standards. The recent development of Codes of Practice (or protocols) based on the concept of absorbed dose to water has led to changes in calibration procedures at hospitals. The International Code of Practice for Dosimetry Based on Standards of Absorbed Dose to Water (TRS 398) was sponsored by the International Atomic Energy Agency (IAEA), World Health Organization (WHO), Pan-American Health Organization (PAHO) and the European Society for Therapeutic Radiology and Oncology (ESTRO) and is expected to be adopted in many countries worldwide. It provides recommendations for the dosimetry of all types of beams (except neutrons) used in external radiotherapy and satisfies

  9. International symposium on standards and codes of practice in medical radiation dosimetry. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    The development of radiation measurement standards by National Metrology Institutes (NMIs) and their dissemination to Secondary Standard Dosimetry Laboratories (SSDLs), cancer therapy centres and hospitals represent essential aspects of the radiation dosimetry measurement chain. Although the demands for accuracy in radiotherapy initiated the establishment of such measurement chains, similar traceable dosimetry procedures have been implemented, or are being developed, in other areas of radiation medicine (e.g. diagnostic radiology and nuclear medicine), in radiation protection and in industrial applications of radiation. In the past few years the development of primary standards of absorbed dose to water in {sup 60}Co for radiotherapy dosimetry has made direct calibrations in terms of absorbed dose to water available in many countries for the first time. Some laboratories have extended the development of these standards to high energy photon and electron beams and to low and medium energy x-ray beams. Other countries, however, still base their dosimetry for radiotherapy on air kerma standards. Dosimetry for conventional external beam radiotherapy was probably the field where standardized procedures adopted by medical physicists at hospitals were developed first. Those were related to exposure and air kerma standards. The recent development of Codes of Practice (or protocols) based on the concept of absorbed dose to water has led to changes in calibration procedures at hospitals. The International Code of Practice for Dosimetry Based on Standards of Absorbed Dose to Water (TRS 398) was sponsored by the International Atomic Energy Agency (IAEA), World Health Organization (WHO), Pan-American Health Organization (PAHO) and the European Society for Therapeutic Radiology and Oncology (ESTRO) and is expected to be adopted in many countries worldwide. It provides recommendations for the dosimetry of all types of beams (except neutrons) used in external radiotherapy and

  10. Simulation of international standard problem no. 44 open tests using Melcor computer code

    International Nuclear Information System (INIS)

    Song, Y.M.; Cho, S.W.

    2001-01-01

    MELCOR 1.8.4 code has been employed to simulate the KAEVER test series of K123/K148/K186/K188 that were proposed as open experiments of International Standard Problem No.44 by OECD-CSNI. The main purpose of this study is to evaluate the accuracy of the MELCOR aerosol model which calculates the aerosol distribution and settlement in a containment. For this, thermal hydraulic conditions are simulated first for the whole test period and then the behavior of hygroscopic CsOH/CsI and unsoluble Ag aerosols, which are predominant activity carriers in a release into the containment, is compared between the experimental results and the code predictions. The calculation results of vessel atmospheric concentration show a good simulation for dry aerosol but show large difference for wet aerosol due to a data mismatch in vessel humidity and the hygroscopicity. (authors)

  11. ISO 639-1 and ISO 639-2: International Standards for Language Codes. ISO 15924: International Standard for Names of Scripts.

    Science.gov (United States)

    Byrum, John D.

    This paper describes two international standards for the representation of the names of languages. The first (ISO 639-1), published in 1988, provides two-letter codes for 136 languages and was produced primarily to meet terminological needs. The second (ISO 639-2) appeared in late 1998 and includes three-letter codes for 460 languages. This list…

  12. Coding of Depth Images for 3DTV

    DEFF Research Database (Denmark)

    Zamarin, Marco; Forchhammer, Søren

    In this short paper a brief overview of the topic of coding and compression of depth images for multi-view image and video coding is provided. Depth images represent a convenient way to describe distances in the 3D scene, useful for 3D video processing purposes. Standard approaches...... for the compression of depth images are described and compared against some recent specialized algorithms able to achieve higher compression performances. Future research directions close the paper....

  13. Introduce subtitles to your video using Aegisub

    CERN Multimedia

    CERN. Geneva; Dawson, Kyle Richard

    2018-01-01

    This is a video explaining how to equip your video with subtitles using the tool Aegisub. You'll also need site webvtt.org Here is the standard filenames for subtitles in various languages. to be fully compatible with both CDS and Videos, please name the subtitle filename in a standard format, _.vtt, where is a two letters ISO language (https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes).   NB! You need to have the script written beforehand!

  14. Review and evaluation of technology, equipment, codes and standards for digitization of industrial radiographic film

    International Nuclear Information System (INIS)

    1992-05-01

    This reports contains a review and evaluation of the technology, equipment, and codes and standards related to the digitization of industrial radiographic film. The report presents recommendations and equipment-performance specifications that will allow the digitization of radiographic film from nuclear power plant components in order to produce faithful reproductions of flaw images of interest on the films. Justification for the specifications selected are provided. Performance demonstration tests for the digitization process are required and criteria for such tests is presented. Also several comments related to implementation of the technology are presented and discussed

  15. Experimental video signals distribution MMF network based on IEEE 802.11 standard

    Science.gov (United States)

    Kowalczyk, Marcin; Maksymiuk, Lukasz; Siuzdak, Jerzy

    2014-11-01

    The article was focused on presentation the achievements in a scope of experimental research on transmission of digital video streams in the frame of specially realized for this purpose ROF (Radio over Fiber) network. Its construction was based on the merge of wireless IEEE 802.11 network, popularly referred as Wi-Fi, with a passive optical network PON based on multimode fibers MMF. The proposed approach can constitute interesting proposal in area of solutions in the scope of the systems monitoring extensive, within which is required covering of a large area with ensuring of a relatively high degree of immunity on the interferences transmitted signals from video IP cameras to the monitoring center and a high configuration flexibility (easily change the deployment of cameras) of such network.

  16. Subjective evaluation of next-generation video compression algorithms: a case study

    Science.gov (United States)

    De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio

    2010-08-01

    This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.

  17. Error detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3

    Science.gov (United States)

    Fujiwara, Toru; Kasami, Tadao; Lin, Shu

    1989-09-01

    The error-detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3 are investigated. These codes are also used for error detection in the data link layer of the Ethernet, a local area network. The weight distributions for various code lengths are calculated to obtain the probability of undetectable error and that of detectable error for a binary symmetric channel with bit-error rate between 0.00001 and 1/2.

  18. Fundamental challenging problems for developing new nuclear safety standard computer codes

    International Nuclear Information System (INIS)

    Wong, P.K.; Wong, A.E.; Wong, A.

    2005-01-01

    Based on the claims of the US Basic patents number 5,084,232; 5,848,377 and 6,430,516 that can be obtained from typing the Patent Numbers into the Box of the Web site http://164.195.100.11/netahtml/srchnum.htm and their associated published technical papers having been presented and published at International Conferences in the last three years and that all these had been sent into US-NRC by E-mail on March 26, 2003 at 2:46 PM., three fundamental challenging problems for developing new nuclear safety standard computer codes had been presented at the US-NRC RIC2003 Session W4. 2:15-3:15 PM. at the Washington D.C. Capital Hilton Hotel, Presidential Ballroom on April 16, 2003 in front of more than 800 nuclear professionals from many countries worldwide. The objective and scope of this paper is to invite all nuclear professionals to examine and evaluate all the current computer codes being used in their own countries by means of comparison of numerical data from these three specific openly challenging fundamental problems in order to set up a global safety standard for all nuclear power plants in the world. (authors)

  19. The standard genetic code and its relation to mutational pressure: robustness and equilibrium criteria

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Martinez Ortiz, Carlos; Sautie Castellanos, Miguel; Valdes, Kiria; Guevara Erra, Ramon

    2004-10-01

    Under the assumption of even point mutation pressure on the DNA strand, rates for transitions from one amino acid into another were assessed. Nearly 25% of all mutations were silent. About 48% of the mutations from a given amino acid stream either into the same amino acid or into an amino acid of the same class. These results suggest a great stability of the Standard Genetic Code respect to mutation load. Concepts from chemical equilibrium theory are applicable into this case provided that mutation rate constants are given. It was obtained that unequal synonymic codon usage may lead to changes in the equilibrium concentrations. Data from real biological species showed that several amino acids are close to the respective equilibrium concentration. However in all the cases the concentration of leucine nearly doubled its equilibrium concentration, whereas for the stop command (Term) it was about 10 times lower. The overall distance from equilibrium for a set of species suggests that eukaryotes are closer to equilibrium than prokaryotes, and the HIV virus was closest to equilibrium among 15 species. We obtained that contemporary species are closer to the equilibrium than the Last Universal Common Ancestor (LUCA) was. Similarly, nonpreserved regions in proteins are closer to equilibrium than the preserved ones. We suggest that this approach can be useful for exploring some aspects of biological evolution in the framework of Standard Genetic Code properties. (author)

  20. HCPB TBM thermo mechanical design: Assessment with respect codes and standards and DEMO relevancy

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Aiello, G.

    2011-01-01

    In the frame of the activities of the European TBM Consortium of Associates the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) is developed in Karlsruhe Institute of Technology (KIT). After performing detailed thermal and fluid dynamic analyses of the preliminary HCPB TBM design, the thermo mechanical behaviour of the TBM under typical ITER loads has to be assessed. A synthesis of the different design options proposed has been realized building two different assemblies of the HCPB-TBM: these two assemblies and the analyses performed on them are presented in this paper. Finite Element thermo-mechanical analyses of two detailed 1/4 scaled models of the HCPB-TBM assemblies proposed have been performed, with the aim of verifying the accordance of the mechanical behaviour with the criteria of the design codes and standards. The structural design limits specified in the codes and standard are discussed in relation with the EUROFER available data and possible damage modes. Solutions to improve the weak structural points of the present design are identified and the DEMO relevancy of the present thermal and structural design parameters is discussed.

  1. Battelle integrity of nuclear piping program. Summary of results and implications for codes/standards

    International Nuclear Information System (INIS)

    Miura, Naoki

    2005-01-01

    The BINP(Battelle Integrity of Nuclear Piping) program was proposed by Battelle to elaborate pipe fracture evaluation methods and to improve LBB and in-service flaw evaluation criteria. The program has been conducted from October 1998 to September 2003. In Japan, CRIEPI participated in the program on behalf of electric utilities and fabricators to catch up the technical backgrounds for possible future revision of LBB and in-service flaw evaluation standards and to investigate the issues needed to be reflected to current domestic standards. A series of the results obtained from the program has been well utilized for the new LBB Regulatory Guide Program by USNRC and for proposal of revised in-service flaw evaluation criteria to the ASME Code Committee. The results were assessed whether they had implications for the existing or future domestic standards. As a result, the impact of many of these issues, which were concerned to be adversely affected to LBB approval or allowable flaw sizes in flaw evaluation criteria, was found to be relatively minor under actual plant conditions. At the same time, some issues that needed to be resolved to address advanced and rational standards in the future were specified. (author)

  2. Study of relationship between radioactivity distribution, contamination burden and quality standard, accommodate energy of code river Yogyakarta

    International Nuclear Information System (INIS)

    Agus Taftazani and Muzakky

    2009-01-01

    Study of relationship between distribution, contamination burden of gross β radioactivity and natural radionuclide in water and sediment sample from 11 observation station Code river to quality standard and maximum capacity of Code river have been done. Natural radio nuclides identification and gross β radioactivity measurement of condensed water, dry and homogeneous sediment powder (past through 100 mesh sieve) samples have been done by using spectrometer and GM counter. Radioactivity data was analyzed descriptive with histogram to show the spreading pattern of data. Contamination burden data, quality standard and maximum capacity of river Code was to descriptive analyzed by line diagram to knowing relationship between contamination burden, quality standard, and maximum capacity of Code river. The observation of water and sediment at 11 observation station show that the emitter natural radionuclides: 210 Pb, 212 Pb, 214 Pb, 226 Ra, 208 Tl, 214 Bi, 228 Ac and 40 K were detected. The analytical result conclusion was that the pattern spread of average activity gross β and were increase from upstream to downstream of the Code river samples. Contamination burden, quality standard and maximum capacity of radionuclide activity of 210 Pb, 212 Pb, 226 Ra and 228 Ac were more smaller than quality standard of river water according to regulation of Nuclear Energy Regulatory Agency 02/Ka-BAPETEN/V-99 concerning quality standard of radioactivity. It’s mean that Code river still in good contamination burden for the four radionuclides. (author)

  3. Evaluating Perceived Naturalness of Facial Expression After Fillers to the Nasolabial Folds and Lower Face With Standardized Video and Photography.

    Science.gov (United States)

    Philipp-Dormston, Wolfgang G; Wong, Cindy; Schuster, Bernd; Larsson, Markus K; Podda, Maurizio

    2018-06-01

    Hyaluronic acid (HA) fillers are commonly used in treating facial wrinkles and folds but have not been studied with standardized methodology to include assessment of standard facial expressions. To assess perceived naturalness of facial expression after treatment with 2 HA fillers manufactured with XpresHAn Technology (also known as Optimal Balance Technology). Treatment was directed to the nasolabial folds (NLFs) and at least 1 additional lower face wrinkle or fold. Maintenance of naturalness, attractiveness, and age at 1 month after optimal treatment were assessed using video recordings and photographs capturing different facial animations. Global aesthetic improvement, subjects' satisfaction, and safety were also evaluated. The treatment was well tolerated. Naturalness of facial expression in motion was determined to be at least maintained in 95% of subjects. Attractiveness was enhanced in 89% of subjects and 79% of subjects were considered to look younger. Most subjects assessed their aesthetic appearance as improved and were satisfied with their treatment. Naturalness and attractiveness can be assessed using video recordings and photographs capturing different facial animations. XpresHAn Technology HA filler treatments create natural-looking results with high subject satisfaction.

  4. Japanese standard method for safety evaluation using best estimate code based on uncertainty and scaling analyses with statistical approach

    International Nuclear Information System (INIS)

    Mizokami, Shinya; Hotta, Akitoshi; Kudo, Yoshiro; Yonehara, Tadashi; Watada, Masayuki; Sakaba, Hiroshi

    2009-01-01

    Current licensing practice in Japan consists of using conservative boundary and initial conditions(BIC), assumptions and analytical codes. The safety analyses for licensing purpose are inherently deterministic. Therefore, conservative BIC and assumptions, such as single failure, must be employed for the analyses. However, using conservative analytical codes are not considered essential. The standard committee of Atomic Energy Society of Japan(AESJ) has drawn up the standard for using best estimate codes for safety analyses in 2008 after three-years of discussions reflecting domestic and international recent findings. (author)

  5. SECOND ATLAS DOMESTIC STANDARD PROBLEM (DSP-02 FOR A CODE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    YEON-SIK KIM

    2013-12-01

    Full Text Available KAERI (Korea Atomic Energy Research Institute has been operating an integral effect test facility, the Advanced Thermal-Hydraulic Test Loop for Accident Simulation (ATLAS, for transient and accident simulations of advanced pressurized water reactors (PWRs. Using ATLAS, a high-quality integral effect test database has been established for major design basis accidents of the APR1400 plant. A Domestic Standard Problem (DSP exercise using the ATLAS database was promoted to transfer the database to domestic nuclear industries and contribute to improving a safety analysis methodology for PWRs. This 2nd ATLAS DSP (DSP-02 exercise aims at an effective utilization of an integral effect database obtained from ATLAS, the establishment of a cooperation framework among the domestic nuclear industry, a better understanding of the thermal hydraulic phenomena, and an investigation into the possible limitation of the existing best-estimate safety analysis codes. A small break loss of coolant accident with a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. This DSP exercise was performed in an open calculation environment where the integral effect test data was open to participants prior to the code calculations. This paper includes major information of the DSP-02 exercise as well as comparison results between the calculations and the experimental data.

  6. The Ontario Energy Board`s draft standard supply service code: effects on air quality

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, J.; Bjorkquist, S. [Ontario Clean Air Alliance, Toronto, ON (Canada)

    1999-06-29

    The Ontario Clean Air Alliance (OCAA), a coalition of 67 organizations, takes issue with the Ontario Energy Board`s draft document `Standard Supply Service Code`, particularly sections 2.2.2. and 2.5.2 which they claim are not in the public interest unless the Ontario government implements the OCAA`s recommended emission caps. The alliance is of the view that without strict new environmental regulations the proposed Code would encourage the use of coal for electricity generation. Public health, the environment, consumer interests, job creation and promotion of a competitive electricity market would all be jeopardized by this development, the alliance states. The argument is supported by extensive reference to the Final Report of the Ontario Market Design Committee (MDC) which also emphasized the importance of combining the introduction of competition with appropriate environmental regulations, singling out the emission cap and trade program, and recommending that it be launched concurrently with the electricity market opening for competition. The view of the MDC was that public support for restructuring would not be forthcoming in the absence of regulatory measures to control power plant emissions. 25 refs.

  7. Second ATLAS Domestic Standard Problem (DSP-02) For A Code Assessment

    International Nuclear Information System (INIS)

    Kim, Yeonsik; Choi, Kiyong; Cho, Seok; Park, Hyunsik; Kang, Kyungho; Song, Chulhwa; Baek, Wonpil

    2013-01-01

    KAERI (Korea Atomic Energy Research Institute) has been operating an integral effect test facility, the Advanced Thermal-Hydraulic Test Loop for Accident Simulation (ATLAS), for transient and accident simulations of advanced pressurized water reactors (PWRs). Using ATLAS, a high-quality integral effect test database has been established for major design basis accidents of the APR1400 plant. A Domestic Standard Problem (DSP) exercise using the ATLAS database was promoted to transfer the database to domestic nuclear industries and contribute to improving a safety analysis methodology for PWRs. This 2 nd ATLAS DSP (DSP-02) exercise aims at an effective utilization of an integral effect database obtained from ATLAS, the establishment of a cooperation framework among the domestic nuclear industry, a better understanding of the thermal hydraulic phenomena, and an investigation into the possible limitation of the existing best-estimate safety analysis codes. A small break loss of coolant accident with a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. This DSP exercise was performed in an open calculation environment where the integral effect test data was open to participants prior to the code calculations. This paper includes major information of the DSP-02 exercise as well as comparison results between the calculations and the experimental data

  8. The AutoProof Verifier: Usability by Non-Experts and on Standard Code

    Directory of Open Access Journals (Sweden)

    Carlo A. Furia

    2015-08-01

    Full Text Available Formal verification tools are often developed by experts for experts; as a result, their usability by programmers with little formal methods experience may be severely limited. In this paper, we discuss this general phenomenon with reference to AutoProof: a tool that can verify the full functional correctness of object-oriented software. In particular, we present our experiences of using AutoProof in two contrasting contexts representative of non-expert usage. First, we discuss its usability by students in a graduate course on software verification, who were tasked with verifying implementations of various sorting algorithms. Second, we evaluate its usability in verifying code developed for programming assignments of an undergraduate course. The first scenario represents usability by serious non-experts; the second represents usability on "standard code", developed without full functional verification in mind. We report our experiences and lessons learnt, from which we derive some general suggestions for furthering the development of verification tools with respect to improving their usability.

  9. Low-complexity wavelet-based image/video coding for home-use and remote surveillance

    NARCIS (Netherlands)

    Loomans, M.J.H.; Koeleman, C.J.; Joosen, K.M.J.; With, de P.H.N.

    2011-01-01

    The availability of inexpensive cameras enables alternative applications beyond personal video communication. For example, surveillance of rooms and home premises is such an alternative application, which can be extended with remote viewing on hand-held battery-powered consumer devices. Scalable

  10. The recently chosen digital video standard : playing the game within the game

    NARCIS (Netherlands)

    Lint, L.J.O.; Pennings, H.P.G.

    2000-01-01

    In the recent process leading to the agreement on the digital versatile disc two product standards have been developed: one by Philips and Sony, and the other by Toshiba and Time Warner. Three actions in the process of standardization have startled business analysts. First, Matsushita's choice to

  11. "You Want "What" on Your Pizza!?": Videophone and Video-Relay Service as Potential Influences on the Lexical Standardization of American Sign Language

    Science.gov (United States)

    Palmer, Jeffrey Levi; Reynolds, Wanette; Minor, Rebecca

    2012-01-01

    This pilot study examines whether the increased virtual "mobility" of ASL users via videophone and video-relay services is contributing to the standardization of ASL. In addition, language attitudes are identified and suggested to be influencing the perception of correct versus incorrect standard forms. ASL users around the country have their own…

  12. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation.

    Science.gov (United States)

    Pujar, Shashikant; O'Leary, Nuala A; Farrell, Catherine M; Loveland, Jane E; Mudge, Jonathan M; Wallin, Craig; Girón, Carlos G; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; Martin, Fergal J; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Suner, Marie-Marthe; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bruford, Elspeth A; Bult, Carol J; Frankish, Adam; Murphy, Terence; Pruitt, Kim D

    2018-01-04

    The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. Published by Oxford University Press on behalf of Nucleic Acids Research 2017.

  13. A Survey of Standardized Approaches towards the Quality of Experience Evaluation for Video Services: An ITU Perspective

    Directory of Open Access Journals (Sweden)

    Debajyoti Pal

    2018-01-01

    Full Text Available Over the past few years there has been an exponential increase in the amount of multimedia data being streamed over the Internet. At the same time, we are also witnessing a change in the way quality of any particular service is interpreted, with more emphasis being given to the end-users. Thus, silently there has been a paradigm shift from the traditional Quality of Service approach (QoS towards a Quality of Experience (QoE model while evaluating the service quality. A lot of work that tries to evaluate the quality of audio, video, and multimedia services over the Internet has been done. At the same time, research is also going on trying to map the two different domains of quality metrics, i.e., the QoS and QoE domain. Apart from the work done by individual researchers, the International Telecommunications Union (ITU has been quite active in this area of quality assessment. This is obvious from the large number of ITU standards that are available for different application types. The sheer variety of techniques being employed by ITU as well as other researchers sometimes tends to be too complex and diversified. Although there are survey papers that try to present the current state of the art methodologies for video quality evaluation, none has focused on the ITU perspective. In this work, we try to fill up this void by presenting up-to-date information on the different measurement methods that are currently being employed by ITU for a video streaming scenario. We highlight the outline of each method with sufficient detail and try to analyze the challenges being faced along with the direction of future research.

  14. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  15. ZAKI a windows-based k sub o standardization code for in-core INAA

    CERN Document Server

    Ojo, J O

    2002-01-01

    A new computer code ZAKI, for k sub o -based INAA standardization, written in Visual Basic for the WINDOWS environment is described. The parameter alpha measuring the deviation of the epithermal neutron spectrum shape from the ideal 1/E shape, and the thermal-to-epithermal flux ratio f, are monitored at each irradiation position for each irradiation using the ''triple bare monitor with k sub o '' technique. Stability of the irradiation position with respect to alpha and f is therefore assumed only for the duration of the irradiation. This now makes it possible to use k sub o standardization even for in-core reactor irradiation channels without an a priori knowledge of alpha and f values as required by existing commercial software. ZAKI is considerably versatile and contains features which allow for use of several detectors at different counting geometries, direct inputting of peak search output from GeniePc, and automatic nuclide identification of all gamma lines using an in-built library. Sample results for ...

  16. ZAKI: a windows-based ko standardization code for in-core INAA

    International Nuclear Information System (INIS)

    Ojo, J.O.; Filby, R.H.

    2002-01-01

    A new computer code ZAKI, for k o -based INAA standardization, written in Visual Basic for the WINDOWS environment is described. The parameter α measuring the deviation of the epithermal neutron spectrum shape from the ideal 1/E shape, and the thermal-to-epithermal flux ratio f, are monitored at each irradiation position for each irradiation using the ''triple bare monitor with k o '' technique. Stability of the irradiation position with respect to α and f is therefore assumed only for the duration of the irradiation. This now makes it possible to use k o standardization even for in-core reactor irradiation channels without an a priori knowledge of α and f values as required by existing commercial software. ZAKI is considerably versatile and contains features which allow for use of several detectors at different counting geometries, direct inputting of peak search output from GeniePc, and automatic nuclide identification of all gamma lines using an in-built library. Sample results for two certified reference materials are presented

  17. VALIDATION OF SIMBAT-PWR USING STANDARD CODE OF COBRA-EN ON REACTOR TRANSIENT CONDITION

    Directory of Open Access Journals (Sweden)

    Muhammad Darwis Isnaini

    2016-03-01

    Full Text Available The validation of Pressurized Water Reactor typed Nuclear Power Plant simulator developed by BATAN (SIMBAT-PWR using standard code of COBRA-EN on reactor transient condition has been done. The development of SIMBAT-PWR has accomplished several neutronics and thermal-hydraulic calculation modules. Therefore, the validation of the simulator is needed, especially in transient reactor operation condition. The research purpose is for characterizing the thermal-hydraulic parameters of PWR1000 core, which be able to be applied or as a comparison in developing the SIMBAT-PWR. The validation involves the calculation of the thermal-hydraulic parameters using COBRA-EN code. Furthermore, the calculation schemes are based on COBRA-EN with fixed material properties and dynamic properties that calculated by MATPRO subroutine (COBRA-EN+MATPRO for reactor condition of startup, power rise and power fluctuation from nominal to over power. The comparison of the temperature distribution at nominal 100% power shows that the fuel centerline temperature calculated by SIMBAT-PWR has 8.76% higher result than COBRA-EN result and 7.70% lower result than COBRA-EN+MATPRO. In general, SIMBAT-PWR calculation results on fuel temperature distribution are mostly between COBRA-EN and COBRA-EN+MATPRO results. The deviations of the fuel centerline, fuel surface, inner and outer cladding as well as coolant bulk temperature in the SIMBAT-PWR and the COBRA-EN calculation, are due to the value difference of the gap heat transfer coefficient and the cladding thermal conductivity.

  18. Authorization request for potential non-compliance with the American Standard Safety Code for Elevators Dumbwaiters and Escalators

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, J.E.

    1964-09-28

    A Third Party inspection of the reactor work platforms was conducted by representatives of the Travelers Insurance Company in 1958. An inspection report submitted by these representatives described hazardous conditions noted and presented a series of recommendations to improve the operational safety of the systems. Project CGI-960, ``C`` & ``D`` Work Platform Safety Improvements -- All Reactors, vas initiated to modify the platforms in compliance with the Third Party recommendations. The American Standard Safety Code for Elevators Dumbwaiters and Escalators (A-17.1) is used as a guide by the Third Party in formulating their recommendations. This code is used because there is no other applicable code for this type of equipment. While the work platforms do not and in some cases can not comply with this code because of operational use, every effort is made to comply with the intent of the code.

  19. Inventory of Safety-related Codes and Standards for Energy Storage Systems with some Experiences related to Approval and Acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.

    2014-09-11

    The purpose of this document is to identify laws, rules, model codes, codes, standards, regulations, specifications (CSR) related to safety that could apply to stationary energy storage systems (ESS) and experiences to date securing approval of ESS in relation to CSR. This information is intended to assist in securing approval of ESS under current CSR and to identification of new CRS or revisions to existing CRS and necessary supporting research and documentation that can foster the deployment of safe ESS.

  20. An improvement analysis on video compression using file segmentation

    Science.gov (United States)

    Sharma, Shubhankar; Singh, K. John; Priya, M.

    2017-11-01

    From the past two decades the extreme evolution of the Internet has lead a massive rise in video technology and significantly video consumption over the Internet which inhabits the bulk of data traffic in general. Clearly, video consumes that so much data size on the World Wide Web, to reduce the burden on the Internet and deduction of bandwidth consume by video so that the user can easily access the video data.For this, many video codecs are developed such as HEVC/H.265 and V9. Although after seeing codec like this one gets a dilemma of which would be improved technology in the manner of rate distortion and the coding standard.This paper gives a solution about the difficulty for getting low delay in video compression and video application e.g. ad-hoc video conferencing/streaming or observation by surveillance. Also this paper describes the benchmark of HEVC and V9 technique of video compression on subjective oral estimations of High Definition video content, playback on web browsers. Moreover, this gives the experimental ideology of dividing the video file into several segments for compression and putting back together to improve the efficiency of video compression on the web as well as on the offline mode.

  1. Digital video technologies and their network requirements

    Energy Technology Data Exchange (ETDEWEB)

    R. P. Tsang; H. Y. Chen; J. M. Brandt; J. A. Hutchins

    1999-11-01

    Coded digital video signals are considered to be one of the most difficult data types to transport due to their real-time requirements and high bit rate variability. In this study, the authors discuss the coding mechanisms incorporated by the major compression standards bodies, i.e., JPEG and MPEG, as well as more advanced coding mechanisms such as wavelet and fractal techniques. The relationship between the applications which use these coding schemes and their network requirements are the major focus of this study. Specifically, the authors relate network latency, channel transmission reliability, random access speed, buffering and network bandwidth with the various coding techniques as a function of the applications which use them. Such applications include High-Definition Television, Video Conferencing, Computer-Supported Collaborative Work (CSCW), and Medical Imaging.

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  4. French codes and standards for design, construction and in-service inspection of nuclear power plants

    International Nuclear Information System (INIS)

    Hugot, G.; Grandemange, J. M.

    1995-01-01

    In 1970, France decided that its future power plants would be of the Pressurized Water Reactor type. This choice proved to be successful since it resulted in more than 60 PWR units in operation or under construction in France and abroad. At the beginning of such a program, the French engineering and manufacturing industry, the national electrical utility and the Safety Authorities had to face the many challenges imposed by the implementation of an imported technology. The government reorganised the licensing process. FRAMATOME, the NSSS vendor, and EDF (Electricite de France), the national utility, decided to create 'AFCEN', the French Association for Design and Construction Rules for Nuclear Island Components. These rules, the RCC's (Regles de Construction et de conception), which are approved by French Safety Authorities deal with mechanical and electrical equipment as well as with nuclear fuel and civil works. They are now being supplemented by in service inspection rules, the RSE's (Regles d'inspection en Service). The paper presents these Codes and their main updating following experience of application, technical progress and evolution of standards. Status of discussion concerning reference to European standardisation and developments of rules applicable to the EPR project will also be discussed

  5. Hospital Standardized Mortality Ratios: Sensitivity Analyses on the Impact of Coding

    Science.gov (United States)

    Bottle, Alex; Jarman, Brian; Aylin, Paul

    2011-01-01

    Introduction Hospital standardized mortality ratios (HSMRs) are derived from administrative databases and cover 80 percent of in-hospital deaths with adjustment for available case mix variables. They have been criticized for being sensitive to issues such as clinical coding but on the basis of limited quantitative evidence. Methods In a set of sensitivity analyses, we compared regular HSMRs with HSMRs resulting from a variety of changes, such as a patient-based measure, not adjusting for comorbidity, not adjusting for palliative care, excluding unplanned zero-day stays ending in live discharge, and using more or fewer diagnoses. Results Overall, regular and variant HSMRs were highly correlated (ρ > 0.8), but differences of up to 10 points were common. Two hospitals were particularly affected when palliative care was excluded from the risk models. Excluding unplanned stays ending in same-day live discharge had the least impact despite their high frequency. The largest impacts were seen when capturing postdischarge deaths and using just five high-mortality diagnosis groups. Conclusions HSMRs in most hospitals changed by only small amounts from the various adjustment methods tried here, though small-to-medium changes were not uncommon. However, the position relative to funnel plot control limits could move in a significant minority even with modest changes in the HSMR. PMID:21790587

  6. PHITS code improvements by Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority

    International Nuclear Information System (INIS)

    Goko, Shinji

    2017-01-01

    As for the safety analysis to be carried out when a nuclear power company applies for installation permission of facility or equipment, business license, design approval etc., the Regulatory Standard and Research Department Secretariat of Nuclear Regulation Authority continuously conducts safety research for the introduction of various technologies and their improvement in order to evaluate the adequacy of this safety analysis. In the field of the shielding analysis of nuclear fuel transportation materials, this group improved the code to make PHITS applicable to this field, and has been promoting the improvement as a tool used for regulations since FY2013. This paper introduced the history and progress of this safety research. PHITS 2.88, which is the latest version as of November 2016, was equipped with the automatic generation function of variance reduction parameters [T-WWG] etc., and developed as the tool equipped with many effective functions in practical application to nuclear power regulations. In addition, this group conducted the verification analysis against nuclear fuel packages, which showed a good agreement with the analysis by MCNP, which is extensively used worldwide and abundant in actual results. It also shows a relatively good agreement with the measured values, when considering differences in analysis and measurement. (A.O.)

  7. On the Impact of Zero-padding in Network Coding Efficiency with Internet Traffic and Video Traces

    DEFF Research Database (Denmark)

    Taghouti, Maroua; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2016-01-01

    Random Linear Network Coding (RLNC) theoretical results typically assume that packets have equal sizes while in reality, data traffic presents a random packet size distribution. Conventional wisdom considers zero-padding of original packets as a viable alternative, but its effect can reduce the e...

  8. Multimedia signal coding and transmission

    CERN Document Server

    Ohm, Jens-Rainer

    2015-01-01

    This textbook covers the theoretical background of one- and multidimensional signal processing, statistical analysis and modelling, coding and information theory with regard to the principles and design of image, video and audio compression systems. The theoretical concepts are augmented by practical examples of algorithms for multimedia signal coding technology, and related transmission aspects. On this basis, principles behind multimedia coding standards, including most recent developments like High Efficiency Video Coding, can be well understood. Furthermore, potential advances in future development are pointed out. Numerous figures and examples help to illustrate the concepts covered. The book was developed on the basis of a graduate-level university course, and most chapters are supplemented by exercises. The book is also a self-contained introduction both for researchers and developers of multimedia compression systems in industry.

  9. Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice

    Science.gov (United States)

    Gambrill, Eileen

    2007-01-01

    Different views of evidence-based practice (EBP) include defining it as the use of empirically-validated treatments and practice guidelines (i.e., the EBPs approach) in contrast to the broad philosophy and related evolving process described by the originators. Social workers can draw on their code of ethics and accreditation standards both to…

  10. NODC Standard Product: NODC Taxonomic Code on CD-ROM (NODC Accession 0050418)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The content of the NODC Taxonomic Code, Version 8 CD-ROM (CD-ROM NODC-68) distributed by NODC is archived in this accession. Version 7 of the NODC Taxonomic Code...

  11. Innovation and Standardization in School Building: A Proposal for the National Code in Italy.

    Science.gov (United States)

    Ridolfi, Giuseppe

    This document discusses the University of Florence's experience and concepts as it developed the research to define a proposal for designing a new national school building code. Section 1 examines the current school building code and the Italian Reform Process in Education between 1960 and 2000. Section 2 details and explains the new school…

  12. Network Coding to Enhance Standard Routing Protocols in Wireless Mesh Networks

    DEFF Research Database (Denmark)

    Pahlevani, Peyman; Roetter, Daniel Enrique Lucani; Fitzek, Frank

    2013-01-01

    This paper introduces a design and simulation of a locally optimized network coding protocol, called PlayNCool, for wireless mesh networks. PlayN-Cool is easy to implement and compatible with existing routing protocols and devices. This allows the system to gain from network coding capabilities i...

  13. WIAMan Technology Demonstrator Sensor Codes Conforming to International Organization for Standardization/Technical Standard (ISO/TS) 13499

    Science.gov (United States)

    2016-03-01

    of this collection of information, including suggestions for reducing  the burden, to Department of Defense, Washington Headquarters  Services ...ISO)- Multimedia Exchange task force is responsible for maintaining the specification for the multimedia data exchange format for impact tests outlined...channel codes, ATD, multimedia exchange format, ISO/TS 13499 36 Michael B Tegtmeyer 410-278-6074Unclassified Unclassified Unclassified UU ii

  14. Portable Video Media Versus Standard Verbal Communication in Surgical Information Delivery to Nurses: A Prospective Multicenter, Randomized Controlled Crossover Trial.

    Science.gov (United States)

    Kam, Jonathan; Ainsworth, Hannah; Handmer, Marcus; Louie-Johnsun, Mark; Winter, Matthew

    2016-10-01

    Continuing education of health professionals is important for delivery of quality health care. Surgical nurses are often required to understand surgical procedures. Nurses need to be aware of the expected outcomes and recognize potential complications of such procedures during their daily work. Traditional educational methods, such as conferences and tutorials or informal education at the bedside, have many drawbacks for delivery of this information in a universal, standardized, and timely manner. The rapid uptake of portable media devices makes portable video media (PVM) a potential alternative to current educational methods. To compare PVM to standard verbal communication (SVC) for surgical information delivery and educational training for nurses and evaluate its impact on knowledge acquisition and participant satisfaction. Prospective, multicenter, randomized controlled crossover trial. Two hospitals: Gosford District Hospital and Wyong Hospital. Seventy-two nursing staff (36 at each site). Information delivery via PVM--7-minute video compared to information delivered via SVC. Knowledge acquisition was measured by a 32-point questionnaire, and satisfaction with the method of education delivery was measured using the validated Client Satisfaction Questionnaire (CSQ-8). Knowledge acquisition was higher via PVM compared to SVC 25.9 (95% confidence interval [CI] 25.2-26.6) versus 24.3 (95% CI 23.5-25.1), p = .004. Participant satisfaction was higher with PVM 29.5 (95% CI 28.3-30.7) versus 26.5 (95% CI 25.1-27.9), p = .003. Following information delivery via SVC, participants had a 6% increase in knowledge scores, 24.3 (95% CI 23.5-25.1) versus 25.7 (95% CI 24.9-26.5) p = .001, and a 13% increase in satisfaction scores, 26.5 (95% CI 25.1-27.9) versus 29.9 (95% CI 28.8-31.0) p < .001, when they crossed-over to information delivery via PVM. PVM provides a novel method for providing education to nurses that improves knowledge retention and satisfaction with the

  15. Feasibility of video codec algorithms for software-only playback

    Science.gov (United States)

    Rodriguez, Arturo A.; Morse, Ken

    1994-05-01

    Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.

  16. A standardized imaging protocol for the endoscopic prediction of dysplasia within sessile serrated polyps (with video).

    Science.gov (United States)

    Tate, David J; Jayanna, Mahesh; Awadie, Halim; Desomer, Lobke; Lee, Ralph; Heitman, Steven J; Sidhu, Mayenaaz; Goodrick, Kathleen; Burgess, Nicholas G; Mahajan, Hema; McLeod, Duncan; Bourke, Michael J

    2018-01-01

    Dysplasia within sessile serrated polyps (SSPs) is difficult to detect and may be mistaken for an adenoma, risking incomplete resection of the background serrated tissue, and is strongly implicated in interval cancer after colonoscopy. The use of endoscopic imaging to detect dysplasia within SSPs has not been systematically studied. Consecutively detected SSPs ≥8 mm in size were evaluated by using a standardized imaging protocol at a tertiary-care endoscopy center over 3 years. Lesions suspected as SSPs were analyzed with high-definition white light then narrow-band imaging. A demarcated area with a neoplastic pit pattern (Kudo type III/IV, NICE type II) was sought among the serrated tissue. If this was detected, the lesion was labeled dysplastic (sessile serrated polyp with dysplasia); if not, it was labeled non-dysplastic (sessile serrated polyp without dysplasia). Histopathology was reviewed by 2 blinded specialist GI pathologists. A total of 141 SSPs were assessed in 83 patients. Median lesion size was 15.0 mm (interquartile range 10-20), and 54.6% were in the right side of the colon. Endoscopic evidence of dysplasia was detected in 36 of 141 (25.5%) SSPs; of these, 5 of 36 (13.9%) lacked dysplasia at histopathology. Two of 105 (1.9%) endoscopically designated non-dysplastic SSPs had dysplasia at histopathology. Endoscopic imaging, therefore, had an accuracy of 95.0% (95% confidence interval [CI], 90.1%-97.6%) and a negative predictive value of 98.1% (95% CI, 92.6%-99.7%) for detection of dysplasia within SSPs. Dysplasia within SSPs can be detected accurately by using a simple, broadly applicable endoscopic imaging protocol that allows complete resection. Independent validation of this protocol and its dissemination to the wider endoscopic community may have a significant impact on rates of interval cancer. (Clinical trial registration number: NCT03100552.). Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All

  17. A Code of Ethics and Standards for Outer-Space Commerce

    Science.gov (United States)

    Livingston, David M.

    2002-01-01

    Now is the time to put forth an effective code of ethics for businesses in outer space. A successful code would be voluntary and would actually promote the growth of individual companies, not hinder their efforts to provide products and services. A properly designed code of ethics would ensure the development of space commerce unfettered by government-created barriers. Indeed, if the commercial space industry does not develop its own professional code of ethics, government- imposed regulations would probably be instituted. Should this occur, there is a risk that the development of off-Earth commerce would become more restricted. The code presented in this paper seeks to avoid the imposition of new barriers to space commerce as well as make new commercial space ventures easier to develop. The proposed code consists of a preamble, which underscores basic values, followed by a number of specific principles. For the most part, these principles set forth broad commitments to fairness and integrity with respect to employees, consumers, business transactions, political contributions, natural resources, off-Earth development, designated environmental protection zones, as well as relevant national and international laws. As acceptance of this code of ethics grows within the industry, general modifications will be necessary to accommodate the different types of businesses entering space commerce. This uniform applicability will help to assure that the code will not be perceived as foreign in nature, potentially restrictive, or threatening. Companies adopting this code of ethics will find less resistance to their space development plans, not only in the United States but also from nonspacefaring nations. Commercial space companies accepting and refining this code would demonstrate industry leadership and an understanding that will serve future generations living, working, and playing in space. Implementation of the code would also provide an off-Earth precedent for a modified

  18. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    Energy Technology Data Exchange (ETDEWEB)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca; Ehrhart, Brian David

    2018-03-01

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behavior of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.

  19. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  20. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  2. Genetic hotels for the standard genetic code: evolutionary analysis based upon novel three-dimensional algebraic models.

    Science.gov (United States)

    José, Marco V; Morgado, Eberto R; Govezensky, Tzipe

    2011-07-01

    Herein, we rigorously develop novel 3-dimensional algebraic models called Genetic Hotels of the Standard Genetic Code (SGC). We start by considering the primeval RNA genetic code which consists of the 16 codons of type RNY (purine-any base-pyrimidine). Using simple algebraic operations, we show how the RNA code could have evolved toward the current SGC via two different intermediate evolutionary stages called Extended RNA code type I and II. By rotations or translations of the subset RNY, we arrive at the SGC via the former (type I) or via the latter (type II), respectively. Biologically, the Extended RNA code type I, consists of all codons of the type RNY plus codons obtained by considering the RNA code but in the second (NYR type) and third (YRN type) reading frames. The Extended RNA code type II, comprises all codons of the type RNY plus codons that arise from transversions of the RNA code in the first (YNY type) and third (RNR) nucleotide bases. Since the dimensions of remarkable subsets of the Genetic Hotels are not necessarily integer numbers, we also introduce the concept of algebraic fractal dimension. A general decoding function which maps each codon to its corresponding amino acid or the stop signals is also derived. The Phenotypic Hotel of amino acids is also illustrated. The proposed evolutionary paths are discussed in terms of the existing theories of the evolution of the SGC. The adoption of 3-dimensional models of the Genetic and Phenotypic Hotels will facilitate the understanding of the biological properties of the SGC.

  3. Video Comparator

    International Nuclear Information System (INIS)

    Rose, R.P.

    1978-01-01

    The Video Comparator is a comparative gage that uses electronic images from two sources, a standard and an unknown. Two matched video cameras are used to obtain the electronic images. The video signals are mixed and displayed on a single video receiver (CRT). The video system is manufactured by ITP of Chatsworth, CA and is a Tele-Microscope II, Model 148. One of the cameras is mounted on a toolmaker's microscope stand and produces a 250X image of a cast. The other camera is mounted on a stand and produces an image of a 250X template. The two video images are mixed in a control box provided by ITP and displayed on a CRT. The template or the cast can be moved to align the desired features. Vertical reference lines are provided on the CRT, and a feature on the cast can be aligned with a line on the CRT screen. The stage containing the casts can be moved using a Boeckleler micrometer equipped with a digital readout, and a second feature aligned with the reference line and the distance moved obtained from the digital display

  4. Application of the Coastal and Marine Ecological Classification Standard to ROV Video Data for Enhanced Analysis of Deep-Sea Habitats in the Gulf of Mexico

    Science.gov (United States)

    Ruby, C.; Skarke, A. D.; Mesick, S.

    2016-02-01

    The Coastal and Marine Ecological Classification Standard (CMECS) is a network of common nomenclature that provides a comprehensive framework for organizing physical, biological, and chemical information about marine ecosystems. It was developed by the National Oceanic and Atmospheric Administration (NOAA) Coastal Services Center, in collaboration with other feral agencies and academic institutions, as a means for scientists to more easily access, compare, and integrate marine environmental data from a wide range of sources and time frames. CMECS has been endorsed by the Federal Geographic Data Committee (FGDC) as a national metadata standard. The research presented here is focused on the application of CMECS to deep-sea video and environmental data collected by the NOAA ROV Deep Discoverer and the NOAA Ship Okeanos Explorer in the Gulf of Mexico in 2011-2014. Specifically, a spatiotemporal index of the physical, chemical, biological, and geological features observed in ROV video records was developed in order to allow scientist, otherwise unfamiliar with the specific content of existing video data, to rapidly determine the abundance and distribution of features of interest, and thus evaluate the applicability of those video data to their research. CMECS units (setting, component, or modifier) for seafloor images extracted from high-definition ROV video data were established based upon visual assessment as well as analysis of coincident environmental sensor (temperature, conductivity), navigation (ROV position, depth, attitude), and log (narrative dive summary) data. The resulting classification units were integrated into easily searchable textual and geo-databases as well as an interactive web map. The spatial distribution and associations of deep-sea habitats as indicated by CMECS classifications are described and optimized methodological approaches for application of CMECS to deep-sea video and environmental data are presented.

  5. Recommendations for codes and standards to be used for design and fabrication of high level waste canister

    International Nuclear Information System (INIS)

    Bermingham, A.J.; Booker, R.J.; Booth, H.R.; Ruehle, W.G.; Shevekov, S.; Silvester, A.G.; Tagart, S.W.; Thomas, J.A.; West, R.G.

    1978-01-01

    This study identifies codes, standards, and regulatory requirements for developing design criteria for high-level waste (HLW) canisters for commercial operation. It has been determined that the canister should be designed as a pressure vessel without provision for any overpressure protection type devices. It is recommended that the HLW canister be designed and fabricated to the requirements of the ASME Section III Code, Division 1 rules, for Code Class 3 components. Identification of other applicable industry and regulatory guides and standards are provided in this report. Requirements for the Design Specification are found in the ASME Section III Code. It is recommended that design verification be conducted principally with prototype testing which will encompass normal and accident service conditions during all phases of the canister life. Adequacy of existing quality assurance and licensing standards for the canister was investigated. One of the recommendations derived from this study is a requirement that the canister be N stamped. In addition, acceptance standards for the HLW waste should be established and the waste qualified to those standards before the canister is sealed. A preliminary investigation of use of an overpack for the canister has been made, and it is concluded that the use of an overpack, as an integral part of overall canister design, is undesirable, both from a design and economics standpoint. However, use of shipping cask liners and overpack type containers at the Federal repository may make the canister and HLW management safer and more cost effective. There are several possible concepts for canister closure design. These concepts can be adapted to the canister with or without an overpack. A remote seal weld closure is considered to be one of the most suitable closure methods; however, mechanical seals should also be investigated

  6. Overview of Development and Deployment of Codes, Standards and Regulations Affecting Energy Storage System Safety in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Conover, David R.

    2014-08-22

    This report acquaints stakeholders and interested parties involved in the development and/or deployment of energy storage systems (ESS) with the subject of safety-related codes, standards and regulations (CSRs). It is hoped that users of this document gain a more in depth and uniform understanding of safety-related CSR development and deployment that can foster improved communications among all ESS stakeholders and the collaboration needed to realize more timely acceptance and approval of safe ESS technology through appropriate CSR.

  7. Dimensioning Method for Conversational Video Applications in Wireless Convergent Networks

    Directory of Open Access Journals (Sweden)

    Raquel Perez Leal

    2007-12-01

    Full Text Available New convergent services are becoming possible, thanks to the expansion of IP networks based on the availability of innovative advanced coding formats such as H.264, which reduce network bandwidth requirements providing good video quality, and the rapid growth in the supply of dual-mode WiFi cellular terminals. This paper provides, first, a comprehensive subject overview as several technologies are involved, such as medium access protocol in IEEE802.11, H.264 advanced video coding standards, and conversational application characterization and recommendations. Second, the paper presents a new and simple dimensioning model of conversational video over wireless LAN. WLAN is addressed under the optimal network throughput and the perspective of video quality. The maximum number of simultaneous users resulting from throughput is limited by the collisions taking place in the shared medium with the statistical contention protocol. The video quality is conditioned by the packet loss in the contention protocol. Both approaches are analyzed within the scope of the advanced video codecs used in conversational video over IP, to conclude that conversational video dimensioning based on network throughput is not enough to ensure a satisfactory user experience, and video quality has to be taken also into account. Finally, the proposed model has been applied to a real-office scenario.

  8. Dimensioning Method for Conversational Video Applications in Wireless Convergent Networks

    Directory of Open Access Journals (Sweden)

    Alonso JoséI

    2008-01-01

    Full Text Available Abstract New convergent services are becoming possible, thanks to the expansion of IP networks based on the availability of innovative advanced coding formats such as H.264, which reduce network bandwidth requirements providing good video quality, and the rapid growth in the supply of dual-mode WiFi cellular terminals. This paper provides, first, a comprehensive subject overview as several technologies are involved, such as medium access protocol in IEEE802.11, H.264 advanced video coding standards, and conversational application characterization and recommendations. Second, the paper presents a new and simple dimensioning model of conversational video over wireless LAN. WLAN is addressed under the optimal network throughput and the perspective of video quality. The maximum number of simultaneous users resulting from throughput is limited by the collisions taking place in the shared medium with the statistical contention protocol. The video quality is conditioned by the packet loss in the contention protocol. Both approaches are analyzed within the scope of the advanced video codecs used in conversational video over IP, to conclude that conversational video dimensioning based on network throughput is not enough to ensure a satisfactory user experience, and video quality has to be taken also into account. Finally, the proposed model has been applied to a real-office scenario.

  9. Advocacy and Accessibility Standards in the New "Code of Professional Ethics for Rehabilitation Counselors"

    Science.gov (United States)

    Waldmann, Ashley K.; Blackwell, Terry L.

    2010-01-01

    This article addresses the changes in the Commission on Rehabilitation Counselor Certification's 2010 "Code of Professional Ethics for Rehabilitation Counselors" as they relate to Section C: Advocacy and Accessibility. Ethical issues are identified and discussed in relation to advocacy skills and to advocacy with, and on behalf of, the client; to…

  10. The Teaching of the Code of Ethics and Standard Practices for Texas Educator Preparation Programs

    Science.gov (United States)

    Davenport, Marvin; Thompson, J. Ray; Templeton, Nathan R.

    2015-01-01

    The purpose of this descriptive quantitative research study was to answer three basic informational questions: (1) To what extent ethics training, as stipulated in Texas Administrative Code Chapter 247, was included in the EPP curriculum; (2) To what extent Texas public universities with approved EPP programs provided faculty opportunities for…

  11. MILSTAMP TACs: Military Standard Transportation and Movement Procedures Transportation Account Codes. Volume 2

    Science.gov (United States)

    1987-02-15

    82302 F 13211 PT VERDE WPB 82311 F 13212 PT SWIFT WPB 82312 E. 13214 PT THATCHER WPB 82314 E 13218 PT HERRON WPB 82318 C 13232 PT ROBERTS WPB 82332 E...Identifies DOT, FAA Logistica Center, OkIanhea City, as an organization to be billed. 4th Position Code A Ia assigned by DOT, rAA. Identifies appropriation

  12. Dynamic code block size for JPEG 2000

    Science.gov (United States)

    Tsai, Ping-Sing; LeCornec, Yann

    2008-02-01

    Since the standardization of the JPEG 2000, it has found its way into many different applications such as DICOM (digital imaging and communication in medicine), satellite photography, military surveillance, digital cinema initiative, professional video cameras, and so on. The unified framework of the JPEG 2000 architecture makes practical high quality real-time compression possible even in video mode, i.e. motion JPEG 2000. In this paper, we present a study of the compression impact using dynamic code block size instead of fixed code block size as specified in the JPEG 2000 standard. The simulation results show that there is no significant impact on compression if dynamic code block sizes are used. In this study, we also unveil the advantages of using dynamic code block sizes.

  13. Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.

    Science.gov (United States)

    Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting

    2012-09-01

    In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.

  14. 75 FR 66735 - National Fire Protection Association (NFPA): Request for Comments on NFPA's Codes and Standards

    Science.gov (United States)

    2010-10-29

    ... 59A Standard for the P Production, Storage, and Handling of Liquefied Natural Gas (LNG). NFPA 75... Horizontally in Fire Resistance-Rated Floor Systems. NFPA 385 Standard for Tank P Vehicles for Flammable and Combustible Liquids. NFPA 497 Recommended Practice P for the Classification of Flammable Liquids, Gases, or...

  15. Croatia - Report on the Observance of Standards and Codes : Accounting and Auditing

    OpenAIRE

    World Bank

    2007-01-01

    This report provides an updated assessment of accounting, financial reporting, and auditing requirements and practices within the enterprise and financial sectors in Croatia. It uses International Financial Reporting Standards (IFRS), International Standards on Auditing (ISA), and the relevant portions of European Union (EU) law (also known as the acquis communautaire). Croatia has made co...

  16. Revision of AESJ standard 'the code of implemnetation of periodic safety review of nuclear power plants'

    International Nuclear Information System (INIS)

    Hirano, Masashi; Narumiya, Yoshiyuki

    2010-01-01

    The Periodic Safety Review (PSR) was launched in June 1992, when the Agency for Natural Resources and Energy issued a notification that required licensees to conduct comprehensive review on the safety of each existing nuclear power plant (NPP) once approximately every ten years based on the latest technical findings for the purpose of improving the safety of the NPP. In 2006, the Standard Committee of the Atomic Energy Society of Japan established the first version of 'The Standard of Implementation for Periodic Safety Review of Nuclear Power Plants: 2006'. Taking into account developments in safety regulation of PSR after the issuance of the first version, the Standard Committee has revised the Standard. This paper summarizes background on PSR, such developments are major contents of the Standard as well as the focal points of the revision. (author)

  17. Mongolia; Report on the Observance of Standards and Codes-Fiscal Transparency

    OpenAIRE

    International Monetary Fund

    2001-01-01

    This report provides an assessment of fiscal transparency practices in Mongolia against the requirements of the IMF Code of Good Practices on Fiscal Transparency. This paper analyzes the government's participation in the financial and nonfinancial sectors of the economy. Executive Directors appreciated the achievements, and stressed the need for improvements in the areas of fiscal transparency. They emphasized the need for addressing weaknesses of fiscal data, maintaining a legal framework fo...

  18. Quality assurance of the French nuclear market - IAEA code and standardization

    International Nuclear Information System (INIS)

    Pavaux, F.

    1980-06-01

    The fact that Quality Assurance was imported from abroad and our reticence to reach agreement on single and accurate texts explain, if not excuse, the abundance of reference requirements existing on the French nuclear market with respect to Quality Assurance Programmes. But all is not lost, since the IAEA Good Practice Code is perhaps the solution that, in a few years time, will enable all French industrialists to work and be assessed by their customers, according to the same reference text [fr

  19. The Canadian approach to nuclear codes and standards. A CSA forum for development of standards for CANDU: radioactive waste management and decommissioning

    International Nuclear Information System (INIS)

    Shin, T.; Azeez, S.; Dua, S.

    2006-01-01

    Together with the Canadian Standards Association (CSA), industry stakeholders, governments, and the public have developed a suite of standards for CANDU nuclear power plants that generate electricity in Canada and abroad. In this paper, we will describe: CSA's role in national and international nuclear standards development; the key issues and priority projects that the nuclear standards program has addressed; the new CSA nuclear committees and projects being established, particularly those related to waste management and decommissioning; the hierarchy of nuclear regulations, nuclear, and other standards in Canada, and how they are applied by AECL; the standards management activities; and the future trends and challenges for CSA and the nuclear community. CSA is an accredited Standards Development Organization (SDO) and part of the international standards system. CSA's Nuclear Strategic Steering Committee (NSSC) provides leadership, direction, and support for a standards committee hierarchy comprised of members from a balanced matrix of interests. The NSSC strategically focuses on industry challenges; a new nuclear regulatory system, deregulated energy markets, and industry restructuring. As the first phase of priority projects is nearing completion, the next phase of priorities is being identified. These priorities address radioactive waste management, environmental radiation management, decommissioning, structural, and seismic issues. As the CSA committees get established in the coming year, members and input will be solicited for the technical committees, subcommittees, and task forces for the following related subjects: Radioactive Waste Management; a) Dry Storage of Irradiated Fuel; b) Short-Term Radioactive Waste Management; c) Long-Term Storage and Disposal of Radioactive Waste. 2. Decommissioning Nuclear Power is highly regulated, and public scrutiny has focused Codes and Standards on public and worker safety. Licensing and regulation serves to control

  20. MISTRA facility for containment lumped parameter and CFD codes validation. Example of the International Standard Problem ISP47

    International Nuclear Information System (INIS)

    Tkatschenko, I.; Studer, E.; Paillere, H.

    2005-01-01

    During a severe accident in a Pressurized Water Reactor (PWR), the formation of a combustible gas mixture in the complex geometry of the reactor depends on the understanding of hydrogen production, the complex 3D thermal-hydraulics flow due to gas/steam injection, natural convection, heat transfer by condensation on walls and effect of mitigation devices. Numerical simulation of such flows may be performed either by Lumped Parameter (LP) or by Computational Fluid Dynamics (CFD) codes. Advantages and drawbacks of LP and CFD codes are well-known. LP codes are mainly developed for full size containment analysis but they need improvements, especially since they are not able to accurately predict the local gas mixing within the containment. CFD codes require a process of validation on well-instrumented experimental data before they can be used with a high degree of confidence. The MISTRA coupled effect test facility has been built at CEA to fulfil this validation objective: with numerous measurement points in the gaseous volume - temperature, gas concentration, velocity and turbulence - and with well controlled boundary conditions. As illustration of both experimental and simulation areas of this topic, a recent example in the use of MISTRA test data is presented for the case of the International Standard Problem ISP47. The proposed experimental work in the MISTRA facility provides essential data to fill the gaps in the modelling/validation of computational tools. (author)

  1. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, L.M.; Jordon, W.C. [Oak Ridge National Lab., TN (United States); Edwards, A.L. [Oak Ridge National Lab., TN (United States)]|[Lawrence Livermore National Lab., CA (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries.

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Landers, N.F.; Petrie, L.M.; Knight, J.R. [Oak Ridge National Lab., TN (United States)] [and others

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Miscellaneous -- Volume 3, Revision 4

    International Nuclear Information System (INIS)

    Petrie, L.M.; Jordon, W.C.; Edwards, A.L.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice; (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System developments has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3--for the data libraries and subroutine libraries

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Control modules -- Volume 1, Revision 4

    International Nuclear Information System (INIS)

    Landers, N.F.; Petrie, L.M.; Knight, J.R.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries

  5. What expects of academies. Activity of code and standard Div. in JSME, AESJ and JEA

    International Nuclear Information System (INIS)

    Miyano, Hiroshi

    2005-01-01

    The present states of establishment activities of standards by the three academies in the title are summarized from the viewpoint of 'explanation accountability'. Their roles and directions to go forward are also summarized. To be concrete, the trends of these establishment academies of standards relating to the nuclear industry are summarized. By focusing on the three indispensable conditions (neutrality, impartiality and transparency) for the establishment academy of standards and their implementation, the verifications of these conditions and 'explanation accountability' in these three academies are described. (K. Kato)

  6. Czech Republic; Report on Observance of Standards and Codes-Fiscal Transparency Module-Update

    OpenAIRE

    International Monetary Fund

    2003-01-01

    The Czech government has made further progress in improving fiscal transparency that was already high by international standards. The measures implemented to broaden the coverage of general government data have been commended. Improved reporting on fiscal risks, including those arising from contingent liabilities, has been welcomed. However, greater effort is needed to improve the public availability of fiscal data and to maintain regular tax expenditure reports. Ensuring appropriate standard...

  7. Physician involvement enhances coding accuracy to ensure national standards: an initiative to improve awareness among new junior trainees.

    Science.gov (United States)

    Nallasivan, S; Gillott, T; Kamath, S; Blow, L; Goddard, V

    2011-06-01

    Record Keeping Standards is a development led by the Royal College of Physicians of London (RCP) Health Informatics Unit and funded by the National Health Service (NHS) Connecting for Health. A supplementary report produced by the RCP makes a number of recommendations based on a study held at an acute hospital trust. We audited the medical notes and coding to assess the accuracy, documentation by the junior doctors and also to correlate our findings with the RCP audit. Northern Lincolnshire & Goole Hospitals NHS Foundation Trust has 114,000 'finished consultant episodes' per year. A total of 100 consecutive medical (50) and rheumatology (50) discharges from Diana Princess of Wales Hospital from August-October 2009 were reviewed. The results showed an improvement in coding accuracy (10% errors), comparable to the RCP audit but with 5% documentation errors. Physician involvement needs enhancing to improve the effectiveness and to ensure clinical safety.

  8. Does video-assisted mediastinoscopy offer lower false-negative rates for subcarinal lymph nodes compared with standard cervical mediastinoscopy?

    Science.gov (United States)

    Citak, Necati; Buyukkale, Songul; Kok, Abdulaziz; Celikten, Alper; Metin, Muzaffer; Sayar, Adnan; Gurses, Atilla

    2014-10-01

    Theoretically, video-assisted mediastinoscopy (VAM) offers improved staging of subcarinal lymph nodes (LNs) compared with standard cervical mediastinoscopy (SCM). Materials and Between 2006 and 2011, 553 patients (SCM, n = 293; VAM, n = 260) with non-small cell lung carcinoma who underwent mediastinoscopy were investigated. Mediastinoscopy was performed only in select patients based on computed tomography (CT) or positron emission tomography CT scans in our center. The mean number of LNs and stations sampled per case was significantly higher with VAM (n = 7.65 ± 1.68 and n = 4.22 ± 0.83) than with SCM (n = 6.91 ± 1.65 and 3.92 ± 86.4; p < 0.001). The percentage of patients sampled in station 7 was significantly higher with VAM (98.8%) than with SCM (93.8%; p = 0.002). Mediastinal LN metastasis was observed in 114 patients by mediastinoscopy. The remaining 439 patients (203 patients in VAM and 236 in SCM) underwent thoracotomy and systematic mediastinal lymphadenectomy (SML). SML showed mediastinal nodal disease in 23 patients (false-negative [FN] rate, 5.2%). The FN rate was higher with SCM (n = 14, 5.9%) than with VAM (n = 9, 4.4%), although this difference was not statistically significant (p = 0.490). Station 7 was the most predominant station for FN results (n = 15). The FN rate of station 7 was found to be higher with SCM (n = 9, 3.8%) than with the VAM group (n = 6, 2.9%; p = 0.623). FN were more common in mediastinoscopy of subcarinal LNs. VAM allows higher rates of sampling of mediastinal LN stations and station 7, although it did not improve staging of subcarinal LNs. Georg Thieme Verlag KG Stuttgart · New York.

  9. Research on compression performance of ultrahigh-definition videos

    Science.gov (United States)

    Li, Xiangqun; He, Xiaohai; Qing, Linbo; Tao, Qingchuan; Wu, Di

    2017-11-01

    With the popularization of high-definition (HD) images and videos (1920×1080 pixels and above), there are even 4K (3840×2160) television signals and 8 K (8192×4320) ultrahigh-definition videos. The demand for HD images and videos is increasing continuously, along with the increasing data volume. The storage and transmission cannot be properly solved only by virtue of the expansion capacity of hard disks and the update and improvement of transmission devices. Based on the full use of the coding standard high-efficiency video coding (HEVC), super-resolution reconstruction technology, and the correlation between the intra- and the interprediction, we first put forward a "division-compensation"-based strategy to further improve the compression performance of a single image and frame I. Then, by making use of the above thought and HEVC encoder and decoder, a video compression coding frame is designed. HEVC is used inside the frame. Last, with the super-resolution reconstruction technology, the reconstructed video quality is further improved. The experiment shows that by the proposed compression method for a single image (frame I) and video sequence here, the performance is superior to that of HEVC in a low bit rate environment.

  10. Polar Coding with CRC-Aided List Decoding

    Science.gov (United States)

    2015-08-01

    TECHNICAL REPORT 2087 August 2015 Polar Coding with CRC-Aided List Decoding David Wasserman Approved...list decoding . RESULTS Our simulation results show that polar coding can produce results very similar to the FEC used in the Digital Video...standard. RECOMMENDATIONS In any application for which the DVB-S2 FEC is considered, polar coding with CRC-aided list decod - ing with N = 65536

  11. Contribution to the panel discussion on 'International developments in standards, rules and codes for pressure vessels'

    International Nuclear Information System (INIS)

    Puell, K.

    1992-01-01

    This contribution to the discussion describes the legal system for technical installations in its existing form in the Federal Republic of Germany. The standardization of technical requirements to meet EC Directives and European Standards requires an adjustment, to a limited extent, of the appurtenant legal prerequisites. Given the trend away from tried and tested control mechanisms in the form of third party inspection, there is imminent danger of a reduction in quality. This compels us to consider how to maintain nonetheless the level of safety that has already been reached. (orig.)

  12. More Than Bar Codes: Integrating Global Standards-Based Bar Code Technology Into National Health Information Systems in Ethiopia and Pakistan to Increase End-to-End Supply Chain Visibility.

    Science.gov (United States)

    Hara, Liuichi; Guirguis, Ramy; Hummel, Keith; Villanueva, Monica

    2017-12-28

    The United Nations Population Fund (UNFPA) and the United States Agency for International Development (USAID) DELIVER PROJECT work together to strengthen public health commodity supply chains by standardizing bar coding under a single set of global standards. From 2015, UNFPA and USAID collaborated to pilot test how tracking and tracing of bar coded health products could be operationalized in the public health supply chains of Ethiopia and Pakistan and inform the ecosystem needed to begin full implementation. Pakistan had been using proprietary bar codes for inventory management of contraceptive supplies but transitioned to global standards-based bar codes during the pilot. The transition allowed Pakistan to leverage the original bar codes that were preprinted by global manufacturers as opposed to printing new bar codes at the central warehouse. However, barriers at lower service delivery levels prevented full realization of end-to-end data visibility. Key barriers at the district level were the lack of a digital inventory management system and absence of bar codes at the primary-level packaging level, such as single blister packs. The team in Ethiopia developed an open-sourced smartphone application that allowed the team to scan bar codes using the mobile phone's camera and to push the captured data to the country's data mart. Real-time tracking and tracing occurred from the central warehouse to the Addis Ababa distribution hub and to 2 health centers. These pilots demonstrated that standardized product identification and bar codes can significantly improve accuracy over manual stock counts while significantly streamlining the stock-taking process, resulting in efficiencies. The pilots also showed that bar coding technology by itself is not sufficient to ensure data visibility. Rather, by using global standards for identification and data capture of pharmaceuticals and medical devices, and integrating the data captured into national and global tracking systems

  13. International standard problem (ISP) No. 41. Containment iodine computer code exercise based on a radioiodine test facility (RTF) experiment

    International Nuclear Information System (INIS)

    2000-04-01

    International Standard Problem (ISP) exercises are comparative exercises in which predictions of different computer codes for a given physical problem are compared with each other or with the results of a carefully controlled experimental study. The main goal of ISP exercises is to increase confidence in the validity and accuracy of the tools, which were used in assessing the safety of nuclear installations. Moreover, they enable code users to gain experience and demonstrate their competence. The ISP No. 41 exercise, computer code exercise based on a Radioiodine Test Facility (RTF) experiment on iodine behaviour in containment under severe accident conditions, is one of such ISP exercises. The ISP No. 41 exercise was borne at the recommendation at the Fourth Iodine Chemistry Workshop held at PSI, Switzerland in June 1996: 'the performance of an International Standard Problem as the basis of an in-depth comparison of the models as well as contributing to the database for validation of iodine codes'. [Proceedings NEA/CSNI/R(96)6, Summary and Conclusions NEA/CSNI/R(96)7]. COG (CANDU Owners Group), comprising AECL and the Canadian nuclear utilities, offered to make the results of a Radioiodine Test Facility (RTF) test available for such an exercise. The ISP No. 41 exercise was endorsed in turn by the FPC (PWG4's Task Group on Fission Product Phenomena in the Primary Circuit and the Containment), PWG4 (CSNI Principal Working Group on the Confinement of Accidental Radioactive Releases), and the CSNI. The OECD/NEA Committee on the Safety of Nuclear Installations (CSNI) has sponsored forty-five ISP exercises over the last twenty-four years, thirteen of them in the area of severe accidents. The criteria for the selection of the RTF test as a basis for the ISP-41 exercise were; (1) complementary to other RTF tests available through the PHEBUS and ACE programmes, (2) simplicity for ease of modelling and (3) good quality data. A simple RTF experiment performed under controlled

  14. The development of speech coding and the first standard coder for public mobile telephony

    NARCIS (Netherlands)

    Sluijter, R.J.

    2005-01-01

    This thesis describes in its core chapter (Chapter 4) the original algorithmic and design features of the ??rst coder for public mobile telephony, the GSM full-rate speech coder, as standardized in 1988. It has never been described in so much detail as presented here. The coder is put in a

  15. Corps et culture: les codes de savoir-vivre (Body and Culture: The Standards of Etiquette).

    Science.gov (United States)

    Picard, Dominique

    1983-01-01

    The evolution of values and standards of behavior as they relate to the body in culture are examined, especially in light of recent trends toward recognition of the natural and the spontaneous, the positive value placed on sexuality, and at the same time, narcissism and emphasis on youth. (MSE)

  16. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  17. A hybrid video compression based on zerotree wavelet structure

    International Nuclear Information System (INIS)

    Kilic, Ilker; Yilmaz, Reyat

    2009-01-01

    A video compression algorithm comparable to the standard techniques at low bit rates is presented in this paper. The overlapping block motion compensation (OBMC) is combined with discrete wavelet transform which followed by Lloyd-Max quantization and zerotree wavelet (ZTW) structure. The novel feature of this coding scheme is the combination of hierarchical finite state vector quantization (HFSVQ) with the ZTW to encode the quantized wavelet coefficients. It is seen that the proposed video encoder (ZTW-HFSVQ) performs better than the MPEG-4 and Zerotree Entropy Coding (ZTE). (author)

  18. The safety relief valve handbook design and use of process safety valves to ASME and International codes and standards

    CERN Document Server

    Hellemans, Marc

    2009-01-01

    The Safety Valve Handbook is a professional reference for design, process, instrumentation, plant and maintenance engineers who work with fluid flow and transportation systems in the process industries, which covers the chemical, oil and gas, water, paper and pulp, food and bio products and energy sectors. It meets the need of engineers who have responsibilities for specifying, installing, inspecting or maintaining safety valves and flow control systems. It will also be an important reference for process safety and loss prevention engineers, environmental engineers, and plant and process designers who need to understand the operation of safety valves in a wider equipment or plant design context. . No other publication is dedicated to safety valves or to the extensive codes and standards that govern their installation and use. A single source means users save time in searching for specific information about safety valves. . The Safety Valve Handbook contains all of the vital technical and standards informat...

  19. Benefit using reasonable regulations in USA, how to skill up on professional engineers, apply international code, standard, and regulation

    International Nuclear Information System (INIS)

    Turner, S.L.; Morokuzu, Muneo; Amano, Osamu

    2005-01-01

    The reasonable regulations in USA consist of a graduated approach and a risk informed approach (RIA). RIA rationalizes the regulations on the basis of data of operations etc. PSA (Probabilistic Safety Assessment), a general method of RIA, is explained in detail. The benefits of nuclear power plant using RIA are increase of the rate of operation, visualization of risk, application of design standard and design, cost down of nuclear fuel cycle, waste, production and operation, and safety. RIA is supported by the field data, code, standard, regulation and professional engineers. The effects of introduction of RIA are explained. In order to introduce RIA in Japan, all the parties concerned such as the regulation authorities, the electric power industries, makers, universities, have to understand it and work together. A part of scientific society is stated. (S.Y.)

  20. Calculation of low-cycle fatigue in accordance with the national standard and strength codes

    Science.gov (United States)

    Kontorovich, T. S.; Radin, Yu. A.

    2017-08-01

    Over the most recent 15 years, the Russian power industry has largely relied on imported equipment manufactured in compliance with foreign standards and procedures. This inevitably necessitates their harmonization with the regulatory documents of the Russian Federation, which include calculations of strength, low cycle fatigue, and assessment of the equipment service life. An important regulatory document providing the engineering foundation for cyclic strength and life assessment for high-load components of the boiler and steamline of a water/steam circuit is RD 10-249-98:2000: Standard Method of Strength Estimation in Stationary Boilers and Steam and Water Piping. In January 2015, the National Standard of the Russian Federation 12952-3:2001 was introduced regulating the issues of design and calculation of the pressure parts of water-tube boilers and auxiliary installations. Thus, there appeared to be two documents simultaneously valid in the same energy field and using different methods for calculating the low-cycle fatigue strength, which leads to different results. In this connection, the current situation can lead to incorrect ideas about the cyclic strength and the service life of high-temperature boiler parts. The article shows that the results of calculations performed in accordance with GOST R 55682.3-2013/EN 12952-3: 2001 are less conservative than the results of the standard RD 10-249-98. Since the calculation of the expected service life of boiler parts should use GOST R 55682.3-2013/EN 12952-3: 2001, it becomes necessary to establish the applicability scope of each of the above documents.

  1. Augmented video viewing: transforming video consumption into an active experience

    OpenAIRE

    WIJNANTS, Maarten; Leën, Jeroen; QUAX, Peter; LAMOTTE, Wim

    2014-01-01

    Traditional video productions fail to cater to the interactivity standards that the current generation of digitally native customers have become accustomed to. This paper therefore advertises the \\activation" of the video consumption process. In particular, it proposes to enhance HTML5 video playback with interactive features in order to transform video viewing into a dynamic pastime. The objective is to enable the authoring of more captivating and rewarding video experiences for end-users. T...

  2. China's High-technology Standards Development

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    There are several major technology standards, including audio video coding (AVS), automotive electronics, third generation (3G) mobile phones, mobile television, wireless networks and digital terrestrial television broadcasting, that have been released or are currently under development in China. This article offers a detailed analysis of each standard and studies their impact on China's high-technology industry.

  3. Overview of the U.S. DOE Hydrogen Safety, Codes and Standards Program. Part 4: Hydrogen Sensors; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J.; Rivkin, Carl; Burgess, Robert; Brosha, Eric; Mukundan, Rangachary; James, C. Will; Keller, Jay

    2016-12-01

    Hydrogen sensors are recognized as a critical element in the safety design for any hydrogen system. In this role, sensors can perform several important functions including indication of unintended hydrogen releases, activation of mitigation strategies to preclude the development of dangerous situations, activation of alarm systems and communication to first responders, and to initiate system shutdown. The functionality of hydrogen sensors in this capacity is decoupled from the system being monitored, thereby providing an independent safety component that is not affected by the system itself. The importance of hydrogen sensors has been recognized by DOE and by the Fuel Cell Technologies Office's Safety and Codes Standards (SCS) program in particular, which has for several years supported hydrogen safety sensor research and development. The SCS hydrogen sensor programs are currently led by the National Renewable Energy Laboratory, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory. The current SCS sensor program encompasses the full range of issues related to safety sensors, including development of advance sensor platforms with exemplary performance, development of sensor-related code and standards, outreach to stakeholders on the role sensors play in facilitating deployment, technology evaluation, and support on the proper selection and use of sensors.

  4. Guided waves dispersion equations for orthotropic multilayered pipes solved using standard finite elements code.

    Science.gov (United States)

    Predoi, Mihai Valentin

    2014-09-01

    The dispersion curves for hollow multilayered cylinders are prerequisites in any practical guided waves application on such structures. The equations for homogeneous isotropic materials have been established more than 120 years ago. The difficulties in finding numerical solutions to analytic expressions remain considerable, especially if the materials are orthotropic visco-elastic as in the composites used for pipes in the last decades. Among other numerical techniques, the semi-analytical finite elements method has proven its capability of solving this problem. Two possibilities exist to model a finite elements eigenvalue problem: a two-dimensional cross-section model of the pipe or a radial segment model, intersecting the layers between the inner and the outer radius of the pipe. The last possibility is here adopted and distinct differential problems are deduced for longitudinal L(0,n), torsional T(0,n) and flexural F(m,n) modes. Eigenvalue problems are deduced for the three modes classes, offering explicit forms of each coefficient for the matrices used in an available general purpose finite elements code. Comparisons with existing solutions for pipes filled with non-linear viscoelastic fluid or visco-elastic coatings as well as for a fully orthotropic hollow cylinder are all proving the reliability and ease of use of this method. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. New Standard Evaluated Neutron Cross Section Libraries for the GEANT4 Code and First Verification

    CERN Document Server

    Mendoza, Emilio; Koi, Tatsumi; Guerrero, Carlos

    2014-01-01

    The Monte Carlo simulation of the interaction of neutrons with matter relies on evaluated nuclear data libraries and models. The evaluated libraries are compilations of measured physical parameters (such as cross sections) combined with predictions of nuclear model calculations which have been adjusted to reproduce the experimental data. The results obtained from the simulations depend largely on the accuracy of the underlying nuclear data used, and thus it is important to have access to the nuclear data libraries available, either of general use or compiled for specific applications, and to perform exhaustive validations which cover the wide scope of application of the simulation code. In this paper we describe the work performed in order to extend the capabilities of the GEANT4 toolkit for the simulation of the interaction of neutrons with matter at neutron energies up to 20 MeV and a first verification of the results obtained. Such a work is of relevance for applications as diverse as the simulation of a n...

  6. Video processing project

    CSIR Research Space (South Africa)

    Globisch, R

    2009-03-01

    Full Text Available Video processing source code for algorithms and tools used in software media pipelines (e.g. image scalers, colour converters, etc.) The currently available source code is written in C++ with their associated libraries and DirectShow- Filters....

  7. Pilot study comparing changes in postural control after training using a video game balance board program and 2 standard activity-based balance intervention programs.

    Science.gov (United States)

    Pluchino, Alessandra; Lee, Sae Yong; Asfour, Shihab; Roos, Bernard A; Signorile, Joseph F

    2012-07-01

    To compare the impacts of Tai Chi, a standard balance exercise program, and a video game balance board program on postural control and perceived falls risk. Randomized controlled trial. Research laboratory. Independent seniors (N=40; 72.5±8.40) began the training, 27 completed. Tai Chi, a standard balance exercise program, and a video game balance board program. The following were used as measures: Timed Up & Go, One-Leg Stance, functional reach, Tinetti Performance Oriented Mobility Assessment, force plate center of pressure (COP) and time to boundary, dynamic posturography (DP), Falls Risk for Older People-Community Setting, and Falls Efficacy Scale. No significant differences were seen between groups for any outcome measures at baseline, nor were significant time or group × time differences for any field test or questionnaire. No group × time differences were seen for any COP measures; however, significant time differences were seen for total COP, 3 of 4 anterior/posterior displacement and both velocity, and 1 displacement and 1 velocity medial/lateral measure across time for the entire sample. For DP, significant improvements in the overall score (dynamic movement analysis score), and in 2 of the 3 linear and angular measures were seen for the sample. The video game balance board program, which can be performed at home, was as effective as Tai Chi and the standard balance exercise program in improving postural control and balance dictated by the force plate postural sway and DP measures. This finding may have implications for exercise adherence because the at-home nature of the intervention eliminates many obstacles to exercise training. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. A Peer-Reviewed Instructional Video is as Effective as a Standard Recorded Didactic Lecture in Medical Trainees Performing Chest Tube Insertion: A Randomized Control Trial.

    Science.gov (United States)

    Saun, Tomas J; Odorizzi, Scott; Yeung, Celine; Johnson, Marjorie; Bandiera, Glen; Dev, Shelly P

    Online medical education resources are becoming an increasingly used modality and many studies have demonstrated their efficacy in procedural instruction. This study sought to determine whether a standardized online procedural video is as effective as a standard recorded didactic teaching session for chest tube insertion. A randomized control trial was conducted. Participants were taught how to insert a chest tube with either a recorded didactic teaching session, or a New England Journal of Medicine (NEJM) video. Participants filled out a questionnaire before and after performing the procedure on a cadaver, which was filmed and assessed by 2 blinded evaluators using a standardized tool. Western University, London, Ontario. Level of clinical care: institutional. A total of 30 fourth-year medical students from 2 graduating classes at the Schulich School of Medicine & Dentistry were screened for eligibility. Two students did not complete the study and were excluded. There were 13 students in the NEJM group, and 15 students in the didactic group. The NEJM group׳s average score was 45.2% (±9.56) on the prequestionnaire, 67.7% (±12.9) for the procedure, and 60.1% (±7.65) on the postquestionnaire. The didactic group׳s average score was 42.8% (±10.9) on the prequestionnaire, 73.7% (±9.90) for the procedure, and 46.5% (±7.46) on the postquestionnaire. There was no difference between the groups on the prequestionnaire (Δ + 2.4%; 95% CI: -5.16 to 9.99), or the procedure (Δ -6.0%; 95% CI: -14.6 to 2.65). The NEJM group had better scores on the postquestionnaire (Δ + 11.15%; 95% CI: 3.74-18.6). The NEJM video was as effective as video-recorded didactic training for teaching the knowledge and technical skills essential for chest tube insertion. Participants expressed high satisfaction with this modality. It may prove to be a helpful adjunct to standard instruction on the topic. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc

  9. Review of provisions on corrosion fatigue and stress corrosion in WWER and Western LWR Codes and Standards

    International Nuclear Information System (INIS)

    Buckthorpe, D.; Filatov, V.; Tashkinov, A.; Evropin, S.V.; Matocha, K.; Guinovart, J.

    2003-01-01

    Results are presented from a collaborative project performed on behalf of the European Commission, Working Group Codes and Standards. The work covered the contents of current codes and standards, plant experience and R and D results. Current fatigue design rules use S-N curves based on tests in air. Although WWER and LWR design curves are often similar they are derived, presented and used in different ways and it is neither convenient nor appropriate to harmonise them. Similarly the fatigue crack growth laws used in the various design and in-service inspection rules differ significantly with respect to both growth rates in air and the effects of water reactor environments. Harmonised approaches to the effects of WWER and LWR environments are possible based on results from R and D programmes carried out over the last decade. For carbon and low alloy steels a consistent approach to both crack initiation and growth can be formulated based on the superposition of environmentally assisted cracking effects on the fatigue crack development. The approach indicates that effects of the water environment are minimal given appropriate control of the oxygen content of the water and/or the sulphur content of the steel. For austenitic stainless steels a different mechanisms may apply and a harmonised approach is possible at present only for S-N curves. Although substantial progress has been made with respect to corrosion fatigue, more data and a clearer understanding are required in order to write code provisions particularly in the area of high cycle fatigue. Reactor operation experience shows stress corrosion cracking of austenitic steels is the most common cause of failure. These failures are associated with high residual stresses combined with high levels of dissolved oxygen or the presence of contaminants. For primary circuit internals there is a potential threat to integrity from irradiated assisted stress corrosion cracking. Design and in-service inspection rules do not at

  10. MDEP Technical Report TR-CSWG-01. Technical Report: Regulatory Frameworks for the Use of Nuclear Pressure Boundary Codes and Standards in MDEP Countries

    International Nuclear Information System (INIS)

    2013-01-01

    The Codes and Standards Working Group (CSWG) is one of the issue-specific working groups that the MDEP members are undertaking; its long term goal is harmonisation of regulatory and code requirements for design and construction of pressure-retaining components in order to improve the effectiveness and efficiency of the regulatory design reviews, increase quality of safety assessments, and to enable each regulator to become stronger in its ability to make safety decisions. The CSWG has interacted closely with the Standards Development Organisations (SDOs) and CORDEL in code comparison and code convergence. The Code Comparison Report STP-NU-051 has been issued by SDO members to identify the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. Besides the differences in codes and standards, the way how the codes and standards are applied to systems, structures and components also affects the design and construction of nuclear power plant. Therefore, to accomplish the goal of potential harmonisation, it is also vital that the regulators learn about each other's procedures, processes, and regulations. To facilitate the learning process, the CSWG meets regularly to discuss issues relevant to licensing new reactors and using codes and standards in licensing safety reviews. The CSWG communicates very frequently with the SDOs to discuss similarities and differences among the various codes and how to proceed with potential harmonisation. It should be noted that the IAEA is invited to all of the issue-specific working groups within MDEP to ensure consistency with IAEA standards. The primary focus of this technical report is to consolidate information shared and accomplishments achieved by the member countries. This report seeks to document how each MDEP regulator utilises national or regional mechanical codes and standards in its safety reviews and licensing of new reactors. The preparation of this report

  11. Processing Decoded Video for Backlight Dimming

    DEFF Research Database (Denmark)

    Burini, Nino; Korhonen, Jari

    rendition of the signals, particularly in the case of LCDs with dynamic local backlight. This thesis shows that it is possible to model LCDs with dynamic backlight to design algorithms that improve the visual quality of 2D and 3D content, and that digital video coding artifacts like blocking or ringing can......Quality of digital image and video signals on TV screens is aected by many factors, including the display technology and compression standards. An accurate knowledge of the characteristics of the display andof the video signals can be used to develop advanced algorithms that improve the visual...... be reduced with post-processing. LCD screens with dynamic local backlight are modeled in their main aspects, like pixel luminance, light diusion and light perception. Following the model, novel algorithms based on optimization are presented and extended, then reduced in complexity, to produce backlights...

  12. Lossless Compression of Broadcast Video

    DEFF Research Database (Denmark)

    Martins, Bo; Eriksen, N.; Faber, E.

    1998-01-01

    We investigate several techniques for lossless and near-lossless compression of broadcast video.The emphasis is placed on the emerging international standard for compression of continous-tone still images, JPEG-LS, due to its excellent compression performance and moderatecomplexity. Except for one...... cannot be expected to code losslessly at a rate of 125 Mbit/s. We investigate the rate and quality effects of quantization using standard JPEG-LS quantization and two new techniques: visual quantization and trellis quantization. Visual quantization is not part of baseline JPEG-LS, but is applicable...... in the framework of JPEG-LS. Visual tests show that this quantization technique gives much better quality than standard JPEG-LS quantization. Trellis quantization is a process by which the original image is altered in such a way as to make lossless JPEG-LS encoding more effective. For JPEG-LS and visual...

  13. Validation of the ASSERT subchannel code: Prediction of critical heat flux in standard and nonstandard CANDU bundle geometries

    International Nuclear Information System (INIS)

    Carver, M.B.; Kiteley, J.C.; Zhou, R.Q.N.; Junop, S.V.; Rowe, D.S.

    1995-01-01

    The ASSERT code has been developed to address the three-dimensional computation of flow and phase distribution and fuel element surface temperatures within the horizontal subchannels of Canada uranium deuterium (CANDU) pressurized heavy water reactor fuel channels and to provide a detailed prediction of critical heat flux (CHF) distribution throughout the bundle. The ASSERT subchannel code has been validated extensively against a wide repertoire of experiments; its combination of three-dimensional prediction of local flow conditions with a comprehensive method of predicting CHF at these local conditions makes it a unique tool for predicting CHF for situations outside the existing experimental database. In particular, ASSERT is an appropriate tool to systematically investigate CHF under conditions of local geometric variations, such as pressure tube creep and fuel element strain. The numerical methodology used in ASSERT, the constitutive relationships incorporated, and the CHF assessment methodology are discussed. The evolutionary validation plan is also discussed and early validation exercises are summarized. More recent validation exercises in standard and nonstandard geometries are emphasized

  14. The Authoritarian Personality in Emerging Adulthood: Longitudinal Analysis Using Standardized Scales, Observer Ratings, and Content Coding of the Life Story.

    Science.gov (United States)

    Peterson, Bill E; Pratt, Michael W; Olsen, Janelle R; Alisat, Susan

    2016-04-01

    Three different methods (a standardized scale, an observer-based Q-sort, and content coding of narratives) were used to study the continuity of authoritarianism longitudinally in emerging and young adults. Authoritarianism was assessed in a Canadian sample (N = 92) of men and women at ages 19 and 32 with Altemeyer's (1996) Right-Wing Authoritarianism (RWA) Scale. In addition, components of the authoritarian personality were assessed at age 26 through Q-sort observer methods (Block, 2008) and at age 32 through content coding of life stories. Age 19 authoritarianism predicted the Q-sort and life story measures of authoritarianism. Two hierarchical regression analyses showed that the Q-sort and life story measures of authoritarianism also predicted the RWA scale at age 32 beyond educational level and parental status, and even after the inclusion of age 19 RWA. Differences and similarities in the pattern of correlates for the Q-sort and life story measures are discussed, including the overall lack of results for authoritarian aggression. Content in narratives may be the result of emerging adult authoritarianism and may serve to maintain levels of authoritarianism in young adulthood. © 2014 Wiley Periodicals, Inc.

  15. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  16. AECL international standard problem ISP-41 FU/1 follow-up exercise (Phase 1): Containment Iodine Computer Code Exercise: Parametric Studies

    International Nuclear Information System (INIS)

    Ball, J.; Glowa, G.; Wren, J.; Ewig, F.; Dickenson, S.; Billarand, Y.; Cantrel, L.; Rydl, A.; Royen, J.

    2001-06-01

    This report describes the results of the second phase of International Standard Problem (ISP) 41, an iodine behaviour code comparison exercise. The first phase of the study, which was based on a simple Radioiodine Test Facility (RTF) experiment, demonstrated that all of the iodine behaviour codes had the capability to reproduce iodine behaviour for a narrow range of conditions (single temperature, no organic impurities, controlled pH steps). The current phase, a parametric study, was designed to evaluate the sensitivity of iodine behaviour codes to boundary conditions such as pH, dose rate, temperature and initial I- concentration. The codes used in this exercise were IODE (IPSN), IODE (NRIR), IMPAIR (GRS), INSPECT (AEAT), IMOD (AECL) and LIRIC (AECL). The parametric study described in this report identified several areas of discrepancy between the various codes. In general, the codes agree regarding qualitative trends, but their predictions regarding the actual amount of volatile iodine varied considerably. The largest source of the discrepancies between code predictions appears to be their different approaches to modelling the formation and destruction of organic iodides. A recommendation arising from this exercise is that an additional code comparison exercise be performed on organic iodide formation, against data obtained from intermediate-scale studies (two RTF (AECL, Canada) and two CAIMAN facility (IPSN, France) experiments have been chosen). This comparison will allow each of the code users to realistically evaluate and improve the organic iodide behaviour sub-models within their codes. (authors)

  17. Changing priorities of codes and standards: An A/E's perspective for operating units and new generation

    International Nuclear Information System (INIS)

    Meyers, B.L.; Jackson, R.W.; Morowski, B.D.

    1994-01-01

    As the nuclear power industry has shifted emphasis from the construction of new plants to the reliability and maintenance of operating units, the industry's commitment to safety has been well guarded and maintained. Many other important indicators of nuclear industry performance are also positive. Unfortunately, by some projections, as many as 25 operating nuclear units could prematurely shutdown because of increasing O ampersand M and total operating costs. The immediate impact of higher generating costs on the nuclear industry is evident. However, when viewed over the longer-term, high generating costs will also affect license renewals, progress in the development of advanced light water reactor designs and prospects for a return to the building of new plants. Today's challenge is to leverage the expertise and contribution of the nuclear industry partner organizations to steadily improve the work processes and methods necessary to reduce operating costs, to achieve higher levels in the performance of operating units, and to maintain high standards of technical excellence and safety. From the experience and perspective of an A/E and partner in the nuclear industry, this paper will discuss the changing priorities of codes and standards as they relate to opportunities for the communication of lessons learned and improving the responsiveness to industry needs

  18. MDEP Technical Report TR-CSWG-02. Technical Report on Lessons Learnt on Achieving Harmonisation of Codes and Standards for Pressure Boundary Components in Nuclear Power Plants

    International Nuclear Information System (INIS)

    2013-01-01

    This report was prepared by the Multinational Design Evaluation Program's (MDEP's) Codes and Standards Working Group (CSWG). The primary, long-term goal of MDEP's CSWG is to achieve international harmonisation of codes and standards for pressure-boundary components in nuclear power plants. The CSWG recognised early on that the first step to achieving harmonisation is to understand the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. To assist the CSWG in its long-term goals, several standards developing organisations (SDOs) from various countries performed a comparison of their pressure-boundary codes and standards to identify the extent of similarities and differences in code requirements and the reasons for their differences. The results of the code-comparison project provided the CSWG with valuable insights in developing the subsequent actions to take with SDOs and the nuclear industry to pursue harmonisation of codes and standards. The results enabled the CSWG to understand from a global perspective how each country's pressure-boundary code or standard evolved into its current form and content. The CSWG recognised the important fact that each country's pressure-boundary code or standard is a comprehensive, living document that is continually being updated and improved to reflect changing technology and common industry practices unique to each country. The rules in the pressure-boundary codes and standards include comprehensive requirements for the design and construction of nuclear power plant components including design, materials selection, fabrication, examination, testing and overpressure protection. The rules also contain programmatic and administrative requirements such as quality assurance; conformity assessment (e.g., third-party inspection); qualification of welders, welding equipment and welding procedures; non-destructive examination (NDE) practices and

  19. Adding access to a video magnifier to standard vision rehabilitation: initial results on reading performance and well-being from a prospective, randomized study

    Science.gov (United States)

    Jackson, Mary Lou; Schoessow, Kimberly A.; Selivanova, Alexandra; Wallis, Jennifer

    2017-01-01

    Purpose Both optical and electronic magnification are available to patients with low vision. Electronic video magnifiers are more expensive than optical magnifiers, but they offer additional benefits, including variable magnification and contrast. This study aimed to evaluate the effect of access to a video magnifier (VM) added to standard comprehensive vision rehabilitation (VR). Methods In this prospective study, 37 subjects with central field loss were randomized to receive standard VR (VR group, 18 subjects) or standard VR plus VM (VM group, 19 subjects). Subjects read the International Reading Speed Texts (IReST), a bank check, and a phone number at enrollment, at 1 month, and after occupational therapy (OT) as indicated to address patient goals. The Impact of Vision Impairment (IVI) questionnaire, a version of the Activity Inventory (AI), and the Depression Anxiety and Stress Scale (DASS) were administered at enrollment, 1 month, after OT, 1 month later, and 1 year after enrollment. Assessments at enrollment and 1 month later were evaluated. Results At 1 month, the VM group displayed significant improvement in reading continuous print as measured by the IReST (P = 0.01) but did not differ on IVI, AI, or DASS. From enrollment to 1 month all subjects improved in their ability to spot read (phone number and check; P read a number in a phone book more than the VR group at 1 month after initial consultation (P = 0.02). All reported better well-being (P = 0.02). Conclusions All subjects reported better well-being on the IVI. The VM group read faster and was better at two spot reading tasks but did not differ from the VR group in other outcome measures. PMID:28924412

  20. Nuclear codes and standards

    International Nuclear Information System (INIS)

    Raisic, N.

    1980-01-01

    The present paper deals with quality assurance regulations and analysesthe difference between documents that can be used in every country and applied on any industrial organizations and documents some aspects of which are bound to the American organization. (orig./RW)

  1. International standard problem (ISP) No. 43 Rapid boron-dilution transient tests for code verification. Comparison report

    International Nuclear Information System (INIS)

    2001-03-01

    International Standard Problem No. 43 (ISP 43) addresses the nuclear industries present capabilities of simulating fluid dynamics aspects of a subset of rapid boron dilution transients. Specifically, the exercise focuses on the sequence involving the transport of a boron-dilute slug through the actuation of a pump. The slug is formed on the primary side of the steam generator as a consequence of in interfacing system leak from the secondary un-borated coolant. Experimental data was collected using the University of Maryland 2 x 4 Thermalhydraulic Loop (UM 2 x 4 Loop) and the Boron-mixing Visualization Facility. Two blind test series were proposed during the first workshop (October 1998) and refined using participant input. The first series, test series A, deals with the injection of a front, i.e., a single interface between borated and dilute fluids. The second blind series, test series B, is the more realistic injection of a slug, i.e., a dilute fluid volume preceded and followed by the borated coolant of the primary system. Data are collected in the UM 2 x 4 Loop and refined details are obtained from the Visualization Facility, which represents a replica of the Loop.s vessel downcomer. In the Loop experimental program, the dilute volume is simulated by cold water and the borated primary coolant is simulated by hot water. The Visualization Facility uses dye to mark the diluted front or slug. The measured boundary conditions for both test series include the initial temperature of the primary system, the front/slug injection flowrate and temperature, and the pressure drop across the core. Temperature data is collected at 185 thermocouple positions in the downcomer and 38 positions in the lower plenum. The advancement of the front/slug through the system is monitored at discrete horizontal levels that contain the thermocouples. The performance of codes is measured relative to a set of figures of merit. During the first workshop, the principal figure of merit was

  2. Dashboard Videos

    Science.gov (United States)

    Gleue, Alan D.; Depcik, Chris; Peltier, Ted

    2012-01-01

    Last school year, I had a web link emailed to me entitled "A Dashboard Physics Lesson." The link, created and posted by Dale Basier on his "Lab Out Loud" blog, illustrates video of a car's speedometer synchronized with video of the road. These two separate video streams are compiled into one video that students can watch and analyze. After seeing…

  3. Video microblogging

    DEFF Research Database (Denmark)

    Bornoe, Nis; Barkhuus, Louise

    2010-01-01

    Microblogging is a recently popular phenomenon and with the increasing trend for video cameras to be built into mobile phones, a new type of microblogging has entered the arena of electronic communication: video microblogging. In this study we examine video microblogging, which is the broadcasting...... of short videos. A series of semi-structured interviews offers an understanding of why and how video microblogging is used and what the users post and broadcast....

  4. Home exercise programmes supported by video and automated reminders compared with standard paper-based home exercise programmes in patients with stroke: a randomized controlled trial.

    Science.gov (United States)

    Emmerson, Kellie B; Harding, Katherine E; Taylor, Nicholas F

    2017-08-01

    To determine whether patients with stroke receiving rehabilitation for upper limb deficits using smart technology (video and reminder functions) demonstrate greater adherence to prescribed home exercise programmes and better functional outcomes when compared with traditional paper-based exercise prescription. Randomized controlled trial comparing upper limb home exercise programmes supported by video and automated reminders on smart technology, with standard paper-based home exercise programmes. A community rehabilitation programme within a large metropolitan health service. Patients with stroke with upper limb deficits, referred for outpatient rehabilitation. Participants were randomly assigned to the control (paper-based home exercise programme) or intervention group (home exercise programme filmed on an electronic tablet, with an automated reminder). Both groups completed their prescribed home exercise programme for four weeks. The primary outcome was adherence using a self-reported log book. Secondary outcomes were change in upper limb function and patient satisfaction. A total of 62 participants were allocated to the intervention ( n = 30) and control groups ( n = 32). There were no differences between the groups for measures of adherence (mean difference 2%, 95% CI -12 to 17) or change in the Wolf Motor Function Test log transformed time (mean difference 0.02 seconds, 95% CI -0.1 to 0.1). There were no between-group differences in how participants found instructions ( p = 0.452), whether they remembered to do their exercises ( p = 0.485), or whether they enjoyed doing their exercises ( p = 0.864). The use of smart technology was not superior to standard paper-based home exercise programmes for patients recovering from stroke. This trial design was registered prospectively with the Australian and New Zealand Clinical Trials Register, ID: ACTRN 12613000786796. http://www.anzctr.org.au/trialSearch.aspx.

  5. Journalistic Ethics and Standards in the Spanish Constitution and National Codes of Conduct from the Perspective of Andalusian Journalism Students

    Directory of Open Access Journals (Sweden)

    María Ángeles López-Hernández

    2013-12-01

    Full Text Available This paper focuses on the opinion held by journalism students of the Faculty of Communication of Seville University about the ethical standards set out in the Spanish Constitution (Article 20.1.d and the country’s codes of conduct. The aim of this paper is to identify the ethical system of values of today’s university students, who have not yet been “contaminated” by the profession and on whom the future of journalism in Spain will ultimately depend. Although the results show that journalism students (both 1st year students and those in their final year have embraced a fairly solid ethical system of values, they nevertheless believe that the strong influence that economic and political powers currently exert on Spanish media corporations makes it impossible for journalists to cultivate their own work ethic, consequently obliging them to conform to the “unscrupulous” demands of their bosses. Faced with this reality, the authors reflect on the need to reinforce ethical values in the lecture hall as a way of curbing, as soon as possible, the deterioration of journalism that has been detected in Spain.

  6. Utilizing Innovative Video Chat Technology to Meet National Standards: A Case Study on a STARTALK Hindi Language Program

    Science.gov (United States)

    Parveen, Shaheen; Pater, Cayley

    2012-01-01

    Responding to the need for foreign language fluency in ever-globalizing business and cultural spheres, the federal government and foreign language institutions in an eleven-member task force collaboratively published a set of nationally recognized, foundational standards for foreign language teaching. Rather than rely on teacher-centered…

  7. Current status of the MPEG-4 standardization effort

    Science.gov (United States)

    Anastassiou, Dimitris

    1994-09-01

    The Moving Pictures Experts Group (MPEG) of the International Standardization Organization has initiated a standardization effort, known as MPEG-4, addressing generic audiovisual coding at very low bit-rates (up to 64 kbits/s) with applications in videotelephony, mobile audiovisual communications, video database retrieval, computer games, video over Internet, remote sensing, etc. This paper gives a survey of the status of MPEG-4, including its planned schedule, and initial ideas about requirements and applications. A significant part of this paper is summarizing an incomplete draft version of a `requirements document' which presents specifications of desirable features on the video, audio, and system level of the forthcoming standard. Very low bit-rate coding algorithms are not described, because no endorsement of any particular algorithm, or class of algorithms, has yet been made by MPEG-4, and several seminars held concurrently with MPEG-4 meetings have not so far provided evidence that such high performance coding schemes are achievable.

  8. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Illinois Jurisdictions

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.; Friedrich, Michele

    2002-05-01

    ASHRAE Standard 90.1-1999 was developed in an effort to set minimum requirements for energy efficienty design and construction of new commercial buildings. This report assesses the benefits and costs of adopting this standard as the building energy code in Illinois. Energy and economic impacts are estimated using BLAST combined with a Life-Cycle Cost approach to assess corresponding economic costs and benefits.

  9. On locality of Generalized Reed-Muller codes over the broadcast erasure channel

    KAUST Repository

    Alloum, Amira

    2016-07-28

    One to Many communications are expected to be among the killer applications for the currently discussed 5G standard. The usage of coding mechanisms is impacting broadcasting standard quality, as coding is involved at several levels of the stack, and more specifically at the application layer where Rateless, LDPC, Reed Slomon codes and network coding schemes have been extensively studied, optimized and standardized in the past. Beyond reusing, extending or adapting existing application layer packet coding mechanisms based on previous schemes and designed for the foregoing LTE or other broadcasting standards; our purpose is to investigate the use of Generalized Reed Muller codes and the value of their locality property in their progressive decoding for Broadcast/Multicast communication schemes with real time video delivery. Our results are meant to bring insight into the use of locally decodable codes in Broadcasting. © 2016 IEEE.

  10. Video pedagogy

    OpenAIRE

    Länsitie, Janne; Stevenson, Blair; Männistö, Riku; Karjalainen, Tommi; Karjalainen, Asko

    2016-01-01

    The short film is an introduction to the concept of video pedagogy. The five categories of video pedagogy further elaborate how videos can be used as a part of instruction and learning process. Most pedagogical videos represent more than one category. A video itself doesn’t necessarily define the category – the ways in which the video is used as a part of pedagogical script are more defining factors. What five categories did you find? Did you agree with the categories, or are more...

  11. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  12. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    International Nuclear Information System (INIS)

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system

  13. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 2, Part 3: Functional modules F16--F17; Revision 5

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.

  14. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-2001 as the Commercial Building Energy Code in Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Winiarski, David W.; Belzer, David B.; Richman, Eric E.

    2004-09-30

    ASHRAE Standard 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings (hereafter referred to as ASHRAE 90.1-2001 or 90.1-2001) was developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The State of Tennessee is considering adopting ASHRAE 90.1-2001 as its commercial building energy code. In an effort to evaluate whether or not this is an appropriate code for the state, the potential benefits and costs of adopting this standard are considered in this report. Both qualitative and quantitative benefits and costs are assessed. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST) simulations combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits. Tennessee currently has ASHRAE Standard 90A-1980 as the statewide voluntary/recommended commercial energy standard; however, it is up to the local jurisdiction to adopt this code. Because 90A-1980 is the recommended standard, many of the requirements of ASHRAE 90A-1980 were used as a baseline for simulations.

  15. Report on the Observance of Standards and Codes, Accounting and Auditing : Module B - Institutional Framework for Corporate Financial Reporting, B.9 Auditing Standard-setting

    OpenAIRE

    World Bank

    2017-01-01

    The purpose of this report is to gain an understanding of the governance arrangements, procedures, and capacity for setting auditing standards in a jurisdiction, covering: (a) the adoption of International Standards on Auditing (ISA) where applicable, and (b) national auditing standards. The questions are based on examples of good practice followed by international standard-setting bodies....

  16. Ethics Standards Impacting Test Development and Use: A Review of 31 Ethics Codes Impacting Practices in 35 Countries

    Science.gov (United States)

    Leach, Mark M.; Oakland, Thomas

    2007-01-01

    Ethics codes are designed to protect the public by prescribing behaviors professionals are expected to exhibit. Although test use is universal, albeit reflecting strong Western influences, previous studies that examine the degree issues pertaining to test development and use and that are addressed in ethics codes of national psychological…

  17. EUS-guided rendezvous for difficult biliary cannulation using a standardized algorithm: a multicenter prospective pilot study (with videos).

    Science.gov (United States)

    Iwashita, Takuji; Yasuda, Ichiro; Mukai, Tsuyoshi; Iwata, Keisuke; Ando, Nobuhiro; Doi, Shinpei; Nakashima, Masanori; Uemura, Shinya; Mabuchi, Masatoshi; Shimizu, Masahito

    2016-02-01

    Biliary cannulation is necessary in therapeutic ERCP for biliary disorders. EUS-guided rendezvous (EUS-RV) can salvage failed cannulation. Our aim was to determine the safety and efficacy of EUS-RV by using a standardized algorithm with regard to the endoscope position in a prospective study. EUS-RV was attempted after failed cannulation in 20 patients. In a standardized approach, extrahepatic bile duct (EHBD) cannulation was preferentially attempted from the second portion of the duodenum (D2) followed by additional approaches to the EHBD from the duodenal bulb (D1) or to the intrahepatic bile duct from the stomach, if necessary. A guidewire was placed in an antegrade fashion into the duodenum. After the guidewire was placed, the endoscope was exchanged for a duodenoscope to complete the cannulation. The bile duct was accessed from the D2 in 10 patients, but from the D1 in 5 patients and the stomach in 4 patients because of no dilation or tumor invasion at the distal EHBD. In the remaining patient, biliary puncture was not attempted due to the presence of collateral vessels. The guidewire was successfully manipulated in 80% of patients: 100% (10/10) with the D2 approach and 66.7% (6/9) with other approaches. The overall success rate was 80% (16/20). Failed EUS-RV was salvaged with a percutaneous approach in 2 patients, repeat ERCP in 1 patient, and conservative management in 1 patient. Minor adverse events occurred in 15% of patients (3/20). EUS-RV is a safe and effective salvage method. Using EUS-RV to approach the EHBD from the D2 may improve success rates. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  18. Application of NEA/CSNI standard problem 3 (blowdown and flow reversal in the IETA-1 rig) to the validation of the RELAP-UK Mk IV code

    International Nuclear Information System (INIS)

    Bryce, W.M.

    1977-10-01

    NEA/CSNI Standard Problem 3 consists of the modelling of an experiment on the IETI-1 rig, in which there is initially flow upwards through a feeder, heated section and riser. The inlet and outlet are then closed and a breach opened at the bottom so that the flow reverses and the rig depressurises. Calculations of this problem by many countries using several computer codes have been reported and show a wide spread of results. The purpose of the study reported here was the following. First, to show the sensitivity of the calculation of Standard Problem 3. Second, to perform an ab initio best estimate calculation using the RELAP-UK Mark IV code with the standard recommended options, and third, to use the results of the sensitivity study to show where tuning of the RELAP-UK Mark IV recommended model options was required. This study has shown that the calculation of Standard Problem 3 is sensitive to model assumptions and that the use of the loss-of-coolant accident code RELAP-UK Mk IV with the standard recommended model options predicts the experimental results very well over most of the transient. (U.K.)

  19. Revision of Ethical Standard 3.04 of the "Ethical Principles of Psychologists and Code of Conduct" (2002, as amended 2010).

    Science.gov (United States)

    2016-12-01

    The following amendment to Ethical Standard 3.04 of the 2002 "Ethical Principles of Psychologists and Code of Conduct" as amended, 2010 (the Ethics Code; American Psychological Association, 2002, 2010) was adopted by the APA Council of Representatives at its August 2016 meeting. The amendment will become effective January 1, 2017. Following is an explanation of the change, a clean version of the revision, and a version indicating changes from the 2002 language (inserted text is in italics). (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Analysis of Potential Benefits and Costs of Adopting ASHRAE Standard 90.1-1999 as a Commercial Building Energy Code in Michigan

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Belzer, David B.; Halverson, Mark A.; Richman, Eric E.; Winiarski, David W.

    2002-09-30

    The state of Michigan is considering adpoting ASHRAE 90.1-1999 as its commercial building energy code. In an effort to evaluate whether or not this is an appropraite code for the state, the potential benefits and costs of adopting this standard are considered. Both qualitative and quantitative benefits are assessed. The energy simulation and economic results suggest that adopting ASHRAE 90.1-1999 would provide postitive net benefits to the state relative to the building and design requirements currently in place.

  1. An efficient interpolation filter VLSI architecture for HEVC standard

    Science.gov (United States)

    Zhou, Wei; Zhou, Xin; Lian, Xiaocong; Liu, Zhenyu; Liu, Xiaoxiang

    2015-12-01

    The next-generation video coding standard of High-Efficiency Video Coding (HEVC) is especially efficient for coding high-resolution video such as 8K-ultra-high-definition (UHD) video. Fractional motion estimation in HEVC presents a significant challenge in clock latency and area cost as it consumes more than 40 % of the total encoding time and thus results in high computational complexity. With aims at supporting 8K-UHD video applications, an efficient interpolation filter VLSI architecture for HEVC is proposed in this paper. Firstly, a new interpolation filter algorithm based on the 8-pixel interpolation unit is proposed in this paper. It can save 19.7 % processing time on average with acceptable coding quality degradation. Based on the proposed algorithm, an efficient interpolation filter VLSI architecture, composed of a reused data path of interpolation, an efficient memory organization, and a reconfigurable pipeline interpolation filter engine, is presented to reduce the implement hardware area and achieve high throughput. The final VLSI implementation only requires 37.2k gates in a standard 90-nm CMOS technology at an operating frequency of 240 MHz. The proposed architecture can be reused for either half-pixel interpolation or quarter-pixel interpolation, which can reduce the area cost for about 131,040 bits RAM. The processing latency of our proposed VLSI architecture can support the real-time processing of 4:2:0 format 7680 × 4320@78fps video sequences.

  2. Comparison of the reusable standard GlideScope® video laryngoscope and the disposable cobalt GlideScope® video laryngoscope for tracheal intubation in an academic emergency department: a retrospective review.

    Science.gov (United States)

    Sakles, John C; Patanwala, Asad E; Mosier, Jarrod; Dicken, John; Holman, Nathan

    2014-04-01

    The objective was to compare the first-pass success and clinical performance characteristics of the reusable standard GlideScope® video laryngoscope (sGVL) and the disposable Cobalt GlideScope® video laryngoscope (cGVL). This was a retrospective analysis of prospectively collected data recorded into a continuous quality improvement database at an urban academic emergency department (ED). The intent of the database is to evaluate operator performance and to track practice patterns used for intubation in the ED. Between July 1, 2007, and June 30, 2013, operators recorded all consecutive intubations performed in the ED. The database included patient demographics and detailed information about each intubation, such as device(s) used, reason for device selection, method of intubation, difficult airway characteristics, number of intubation attempts, and outcome of each attempt. The operator also evaluated the presence of lens fogging and extent of lens contamination. The primary outcome measure was first-pass success. Secondary outcome measures were ultimate success, Cormack-Lehane (CL) view of the airway, presence of lens fogging, and extent of lens contamination. Only adult patients age 18 years or older intubated with the sGVL or cGVL using a stylet, and who had data forms completed at the time of intubation, were included in this study. A total of 583 intubations were included in the study, 504 with the sGVL and 79 with cGVL. First pass success was achieved in 81.0% (95% confidence interval [CI]=77.3% to 84.3%) of patients in the sGVL group and in 58.2% (95% CI=46.6% to 69.2%) of patients in the cGVL group. In a multivariate logistic regression analysis, the sGVL was associated with a higher first pass success than the cGVL (odds ratio [OR]=3.3, 95% CI=1.9 to 5.8). The ultimate success of the sGVL was 92.1% (95% CI=89.4% to 94.3%) and the cGVL was 72.2% (95% CI=60.9% to 81.7%). A CL grade I or II view was obtained in 93.2% (95% CI=90.7% to 95.3%) in the sGVL group

  3. Dynamic infrared thermography (DIRT) for assessment of skin blood perfusion in cranioplasty: a proof of concept for qualitative comparison with the standard indocyanine green video angiography (ICGA).

    Science.gov (United States)

    Rathmann, P; Chalopin, C; Halama, D; Giri, P; Meixensberger, J; Lindner, D

    2018-03-01

    Complications in wound healing after neurosurgical operations occur often due to scarred dehiscence with skin blood perfusion disturbance. The standard imaging method for intraoperative skin perfusion assessment is the invasive indocyanine green video angiography (ICGA). The noninvasive dynamic infrared thermography (DIRT) is a promising alternative modality that was evaluated by comparison with ICGA. The study was carried out in two parts: (1) investigation of technical conditions for intraoperative use of DIRT for its comparison with ICGA, and (2) visual and quantitative comparison of both modalities in a proof of concept on nine patients. Time-temperature curves in DIRT and time-intensity curves in ICGA for defined regions of interest were analyzed. New perfusion parameters were defined in DIRT and compared with the usual perfusion parameters in ICGA. The visual observation of the image data in DIRT and ICGA showed that operation material, anatomical structures and skin perfusion are represented similarly in both modalities. Although the analysis of the curves and perfusion parameter values showed differences between patients, no complications were observed clinically. These differences were represented in DIRT and ICGA equivalently. DIRT has shown a great potential for intraoperative use, with several advantages over ICGA. The technique is passive, contactless and noninvasive. The practicability of the intraoperative recording of the same operation field section with ICGA and DIRT has been demonstrated. The promising results of this proof of concept provide a basis for a trial with a larger number of patients.

  4. Immersive video

    Science.gov (United States)

    Moezzi, Saied; Katkere, Arun L.; Jain, Ramesh C.

    1996-03-01

    Interactive video and television viewers should have the power to control their viewing position. To make this a reality, we introduce the concept of Immersive Video, which employs computer vision and computer graphics technologies to provide remote users a sense of complete immersion when viewing an event. Immersive Video uses multiple videos of an event, captured from different perspectives, to generate a full 3D digital video of that event. That is accomplished by assimilating important information from each video stream into a comprehensive, dynamic, 3D model of the environment. Using this 3D digital video, interactive viewers can then move around the remote environment and observe the events taking place from any desired perspective. Our Immersive Video System currently provides interactive viewing and `walkthrus' of staged karate demonstrations, basketball games, dance performances, and typical campus scenes. In its full realization, Immersive Video will be a paradigm shift in visual communication which will revolutionize television and video media, and become an integral part of future telepresence and virtual reality systems.

  5. Reconfigurable Secure Video Codec Based on DWT and AES Processor

    Directory of Open Access Journals (Sweden)

    Rached Tourki

    2010-01-01

    Full Text Available In this paper, we proposed a secure video codec based on the discrete wavelet transformation (DWT and the Advanced Encryption Standard (AES processor. Either, use of video coding with DWT or encryption using AES is well known. However, linking these two designs to achieve secure video coding is leading. The contributions of our work are as follows. First, a new method for image and video compression is proposed. This codec is a synthesis of JPEG and JPEG2000,which is implemented using Huffman coding to the JPEG and DWT to the JPEG2000. Furthermore, an improved motion estimation algorithm is proposed. Second, the encryptiondecryption effects are achieved by the AES processor. AES is aim to encrypt group of LL bands. The prominent feature of this method is an encryption of LL bands by AES-128 (128-bit keys, or AES-192 (192-bit keys, or AES-256 (256-bit keys.Third, we focus on a method that implements partial encryption of LL bands. Our approach provides considerable levels of security (key size, partial encryption, mode encryption, and has very limited adverse impact on the compression efficiency. The proposed codec can provide up to 9 cipher schemes within a reasonable software cost. Latency, correlation, PSNR and compression rate results are analyzed and shown.

  6. An evaluation of the effectiveness of the EPA comply code to demonstrate compliance with radionuclide emission standards at three manufacturing facilities

    International Nuclear Information System (INIS)

    Smith, L.R.; Laferriere, J.R.; Nagy, J.W.

    1991-01-01

    Measurements of airborne radionuclide emissions and associated environmental concentrations were made at, and in the vicinity of, two urban and one suburban facility where radiolabeled chemicals for biomedical research and radiopharmaceuticals are manufactured. Emission, environmental and meteorological measurements were used in the EPA COMPLY code and in environmental assessment models developed specifically for these sites to compare their ability to predict off-site measurements. The models and code were then used to determine potential dose to hypothetical maximally exposed receptors and the ability of these methods to demonstrate whether these facilities comply with proposed radionuclide emission standards assessed. In no case did the models and code seriously underestimate off-site impacts. However, for certain radionuclides and chemical forms, the EPA COMPLY code was found to overestimate off-site impacts by such a large factor as to render its value questionable for determining regulatory compliance. Recommendations are offered for changing the code to enable it to be more serviceable to radionuclide users and regulators

  7. Comparative assessment of H.265/MPEG-HEVC, VP9, and H.264/MPEG-AVC encoders for low-delay video applications

    Science.gov (United States)

    Grois, Dan; Marpe, Detlev; Nguyen, Tung; Hadar, Ofer

    2014-09-01

    The popularity of low-delay video applications dramatically increased over the last years due to a rising demand for realtime video content (such as video conferencing or video surveillance), and also due to the increasing availability of relatively inexpensive heterogeneous devices (such as smartphones and tablets). To this end, this work presents a comparative assessment of the two latest video coding standards: H.265/MPEG-HEVC (High-Efficiency Video Coding), H.264/MPEG-AVC (Advanced Video Coding), and also of the VP9 proprietary video coding scheme. For evaluating H.264/MPEG-AVC, an open-source x264 encoder was selected, which has a multi-pass encoding mode, similarly to VP9. According to experimental results, which were obtained by using similar low-delay configurations for all three examined representative encoders, it was observed that H.265/MPEG-HEVC provides significant average bit-rate savings of 32.5%, and 40.8%, relative to VP9 and x264 for the 1-pass encoding, and average bit-rate savings of 32.6%, and 42.2% for the 2-pass encoding, respectively. On the other hand, compared to the x264 encoder, typical low-delay encoding times of the VP9 encoder, are about 2,000 times higher for the 1-pass encoding, and are about 400 times higher for the 2-pass encoding.

  8. Paraguay; Report on the Observance of Standards and Codes: FATF Recommendations for Anti-Money Laundering and Combating the Financing of Terrorism

    OpenAIRE

    International Monetary Fund

    2009-01-01

    This paper discusses assessment results on the observance of standards and codes on the Financial Action Task Force (FATF) recommendations for antimoney laundering and combating the financing of terrorism (AML/CFT) for Paraguay. The assessment reveals that the substantial U.S. dollar contraband trade that occurs on the borders shared with Argentina and Brazil facilitates money laundering in Paraguay. Achievements in the implementation of Paraguay’s AML framework remain modest since the crimin...

  9. Activities of the Commission of the European Communities in the field of codes and standards for FBRs

    International Nuclear Information System (INIS)

    Terzaghi, A.

    1987-01-01

    A description of the organization set up by the Commission of European Communities to study problems, compare information within the member nations, and with other industrial nations for the preparation of guides and codes for the components of the LMFBR is given. Work performed and currently in progress is given on structural analysis, materials, and classification of components. (orig.)

  10. Video games

    OpenAIRE

    Kolář, Vojtěch

    2012-01-01

    This thesis is based on a detailed analysis of various topics related to the question of whether video games can be art. In the first place it analyzes the current academic discussion on this subject and confronts different opinions of both supporters and objectors of the idea, that video games can be a full-fledged art form. The second point of this paper is to analyze the properties, that are inherent to video games, in order to find the reason, why cultural elite considers video games as i...

  11. Codex general standard for irradiated foods and recommended international code of practice for the operation of radiation facilities used for the treatment of foods

    International Nuclear Information System (INIS)

    1990-06-01

    The FAO/WHO Codex Alimentarius Commission was established to implement the Joint FAO/WHO Food Standards Programme. The purpose of this programme is to protect the health of consumers and to ensure fair practices in the food trade. At its 15th session, held in July 1983, the Commission adopted a Codex General Standard for Irradiated Foods and a Recommended International Code of Practice for the Operation of Radiation Facilities used for the Treatment of Foods. This Standard takes into account the recommendations and conclusions of the Joint FAO/IAEA/WHO Expert Committees convened to evaluate all available data concerning the various aspects of food irradiation. This Standard refers only to those aspects which relate to the processing of foods by ionising energy. The Standard recognizes that the process of food irradiation has been established as safe for general application to an overall average level of absorbed dose of 10 KGy. The latter value shold not be regarded as a toxicological upper limit above which irradiated foods become unsafe; it is simply the level at or below which safety has been established. The Standard provides certain mandatory provisions concerning the facilities used and for the control of the process in the irradiation plants. The present Standard requires that shipping documents accompanying irradiated foods moving in trade should indicate the fact of irradiation. The labelling of prepackaged irradiated foods intended for direct sale to the consumer is not covered in this Standard

  12. Codex general standard for irradiated foods and recommended international code of practice for the operation of radiation facilities used for the treatment of foods

    International Nuclear Information System (INIS)

    1984-01-01

    The FAO/WHO Codex Alimentarius Commission was established to implement the Joint FAO/WHO Food Standards Programme. The purpose of this programme is to protect the health of consumers and to ensure fair practices in the food trade. At its 15th session, held in July 1983, the Commission adopted a Codex General Standard for Irradiated Foods and a Recommended International Code of Practice for the Operation of Radiation Facilities used for the Treatment of Foods. This Standard takes into account the recommendations and conclusions of the Joint FAO/IAEA/WHO Expert Committees convened to evaluate all available data concerning the various aspects of food irradiation. This Standard refers only to those aspects which relate to the processing of foods by ionising energy. The Standard recognizes that the process of food irradiation has been established as safe for general application to an overall average level of absorbed dose of 10 kGy. The latter value should not be regarded as a toxicological upper limit above which irradiated foods become unsafe; it is simply the level at or below which safety has been established. The Standard provides certain mandatory provisions concerning the facilities used and for the control of the process in the irradiation plants. The present Standard requires that shipping documents accompanying irradiated foods moving in trade should indicate the fact of irradiation. The labelling of prepackaged irradiated foods intended for direct sale to the consumer is not covered in this Standard

  13. Scheduling Heuristics for Live Video Transcoding on Cloud Edges

    Institute of Scientific and Technical Information of China (English)

    Panagiotis Oikonomou; Maria G. Koziri; Nikos Tziritas; Thanasis Loukopoulos; XU Cheng-Zhong

    2017-01-01

    Efficient video delivery involves the transcoding of the original sequence into various resolutions, bitrates and standards, in order to match viewers 'capabilities. Since video coding and transcoding are computationally demanding, performing a portion of these tasks at the network edges promises to decrease both the workload and network traffic towards the data centers of media provid-ers. Motivated by the increasing popularity of live casting on social media platforms, in this paper we focus on the case of live vid-eo transcoding. Specifically, we investigate scheduling heuristics that decide on which jobs should be assigned to an edge mini-datacenter and which to a backend datacenter. Through simulation experiments with different QoS requirements we conclude on the best alternative.

  14. Assessment of United States industry structural codes and standards for application to advanced nuclear power reactors: Appendices. Volume 2

    International Nuclear Information System (INIS)

    Adams, T.M.; Stevenson, J.D.

    1995-10-01

    Throughout its history, the USNRC has remained committed to the use of industry consensus standards for the design, construction, and licensing of commercial nuclear power facilities. The existing industry standards are based on the current class of light water reactors and as such may not adequately address design and construction features of the next generation of Advanced Light Water Reactors and other types of Advanced Reactors. As part of their on-going commitment to industry standards, the USNRC commissioned this study to evaluate US industry structural standards for application to Advanced Light Water Reactors and Advanced Reactors. The initial review effort included (1) the review and study of the relevant reactor design basis documentation for eight Advanced Light Water Reactors and Advanced Reactor Designs, (2) the review of the USNRCs design requirements for advanced reactors, (3) the review of the latest revisions of the relevant industry consensus structural standards, and (4) the identification of the need for changes to these standards. The results of these studies were used to develop recommended changes to industry consensus structural standards which will be used in the construction of Advanced Light Water Reactors and Advanced Reactors. Over seventy sets of proposed standard changes were recommended and the need for the development of four new structural standards was identified. In addition to the recommended standard changes, several other sets of information and data were extracted for use by USNRC in other on-going programs. This information included (1) detailed observations on the response of structures and distribution system supports to the recent Northridge, California (1994) and Kobe, Japan (1995) earthquakes, (2) comparison of versions of certain standards cited in the standard review plan to the most current versions, and (3) comparison of the seismic and wind design basis for all the subject reactor designs

  15. Camera network video summarization

    Science.gov (United States)

    Panda, Rameswar; Roy-Chowdhury, Amit K.

    2017-05-01

    Networks of vision sensors are deployed in many settings, ranging from security needs to disaster response to environmental monitoring. Many of these setups have hundreds of cameras and tens of thousands of hours of video. The difficulty of analyzing such a massive volume of video data is apparent whenever there is an incident that requires foraging through vast video archives to identify events of interest. As a result, video summarization, that automatically extract a brief yet informative summary of these videos, has attracted intense attention in the recent years. Much progress has been made in developing a variety of ways to summarize a single video in form of a key sequence or video skim. However, generating a summary from a set of videos captured in a multi-camera network still remains as a novel and largely under-addressed problem. In this paper, with the aim of summarizing videos in a camera network, we introduce a novel representative selection approach via joint embedding and capped l21-norm minimization. The objective function is two-fold. The first is to capture the structural relationships of data points in a camera network via an embedding, which helps in characterizing the outliers and also in extracting a diverse set of representatives. The second is to use a capped l21-norm to model the sparsity and to suppress the influence of data outliers in representative selection. We propose to jointly optimize both of the objectives, such that embedding can not only characterize the structure, but also indicate the requirements of sparse representative selection. Extensive experiments on standard multi-camera datasets well demonstrate the efficacy of our method over state-of-the-art methods.

  16. Absorbed dose determination in external beam radiotherapy. An international code of practice for dosimetry based on standards of absorbed dose to water

    International Nuclear Information System (INIS)

    2000-01-01

    The International Atomic Energy Agency published in 1987 an International Code of Practice entitled 'Absorbed Dose Determination in Photon and Electron Beams' (IAEA Technical Reports Series No. 277 (TRS-277)), recommending procedures to obtain the absorbed dose in water from measurements made with an ionization chamber in external beam radiotherapy. A second edition of TRS-277 was published in 1997 updating the dosimetry of photon beams, mainly kilovoltage X rays. Another International Code of Practice for radiotherapy dosimetry entitled 'The Use of Plane-Parallel Ionization Chambers in High Energy Electron and Photon Beams' (IAEA Technical Reports Series No. 381 (TRS-381)) was published in 1997 to further update TRS-277 and complement it with respect to the area of parallel-plate ionization chambers. Both codes have proven extremely valuable for users involved in the dosimetry of the radiation beams used in radiotherapy. In TRS-277 the calibration of the ionization chambers was based on primary standards of air kerma; this procedure was also used in TRS-381, but the new trend of calibrating ionization chambers directly in a water phantom in terms of absorbed dose to water was introduced. The development of primary standards of absorbed dose to water for high energy photon and electron beams, and improvements in radiation dosimetry concepts, offer the possibility of reducing the uncertainty in the dosimetry of radiotherapy beams. The dosimetry of kilovoltage X rays, as well as that of proton and heavy ion beams, interest in which has grown considerably in recent years, can also be based on these standards. Thus a coherent dosimetry system based on standards of absorbed dose to water is possible for practically all radiotherapy beams. Many Primary Standard Dosimetry Laboratories (PSDLs) already provide calibrations in terms of absorbed dose to water at the radiation quality of 60 Co gamma rays. Some laboratories have extended calibrations to high energy photon and

  17. Design of batch audio/video conversion platform based on JavaEE

    Science.gov (United States)

    Cui, Yansong; Jiang, Lianpin

    2018-03-01

    With the rapid development of digital publishing industry, the direction of audio / video publishing shows the diversity of coding standards for audio and video files, massive data and other significant features. Faced with massive and diverse data, how to quickly and efficiently convert to a unified code format has brought great difficulties to the digital publishing organization. In view of this demand and present situation in this paper, basing on the development architecture of Sptring+SpringMVC+Mybatis, and combined with the open source FFMPEG format conversion tool, a distributed online audio and video format conversion platform with a B/S structure is proposed. Based on the Java language, the key technologies and strategies designed in the design of platform architecture are analyzed emphatically in this paper, designing and developing a efficient audio and video format conversion system, which is composed of “Front display system”, "core scheduling server " and " conversion server ". The test results show that, compared with the ordinary audio and video conversion scheme, the use of batch audio and video format conversion platform can effectively improve the conversion efficiency of audio and video files, and reduce the complexity of the work. Practice has proved that the key technology discussed in this paper can be applied in the field of large batch file processing, and has certain practical application value.

  18. Comparison of European codes and standards on the welding of LMFBR components and proposals for their harmonization

    International Nuclear Information System (INIS)

    Koehler, S.

    1992-01-01

    A comparative study has been conducted, within the framework of the exercises of comparisons of specifications and standards for fast reactors in the following specialized fields: - welding supervisor, welder; - welder's tests; -production test specimens of welds; - measures to prevent mistakes with weld material. The relevant specifications were forwarded by the national delegations: Germany, France, Italy and United Kingdom. The comparison has been presented in tabular form where rules for a particular sub-group of specialized field are laid down in the standards of at least two Member States. In each case, the conclusions and requirements set out in the national standards have been compared in relation to a specific comparison criterion. The quantitative comparisons of the requirements laid down in the individual national standards are assessed from the following standpoints: a) points of agreement between the regulations in the standards of all four Member states (Germany, France, United Kingdom and Italy); b) significant differences between the regulations. 13 tabs

  19. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  20. No-reference pixel based video quality assessment for HEVC decoded video

    DEFF Research Database (Denmark)

    Huang, Xin; Søgaard, Jacob; Forchhammer, Søren

    2017-01-01

    the quantization step used in the Intra coding is estimated. We map the obtained HEVC features using an Elastic Net to predict subjective video quality scores, Mean Opinion Scores (MOS). The performance is verified on a dataset consisting of HEVC coded 4 K UHD (resolution equal to 3840 x 2160) video sequences...

  1. Akademisk video

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth

    2017-01-01

    Dette kapitel har fokus på metodiske problemstillinger, der opstår i forhold til at bruge (digital) video i forbindelse med forskningskommunikation, ikke mindst online. Video har længe været benyttet i forskningen til dataindsamling og forskningskommunikation. Med digitaliseringen og internettet ...

  2. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    International Nuclear Information System (INIS)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR.

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.; Bucholz, J.A.; Hermann, O.W.; Fraley, S.K. [Oak Ridge National Lab., TN (United States)

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F9--F16 -- Volume 2, Part 2, Revision 4

    International Nuclear Information System (INIS)

    West, J.T.; Hoffman, T.J.; Emmett, M.B.; Childs, K.W.; Petrie, L.M.; Landers, N.F.; Bryan, C.B.; Giles, G.E.

    1995-04-01

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries. This volume discusses the following functional modules: MORSE-SGC; HEATING 7.2; KENO V.a; JUNEBUG-II; HEATPLOT-S; REGPLOT 6; PLORIGEN; and OCULAR

  6. Inventory of power plants in the United States. [By state within standard Federal Regions, using county codes

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    The purpose of this inventory of power plants is to provide a ready reference for planners whose focus is on the state, standard Federal region, and/or national level. Thus the inventory is compiled alphabetically by state within standard Federal regions. The units are listed alphabetically within electric utility systems which in turn are listed alphabetically within states. The locations are identified to county level according to the Federal Information Processing Standards Publication Counties and County Equivalents of the States of the United States. Data compiled include existing and projected electrical generation units, jointly owned units, and projected construction units.

  7. Video Measurements: Quantity or Quality

    Science.gov (United States)

    Zajkov, Oliver; Mitrevski, Boce

    2012-01-01

    Students have problems with understanding, using and interpreting graphs. In order to improve the students' skills for working with graphs, we propose Manual Video Measurement (MVM). In this paper, the MVM method is explained and its accuracy is tested. The comparison with the standardized video data software shows that its accuracy is comparable…

  8. Watermarking textures in video games

    Science.gov (United States)

    Liu, Huajian; Berchtold, Waldemar; Schäfer, Marcel; Lieb, Patrick; Steinebach, Martin

    2014-02-01

    Digital watermarking is a promising solution to video game piracy. In this paper, based on the analysis of special challenges and requirements in terms of watermarking textures in video games, a novel watermarking scheme for DDS textures in video games is proposed. To meet the performance requirements in video game applications, the proposed algorithm embeds the watermark message directly in the compressed stream in DDS files and can be straightforwardly applied in watermark container technique for real-time embedding. Furthermore, the embedding approach achieves high watermark payload to handle collusion secure fingerprinting codes with extreme length. Hence, the scheme is resistant to collusion attacks, which is indispensable in video game applications. The proposed scheme is evaluated in aspects of transparency, robustness, security and performance. Especially, in addition to classical objective evaluation, the visual quality and playing experience of watermarked games is assessed subjectively in game playing.

  9. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  10. Data Partitioning Technique for Improved Video Prioritization

    Directory of Open Access Journals (Sweden)

    Ismail Amin Ali

    2017-07-01

    Full Text Available A compressed video bitstream can be partitioned according to the coding priority of the data, allowing prioritized wireless communication or selective dropping in a congested channel. Known as data partitioning in the H.264/Advanced Video Coding (AVC codec, this paper introduces a further sub-partition of one of the H.264/AVC codec’s three data-partitions. Results show a 5 dB improvement in Peak Signal-to-Noise Ratio (PSNR through this innovation. In particular, the data partition containing intra-coded residuals is sub-divided into data from: those macroblocks (MBs naturally intra-coded, and those MBs forcibly inserted for non-periodic intra-refresh. Interactive user-to-user video streaming can benefit, as then HTTP adaptive streaming is inappropriate and the High Efficiency Video Coding (HEVC codec is too energy demanding.

  11. Video Podcasts

    DEFF Research Database (Denmark)

    Nortvig, Anne Mette; Sørensen, Birgitte Holm

    2016-01-01

    This project’s aim was to support and facilitate master’s students’ preparation and collaboration by making video podcasts of short lectures available on YouTube prior to students’ first face-to-face seminar. The empirical material stems from group interviews, from statistical data created through...... YouTube analytics and from surveys answered by students after the seminar. The project sought to explore how video podcasts support learning and reflection online and how students use and reflect on the integration of online activities in the videos. Findings showed that students engaged actively...

  12. Video games.

    Science.gov (United States)

    Funk, Jeanne B

    2005-06-01

    The video game industry insists that it is doing everything possible to provide information about the content of games so that parents can make informed choices; however, surveys indicate that ratings may not reflect consumer views of the nature of the content. This article describes some of the currently popular video games, as well as developments that are on the horizon, and discusses the status of research on the positive and negative impacts of playing video games. Recommendations are made to help parents ensure that children play games that are consistent with their values.

  13. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2013-01-01

    We present a method for No-Reference (NR) Video Quality Assessment (VQA) for decoded video without access to the bitstream. This is achieved by extracting and pooling features from a NR image quality assessment method used frame by frame. We also present methods to identify the video coding...... and estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods...

  14. Segmentation of object-based video of gaze communication

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Stegmann, Mikkel Bille; Forchhammer, Søren

    2005-01-01

    Aspects of video communication based on gaze interaction are considered. The overall idea is to use gaze interaction to control video, e.g. for video conferencing. Towards this goal, animation of a facial mask is demonstrated. The animation is based on images using Active Appearance Models (AAM......). Good quality reproduction of (low-resolution) coded video of an animated facial mask as low as 10-20 kbit/s using MPEG-4 object based video is demonstated....

  15. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  16. Input data preparation and simulation of the second standard problem of IAEA using the Trac/PF1 code

    International Nuclear Information System (INIS)

    Madeira, A.A.; Pontedeiro, A.C.; Silva Galetti, M.R. da; Borges, R.C.

    1989-10-01

    The second Standard Problem sponsored by IAEA consists in the simulation of a small LOCA located in the downcomer of a PMK-NVH integral test facility, which models WWER/440 type reactor. This report presents input data preparation and comparison between TRAC-PF1 results and experimental measurements. (author) [pt

  17. The use of portable video media vs standard verbal communication in the urological consent process: a multicentre, randomised controlled, crossover trial.

    Science.gov (United States)

    Winter, Matthew; Kam, Jonathan; Nalavenkata, Sunny; Hardy, Ellen; Handmer, Marcus; Ainsworth, Hannah; Lee, Wai Gin; Louie-Johnsun, Mark

    2016-11-01

    To determine if portable video media (PVM) improves patient's knowledge and satisfaction acquired during the consent process for cystoscopy and insertion of a ureteric stent compared to standard verbal communication (SVC), as informed consent is a crucial component of patient care and PVM is an emerging technology that may help improve the consent process. In this multi-centre randomised controlled crossover trial, patients requiring cystoscopy and stent insertion were recruited from two major teaching hospitals in Australia over a 15-month period (July 2014-December 2015). Patient information delivery was via PVM and SVC. The PVM consisted of an audio-visual presentation with cartoon animation presented on an iPad. Patient satisfaction was assessed using the validated Client Satisfaction Questionnaire 8 (CSQ-8; maximum score 32) and knowledge was tested using a true/false questionnaire (maximum score 28). Questionnaires were completed after first intervention and after crossover. Scores were analysed using the independent samples t-test and Wilcoxon signed-rank test for the crossover analysis. In all, 88 patients were recruited. A significant 3.1 point (15.5%) increase in understanding was demonstrable favouring the use of PVM (P < 0.001). There was no difference in patient satisfaction between the groups as judged by the CSQ-8. A significant 3.6 point (17.8%) increase in knowledge score was seen when the SVC group were crossed over to the PVM arm. A total of 80.7% of patients preferred PVM and 19.3% preferred SVC. Limitations include the lack of a validated questionnaire to test knowledge acquired from the interventions. This study demonstrates patients' preference towards PVM in the urological consent process of cystoscopy and ureteric stent insertion. PVM improves patient's understanding compared with SVC and is a more effective means of content delivery to patients in terms of overall preference and knowledge gained during the consent process. © 2016 The

  18. Modeling of video traffic in packet networks, low rate video compression, and the development of a lossy+lossless image compression algorithm

    Science.gov (United States)

    Sayood, K.; Chen, Y. C.; Wang, X.

    1992-01-01

    During this reporting period we have worked on three somewhat different problems. These are modeling of video traffic in packet networks, low rate video compression, and the development of a lossy + lossless image compression algorithm, which might have some application in browsing algorithms. The lossy + lossless scheme is an extension of work previously done under this grant. It provides a simple technique for incorporating browsing capability. The low rate coding scheme is also a simple variation on the standard discrete cosine transform (DCT) coding approach. In spite of its simplicity, the approach provides surprisingly high quality reconstructions. The modeling approach is borrowed from the speech recognition literature, and seems to be promising in that it provides a simple way of obtaining an idea about the second order behavior of a particular coding scheme. Details about these are presented.

  19. Uruguay; Report on Observance of Standards and Codes-Data Module and the Response by the Authorities

    OpenAIRE

    International Monetary Fund

    2001-01-01

    The paper provides a summary of Uruguay's practices with respect to the coverage, periodicity, and timeliness of the Special Data Dissemination Standard (SDDS) data categories, and an assessment of the quality of national accounts, prices, fiscal, monetary and financial, and external sector statistics. Uruguay has made good progress recently in improving the dissemination of statistical information. The Internet pages of the Central Bank of Uruguay (BCU) and the National Institute of Statisti...

  20. Development of an accident consequence assessment code for evaluating site suitability of light- and heavy-water reactors based on the Korean Technical standards

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Won Tae; Jeong, Hae Sung; Jeong, Hyo Joon; Kil, A Reum; Kim, Eun Han; Han, Moon Hee [Nuclear Environment Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    Methodologies for a series of radiological consequence assessments show a distinctive difference according to the design principles of the original nuclear suppliers and their technical standards to be imposed. This is due to the uncertainties of the accidental source term, radionuclide behavior in the environment, and subsequent radiological dose. Both types of PWR and PHWR are operated in Korea. However, technical standards for evaluating atmospheric dispersion have been enacted based on the U.S. NRC's positions regardless of the reactor types. For this reason, it might cause a controversy between the licensor and licensee of a nuclear power plant. It was modelled under the framework of the NRC Regulatory Guide 1.145 for light-water reactors, reflecting the features of heavy-water reactors as specified in the Canadian National Standard and the modelling features in MACCS2, such as atmospheric diffusion coefficient, ground deposition, surface roughness, radioactive plume depletion, and exposure from ground deposition. An integrated accident consequence assessment code, ACCESS (Accident Consequence Assessment Code for Evaluating Site Suitability), was developed by taking into account the unique regulatory positions for reactor types under the framework of the current Korean technical standards. Field tracer experiments and hand calculations have been carried out for validation and verification of the models. The modelling approaches of ACCESS and its features are introduced, and its applicative results for a hypothetical accidental scenario are comprehensively discussed. In an applicative study, the predicted results by the light-water reactor assessment model were higher than those by other models in terms of total doses.