WorldWideScience

Sample records for source coded character

  1. Boolean logic and character state identity: pitfalls of character coding in metazoan cladistics

    NARCIS (Netherlands)

    Jenner, Ronald A.

    2002-01-01

    A critical study of the morphological data sets used for the most recent analyses of metazoan cladistics exposes a rather cavalier attitude towards character coding. Binary absence/presence coding is ubiquitous, but without any explicit justification. This uncompromising application of Boolean logic

  2. LDGM Codes for Channel Coding and Joint Source-Channel Coding of Correlated Sources

    Directory of Open Access Journals (Sweden)

    Javier Garcia-Frias

    2005-05-01

    Full Text Available We propose a coding scheme based on the use of systematic linear codes with low-density generator matrix (LDGM codes for channel coding and joint source-channel coding of multiterminal correlated binary sources. In both cases, the structures of the LDGM encoder and decoder are shown, and a concatenated scheme aimed at reducing the error floor is proposed. Several decoding possibilities are investigated, compared, and evaluated. For different types of noisy channels and correlation models, the resulting performance is very close to the theoretical limits.

  3. The Effects of Single and Dual Coded Multimedia Instructional Methods on Chinese Character Learning

    Science.gov (United States)

    Wang, Ling

    2013-01-01

    Learning Chinese characters is a difficult task for adult English native speakers due to the significant differences between the Chinese and English writing system. The visuospatial properties of Chinese characters have inspired the development of instructional methods using both verbal and visual information based on the Dual Coding Theory. This…

  4. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  5. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  6. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    Science.gov (United States)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  7. Joint source-channel coding using variable length codes

    NARCIS (Netherlands)

    Balakirsky, V.B.

    2001-01-01

    We address the problem of joint source-channel coding when variable-length codes are used for information transmission over a discrete memoryless channel. Data transmitted over the channel are interpreted as pairs (m k ,t k ), where m k is a message generated by the source and t k is a time instant

  8. Rate-adaptive BCH coding for Slepian-Wolf coding of highly correlated sources

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Salmistraro, Matteo; Larsen, Knud J.

    2012-01-01

    This paper considers using BCH codes for distributed source coding using feedback. The focus is on coding using short block lengths for a binary source, X, having a high correlation between each symbol to be coded and a side information, Y, such that the marginal probability of each symbol, Xi in X......, given Y is highly skewed. In the analysis, noiseless feedback and noiseless communication are assumed. A rate-adaptive BCH code is presented and applied to distributed source coding. Simulation results for a fixed error probability show that rate-adaptive BCH achieves better performance than LDPCA (Low......-Density Parity-Check Accumulate) codes for high correlation between source symbols and the side information....

  9. Multiple LDPC decoding for distributed source coding and video coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Luong, Huynh Van; Huang, Xin

    2011-01-01

    Distributed source coding (DSC) is a coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. Distributed video coding (DVC) is one example. This paper considers the use of Low Density Parity Check Accumulate...... (LDPCA) codes in a DSC scheme with feed-back. To improve the LDPC coding performance in the context of DSC and DVC, while retaining short encoder blocks, this paper proposes multiple parallel LDPC decoding. The proposed scheme passes soft information between decoders to enhance performance. Experimental...

  10. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  11. Transmission imaging with a coded source

    International Nuclear Information System (INIS)

    Stoner, W.W.; Sage, J.P.; Braun, M.; Wilson, D.T.; Barrett, H.H.

    1976-01-01

    The conventional approach to transmission imaging is to use a rotating anode x-ray tube, which provides the small, brilliant x-ray source needed to cast sharp images of acceptable intensity. Stationary anode sources, although inherently less brilliant, are more compatible with the use of large area anodes, and so they can be made more powerful than rotating anode sources. Spatial modulation of the source distribution provides a way to introduce detailed structure in the transmission images cast by large area sources, and this permits the recovery of high resolution images, in spite of the source diameter. The spatial modulation is deliberately chosen to optimize recovery of image structure; the modulation pattern is therefore called a ''code.'' A variety of codes may be used; the essential mathematical property is that the code possess a sharply peaked autocorrelation function, because this property permits the decoding of the raw image cast by th coded source. Random point arrays, non-redundant point arrays, and the Fresnel zone pattern are examples of suitable codes. This paper is restricted to the case of the Fresnel zone pattern code, which has the unique additional property of generating raw images analogous to Fresnel holograms. Because the spatial frequency of these raw images are extremely coarse compared with actual holograms, a photoreduction step onto a holographic plate is necessary before the decoded image may be displayed with the aid of coherent illumination

  12. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  13. Distributed source coding of video

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Van Luong, Huynh

    2015-01-01

    A foundation for distributed source coding was established in the classic papers of Slepian-Wolf (SW) [1] and Wyner-Ziv (WZ) [2]. This has provided a starting point for work on Distributed Video Coding (DVC), which exploits the source statistics at the decoder side offering shifting processing...... steps, conventionally performed at the video encoder side, to the decoder side. Emerging applications such as wireless visual sensor networks and wireless video surveillance all require lightweight video encoding with high coding efficiency and error-resilience. The video data of DVC schemes differ from...... the assumptions of SW and WZ distributed coding, e.g. by being correlated in time and nonstationary. Improving the efficiency of DVC coding is challenging. This paper presents some selected techniques to address the DVC challenges. Focus is put on pin-pointing how the decoder steps are modified to provide...

  14. NOBAI: a web server for character coding of geometrical and statistical features in RNA structure

    Science.gov (United States)

    Knudsen, Vegeir; Caetano-Anollés, Gustavo

    2008-01-01

    The Numeration of Objects in Biology: Alignment Inferences (NOBAI) web server provides a web interface to the applications in the NOBAI software package. This software codes topological and thermodynamic information related to the secondary structure of RNA molecules as multi-state phylogenetic characters, builds character matrices directly in NEXUS format and provides sequence randomization options. The web server is an effective tool that facilitates the search for evolutionary history embedded in the structure of functional RNA molecules. The NOBAI web server is accessible at ‘http://www.manet.uiuc.edu/nobai/nobai.php’. This web site is free and open to all users and there is no login requirement. PMID:18448469

  15. Code Forking, Governance, and Sustainability in Open Source Software

    OpenAIRE

    Juho Lindman; Linus Nyman

    2013-01-01

    The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibilit...

  16. Code Forking, Governance, and Sustainability in Open Source Software

    Directory of Open Access Journals (Sweden)

    Juho Lindman

    2013-01-01

    Full Text Available The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibility of forking code, affects the governance and sustainability of open source initiatives on three distinct levels: software, community, and ecosystem. On the software level, the right to fork makes planned obsolescence, versioning, vendor lock-in, end-of-support issues, and similar initiatives all but impossible to implement. On the community level, forking impacts both sustainability and governance through the power it grants the community to safeguard against unfavourable actions by corporations or project leaders. On the business-ecosystem level forking can serve as a catalyst for innovation while simultaneously promoting better quality software through natural selection. Thus, forking helps keep open source initiatives relevant and presents opportunities for the development and commercialization of current and abandoned programs.

  17. On the Combination of Multi-Layer Source Coding and Network Coding for Wireless Networks

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Fitzek, Frank; Pedersen, Morten Videbæk

    2013-01-01

    quality is developed. A linear coding structure designed to gracefully encapsulate layered source coding provides both low complexity of the utilised linear coding while enabling robust erasure correction in the form of fountain coding capabilities. The proposed linear coding structure advocates efficient...

  18. Image authentication using distributed source coding.

    Science.gov (United States)

    Lin, Yao-Chung; Varodayan, David; Girod, Bernd

    2012-01-01

    We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.

  19. The Astrophysics Source Code Library by the numbers

    Science.gov (United States)

    Allen, Alice; Teuben, Peter; Berriman, G. Bruce; DuPrie, Kimberly; Mink, Jessica; Nemiroff, Robert; Ryan, PW; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Wallin, John; Warmels, Rein

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) was founded in 1999 by Robert Nemiroff and John Wallin. ASCL editors seek both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and add entries for the found codes to the library. Software authors can submit their codes to the ASCL as well. This ensures a comprehensive listing covering a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL is indexed by both NASA’s Astrophysics Data System (ADS) and Web of Science, making software used in research more discoverable. This presentation covers the growth in the ASCL’s number of entries, the number of citations to its entries, and in which journals those citations appear. It also discusses what changes have been made to the ASCL recently, and what its plans are for the future.

  20. Data processing with microcode designed with source coding

    Science.gov (United States)

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  1. Present state of the SOURCES computer code

    International Nuclear Information System (INIS)

    Shores, Erik F.

    2002-01-01

    In various stages of development for over two decades, the SOURCES computer code continues to calculate neutron production rates and spectra from four types of problems: homogeneous media, two-region interfaces, three-region interfaces and that of a monoenergetic alpha particle beam incident on a slab of target material. Graduate work at the University of Missouri - Rolla, in addition to user feedback from a tutorial course, provided the impetus for a variety of code improvements. Recently upgraded to version 4B, initial modifications to SOURCES focused on updates to the 'tape5' decay data library. Shortly thereafter, efforts focused on development of a graphical user interface for the code. This paper documents the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) and describes additional library modifications in more detail. Minor improvements and planned enhancements are discussed.

  2. Schroedinger’s Code: A Preliminary Study on Research Source Code Availability and Link Persistence in Astrophysics

    Science.gov (United States)

    Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley

    2018-05-01

    We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.

  3. Iterative List Decoding of Concatenated Source-Channel Codes

    Directory of Open Access Journals (Sweden)

    Hedayat Ahmadreza

    2005-01-01

    Full Text Available Whenever variable-length entropy codes are used in the presence of a noisy channel, any channel errors will propagate and cause significant harm. Despite using channel codes, some residual errors always remain, whose effect will get magnified by error propagation. Mitigating this undesirable effect is of great practical interest. One approach is to use the residual redundancy of variable length codes for joint source-channel decoding. In this paper, we improve the performance of residual redundancy source-channel decoding via an iterative list decoder made possible by a nonbinary outer CRC code. We show that the list decoding of VLC's is beneficial for entropy codes that contain redundancy. Such codes are used in state-of-the-art video coders, for example. The proposed list decoder improves the overall performance significantly in AWGN and fully interleaved Rayleigh fading channels.

  4. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  5. Optimization of Coding of AR Sources for Transmission Across Channels with Loss

    DEFF Research Database (Denmark)

    Arildsen, Thomas

    Source coding concerns the representation of information in a source signal using as few bits as possible. In the case of lossy source coding, it is the encoding of a source signal using the fewest possible bits at a given distortion or, at the lowest possible distortion given a specified bit rate....... Channel coding is usually applied in combination with source coding to ensure reliable transmission of the (source coded) information at the maximal rate across a channel given the properties of this channel. In this thesis, we consider the coding of auto-regressive (AR) sources which are sources that can...... compared to the case where the encoder is unaware of channel loss. We finally provide an extensive overview of cross-layer communication issues which are important to consider due to the fact that the proposed algorithm interacts with the source coding and exploits channel-related information typically...

  6. GRHydro: a new open-source general-relativistic magnetohydrodynamics code for the Einstein toolkit

    International Nuclear Information System (INIS)

    Mösta, Philipp; Haas, Roland; Ott, Christian D; Reisswig, Christian; Mundim, Bruno C; Faber, Joshua A; Noble, Scott C; Bode, Tanja; Löffler, Frank; Schnetter, Erik

    2014-01-01

    We present the new general-relativistic magnetohydrodynamics (GRMHD) capabilities of the Einstein toolkit, an open-source community-driven numerical relativity and computational relativistic astrophysics code. The GRMHD extension of the toolkit builds upon previous releases and implements the evolution of relativistic magnetized fluids in the ideal MHD limit in fully dynamical spacetimes using the same shock-capturing techniques previously applied to hydrodynamical evolution. In order to maintain the divergence-free character of the magnetic field, the code implements both constrained transport and hyperbolic divergence cleaning schemes. We present test results for a number of MHD tests in Minkowski and curved spacetimes. Minkowski tests include aligned and oblique planar shocks, cylindrical explosions, magnetic rotors, Alfvén waves and advected loops, as well as a set of tests designed to study the response of the divergence cleaning scheme to numerically generated monopoles. We study the code’s performance in curved spacetimes with spherical accretion onto a black hole on a fixed background spacetime and in fully dynamical spacetimes by evolutions of a magnetized polytropic neutron star and of the collapse of a magnetized stellar core. Our results agree well with exact solutions where these are available and we demonstrate convergence. All code and input files used to generate the results are available on http://einsteintoolkit.org. This makes our work fully reproducible and provides new users with an introduction to applications of the code. (paper)

  7. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  8. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  9. Towards Holography via Quantum Source-Channel Codes

    Science.gov (United States)

    Pastawski, Fernando; Eisert, Jens; Wilming, Henrik

    2017-07-01

    While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.

  10. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  11. An efficient chaotic source coding scheme with variable-length blocks

    International Nuclear Information System (INIS)

    Lin Qiu-Zhen; Wong Kwok-Wo; Chen Jian-Yong

    2011-01-01

    An efficient chaotic source coding scheme operating on variable-length blocks is proposed. With the source message represented by a trajectory in the state space of a chaotic system, data compression is achieved when the dynamical system is adapted to the probability distribution of the source symbols. For infinite-precision computation, the theoretical compression performance of this chaotic coding approach attains that of optimal entropy coding. In finite-precision implementation, it can be realized by encoding variable-length blocks using a piecewise linear chaotic map within the precision of register length. In the decoding process, the bit shift in the register can track the synchronization of the initial value and the corresponding block. Therefore, all the variable-length blocks are decoded correctly. Simulation results show that the proposed scheme performs well with high efficiency and minor compression loss when compared with traditional entropy coding. (general)

  12. Meeting Characters in Caldecotts: What Does This Mean for Today's Readers?

    Science.gov (United States)

    Koss, Melanie D.; Martinez, Miriam; Johnson, Nancy J.

    2016-01-01

    We examined representations of main characters in Caldecott Award winner and honor books over the past 25 years. Each book containing a human main character was coded for the following features: culture/ethnicity, gender, age, place where character lives, time period in which the character lives, disability, religion, socioeconomic status, and…

  13. Coded aperture imaging of alpha source spatial distribution

    International Nuclear Information System (INIS)

    Talebitaher, Alireza; Shutler, Paul M.E.; Springham, Stuart V.; Rawat, Rajdeep S.; Lee, Paul

    2012-01-01

    The Coded Aperture Imaging (CAI) technique has been applied with CR-39 nuclear track detectors to image alpha particle source spatial distributions. The experimental setup comprised: a 226 Ra source of alpha particles, a laser-machined CAI mask, and CR-39 detectors, arranged inside a vacuum enclosure. Three different alpha particle source shapes were synthesized by using a linear translator to move the 226 Ra source within the vacuum enclosure. The coded mask pattern used is based on a Singer Cyclic Difference Set, with 400 pixels and 57 open square holes (representing ρ = 1/7 = 14.3% open fraction). After etching of the CR-39 detectors, the area, circularity, mean optical density and positions of all candidate tracks were measured by an automated scanning system. Appropriate criteria were used to select alpha particle tracks, and a decoding algorithm applied to the (x, y) data produced the de-coded image of the source. Signal to Noise Ratio (SNR) values obtained for alpha particle CAI images were found to be substantially better than those for corresponding pinhole images, although the CAI-SNR values were below the predictions of theoretical formulae. Monte Carlo simulations of CAI and pinhole imaging were performed in order to validate the theoretical SNR formulae and also our CAI decoding algorithm. There was found to be good agreement between the theoretical formulae and SNR values obtained from simulations. Possible reasons for the lower SNR obtained for the experimental CAI study are discussed.

  14. An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Tu Zhenyu

    2005-01-01

    Full Text Available A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF in the source encoder and an inverse syndrome former (ISF in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

  15. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories.

  16. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2004-01-01

    The objectives of the Code of Conduct are, through the development, harmonization and implementation of national policies, laws and regulations, and through the fostering of international co-operation, to: (i) achieve and maintain a high level of safety and security of radioactive sources; (ii) prevent unauthorized access or damage to, and loss, theft or unauthorized transfer of, radioactive sources, so as to reduce the likelihood of accidental harmful exposure to such sources or the malicious use of such sources to cause harm to individuals, society or the environment; and (iii) mitigate or minimize the radiological consequences of any accident or malicious act involving a radioactive source. These objectives should be achieved through the establishment of an adequate system of regulatory control of radioactive sources, applicable from the stage of initial production to their final disposal, and a system for the restoration of such control if it has been lost. This Code relies on existing international standards relating to nuclear, radiation, radioactive waste and transport safety and to the control of radioactive sources. It is intended to complement existing international standards in these areas. The Code of Conduct serves as guidance in general issues, legislation and regulations, regulatory bodies as well as import and export of radioactive sources. A list of radioactive sources covered by the code is provided which includes activities corresponding to thresholds of categories

  17. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    Science.gov (United States)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  18. OSSMETER D3.2 – Report on Source Code Activity Metrics

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and initial prototypes of the tools that are needed for source code activity analysis. It builds upon the Deliverable 3.1 where infra-structure and a domain analysis have been

  19. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  20. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    Science.gov (United States)

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  1. Joint source/channel coding of scalable video over noisy channels

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, G.; Zakhor, A. [Department of Electrical Engineering and Computer Sciences University of California Berkeley, California94720 (United States)

    1997-01-01

    We propose an optimal bit allocation strategy for a joint source/channel video codec over noisy channel when the channel state is assumed to be known. Our approach is to partition source and channel coding bits in such a way that the expected distortion is minimized. The particular source coding algorithm we use is rate scalable and is based on 3D subband coding with multi-rate quantization. We show that using this strategy, transmission of video over very noisy channels still renders acceptable visual quality, and outperforms schemes that use equal error protection only. The flexibility of the algorithm also permits the bit allocation to be selected optimally when the channel state is in the form of a probability distribution instead of a deterministic state. {copyright} {ital 1997 American Institute of Physics.}

  2. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Fossorier Marc

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope -ary phase shift key ( -PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded -PSK signaling (with . Then, it is extended to include coded -PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded -PSK signaling performs 3.1 to 5.2 dB better than uncoded -PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  3. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    . However, it comes with the price of several challenges, including the limited power and bandwidth resources for wireless transmission of audio recordings. In such a setup, we study the problem of source coding for the compression of the audio recordings before the transmission in order to reduce the power...... consumption and/or transmission bandwidth by reduction in the transmission rates. Source coding for wireless microphones in reverberant environments has several special characteristics which make it more challenging in comparison with regular audio coding. The signals which are acquired by the microphones......Modern multimedia systems are more and more shifting toward distributed and networked structures. This includes audio systems, where networks of wireless distributed microphones are replacing the traditional microphone arrays. This allows for flexibility of placement and high spatial diversity...

  4. Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver

    Directory of Open Access Journals (Sweden)

    Ser Javier Del

    2005-01-01

    Full Text Available We consider the case of two correlated sources, and . The correlation between them has memory, and it is modelled by a hidden Markov chain. The paper studies the problem of reliable communication of the information sent by the source over an additive white Gaussian noise (AWGN channel when the output of the other source is available as side information at the receiver. We assume that the receiver has no a priori knowledge of the correlation statistics between the sources. In particular, we propose the use of a turbo code for joint source-channel coding of the source . The joint decoder uses an iterative scheme where the unknown parameters of the correlation model are estimated jointly within the decoding process. It is shown that reliable communication is possible at signal-to-noise ratios close to the theoretical limits set by the combination of Shannon and Slepian-Wolf theorems.

  5. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Marc Fossorier

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope M-ary phase shift key (M-PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded M-PSK signaling (with M=2k. Then, it is extended to include coded M-PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded M-PSK signaling performs 3.1 to 5.2 dB better than uncoded M-PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  6. Comparison of DT neutron production codes MCUNED, ENEA-JSI source subroutine and DDT

    Energy Technology Data Exchange (ETDEWEB)

    Čufar, Aljaž, E-mail: aljaz.cufar@ijs.si [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Lengar, Igor; Kodeli, Ivan [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Milocco, Alberto [Culham Centre for Fusion Energy, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Sauvan, Patrick [Departamento de Ingeniería Energética, E.T.S. Ingenieros Industriales, UNED, C/Juan del Rosal 12, 28040 Madrid (Spain); Conroy, Sean [VR Association, Uppsala University, Department of Physics and Astronomy, PO Box 516, SE-75120 Uppsala (Sweden); Snoj, Luka [Reactor Physics Department, Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2016-11-01

    Highlights: • Results of three codes capable of simulating the accelerator based DT neutron generators were compared on a simple model where only a thin target made of mixture of titanium and tritium is present. Two typical deuteron beam energies, 100 keV and 250 keV, were used in the comparison. • Comparisons of the angular dependence of the total neutron flux and spectrum as well as the neutron spectrum of all the neutrons emitted from the target show general agreement of the results but also some noticeable differences. • A comparison of figures of merit of the calculations using different codes showed that the computational time necessary to achieve the same statistical uncertainty can vary for more than 30× when different codes for the simulation of the DT neutron generator are used. - Abstract: As the DT fusion reaction produces neutrons with energies significantly higher than in fission reactors, special fusion-relevant benchmark experiments are often performed using DT neutron generators. However, commonly used Monte Carlo particle transport codes such as MCNP or TRIPOLI cannot be directly used to analyze these experiments since they do not have the capabilities to model the production of DT neutrons. Three of the available approaches to model the DT neutron generator source are the MCUNED code, the ENEA-JSI DT source subroutine and the DDT code. The MCUNED code is an extension of the well-established and validated MCNPX Monte Carlo code. The ENEA-JSI source subroutine was originally prepared for the modelling of the FNG experiments using different versions of the MCNP code (−4, −5, −X) and was later extended to allow the modelling of both DT and DD neutron sources. The DDT code prepares the DT source definition file (SDEF card in MCNP) which can then be used in different versions of the MCNP code. In the paper the methods for the simulation of the DT neutron production used in the codes are briefly described and compared for the case of a

  7. Document image retrieval through word shape coding.

    Science.gov (United States)

    Lu, Shijian; Li, Linlin; Tan, Chew Lim

    2008-11-01

    This paper presents a document retrieval technique that is capable of searching document images without OCR (optical character recognition). The proposed technique retrieves document images by a new word shape coding scheme, which captures the document content through annotating each word image by a word shape code. In particular, we annotate word images by using a set of topological shape features including character ascenders/descenders, character holes, and character water reservoirs. With the annotated word shape codes, document images can be retrieved by either query keywords or a query document image. Experimental results show that the proposed document image retrieval technique is fast, efficient, and tolerant to various types of document degradation.

  8. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  9. Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider the problem of remote vector Gaussian source coding for a wireless acoustic sensor network. Each node receives messages from multiple nodes in the network and decodes these messages using its own measurement of the sound field as side information. The node’s measurement...... and the estimates of the source resulting from decoding the received messages are then jointly encoded and transmitted to a neighboring node in the network. We show that for this distributed source coding scenario, one can encode a so-called conditional sufficient statistic of the sources instead of jointly...

  10. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  11. Test of Effective Solid Angle code for the efficiency calculation of volume source

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    It is hard to determine a full energy (FE) absorption peak efficiency curve for an arbitrary volume source by experiment. That's why the simulation and semi-empirical methods have been preferred so far, and many works have progressed in various ways. Moens et al. determined the concept of effective solid angle by considering an attenuation effect of γ-rays in source, media and detector. This concept is based on a semi-empirical method. An Effective Solid Angle code (ESA code) has been developed for years by the Applied Nuclear Physics Group in Seoul National University. ESA code converts an experimental FE efficiency curve determined by using a standard point source to that for a volume source. To test the performance of ESA Code, we measured the point standard sources and voluminous certified reference material (CRM) sources of γ-ray, and compared with efficiency curves obtained in this study. 200∼1500 KeV energy region is fitted well. NIST X-ray mass attenuation coefficient data is used currently to check for the effect of linear attenuation only. We will use the interaction cross-section data obtained from XCOM code to check the each contributing factor like photoelectric effect, incoherent scattering and coherent scattering in the future. In order to minimize the calculation time and code simplification, optimization of algorithm is needed.

  12. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Directory of Open Access Journals (Sweden)

    Pierre Siohan

    2005-05-01

    Full Text Available Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC and variable-length source codes (VLC widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  13. Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey

    Science.gov (United States)

    Guillemot, Christine; Siohan, Pierre

    2005-12-01

    Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.

  14. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Villalobos, Hector J [ORNL; Gregor, Jens [University of Tennessee, Knoxville (UTK); Bingham, Philip R [ORNL

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  15. Transliterating non-ASCII characters with Python

    Directory of Open Access Journals (Sweden)

    Seth Bernstein

    2013-10-01

    Full Text Available This lesson shows how to use Python to transliterate automatically a list of words from a language with a non-Latin alphabet to a standardized format using the American Standard Code for Information Interchange (ASCII characters. It builds on readers’ understanding of Python from the lessons “Viewing HTML Files,” “Working with Web Pages,” “From HTML to List of Words (part 1” and “Intro to Beautiful Soup.” At the end of the lesson, we will use the transliteration dictionary to convert the names from a database of the Russian organization Memorial from Cyrillic into Latin characters. Although the example uses Cyrillic characters, the technique can be reproduced with other alphabets using Unicode.

  16. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  17. Code of conduct on the safety and security of radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost.

  18. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    2001-03-01

    The objective of this Code is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through tile fostering of international co-operation. In particular, this Code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost

  19. The Astrophysics Source Code Library: Supporting software publication and citation

    Science.gov (United States)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  20. Distributed Source Coding Techniques for Lossless Compression of Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Barni Mauro

    2007-01-01

    Full Text Available This paper deals with the application of distributed source coding (DSC theory to remote sensing image compression. Although DSC exhibits a significant potential in many application fields, up till now the results obtained on real signals fall short of the theoretical bounds, and often impose additional system-level constraints. The objective of this paper is to assess the potential of DSC for lossless image compression carried out onboard a remote platform. We first provide a brief overview of DSC of correlated information sources. We then focus on onboard lossless image compression, and apply DSC techniques in order to reduce the complexity of the onboard encoder, at the expense of the decoder's, by exploiting the correlation of different bands of a hyperspectral dataset. Specifically, we propose two different compression schemes, one based on powerful binary error-correcting codes employed as source codes, and one based on simpler multilevel coset codes. The performance of both schemes is evaluated on a few AVIRIS scenes, and is compared with other state-of-the-art 2D and 3D coders. Both schemes turn out to achieve competitive compression performance, and one of them also has reduced complexity. Based on these results, we highlight the main issues that are still to be solved to further improve the performance of DSC-based remote sensing systems.

  1. Remodularizing Java Programs for Improved Locality of Feature Implementations in Source Code

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    Explicit traceability between features and source code is known to help programmers to understand and modify programs during maintenance tasks. However, the complex relations between features and their implementations are not evident from the source code of object-oriented Java programs....... Consequently, the implementations of individual features are difficult to locate, comprehend, and modify in isolation. In this paper, we present a novel remodularization approach that improves the representation of features in the source code of Java programs. Both forward- and reverse restructurings...... are supported through on-demand bidirectional restructuring between feature-oriented and object-oriented decompositions. The approach includes a feature location phase based of tracing program execution, a feature representation phase that reallocates classes into a new package structure based on single...

  2. Distributed coding of multiview sparse sources with joint recovery

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Deligiannis, Nikos; Forchhammer, Søren

    2016-01-01

    In support of applications involving multiview sources in distributed object recognition using lightweight cameras, we propose a new method for the distributed coding of sparse sources as visual descriptor histograms extracted from multiview images. The problem is challenging due to the computati...... transform (SIFT) descriptors extracted from multiview images shows that our method leads to bit-rate saving of up to 43% compared to the state-of-the-art distributed compressed sensing method with independent encoding of the sources....

  3. Moral character in the workplace.

    Science.gov (United States)

    Cohen, Taya R; Panter, A T; Turan, Nazli; Morse, Lily; Kim, Yeonjeong

    2014-11-01

    Using two 3-month diary studies and a large cross-sectional survey, we identified distinguishing features of adults with low versus high levels of moral character. Adults with high levels of moral character tend to: consider the needs and interests of others and how their actions affect other people (e.g., they have high levels of Honesty-Humility, empathic concern, guilt proneness); regulate their behavior effectively, specifically with reference to behaviors that have positive short-term consequences but negative long-term consequences (e.g., they have high levels of Conscientiousness, self-control, consideration of future consequences); and value being moral (e.g., they have high levels of moral identity-internalization). Cognitive moral development, Emotionality, and social value orientation were found to be relatively undiagnostic of moral character. Studies 1 and 2 revealed that employees with low moral character committed harmful work behaviors more frequently and helpful work behaviors less frequently than did employees with high moral character, according to their own admissions and coworkers' observations. Study 3 revealed that adults with low moral character committed more delinquent behavior and had more lenient attitudes toward unethical negotiation tactics than did adults with high moral character. By showing that individual differences have consistent, meaningful effects on employees' behaviors, after controlling for demographic variables (e.g., gender, age, income) and basic attributes of the work setting (e.g., enforcement of an ethics code), our results contest situationist perspectives that deemphasize the importance of personality. Moral people can be identified by self-reports in surveys, and these self-reports predict consequential behaviors months after the initial assessment.

  4. Revised IAEA Code of Conduct on the Safety and Security of Radioactive Sources

    International Nuclear Information System (INIS)

    Wheatley, J. S.

    2004-01-01

    The revised Code of Conduct on the Safety and Security of Radioactive Sources is aimed primarily at Governments, with the objective of achieving and maintaining a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations; and through the fostering of international co-operation. It focuses on sealed radioactive sources and provides guidance on legislation, regulations and the regulatory body, and import/export controls. Nuclear materials (except for sources containing 239Pu), as defined in the Convention on the Physical Protection of Nuclear Materials, are not covered by the revised Code, nor are radioactive sources within military or defence programmes. An earlier version of the Code was published by IAEA in 2001. At that time, agreement was not reached on a number of issues, notably those relating to the creation of comprehensive national registries for radioactive sources, obligations of States exporting radioactive sources, and the possibility of unilateral declarations of support. The need to further consider these and other issues was highlighted by the events of 11th September 2001. Since then, the IAEA's Secretariat has been working closely with Member States and relevant International Organizations to achieve consensus. The text of the revised Code was finalized at a meeting of technical and legal experts in August 2003, and it was submitted to IAEA's Board of Governors for approval in September 2003, with a recommendation that the IAEA General Conference adopt it and encourage its wide implementation. The IAEA General Conference, in September 2003, endorsed the revised Code and urged States to work towards following the guidance contained within it. This paper summarizes the history behind the revised Code, its content and the outcome of the discussions within the IAEA Board of Governors and General Conference. (Author) 8 refs

  5. Code of conduct on the safety and security of radioactive sources

    International Nuclear Information System (INIS)

    Anon.

    2001-01-01

    The objective of the code of conduct is to achieve and maintain a high level of safety and security of radioactive sources through the development, harmonization and enforcement of national policies, laws and regulations, and through the fostering of international co-operation. In particular, this code addresses the establishment of an adequate system of regulatory control from the production of radioactive sources to their final disposal, and a system for the restoration of such control if it has been lost. (N.C.)

  6. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-07-09

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.

  7. Distributed Remote Vector Gaussian Source Coding with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2014-01-01

    In this paper, we consider a distributed remote source coding problem, where a sequence of observations of source vectors is available at the encoder. The problem is to specify the optimal rate for encoding the observations subject to a covariance matrix distortion constraint and in the presence...

  8. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    International Nuclear Information System (INIS)

    Etienne, Zachariah B; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L

    2015-01-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores. (paper)

  9. THE CHARACTER ANALYSIS OF GLEN HANSARD IN ONCE FILM

    Directory of Open Access Journals (Sweden)

    Nani Rosnani Thamrin

    2013-12-01

    Full Text Available This paper analyzed the characterization of talented actor named Glen Hansard in Once film. This study employed a descriptive qualitative research design based on theories of Rahardjo (1985, Robert (1965, and Card (1988. Primary data sources were whole Once scenes film directed by John Carney which had low-budgeting production with two stars, Glen Hansard and Irglova, while secondary data sources were collected from the articles related to this study. This research mainly investigated two aspects involved two character analyses of the main actor, Hansard. The first one was the construction of Hansard’s characters and the second one was the effect between his character and another actors’ character. The study showed that Hansard’s characters were constructed by five factors: from what the character did and said, what the other characters said about him, how the appearance and its milieu were, influencing one and another.The study also found that he had struggle, visionary, ambitious, introvert, sensitive, straightforward and curious characters, but more characters that showed strong characters of his were struggle, visionary and ambitious, because the scenes reflected them more.

  10. Domain-Specific Acceleration and Auto-Parallelization of Legacy Scientific Code in FORTRAN 77 using Source-to-Source Compilation

    OpenAIRE

    Vanderbauwhede, Wim; Davidson, Gavin

    2017-01-01

    Massively parallel accelerators such as GPGPUs, manycores and FPGAs represent a powerful and affordable tool for scientists who look to speed up simulations of complex systems. However, porting code to such devices requires a detailed understanding of heterogeneous programming tools and effective strategies for parallelization. In this paper we present a source to source compilation approach with whole-program analysis to automatically transform single-threaded FORTRAN 77 legacy code into Ope...

  11. Automating RPM Creation from a Source Code Repository

    Science.gov (United States)

    2012-02-01

    apps/usr --with- libpq=/apps/ postgres make rm -rf $RPM_BUILD_ROOT umask 0077 mkdir -p $RPM_BUILD_ROOT/usr/local/bin mkdir -p $RPM_BUILD_ROOT...from a source code repository. %pre %prep %setup %build ./autogen.sh ; ./configure --with-db=/apps/db --with-libpq=/apps/ postgres make

  12. GapCoder automates the use of indel characters in phylogenetic analysis.

    Science.gov (United States)

    Young, Nelson D; Healy, John

    2003-02-19

    Several ways of incorporating indels into phylogenetic analysis have been suggested. Simple indel coding has two strengths: (1) biological realism and (2) efficiency of analysis. In the method, each indel with different start and/or end positions is considered to be a separate character. The presence/absence of these indel characters is then added to the data set. We have written a program, GapCoder to automate this procedure. The program can input PIR format aligned datasets, find the indels and add the indel-based characters. The output is a NEXUS format file, which includes a table showing what region each indel characters is based on. If regions are excluded from analysis, this table makes it easy to identify the corresponding indel characters for exclusion. Manual implementation of the simple indel coding method can be very time-consuming, especially in data sets where indels are numerous and/or overlapping. GapCoder automates this method and is therefore particularly useful during procedures where phylogenetic analyses need to be repeated many times, such as when different alignments are being explored or when various taxon or character sets are being explored. GapCoder is currently available for Windows from http://www.home.duq.edu/~youngnd/GapCoder.

  13. Features fusion based approach for handwritten Gujarati character recognition

    Directory of Open Access Journals (Sweden)

    Ankit Sharma

    2017-02-01

    Full Text Available Handwritten character recognition is a challenging area of research. Lots of research activities in the area of character recognition are already done for Indian languages such as Hindi, Bangla, Kannada, Tamil and Telugu. Literature review on handwritten character recognition indicates that in comparison with other Indian scripts research activities on Gujarati handwritten character recognition are very less.  This paper aims to bring Gujarati character recognition in attention. Recognition of isolated Gujarati handwritten characters is proposed using three different kinds of features and their fusion. Chain code based, zone based and projection profiles based features are utilized as individual features. One of the significant contribution of proposed work is towards the generation of large and representative dataset of 88,000 handwritten Gujarati characters. Experiments are carried out on this developed dataset. Artificial Neural Network (ANN, Support Vector Machine (SVM and Naive Bayes (NB classifier based methods are implemented for handwritten Gujarati character recognition. Experimental results show substantial enhancement over state-of-the-art and authenticate our proposals.

  14. Development of in-vessel source term analysis code, tracer

    International Nuclear Information System (INIS)

    Miyagi, K.; Miyahara, S.

    1996-01-01

    Analyses of radionuclide transport in fuel failure accidents (generally referred to source terms) are considered to be important especially in the severe accident evaluation. The TRACER code has been developed to realistically predict the time dependent behavior of FPs and aerosols within the primary cooling system for wide range of fuel failure events. This paper presents the model description, results of validation study, the recent model advancement status of the code, and results of check out calculations under reactor conditions. (author)

  15. Source Coding in Networks with Covariance Distortion Constraints

    DEFF Research Database (Denmark)

    Zahedi, Adel; Østergaard, Jan; Jensen, Søren Holdt

    2016-01-01

    results to a joint source coding and denoising problem. We consider a network with a centralized topology and a given weighted sum-rate constraint, where the received signals at the center are to be fused to maximize the output SNR while enforcing no linear distortion. We show that one can design...

  16. Use of source term code package in the ELEBRA MX-850 system

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-12-01

    The implantation of source term code package in the ELEBRA-MX850 system is presented. The source term is formed when radioactive materials generated in nuclear fuel leakage toward containment and the external environment to reactor containment. The implantated version in the ELEBRA system are composed of five codes: MARCH 3, TRAPMELT 3, THCCA, VANESA and NAVA. The original example case was used. The example consists of a small loca accident in a PWR type reactor. A sensitivity study for the TRAPMELT 3 code was carried out, modifying the 'TIME STEP' to estimate the processing time of CPU for executing the original example case. (M.C.K.) [pt

  17. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    Science.gov (United States)

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  18. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  19. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    Science.gov (United States)

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  20. The Representation of Female Characters by Women Directors in Surveillance Spaces in Turkish Cinema

    OpenAIRE

    Berceste Gülçin Özdemir

    2017-01-01

    The representation of women characters in cinema has been discussed for centuries. In cinema where dominant narrative codes prevail and scopophilic views exist over women characters, passive stereotypes of women are observed in the representation of women characters. In films shot from a woman’s point of view in Turkish Cinema and even in the films outside the main stream in which the stories of women characters are told, the fact that women characters are discussed on the basis of feminist f...

  1. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    Science.gov (United States)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  2. Microdosimetry computation code of internal sources - MICRODOSE 1

    International Nuclear Information System (INIS)

    Li Weibo; Zheng Wenzhong; Ye Changqing

    1995-01-01

    This paper describes a microdosimetry computation code, MICRODOSE 1, on the basis of the following described methods: (1) the method of calculating f 1 (z) for charged particle in the unit density tissues; (2) the method of calculating f(z) for a point source; (3) the method of applying the Fourier transform theory to the calculation of the compound Poisson process; (4) the method of using fast Fourier transform technique to determine f(z) and, giving some computed examples based on the code, MICRODOSE 1, including alpha particles emitted from 239 Pu in the alveolar lung tissues and from radon progeny RaA and RAC in the human respiratory tract. (author). 13 refs., 6 figs

  3. Source Code Vulnerabilities in IoT Software Systems

    Directory of Open Access Journals (Sweden)

    Saleh Mohamed Alnaeli

    2017-08-01

    Full Text Available An empirical study that examines the usage of known vulnerable statements in software systems developed in C/C++ and used for IoT is presented. The study is conducted on 18 open source systems comprised of millions of lines of code and containing thousands of files. Static analysis methods are applied to each system to determine the number of unsafe commands (e.g., strcpy, strcmp, and strlen that are well-known among research communities to cause potential risks and security concerns, thereby decreasing a system’s robustness and quality. These unsafe statements are banned by many companies (e.g., Microsoft. The use of these commands should be avoided from the start when writing code and should be removed from legacy code over time as recommended by new C/C++ language standards. Each system is analyzed and the distribution of the known unsafe commands is presented. Historical trends in the usage of the unsafe commands of 7 of the systems are presented to show how the studied systems evolved over time with respect to the vulnerable code. The results show that the most prevalent unsafe command used for most systems is memcpy, followed by strlen. These results can be used to help train software developers on secure coding practices so that they can write higher quality software systems.

  4. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  5. THE NEED FOR CHARACTER EDUCATION

    Directory of Open Access Journals (Sweden)

    Aynur Pala

    2011-07-01

    Full Text Available Character education is a national movement creatingschools that foster ethical,responsible and caring young people by modelling and teaching good characterthrough emphasis on universal values that we all share. It is the intentional,proactive effort by schools, districts and states to instil in their students importantcore ethical values such as caring, honesty, fairness, responsibility and respect forself and others.Good character is not formed automatically; it is developed over time through asustained process of teaching, example, learning and practice. It is developedthrough character education. The intentional teaching of good character isparticularly important in today’s society since ouryouth face many opportunitiesand dangers unknown to earlier generations. They are bombarded with many morenegative influences through the media and other external sources prevalent intoday’s culture. Since children spend about 900 hours a year in school, it isessential that schools resume a proactive role in assisting families andcommunities by developing caring, respectful environments where students learncore, ethical values. When a comprehensive approachto character education isused, a positive moral culture is created in the school—a total school environmentthat supports the values taught in the classroom (Character Education Partnership,2010.The aim of this study is to provide guidelines forthe elements need for effectiveand comprehensive character education. And to emphasize the need of charactereducation to help students develop good character, which includes knowing,caring about and acting upon core ethical values such as respect, responsibility,honesty, fairness and compassion.

  6. Verification test calculations for the Source Term Code Package

    International Nuclear Information System (INIS)

    Denning, R.S.; Wooton, R.O.; Alexander, C.A.; Curtis, L.A.; Cybulskis, P.; Gieseke, J.A.; Jordan, H.; Lee, K.W.; Nicolosi, S.L.

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs

  7. Imaging x-ray sources at a finite distance in coded-mask instruments

    International Nuclear Information System (INIS)

    Donnarumma, Immacolata; Pacciani, Luigi; Lapshov, Igor; Evangelista, Yuri

    2008-01-01

    We present a method for the correction of beam divergence in finite distance sources imaging through coded-mask instruments. We discuss the defocusing artifacts induced by the finite distance showing two different approaches to remove such spurious effects. We applied our method to one-dimensional (1D) coded-mask systems, although it is also applicable in two-dimensional systems. We provide a detailed mathematical description of the adopted method and of the systematics introduced in the reconstructed image (e.g., the fraction of source flux collected in the reconstructed peak counts). The accuracy of this method was tested by simulating pointlike and extended sources at a finite distance with the instrumental setup of the SuperAGILE experiment, the 1D coded-mask x-ray imager onboard the AGILE (Astro-rivelatore Gamma a Immagini Leggero) mission. We obtained reconstructed images of good quality and high source location accuracy. Finally we show the results obtained by applying this method to real data collected during the calibration campaign of SuperAGILE. Our method was demonstrated to be a powerful tool to investigate the imaging response of the experiment, particularly the absorption due to the materials intercepting the line of sight of the instrument and the conversion between detector pixel and sky direction

  8. SOURCES-3A: A code for calculating (α, n), spontaneous fission, and delayed neutron sources and spectra

    International Nuclear Information System (INIS)

    Perry, R.T.; Wilson, W.B.; Charlton, W.S.

    1998-04-01

    In many systems, it is imperative to have accurate knowledge of all significant sources of neutrons due to the decay of radionuclides. These sources can include neutrons resulting from the spontaneous fission of actinides, the interaction of actinide decay α-particles in (α,n) reactions with low- or medium-Z nuclides, and/or delayed neutrons from the fission products of actinides. Numerous systems exist in which these neutron sources could be important. These include, but are not limited to, clean and spent nuclear fuel (UO 2 , ThO 2 , MOX, etc.), enrichment plant operations (UF 6 , PuF 4 , etc.), waste tank studies, waste products in borosilicate glass or glass-ceramic mixtures, and weapons-grade plutonium in storage containers. SOURCES-3A is a computer code that determines neutron production rates and spectra from (α,n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides in homogeneous media (i.e., a mixture of α-emitting source material and low-Z target material) and in interface problems (i.e., a slab of α-emitting source material in contact with a slab of low-Z target material). The code is also capable of calculating the neutron production rates due to (α,n) reactions induced by a monoenergetic beam of α-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The (α,n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay α-particle spectra, 24 sets of measured and/or evaluated (α,n) cross sections and product nuclide level branching fractions, and functional α-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron source. It also provides an

  9. Identification of Sparse Audio Tampering Using Distributed Source Coding and Compressive Sensing Techniques

    Directory of Open Access Journals (Sweden)

    Valenzise G

    2009-01-01

    Full Text Available In the past few years, a large amount of techniques have been proposed to identify whether a multimedia content has been illegally tampered or not. Nevertheless, very few efforts have been devoted to identifying which kind of attack has been carried out, especially due to the large data required for this task. We propose a novel hashing scheme which exploits the paradigms of compressive sensing and distributed source coding to generate a compact hash signature, and we apply it to the case of audio content protection. The audio content provider produces a small hash signature by computing a limited number of random projections of a perceptual, time-frequency representation of the original audio stream; the audio hash is given by the syndrome bits of an LDPC code applied to the projections. At the content user side, the hash is decoded using distributed source coding tools. If the tampering is sparsifiable or compressible in some orthonormal basis or redundant dictionary, it is possible to identify the time-frequency position of the attack, with a hash size as small as 200 bits/second; the bit saving obtained by introducing distributed source coding ranges between 20% to 70%.

  10. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    Science.gov (United States)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  11. Authorship attribution of source code by using back propagation neural network based on particle swarm optimization.

    Science.gov (United States)

    Yang, Xinyu; Xu, Guoai; Li, Qi; Guo, Yanhui; Zhang, Miao

    2017-01-01

    Authorship attribution is to identify the most likely author of a given sample among a set of candidate known authors. It can be not only applied to discover the original author of plain text, such as novels, blogs, emails, posts etc., but also used to identify source code programmers. Authorship attribution of source code is required in diverse applications, ranging from malicious code tracking to solving authorship dispute or software plagiarism detection. This paper aims to propose a new method to identify the programmer of Java source code samples with a higher accuracy. To this end, it first introduces back propagation (BP) neural network based on particle swarm optimization (PSO) into authorship attribution of source code. It begins by computing a set of defined feature metrics, including lexical and layout metrics, structure and syntax metrics, totally 19 dimensions. Then these metrics are input to neural network for supervised learning, the weights of which are output by PSO and BP hybrid algorithm. The effectiveness of the proposed method is evaluated on a collected dataset with 3,022 Java files belong to 40 authors. Experiment results show that the proposed method achieves 91.060% accuracy. And a comparison with previous work on authorship attribution of source code for Java language illustrates that this proposed method outperforms others overall, also with an acceptable overhead.

  12. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  13. Eu-NORSEWInD - Assessment of Viability of Open Source CFD Code for the Wind Industry

    DEFF Research Database (Denmark)

    Stickland, Matt; Scanlon, Tom; Fabre, Sylvie

    2009-01-01

    Part of the overall NORSEWInD project is the use of LiDAR remote sensing (RS) systems mounted on offshore platforms to measure wind velocity profiles at a number of locations offshore. The data acquired from the offshore RS measurements will be fed into a large and novel wind speed dataset suitab...... between the results of simulations created by the commercial code FLUENT and the open source code OpenFOAM. An assessment of the ease with which the open source code can be used is also included....

  14. Health physics source document for codes of practice

    International Nuclear Information System (INIS)

    Pearson, G.W.; Meggitt, G.C.

    1989-05-01

    Personnel preparing codes of practice often require basic Health Physics information or advice relating to radiological protection problems and this document is written primarily to supply such information. Certain technical terms used in the text are explained in the extensive glossary. Due to the pace of change in the field of radiological protection it is difficult to produce an up-to-date document. This document was compiled during 1988 however, and therefore contains the principle changes brought about by the introduction of the Ionising Radiations Regulations (1985). The paper covers the nature of ionising radiation, its biological effects and the principles of control. It is hoped that the document will provide a useful source of information for both codes of practice and wider areas and stimulate readers to study radiological protection issues in greater depth. (author)

  15. Low complexity source and channel coding for mm-wave hybrid fiber-wireless links

    DEFF Research Database (Denmark)

    Lebedev, Alexander; Vegas Olmos, Juan José; Pang, Xiaodan

    2014-01-01

    We report on the performance of channel and source coding applied for an experimentally realized hybrid fiber-wireless W-band link. Error control coding performance is presented for a wireless propagation distance of 3 m and 20 km fiber transmission. We report on peak signal-to-noise ratio perfor...

  16. Fine-Grained Energy Modeling for the Source Code of a Mobile Application

    DEFF Research Database (Denmark)

    Li, Xueliang; Gallagher, John Patrick

    2016-01-01

    The goal of an energy model for source code is to lay a foundation for the application of energy-aware programming techniques. State of the art solutions are based on source-line energy information. In this paper, we present an approach to constructing a fine-grained energy model which is able...

  17. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  18. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  19. READING LITERATURE, TAKING PHILOSOPHICAL IDEAS, AND OBTAINING CHARACTERS

    Directory of Open Access Journals (Sweden)

    Siti Maisaroh

    2017-05-01

    Full Text Available This study aims to describe the philosophical ideas and characters containing in trilogy of 'RaraMendut's' novel by YB Mangunwijaya. The method used is the knowledge archeology of Michel Foucault. The research proves that the philosophical ideas as follows: 1 wife's faithfulness contains characters of wife’s strong determination and true faithfulness sense; 2 The women seizing fate's  contains the character of high struggle spirit;3 women as a glory’s symbol contains character of self-actualization ability; 4 women and a country's defense contains a character of clever to take on the role / responsive; 5 women and their benefits contains the character as a source of love and life spirit; 6 women as good mothers contains the character of conciliatory, reassuring, joyful, sincere, and full of love; 7 the anxiety to old age contains the character of religious and strong self-awareness; 8 the glory contains the character of the glory of battle with themselves; 9 the child's nature contains the character of belief in the skill/ creativity of children and believe to God the Evolver; And 10 the essence of wisdom and usefulness of life contain  the characters of uniting the scattered things, receiving and embracing sincerely things bad/ broken/ waste, understanding and forgiving, voice sincerity and excitement, not easy to complain.

  20. WASTK: A Weighted Abstract Syntax Tree Kernel Method for Source Code Plagiarism Detection

    Directory of Open Access Journals (Sweden)

    Deqiang Fu

    2017-01-01

    Full Text Available In this paper, we introduce a source code plagiarism detection method, named WASTK (Weighted Abstract Syntax Tree Kernel, for computer science education. Different from other plagiarism detection methods, WASTK takes some aspects other than the similarity between programs into account. WASTK firstly transfers the source code of a program to an abstract syntax tree and then gets the similarity by calculating the tree kernel of two abstract syntax trees. To avoid misjudgment caused by trivial code snippets or frameworks given by instructors, an idea similar to TF-IDF (Term Frequency-Inverse Document Frequency in the field of information retrieval is applied. Each node in an abstract syntax tree is assigned a weight by TF-IDF. WASTK is evaluated on different datasets and, as a result, performs much better than other popular methods like Sim and JPlag.

  1. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    Science.gov (United States)

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  2. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  3. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  4. Use of Splines in Handwritten Character Recognition

    OpenAIRE

    Sunil Kumar; Gopinath S,; Satish Kumar; Rajesh Chhikara

    2010-01-01

    Handwritten Character Recognition is software used to identify the handwritten characters and receive and interpret intelligible andwritten input from sources such as manuscript documents. The recent past several years has seen the development of many systems which are able to simulate the human brain actions. Among the many, the neural networks and the artificial intelligence are the most two important paradigms used. In this paper we propose a new algorithm for recognition of handwritten t...

  5. Women Out of View. An Analysis of Female Characters on 1987-88 TV Programs.

    Science.gov (United States)

    Steenland, Sally; Whittemore, Lauren

    This study of the images of women as portrayed on new television programs in 1987-88 not only compared them with the images of the last season, but examined the similarities and differences between these characters and real life women. Each continuing female character on every new show was coded for race, age, occupation, marital and socioeconomic…

  6. D-DSC: Decoding Delay-based Distributed Source Coding for Internet of Sensing Things.

    Science.gov (United States)

    Aktas, Metin; Kuscu, Murat; Dinc, Ergin; Akan, Ozgur B

    2018-01-01

    Spatial correlation between densely deployed sensor nodes in a wireless sensor network (WSN) can be exploited to reduce the power consumption through a proper source coding mechanism such as distributed source coding (DSC). In this paper, we propose the Decoding Delay-based Distributed Source Coding (D-DSC) to improve the energy efficiency of the classical DSC by employing the decoding delay concept which enables the use of the maximum correlated portion of sensor samples during the event estimation. In D-DSC, network is partitioned into clusters, where the clusterheads communicate their uncompressed samples carrying the side information, and the cluster members send their compressed samples. Sink performs joint decoding of the compressed and uncompressed samples and then reconstructs the event signal using the decoded sensor readings. Based on the observed degree of the correlation among sensor samples, the sink dynamically updates and broadcasts the varying compression rates back to the sensor nodes. Simulation results for the performance evaluation reveal that D-DSC can achieve reliable and energy-efficient event communication and estimation for practical signal detection/estimation applications having massive number of sensors towards the realization of Internet of Sensing Things (IoST).

  7. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  8. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  9. A Source Term Calculation for the APR1400 NSSS Auxiliary System Components Using the Modified SHIELD Code

    International Nuclear Information System (INIS)

    Park, Hong Sik; Kim, Min; Park, Seong Chan; Seo, Jong Tae; Kim, Eun Kee

    2005-01-01

    The SHIELD code has been used to calculate the source terms of NSSS Auxiliary System (comprising CVCS, SIS, and SCS) components of the OPR1000. Because the code had been developed based upon the SYSTEM80 design and the APR1400 NSSS Auxiliary System design is considerably changed from that of SYSTEM80 or OPR1000, the SHIELD code cannot be used directly for APR1400 radiation design. Thus the hand-calculation is needed for the portion of design changes using the results of the SHIELD code calculation. In this study, the SHIELD code is modified to incorporate the APR1400 design changes and the source term calculation is performed for the APR1400 NSSS Auxiliary System components

  10. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    Science.gov (United States)

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  11. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  12. Code of Conduct on the Safety and Security of Radioactive Sources and the Supplementary Guidance on the Import and Export of Radioactive Sources

    International Nuclear Information System (INIS)

    2005-01-01

    In operative paragraph 4 of its resolution GC(47)/RES/7.B, the General Conference, having welcomed the approval by the Board of Governors of the revised IAEA Code of Conduct on the Safety and Security of Radioactive Sources (GC(47)/9), and while recognizing that the Code is not a legally binding instrument, urged each State to write to the Director General that it fully supports and endorses the IAEA's efforts to enhance the safety and security of radioactive sources and is working toward following the guidance contained in the IAEA Code of Conduct. In operative paragraph 5, the Director General was requested to compile, maintain and publish a list of States that have made such a political commitment. The General Conference, in operative paragraph 6, recognized that this procedure 'is an exceptional one, having no legal force and only intended for information, and therefore does not constitute a precedent applicable to other Codes of Conduct of the Agency or of other bodies belonging to the United Nations system'. In operative paragraph 7 of resolution GC(48)/RES/10.D, the General Conference welcomed the fact that more than 60 States had made political commitments with respect to the Code in line with resolution GC(47)/RES/7.B and encouraged other States to do so. In operative paragraph 8 of resolution GC(48)/RES/10.D, the General Conference further welcomed the approval by the Board of Governors of the Supplementary Guidance on the Import and Export of Radioactive Sources (GC(48)/13), endorsed this Guidance while recognizing that it is not legally binding, noted that more than 30 countries had made clear their intention to work towards effective import and export controls by 31 December 2005, and encouraged States to act in accordance with the Guidance on a harmonized basis and to notify the Director General of their intention to do so as supplementary information to the Code of Conduct, recalling operative paragraph 6 of resolution GC(47)/RES/7.B. 4. The

  13. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    Science.gov (United States)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  14. Comparison of TG‐43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes

    Science.gov (United States)

    Zaker, Neda; Sina, Sedigheh; Koontz, Craig; Meigooni1, Ali S.

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross‐sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross‐sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code — MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low‐energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg PMID:27074460

  15. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    Science.gov (United States)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  16. Source Code Verification for Embedded Systems using Prolog

    Directory of Open Access Journals (Sweden)

    Frank Flederer

    2017-01-01

    Full Text Available System relevant embedded software needs to be reliable and, therefore, well tested, especially for aerospace systems. A common technique to verify programs is the analysis of their abstract syntax tree (AST. Tree structures can be elegantly analyzed with the logic programming language Prolog. Moreover, Prolog offers further advantages for a thorough analysis: On the one hand, it natively provides versatile options to efficiently process tree or graph data structures. On the other hand, Prolog's non-determinism and backtracking eases tests of different variations of the program flow without big effort. A rule-based approach with Prolog allows to characterize the verification goals in a concise and declarative way. In this paper, we describe our approach to verify the source code of a flash file system with the help of Prolog. The flash file system is written in C++ and has been developed particularly for the use in satellites. We transform a given abstract syntax tree of C++ source code into Prolog facts and derive the call graph and the execution sequence (tree, which then are further tested against verification goals. The different program flow branching due to control structures is derived by backtracking as subtrees of the full execution sequence. Finally, these subtrees are verified in Prolog. We illustrate our approach with a case study, where we search for incorrect applications of semaphores in embedded software using the real-time operating system RODOS. We rely on computation tree logic (CTL and have designed an embedded domain specific language (DSL in Prolog to express the verification goals.

  17. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly; Pettersson, Gustav M.; Kostina, Victoria; Hassibi, Babak

    2017-01-01

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  18. Multi-rate control over AWGN channels via analog joint source-channel coding

    KAUST Repository

    Khina, Anatoly

    2017-01-05

    We consider the problem of controlling an unstable plant over an additive white Gaussian noise (AWGN) channel with a transmit power constraint, where the signaling rate of communication is larger than the sampling rate (for generating observations and applying control inputs) of the underlying plant. Such a situation is quite common since sampling is done at a rate that captures the dynamics of the plant and which is often much lower than the rate that can be communicated. This setting offers the opportunity of improving the system performance by employing multiple channel uses to convey a single message (output plant observation or control input). Common ways of doing so are through either repeating the message, or by quantizing it to a number of bits and then transmitting a channel coded version of the bits whose length is commensurate with the number of channel uses per sampled message. We argue that such “separated source and channel coding” can be suboptimal and propose to perform joint source-channel coding. Since the block length is short we obviate the need to go to the digital domain altogether and instead consider analog joint source-channel coding. For the case where the communication signaling rate is twice the sampling rate, we employ the Archimedean bi-spiral-based Shannon-Kotel\\'nikov analog maps to show significant improvement in stability margins and linear-quadratic Gaussian (LQG) costs over simple schemes that employ repetition.

  19. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  20. Code of practice for the use of sealed radioactive sources in borehole logging (1998)

    International Nuclear Information System (INIS)

    1989-12-01

    The purpose of this code is to establish working practices, procedures and protective measures which will aid in keeping doses, arising from the use of borehole logging equipment containing sealed radioactive sources, to as low as reasonably achievable and to ensure that the dose-equivalent limits specified in the National Health and Medical Research Council s radiation protection standards, are not exceeded. This code applies to all situations and practices where a sealed radioactive source or sources are used through wireline logging for investigating the physical properties of the geological sequence, or any fluids contained in the geological sequence, or the properties of the borehole itself, whether casing, mudcake or borehole fluids. The radiation protection standards specify dose-equivalent limits for two categories: radiation workers and members of the public. 3 refs., tabs., ills

  1. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    Science.gov (United States)

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  2. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    Science.gov (United States)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  3. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  4. Quantitative computed tomography (QCT) as a radiology reporting tool by using optical character recognition (OCR) and macro program.

    Science.gov (United States)

    Lee, Young Han; Song, Ho-Taek; Suh, Jin-Suck

    2012-12-01

    The objectives are (1) to introduce a new concept of making a quantitative computed tomography (QCT) reporting system by using optical character recognition (OCR) and macro program and (2) to illustrate the practical usages of the QCT reporting system in radiology reading environment. This reporting system was created as a development tool by using an open-source OCR software and an open-source macro program. The main module was designed for OCR to report QCT images in radiology reading process. The principal processes are as follows: (1) to save a QCT report as a graphic file, (2) to recognize the characters from an image as a text, (3) to extract the T scores from the text, (4) to perform error correction, (5) to reformat the values into QCT radiology reporting template, and (6) to paste the reports into the electronic medical record (EMR) or picture archiving and communicating system (PACS). The accuracy test of OCR was performed on randomly selected QCTs. QCT as a radiology reporting tool successfully acted as OCR of QCT. The diagnosis of normal, osteopenia, or osteoporosis is also determined. Error correction of OCR is done with AutoHotkey-coded module. The results of T scores of femoral neck and lumbar vertebrae had an accuracy of 100 and 95.4 %, respectively. A convenient QCT reporting system could be established by utilizing open-source OCR software and open-source macro program. This method can be easily adapted for other QCT applications and PACS/EMR.

  5. Coding OSICS sports injury diagnoses in epidemiological studies: does the background of the coder matter?

    Science.gov (United States)

    Finch, Caroline F; Orchard, John W; Twomey, Dara M; Saad Saleem, Muhammad; Ekegren, Christina L; Lloyd, David G; Elliott, Bruce C

    2014-04-01

    To compare Orchard Sports Injury Classification System (OSICS-10) sports medicine diagnoses assigned by a clinical and non-clinical coder. Assessment of intercoder agreement. Community Australian football. 1082 standardised injury surveillance records. Direct comparison of the four-character hierarchical OSICS-10 codes assigned by two independent coders (a sports physician and an epidemiologist). Adjudication by a third coder (biomechanist). The coders agreed on the first character 95% of the time and on the first two characters 86% of the time. They assigned the same four-digit OSICS-10 code for only 46% of the 1082 injuries. The majority of disagreements occurred for the third character; 85% were because one coder assigned a non-specific 'X' code. The sports physician code was deemed correct in 53% of cases and the epidemiologist in 44%. Reasons for disagreement included the physician not using all of the collected information and the epidemiologist lacking specific anatomical knowledge. Sports injury research requires accurate identification and classification of specific injuries and this study found an overall high level of agreement in coding according to OSICS-10. The fact that the majority of the disagreements occurred for the third OSICS character highlights the fact that increasing complexity and diagnostic specificity in injury coding can result in a loss of reliability and demands a high level of anatomical knowledge. Injury report form details need to reflect this level of complexity and data management teams need to include a broad range of expertise.

  6. From system requirements to source code: transitions in UML and RUP

    Directory of Open Access Journals (Sweden)

    Stanisław Wrycza

    2011-06-01

    Full Text Available There are many manuals explaining language specification among UML-related books. Only some of books mentioned concentrate on practical aspects of using the UML language in effective way using CASE tools and RUP. The current paper presents transitions from system requirements specification to structural source code, useful while developing an information system.

  7. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  8. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system

    Science.gov (United States)

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes.

  9. RMG An Open Source Electronic Structure Code for Multi-Petaflops Calculations

    Science.gov (United States)

    Briggs, Emil; Lu, Wenchang; Hodak, Miroslav; Bernholc, Jerzy

    RMG (Real-space Multigrid) is an open source, density functional theory code for quantum simulations of materials. It solves the Kohn-Sham equations on real-space grids, which allows for natural parallelization via domain decomposition. Either subspace or Davidson diagonalization, coupled with multigrid methods, are used to accelerate convergence. RMG is a cross platform open source package which has been used in the study of a wide range of systems, including semiconductors, biomolecules, and nanoscale electronic devices. It can optionally use GPU accelerators to improve performance on systems where they are available. The recently released versions (>2.0) support multiple GPU's per compute node, have improved performance and scalability, enhanced accuracy and support for additional hardware platforms. New versions of the code are regularly released at http://www.rmgdft.org. The releases include binaries for Linux, Windows and MacIntosh systems, automated builds for clusters using cmake, as well as versions adapted to the major supercomputing installations and platforms. Several recent, large-scale applications of RMG will be discussed.

  10. Lovers, enemies, and friends: The complex and coded early history of lesbian comic strip characters.

    Science.gov (United States)

    McGurk, Caitlin

    2018-05-31

    This article seeks to recuperate four previously unexamined early newspaper comic strip characters that could lay the groundwork for queer comic studies. The titular characters in Lucy and Sophie Say Goodbye (1905), Sanjak in Terry and the Pirates (1939) by Milton Caniff, and Hank O'Hair in Brenda Starr, Reporter (1940) by Dale Messick are analyzed through close readings, supporting archival material, and interviews. The article also theorizes the identification of the creator of Lucy and Sophie Say Goodbye as George O. Frink, and offers an overview of LGBTQ comics holdings at institutions in North America.

  11. ‘Modular Spacetime in the “Intelligent” Blockbuster: Inception and Source Code’

    OpenAIRE

    Misek, Richard; Cameron, Allan

    2014-01-01

    Suggesting both linear progression and configurable modularity, the complex cinematic narratives of Inception (Christopher Nolan, 2010) and Source Code (Duncan Jones, 2011) produce distinctive articulations of time and space. They also thematize the architectural processes involved in their own narrative construction, by featuring characters who are programmers, designers, and architects, and deploying a range of spatial metaphors (including lines, layers, and circles) via scenography, dialog...

  12. Cinematography and character depiction

    OpenAIRE

    William Francis Nicholson

    2011-01-01

    This essay investigates the ways in which cinematography can be used in depicting characters effectively in the motion picture medium. Since an aspiring filmmaker may be overwhelmed by the expansive field of cinematography, this essay aims to demystify and systematise this aspect of filmmaking. It combines information from written sources (mostly text books on filmmaking and cinematography) with observations made from viewing recent and older feature films. The knowledge is organised under th...

  13. Coded moderator approach for fast neutron source detection and localization at standoff

    Energy Technology Data Exchange (ETDEWEB)

    Littell, Jennifer [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Lukosi, Eric, E-mail: elukosi@utk.edu [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Institute for Nuclear Security, University of Tennessee, 1640 Cumberland Avenue, Knoxville, TN 37996 (United States); Hayward, Jason; Milburn, Robert; Rowan, Allen [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States)

    2015-06-01

    Considering the need for directional sensing at standoff for some security applications and scenarios where a neutron source may be shielded by high Z material that nearly eliminates the source gamma flux, this work focuses on investigating the feasibility of using thermal neutron sensitive boron straw detectors for fast neutron source detection and localization. We utilized MCNPX simulations to demonstrate that, through surrounding the boron straw detectors by a HDPE coded moderator, a source-detector orientation-specific response enables potential 1D source localization in a high neutron detection efficiency design. An initial test algorithm has been developed in order to confirm the viability of this detector system's localization capabilities which resulted in identification of a 1 MeV neutron source with a strength equivalent to 8 kg WGPu at 50 m standoff within ±11°.

  14. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  15. Source Code Stylometry Improvements in Python

    Science.gov (United States)

    2017-12-14

    grant (Caliskan-Islam et al. 2015) ............. 1 Fig. 2 Corresponding abstract syntax tree from de-anonymizing programmers’ paper (Caliskan-Islam et...person can be identified via their handwriting or an author identified by their style or prose, programmers can be identified by their code...Provided a labelled training set of code samples (example in Fig. 1), the techniques used in stylometry can identify the author of a piece of code or even

  16. Characters with personality!

    NARCIS (Netherlands)

    Bosch, K. van den; Brandenburgh, A.; Muller, T.J.; Heuvelink, A.

    2012-01-01

    Serious games offer an opportunity for learning communication skills by practicing conversations with one or more virtual characters, provided that the character(s) behave in accordance with their assigned properties and strate-gies. This paper presents an approach for developing virtual characters

  17. Coding OSICS sports injury diagnoses in epidemiological studies: does the background of the coder matter?

    Science.gov (United States)

    Finch, Caroline F; Orchard, John W; Twomey, Dara M; Saad Saleem, Muhammad; Ekegren, Christina L; Lloyd, David G; Elliott, Bruce C

    2014-01-01

    Objective To compare Orchard Sports Injury Classification System (OSICS-10) sports medicine diagnoses assigned by a clinical and non-clinical coder. Design Assessment of intercoder agreement. Setting Community Australian football. Participants 1082 standardised injury surveillance records. Main outcome measurements Direct comparison of the four-character hierarchical OSICS-10 codes assigned by two independent coders (a sports physician and an epidemiologist). Adjudication by a third coder (biomechanist). Results The coders agreed on the first character 95% of the time and on the first two characters 86% of the time. They assigned the same four-digit OSICS-10 code for only 46% of the 1082 injuries. The majority of disagreements occurred for the third character; 85% were because one coder assigned a non-specific ‘X’ code. The sports physician code was deemed correct in 53% of cases and the epidemiologist in 44%. Reasons for disagreement included the physician not using all of the collected information and the epidemiologist lacking specific anatomical knowledge. Conclusions Sports injury research requires accurate identification and classification of specific injuries and this study found an overall high level of agreement in coding according to OSICS-10. The fact that the majority of the disagreements occurred for the third OSICS character highlights the fact that increasing complexity and diagnostic specificity in injury coding can result in a loss of reliability and demands a high level of anatomical knowledge. Injury report form details need to reflect this level of complexity and data management teams need to include a broad range of expertise. PMID:22919021

  18. Character selecting advisor for a role-playing game

    Science.gov (United States)

    Redfield, Carol L.; Berlanga, Felicia

    1994-01-01

    Role-playing games have been a source of much pleasure and merriment for people of all ages. The process of developing a character for a role-playing game is usually very, very time consuming, delaying what many players consider the most entertaining part of the game. An expert system has been written to assist a player in creating a character by guiding the player through a series of questions. This paper discusses the selection of this topic, the knowledge engineering, the software development, and the resulting program that cuts the time of character development from about 4 hours to 30 minutes. The program was written on a PC and an Apollo in CLIPS 4.3 and currently runs on the Apollo.

  19. Violent film characters' portrayal of alcohol, sex, and tobacco-related behaviors.

    Science.gov (United States)

    Bleakley, Amy; Romer, Daniel; Jamieson, Patrick E

    2014-01-01

    To determine the extent to which movies popular with adolescents feature characters who jointly engage in violence and other risk behaviors. We hypothesized that violent characters engage in other risk behaviors equally often in films rated appropriate for children over 12 (PG-13) and Restricted (R)-rated films. Content analysis of a sample of top-grossing movies from 1985 to 2010 (n = 390). We coded movies for the presence of at least 1 main character who was involved in violence and either sex, tobacco, or alcohol use within a 5-minute movie segment and throughout a film. Approximately 90% of the movies contained a segment with a main character involved in violence, and ~77% of the films had the same character engaging in at least 1 other risk behavior. A violent character was portrayed most often partaking in alcohol-related and sexual behaviors. G and PG movies had less co-occurrence than PG-13 or R-rated movies, but there was no statistical difference between PG-13 and R-rated movies with regards to violence co-occurring with other risk behaviors. These trends did not vary over time. Popular films that contain violent characters also show those characters engaging in other risk behaviors. Similar rates of co-occurrence between PG-13 and R-rated films suggest that the Motion Picture Association of America ratings system is not sensitive to the joint portrayal of violence and alcohol, sex, and tobacco-related risk behaviors. The on-screen clustering of violence with other risk behaviors is cause for concern and worthy of additional research.

  20. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  1. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  2. Solving Semantic Searches for Source Code

    Science.gov (United States)

    2012-11-01

    but of input and expected output pairs. In this domain, those inputs take the form of strings and outputs could be one of sev- eral datatypes ...for some relaxation of CPi that yields C ′ Pi . Encoding weakening is performed by systematically making the constraints on a particular datatype ...the datatypes that can hold concrete or symbolic values: integers, characters, booleans, and strings. The Java implementation uses all the data types

  3. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  4. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  5. Bit rates in audio source coding

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.

    1992-01-01

    The goal is to introduce and solve the audio coding optimization problem. Psychoacoustic results such as masking and excitation pattern models are combined with results from rate distortion theory to formulate the audio coding optimization problem. The solution of the audio optimization problem is a

  6. COMPASS: A source term code for investigating capillary barrier performance

    International Nuclear Information System (INIS)

    Zhou, Wei; Apted, J.J.

    1996-01-01

    A computer code COMPASS based on compartment model approach is developed to calculate the near-field source term of the High-Level-Waste repository under unsaturated conditions. COMPASS is applied to evaluate the expected performance of Richard's (capillary) barriers as backfills to divert infiltrating groundwater at Yucca Mountain. Comparing the release rates of four typical nuclides with and without the Richard's barrier, it is shown that the Richard's barrier significantly decreases the peak release rates from the Engineered-Barrier-System (EBS) into the host rock

  7. BRIEF CONSIDERATIONS REGARDING THE PROPERTY TRANSFERRING OR ENGENDERING OF OBLIGATIONS CHARACTER OF THE SALES CONTRACT

    Directory of Open Access Journals (Sweden)

    Vlad-Victor OCHEA

    2016-05-01

    Full Text Available I herein want to emphasize the main aspects regarding the property/real estate transferring or engendering of obligations character of the sales contract governed by the Romanian Civil Code of 2009.

  8. Time-dependent anisotropic external sources in transient 3-D transport code TORT-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    This paper describes the implementation of a time-dependent distributed external source in TORT-TD by explicitly considering the external source in the ''fixed-source'' term of the implicitly time-discretised 3-D discrete ordinates transport equation. Anisotropy of the external source is represented by a spherical harmonics series expansion similar to the angular fluxes. The YALINA-Thermal subcritical assembly serves as a test case. The configuration with 280 fuel rods has been analysed with TORT-TD using cross sections in 18 energy groups and P1 scattering order generated by the KAPROS code system. Good agreement is achieved concerning the multiplication factor. The response of the system to an artificial time-dependent source consisting of two square-wave pulses demonstrates the time-dependent external source capability of TORT-TD. The result is physically plausible as judged from validation calculations. (orig.)

  9. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    Science.gov (United States)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that

  10. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  11. Methods for Presenting Braille Characters on a Mobile Device with a Touchscreen and Tactile Feedback.

    Science.gov (United States)

    Rantala, J; Raisamo, R; Lylykangas, J; Surakka, V; Raisamo, J; Salminen, K; Pakkanen, T; Hippula, A

    2009-01-01

    Three novel interaction methods were designed for reading six-dot Braille characters from the touchscreen of a mobile device. A prototype device with a piezoelectric actuator embedded under the touchscreen was used to create tactile feedback. The three interaction methods, scan, sweep, and rhythm, enabled users to read Braille characters one at a time either by exploring the characters dot by dot or by sensing a rhythmic pattern presented on the screen. The methods were tested with five blind Braille readers as a proof of concept. The results of the first experiment showed that all three methods can be used to convey information as the participants could accurately (91-97 percent) recognize individual characters. In the second experiment the presentation rate of the most efficient and preferred method, the rhythm, was varied. A mean recognition accuracy of 70 percent was found when the speed of presenting a single character was nearly doubled from the first experiment. The results showed that temporal tactile feedback and Braille coding can be used to transmit single-character information while further studies are still needed to evaluate the presentation of serial information, i.e., multiple Braille characters.

  12. Character animation fundamentals developing skills for 2D and 3D character animation

    CERN Document Server

    Roberts, Steve

    2012-01-01

    Expand your animation toolkit and remain competitive in the industry with this leading resource for 2D and 3D character animation techniques. Apply the industry's best practices to your own workflows and develop 2D, 3D and hybrid characters with ease. With side by side comparisons of 2D and 3D character design, improve your character animation and master traditional principles and processes including weight and balance, timing and walks. Develop characters inspired by humans, birds, fish, snakes and four legged animals. Breathe life into your character and develop a characters personality w

  13. A Comparison of Source Code Plagiarism Detection Engines

    Science.gov (United States)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  14. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  15. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104)

    International Nuclear Information System (INIS)

    Kress, T.S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time

  16. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    Science.gov (United States)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  17. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  18. Implementation and Analysis Audio Steganography Used Parity Coding for Symmetric Cryptography Key Delivery

    Directory of Open Access Journals (Sweden)

    Afany Zeinata Firdaus

    2013-12-01

    Full Text Available In today's era of communication, online data transactions is increasing. Various information even more accessible, both upload and download. Because it takes a capable security system. Blowfish cryptographic equipped with Audio Steganography is one way to secure the data so that the data can not be accessed by unauthorized parties. In this study Audio Steganography technique is implemented using parity coding method that is used to send the key cryptography blowfish in e-commerce applications based on Android. The results obtained for the average computation time on stage insertion (embedding the secret message is shorter than the average computation time making phase (extracting the secret message. From the test results can also be seen that the more the number of characters pasted the greater the noise received, where the highest SNR is obtained when a character is inserted as many as 506 characters is equal to 11.9905 dB, while the lowest SNR obtained when a character is inserted as many as 2006 characters at 5,6897 dB . Keywords: audio steganograph, parity coding, embedding, extractin, cryptography blowfih.

  19. DISCURSIVE ACTUALIZATION OF ETHNO-LINGUOCULTURAL CODE IN ENGLISH GLUTTONY

    Directory of Open Access Journals (Sweden)

    Nikishkova Mariya Sergeevna

    2014-11-01

    Full Text Available The article presents the overview of linguistic research on gastronomic / gluttony communicative environment as ethnocultural phenomenon from the standpoint of conceptology, discourse study and linguosemiotics. The authors study the linguosemiotic encoding / decoding in the English gastronomic (gluttony discourse. The peculiarities of gastronomic gluttonyms "immersion" into everyday communication are studied. The anglophone ethnicities are revealed and different ways of gluttony texts (including the precedent ones formation are investigated. The linguosemiotic parameters of ethnocultural (anglophone gastronomic coded communication are established, their discursive characteristics are identified. It is determined that in English gastronomic communication, the discursive actualization of ethno-linguocultural code has a dynamic nature; the constitutive features of gastronomic discourse have symbolic (semiotic basics and are connected with such semiotic categories as code, encoding, decoding. It was found that food is semiotic in its origin and represents the cultural code. It was revealed that the semiosis of English gastronomic text is regularly filled with the codes of traditional "English-likeness" (ethnic term by Roland Barthes expressed by gluttonyms. "Nationality" code is detected through the names of products specific to certain areas; national identity of ethnic code also allows highlighting ways of dish garnishing and serving, typical characteristics of particular local preparation methods. The authors analyze the "lingualization" of food images having an ambivalent character, determined, firstly, by food signs (gluttonyms which structure the common space of gastronomic discourse and provide it with ethnic linguocultural food source; secondly, by immerging formed images into a specific ethnic code that is decoded in gastronomic discourse unfolding. The precedent texts accumulate ethnic information supplying adequate gastronomic worldview

  20. Harmonic hopping, and both punctuated and gradual evolution of acoustic characters in Selasphorus hummingbird tail-feathers.

    Directory of Open Access Journals (Sweden)

    Christopher James Clark

    Full Text Available Models of character evolution often assume a single mode of evolutionary change, such as continuous, or discrete. Here I provide an example in which a character exhibits both types of change. Hummingbirds in the genus Selasphorus produce sound with fluttering tail-feathers during courtship. The ancestral character state within Selasphorus is production of sound with an inner tail-feather, R2, in which the sound usually evolves gradually. Calliope and Allen's Hummingbirds have evolved autapomorphic acoustic mechanisms that involve feather-feather interactions. I develop a source-filter model of these interactions. The 'source' comprises feather(s that are both necessary and sufficient for sound production, and are aerodynamically coupled to neighboring feathers, which act as filters. Filters are unnecessary or insufficient for sound production, but may evolve to become sources. Allen's Hummingbird has evolved to produce sound with two sources, one with feather R3, another frequency-modulated sound with R4, and their interaction frequencies. Allen's R2 retains the ancestral character state, a ∼1 kHz "ghost" fundamental frequency masked by R3, which is revealed when R3 is experimentally removed. In the ancestor to Allen's Hummingbird, the dominant frequency has 'hopped' to the second harmonic without passing through intermediate frequencies. This demonstrates that although the fundamental frequency of a communication sound may usually evolve gradually, occasional jumps from one character state to another can occur in a discrete fashion. Accordingly, mapping acoustic characters on a phylogeny may produce misleading results if the physical mechanism of production is not known.

  1. Code of practice for the control and safe handling of radioactive sources used for therapeutic purposes (1988)

    International Nuclear Information System (INIS)

    1988-01-01

    This Code is intended as a guide to safe practices in the use of sealed and unsealed radioactive sources and in the management of patients being treated with them. It covers the procedures for the handling, preparation and use of radioactive sources, precautions to be taken for patients undergoing treatment, storage and transport of radioactive sources within a hospital or clinic, and routine testing of sealed sources [fr

  2. Binary codes storage and data encryption in substrates with single proton beam writing technology

    International Nuclear Information System (INIS)

    Zhang Jun; Zhan Furu; Hu Zhiwen; Chen Lianyun; Yu Zengliang

    2006-01-01

    It has been demonstrated that characters can be written by proton beams in various materials. In contributing to the rapid development of proton beam writing technology, we introduce a new method for binary code storage and data encryption by writing binary codes of characters (BCC) in substrates with single proton beam writing technology. In this study, two kinds of BCC (ASCII BCC and long bit encrypted BCC) were written in CR-39 by a 2.6 MeV single proton beam. Our results show that in comparison to directly writing character shapes, writing ASCII BCC turned out to be about six times faster and required about one fourth the area in substrates. The approach of writing long bit encrypted BCC by single proton beams supports preserving confidential information in substrates. Additionally, binary codes fabricated by MeV single proton beams in substrates are more robust than those formed by lasers, since MeV single proton beams can make much deeper pits in the substrates

  3. Dual character of Sundarban estuary as a source and sink of CO2 during summer: an investigation of spatial dynamics.

    Science.gov (United States)

    Akhand, Anirban; Chanda, Abhra; Dutta, Sachinandan; Manna, Sudip; Sanyal, Pranabes; Hazra, Sugata; Rao, K H; Dadhwal, V K

    2013-08-01

    A comprehensive attempt has been made to evaluate the diurnal and spatial pattern of CO2 exchange between the atmosphere and water along the estuarine track of Indian Sundarbans during the two summer months, April and May, 2011. Rigorous field observations were carried out which included the hourly measurements of total alkalinity, pH, fugacity of CO2 in ambient air and water surface, dissolved oxygen, and chlorophyll a. The estuarine water was found rich in total alkalinity and was oversaturated with CO2 throughout the diurnal cycle in the two stations situated at the inner and middle estuary, respectively, whereas an entirely reverse situation was observed in the outer fringes. The fugacity of CO2 in water ranged from 152 to 657 μatm during the study period. The percentage of over-saturation in inner and middle estuary varied from 103 to 168 and 103 to 176 %, respectively, whereas the degree of under-saturation in the outer estuary lied between 40 and 99 %. Chlorophyll a concentrations were found higher in the outer estuary (12.3 ± 2.2 mg m(-3)) compared to the middle (6.4 ± 0.6 mg m(-3)) and inner parts (1.6 ± 0.2 mg m(-3)), followed by a similar decreasing pattern in nutrient availability from the outer to inner estuary. The sampling stations situated at the inner and middle estuary acted as a net source of 29.69 and 23.62 mg CO2 m(-2) day(-1), respectively, whereas the outer station behaved as a net sink of -33.37 mg CO2 m(-2) day(-1). The study of primary production and community respiration further supports the heterotrophic nature of the estuary in the inner region while the outer periphery was marked by dominant autotrophic character. These contrasting results are in parity with the source characters of many inner estuaries and sinking characters of the outer estuaries situated at the distal continental shelf areas.

  4. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  5. Chronos sickness: digital reality in Duncan Jones’s Source Code

    Directory of Open Access Journals (Sweden)

    Marcia Tiemy Morita Kawamoto

    2017-01-01

    Full Text Available http://dx.doi.org/10.5007/2175-8026.2017v70n1p249 The advent of the digital technologies unquestionably affected the cinema. The indexical relation and realistic effect with the photographed world much praised by André Bazin and Roland Barthes is just one of the affected aspects. This article discusses cinema in light of the new digital possibilities, reflecting on Steven Shaviro’s consideration of “how a nonindexical realism might be possible” (63 and how in fact a new kind of reality, a digital one, might emerge in the science fiction film Source Code (2013 by Duncan Jones.

  6. Coded aperture detector for high precision gamma-ray burst source locations

    International Nuclear Information System (INIS)

    Helmken, H.; Gorenstein, P.

    1977-01-01

    Coded aperture collimators in conjunction with position-sensitive detectors are very useful in the study of transient phenomenon because they combine broad field of view, high sensitivity, and an ability for precise source locations. Since the preceeding conference, a series of computer simulations of various detector designs have been carried out with the aid of a CDC 6400. Particular emphasis was placed on the development of a unit consisting of a one-dimensional random or periodic collimator in conjunction with a two-dimensional position-sensitive Xenon proportional counter. A configuration involving four of these units has been incorporated into the preliminary design study of the Transient Explorer (ATREX) satellite and are applicable to any SAS or HEAO type satellite mission. Results of this study, including detector response, fields of view, and source location precision, will be presented

  7. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    Science.gov (United States)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  8. Phylogeny, character evolution, and biogeography of Cuscuta (dodders; Convolvulaceae) inferred from coding plastid and nuclear sequences.

    Science.gov (United States)

    García, Miguel A; Costea, Mihai; Kuzmina, Maria; Stefanović, Saša

    2014-04-01

    The parasitic genus Cuscuta, containing some 200 species circumscribed traditionally in three subgenera, is nearly cosmopolitan, occurring in a wide range of habitats and hosts. Previous molecular studies, on subgenera Grammica and Cuscuta, delimited major clades within these groups. However, the sequences used were unalignable among subgenera, preventing the phylogenetic comparison across the genus. We conducted a broad phylogenetic study using rbcL and nrLSU sequences covering the morphological, physiological, and geographical diversity of Cuscuta. We used parsimony methods to reconstruct ancestral states for taxonomically important characters. Biogeographical inferences were obtained using statistical and Bayesian approaches. Four well-supported major clades are resolved. Two of them correspond to subgenera Monogynella and Grammica. Subgenus Cuscuta is paraphyletic, with section Pachystigma sister to subgenus Grammica. Previously described cases of strongly supported discordance between plastid and nuclear phylogenies, interpreted as reticulation events, are confirmed here and three new cases are detected. Dehiscent fruits and globose stigmas are inferred as ancestral character states, whereas the ancestral style number is ambiguous. Biogeographical reconstructions suggest an Old World origin for the genus and subsequent spread to the Americas as a consequence of one long-distance dispersal. Hybridization may play an important yet underestimated role in the evolution of Cuscuta. Our results disagree with scenarios of evolution (polarity) previously proposed for several taxonomically important morphological characters, and with their usage and significance. While several cases of long-distance dispersal are inferred, vicariance or dispersal to adjacent areas emerges as the dominant biogeographical pattern.

  9. Development of Coupled Interface System between the FADAS Code and a Source-term Evaluation Code XSOR for CANDU Reactors

    International Nuclear Information System (INIS)

    Son, Han Seong; Song, Deok Yong; Kim, Ma Woong; Shin, Hyeong Ki; Lee, Sang Kyu; Kim, Hyun Koon

    2006-01-01

    An accident prevention system is essential to the industrial security of nuclear industry. Thus, the more effective accident prevention system will be helpful to promote safety culture as well as to acquire public acceptance for nuclear power industry. The FADAS(Following Accident Dose Assessment System) which is a part of the Computerized Advisory System for a Radiological Emergency (CARE) system in KINS is used for the prevention against nuclear accident. In order to enhance the FADAS system more effective for CANDU reactors, it is necessary to develop the various accident scenarios and reliable database of source terms. This study introduces the construction of the coupled interface system between the FADAS and the source-term evaluation code aimed to improve the applicability of the CANDU Integrated Safety Analysis System (CISAS) for CANDU reactors

  10. Actor/Character Dualism

    DEFF Research Database (Denmark)

    Riis, Johannes

    2012-01-01

    Our perception of agency may be inherently fallible, and this may explain not only our general awareness of actors when engaged in fictional characters but also the specific case of paradoxical characters...

  11. Survey of source code metrics for evaluating testability of object oriented systems

    OpenAIRE

    Shaheen , Muhammad Rabee; Du Bousquet , Lydie

    2010-01-01

    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide test...

  12. NEACRP comparison of source term codes for the radiation protection assessment of transportation packages

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Locke, H.F.; Avery, A.F.

    1994-01-01

    The results for Problems 5 and 6 of the NEACRP code comparison as submitted by six participating countries are presented in summary. These problems concentrate on the prediction of the neutron and gamma-ray sources arising in fuel after a specified irradiation, the fuel being uranium oxide for problem 5 and a mixture of uranium and plutonium oxides for problem 6. In both problems the predicted neutron sources are in good agreement for all participants. For gamma rays, however, there are differences, largely due to the omission of bremsstrahlung in some calculations

  13. Optimal source coding, removable noise elimination, and natural coordinate system construction for general vector sources using replicator neural networks

    Science.gov (United States)

    Hecht-Nielsen, Robert

    1997-04-01

    A new universal one-chart smooth manifold model for vector information sources is introduced. Natural coordinates (a particular type of chart) for such data manifolds are then defined. Uniformly quantized natural coordinates form an optimal vector quantization code for a general vector source. Replicator neural networks (a specialized type of multilayer perceptron with three hidden layers) are the introduced. As properly configured examples of replicator networks approach minimum mean squared error (e.g., via training and architecture adjustment using randomly chosen vectors from the source), these networks automatically develop a mapping which, in the limit, produces natural coordinates for arbitrary source vectors. The new concept of removable noise (a noise model applicable to a wide variety of real-world noise processes) is then discussed. Replicator neural networks, when configured to approach minimum mean squared reconstruction error (e.g., via training and architecture adjustment on randomly chosen examples from a vector source, each with randomly chosen additive removable noise contamination), in the limit eliminate removable noise and produce natural coordinates for the data vector portions of the noise-corrupted source vectors. Consideration regarding selection of the dimension of a data manifold source model and the training/configuration of replicator neural networks are discussed.

  14. CHARACTER EDUCATION OF CHILDREN'S PERSPECTIVE IBN QAYYIM AL-JAWZIYYAH (691 H - 752 H

    Directory of Open Access Journals (Sweden)

    Makmudi

    2017-11-01

    Full Text Available The main problematic that caused the crisis of character in children is not making belief as basic philosophical in education. So, the implication of this is freedom of excessive euphoria. The success of the child based only from the perspective of the child's success in academic course. While the values of ethics, morals, and character less getting serious attention. Therefore, even though the students excel in intelligence quotient (IQ, but in emotional quotient (EQ they are in crisis of becoming alarming character. In this context, the existence of children's character education is considered very important for the next generation as an integral part of their life and living. The purpose of this study was to determine the thinking of Ibnu Qayyim al-Jawziyya about the concept of character education of children. The method of writing this research use library research (library research, the research done by collecting data and information by reading, studying and then analyze literatures relating to the theme, both primary (primary sources and secondary (secondary sources. Then analyzed using content analysis method (content analysis in the form of descriptive-Analytic. The results of the research in this dissertation shows that the concept of character education of children according to Ibnu Qayyim al-Jawziyya emphasis on the four major ways : 1. The importance of introducing kids about the monotheistic God, 2. The need to teach children the principal teachings of religion, 3. Teach and familiarize children on good ethics and morals, 4. Modeling, 5. Praise and meaningful punishment.

  15. Time-dependent anisotropic distributed source capability in transient 3-d transport code tort-TD

    International Nuclear Information System (INIS)

    Seubert, A.; Pautz, A.; Becker, M.; Dagan, R.

    2009-01-01

    The transient 3-D discrete ordinates transport code TORT-TD has been extended to account for time-dependent anisotropic distributed external sources. The extension aims at the simulation of the pulsed neutron source in the YALINA-Thermal subcritical assembly. Since feedback effects are not relevant in this zero-power configuration, this offers a unique opportunity to validate the time-dependent neutron kinetics of TORT-TD with experimental data. The extensions made in TORT-TD to incorporate a time-dependent anisotropic external source are described. The steady state of the YALINA-Thermal assembly and its response to an artificial square-wave source pulse sequence have been analysed with TORT-TD using pin-wise homogenised cross sections in 18 prompt energy groups with P 1 scattering order and 8 delayed neutron groups. The results demonstrate the applicability of TORT-TD to subcritical problems with a time-dependent external source. (authors)

  16. Numerical modeling of the Linac4 negative ion source extraction region by 3D PIC-MCC code ONIX

    CERN Document Server

    Mochalskyy, S; Minea, T; Lifschitz, AF; Schmitzer, C; Midttun, O; Steyaert, D

    2013-01-01

    At CERN, a high performance negative ion (NI) source is required for the 160 MeV H- linear accelerator Linac4. The source is planned to produce 80 mA of H- with an emittance of 0.25 mm mradN-RMS which is technically and scientifically very challenging. The optimization of the NI source requires a deep understanding of the underling physics concerning the production and extraction of the negative ions. The extraction mechanism from the negative ion source is complex involving a magnetic filter in order to cool down electrons’ temperature. The ONIX (Orsay Negative Ion eXtraction) code is used to address this problem. The ONIX is a selfconsistent 3D electrostatic code using Particles-in-Cell Monte Carlo Collisions (PIC-MCC) approach. It was written to handle the complex boundary conditions between plasma, source walls, and beam formation at the extraction hole. Both, the positive extraction potential (25kV) and the magnetic field map are taken from the experimental set-up, in construction at CERN. This contrib...

  17. EchoSeed Model 6733 Iodine-125 brachytherapy source: Improved dosimetric characterization using the MCNP5 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Mosleh-Shirazi, M. A.; Hadad, K.; Faghihi, R.; Baradaran-Ghahfarokhi, M.; Naghshnezhad, Z.; Meigooni, A. S. [Center for Research in Medical Physics and Biomedical Engineering and Physics Unit, Radiotherapy Department, Shiraz University of Medical Sciences, Shiraz 71936-13311 (Iran, Islamic Republic of); Radiation Research Center and Medical Radiation Department, School of Engineering, Shiraz University, Shiraz 71936-13311 (Iran, Islamic Republic of); Comprehensive Cancer Center of Nevada, Las Vegas, Nevada 89169 (United States)

    2012-08-15

    This study primarily aimed to obtain the dosimetric characteristics of the Model 6733 {sup 125}I seed (EchoSeed) with improved precision and accuracy using a more up-to-date Monte-Carlo code and data (MCNP5) compared to previously published results, including an uncertainty analysis. Its secondary aim was to compare the results obtained using the MCNP5, MCNP4c2, and PTRAN codes for simulation of this low-energy photon-emitting source. The EchoSeed geometry and chemical compositions together with a published {sup 125}I spectrum were used to perform dosimetric characterization of this source as per the updated AAPM TG-43 protocol. These simulations were performed in liquid water material in order to obtain the clinically applicable dosimetric parameters for this source model. Dose rate constants in liquid water, derived from MCNP4c2 and MCNP5 simulations, were found to be 0.993 cGyh{sup -1} U{sup -1} ({+-}1.73%) and 0.965 cGyh{sup -1} U{sup -1} ({+-}1.68%), respectively. Overall, the MCNP5 derived radial dose and 2D anisotropy functions results were generally closer to the measured data (within {+-}4%) than MCNP4c and the published data for PTRAN code (Version 7.43), while the opposite was seen for dose rate constant. The generally improved MCNP5 Monte Carlo simulation may be attributed to a more recent and accurate cross-section library. However, some of the data points in the results obtained from the above-mentioned Monte Carlo codes showed no statistically significant differences. Derived dosimetric characteristics in liquid water are provided for clinical applications of this source model.

  18. Two-terminal video coding.

    Science.gov (United States)

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  19. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  20. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  1. Believable Characters

    Science.gov (United States)

    El-Nasr, Magy Seif; Bishko, Leslie; Zammitto, Veronica; Nixon, Michael; Vasiliakos, Athanasios V.; Wei, Huaxin

    The interactive entertainment industry is one of the fastest growing industries in the world. In 1996, the U.S. entertainment software industry reported 2.6 billion in sales revenue, this figure has more than tripled in 2007 yielding 9.5 billion in revenues [1]. In addition, gamers, the target market for interactive entertainment products, are now reaching beyond the traditional 8-34 year old male to include women, Hispanics, and African Americans [2]. This trend has been observed in several markets, including Japan, China, Korea, and India, who has just published their first international AAA title (defined as high quality games with high budget), a 3D third person action game: Ghajini - The Game [3]. The topic of believable characters is becoming a central issue when designing and developing games for today's game industry. While narrative and character were considered secondary to game mechanics, games are currently evolving to integrate characters, narrative, and drama as part of their design. One can see this pattern through the emergence of games like Assassin's Creed (published by Ubisoft 2008), Hotel Dusk (published by Nintendo 2007), and Prince of Persia series (published by Ubisoft), which emphasized character and narrative as part of their design.

  2. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  3. Nuclear structure references coding manual

    International Nuclear Information System (INIS)

    Ramavataram, S.; Dunford, C.L.

    1984-02-01

    This manual is intended as a guide to Nuclear Structure References (NSR) compilers. The basic conventions followed at the National Nuclear Data Center (NNDC), which are compatible with the maintenance and updating of and retrieval from the Nuclear Structure References (NSR) file, are outlined. The structure of the NSR file such as the valid record identifiers, record contents, text fields as well as the major topics for which [KEYWORDS] are prepared are ennumerated. Relevant comments regarding a new entry into the NSR file, assignment of [KEYNO ], generation of [SELECTRS] and linkage characteristics are also given. A brief definition of the Keyword abstract is given followed by specific examples; for each TOPIC, the criteria for inclusion of an article as an entry into the NSR file as well as coding procedures are described. Authors submitting articles to Journals which require Keyword abstracts should follow the illustrations. The scope of the literature covered at NNDC, the categorization into Primary and Secondary sources, etc. is discussed. Useful information regarding permitted character sets, recommended abbreviations, etc. is given

  4. SCRIC: a code dedicated to the detailed emission and absorption of heterogeneous NLTE plasmas; application to xenon EUV sources

    International Nuclear Information System (INIS)

    Gaufridy de Dortan, F. de

    2006-01-01

    Nearly all spectral opacity codes for LTE and NLTE plasmas rely on configurations approximate modelling or even supra-configurations modelling for mid Z plasmas. But in some cases, configurations interaction (either relativistic and non relativistic) induces dramatic changes in spectral shapes. We propose here a new detailed emissivity code with configuration mixing to allow for a realistic description of complex mid Z plasmas. A collisional radiative calculation. based on HULLAC precise energies and cross sections. determines the populations. Detailed emissivities and opacities are then calculated and radiative transfer equation is resolved for wide inhomogeneous plasmas. This code is able to cope rapidly with very large amount of atomic data. It is therefore possible to use complex hydrodynamic files even on personal computers in a very limited time. We used this code for comparison with Xenon EUV sources within the framework of nano-lithography developments. It appears that configurations mixing strongly shifts satellite lines and must be included in the description of these sources to enhance their efficiency. (author)

  5. A cognitive network for oracle bone characters related to animals

    Science.gov (United States)

    Dress, Andreas; Grünewald, Stefan; Zeng, Zhenbing

    2016-01-01

    In this paper, we present an analysis of oracle bone characters for animals from a “cognitive” point of view. After some general remarks on oracle-bone characters presented in Sec. 1 and a short outline of the paper in Sec. 2, we collect various oracle-bone characters for animals from published resources in Sec. 3. In the next section, we begin analyzing a group of 60 ancient animal characters from www.zdic.net, a highly acclaimed internet dictionary of Chinese characters that is strictly based on historical sources, and introduce five categories of specific features regarding their (graphical) structure that will be used in Sec. 5 to associate corresponding feature vectors to these characters. In Sec. 6, these feature vectors will be used to investigate their dissimilarity in terms of a family of parameterized distance measures. And in the last section, we apply the SplitsTree method as encoded in the NeighborNet algorithms to construct a corresponding family of dissimilarity-based networks with the intention of elucidating how the ancient Chinese might have perceived the “animal world” in the late bronze age and to demonstrate that these pictographs reflect an intuitive understanding of this world and its inherent structure that predates its classification in the oldest surviving Chinese encyclopedia from approximately the third century BC, the Er Ya, as well as similar classification systems in the West by one to two millennia. We also present an English dictionary of 70 oracle bone characters for animals in Appendix A. In Appendix B, we list various variants of animal characters that were published in the Jia Gu Wen Bian (cf. 甲骨文编, A Complete Collection of Oracle Bone Characters, edited by the Institute of Archaeology of the Chinese Academy of Social Sciences, published by the Zhonghua Book Company in 1965). We recall the frequencies of the 521 most frequent oracle bone characters in Appendix C as reported in [T. Chen, Yin-Shang Jiaguwen Zixing

  6. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    Science.gov (United States)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  7. A Character Segmentation Proposal for High-Speed Visual Monitoring of Expiration Codes on Beverage Cans

    Directory of Open Access Journals (Sweden)

    José C. Rodríguez-Rodríguez

    2016-04-01

    Full Text Available Expiration date labels are ubiquitous in the food industry. With the passage of time, almost any food becomes unhealthy, even when well preserved. The expiration date is estimated based on the type and manufacture/packaging time of that particular food unit. This date is then printed on the container so it is available to the end user at the time of consumption. MONICOD (MONItoring of CODes; an industrial validator of expiration codes; allows the expiration code printed on a drink can to be read. This verification occurs immediately after printing. MONICOD faces difficulties due to the high printing rate (35 cans per second and problematic lighting caused by the metallic surface on which the code is printed. This article describes a solution that allows MONICOD to extract shapes and presents quantitative results for the speed and quality.

  8. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  9. Metamorphosed characters in dreams: constraints of conceptual structure and amount of theory of mind.

    Science.gov (United States)

    Schweickert, Richard; Xi, Zhuangzhuang

    2010-05-01

    Dream reports from 21 dreamers in which a metamorphosis of a person-like entity or animal occurred were coded for characters and animals and for inner states attributed to them (Theory of Mind). In myths and fairy tales, Kelly and Keil (1985) found that conscious beings (people, gods) tend to be transformed into entities nearby in the conceptual structure of Keil (1979). This also occurred in dream reports, but perceptual nearness seemed more important than conceptual nearness. In dream reports, most inanimate objects involved in metamorphoses with person-like entities were objects such as statues that ordinarily resemble people physically, and moreover represent people. A metamorphosis of a person-like entity or animal did not lead to an increase in the amount of Theory of Mind attribution. We propose that a character-line starts when a character enters a dream; properties and Theory of Mind attributions tend to be preserved along the line, regardless of whether, metamorphoses occur on it. Copyright © 2009 Cognitive Science Society, Inc.

  10. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field ( port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  11. Modelling RF sources using 2-D PIC codes

    Energy Technology Data Exchange (ETDEWEB)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT`S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (``port approximation``). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  12. Modelling RF sources using 2-D PIC codes

    International Nuclear Information System (INIS)

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (''port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation

  13. Validation of the Open Source Code_Aster Software Used in the Modal Analysis of the Fluid-filled Cylindrical Shell

    Directory of Open Access Journals (Sweden)

    B D. Kashfutdinov

    2017-01-01

    Full Text Available The paper deals with a modal analysis of the elastic cylindrical shell with a clamped bottom partially filled with fluid in open source Code_Aster software using the finite element method. Natural frequencies and modes obtained in Code_Aster are compared to experimental and theoretical data. The aim of this paper is to prove that Code_Aster has all necessary tools for solving fluid structure interaction problems. Also, Code_Aster can be used in the industrial projects as an alternative to commercial software. The available free pre- and post-processors with a graphical user interface that is compatible with Code_Aster allow creating complex models and processing the results.The paper presents new validation results of open source Code_Aster software used to calculate small natural modes of the cylindrical shell partially filled with non-viscous compressible barotropic fluid under gravity field.The displacement of the middle surface of thin shell and the displacement of the fluid relative to the equilibrium position are described by coupled hydro-elasticity problem. The fluid flow is considered to be potential. The finite element method (FEM is used. The features of computational model are described. The resolution equation has symmetrical block matrices. To compare the results, is discussed the well-known modal analysis problem of cylindrical shell with flat non-deformable bottom, filled with a compressible fluid. The numerical parameters of the scheme were chosen in accordance with well-known experimental and analytical data. Three cases were taken into account: an empty, a partially filled and a full-filled cylindrical shell.The frequencies of Code_Aster are in good agreement with those, obtained in experiment, analytical solution, as well as with results obtained by FEM in other software. The difference between experiment and analytical solution in software is approximately the same. The obtained results extend a set of validation tests for

  14. A cognitive network for oracle-bone characters related to animals

    Science.gov (United States)

    Dress. Andreas; Grünewald, Stefan; Zeng, Zhenbing

    This paper is dedicated to HAO Bailin on the occasion of his eighties birthday, the great scholar and very good friend who never tired to introduce us to the wonderful and complex intricacies of Chinese culture and history. In this paper, we present an analysis of oracle-bone characters for animals from a `cognitive' point of view. After some general remarks on oraclebone characters presented in Section 1 and a short outline of the paper in Section 2, we collect various oracle-bone characters for animals from published resources in Section 3. In the next section, we begin analysing a group of 60 ancient animal characters from www.zdic.net, a highly acclaimed internet dictionary of Chinese characters that is strictly based on historical sources, and introduce five categories of specific features regarding their (graphical) structure that will be used in Section 5 to associate corresponding feature vectors to these characters. In Section 6, these feature vectors will be used to investigate their dissimilarity in terms of a family of parameterised distance measures. And in the last section, we apply the SplitsTree method as encoded in the NeighbourNet algorithms to construct a corresponding family of dissimilarity-based networks with the intention of elucidating how the ancient Chinese might have perceived the `animal world' in the late bronze age and to demonstrate that these pictographs reflect an intuitive understanding of this world and its inherent structure that predates its classification in the oldest surviving Chinese encyclopedia from approximately the 3rd century BC, the ErYa, as well as similar classification systems in the West by one to two millennia. We also present an English dictionary of 70 oracle-bone characters for animals in Appendix 1. In Appendix 2, we list various variants of animal characters that were published in the Jia Gu Wen Bian (cf. , A Complete Collection of Oracle Bone Characters, edited by the Institute of Archaeology of the Chinese

  15. A REVIEW: OPTICAL CHARACTER RECOGNITION

    OpenAIRE

    Swati Tomar*1 & Amit Kishore2

    2018-01-01

    This paper presents detailed review in the field of Optical Character Recognition. Various techniques are determine that have been proposed to realize the center of character recognition in an optical character recognition system. Even though, sufficient studies and papers are describes the techniques for converting textual content from a paper document into machine readable form. Optical character recognition is a process where the computer understands automatically the image of handwritten ...

  16. Knowing Chinese character grammar.

    Science.gov (United States)

    Myers, James

    2016-02-01

    Chinese character structure has often been described as representing a kind of grammar, but the notion of character grammar has hardly been explored. Patterns in character element reduplication are particularly grammar-like, displaying discrete combinatoriality, binarity, phonology-like final prominence, and potentially the need for symbolic rules (X→XX). To test knowledge of these patterns, Chinese readers were asked to judge the acceptability of fake characters varying both in grammaticality (obeying or violating reduplication constraints) and in lexicality (of the reduplicative configurations). While lexical knowledge was important (lexicality improved acceptability and grammatical configurations were accepted more quickly when also lexical), grammatical knowledge was important as well, with grammaticality improving acceptability equally for lexical and nonlexical configurations. Acceptability was also higher for more frequent reduplicative elements, suggesting that the reduplicative configurations were decomposed. Chinese characters present an as-yet untapped resource for exploring fundamental questions about the nature of the human capacity for grammar. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Character Apps for Children's Snacks: Effects of Character Awareness on Snack Selection and Consumption Patterns.

    Science.gov (United States)

    Putnam, Marisa M; Cotto, Caroline E; Calvert, Sandra L

    2018-04-01

    Media characters are used to market snacks that are typically of poor nutritional value, which has been linked to childhood obesity. This study examines whether children's snack selections and consumption patterns are influenced by an app depicting a popular children's media character, as well as the role that children's awareness of the character plays. The results can increase our understanding of how to encourage healthier snack selection and consumption in newer game-based marketing venues, such as apps. Four- and 5-year-old children (N = 132) played a bowling game on an iPad with no character or with a character holding either healthier or unhealthy snacks. After app-play, children selected and consumed healthier or unhealthy snacks. Children's awareness of the character was measured by children's verbalizations of the character's name during or after app-play. An ordered logistic regression found no significant effect of treatment conditions compared with the control group. Within treatment conditions, awareness of the character led to selection and consumption of more healthy snacks in the healthier condition (odds ratio β = 10.340, P = 0.008), and of unhealthy snacks in the unhealthy condition (odds ratio β = 0.228, P = 0.033), but children were unaware that the character influenced their decisions. Results suggest that young children will choose and consume healthier, not just unhealthy, products when they are aware that a popular character in an app is associated with the snack, potentially leading to healthier eating patterns.

  18. Character Education Based on Religious Values: an Islamic Perspective

    Directory of Open Access Journals (Sweden)

    Ismail Sukardi

    2016-09-01

    Full Text Available Character education in Indonesia has become a necessity that can not be negotiable. Various cases of crime and moral deviations become evident that the character of most citizens already at alarming stage. Therefore, since the beginning, national education is not only aimed at generating human intelligent and skilled, but also of noble character. This is realized through the introduction of 18 characters excel in school (religious, honest, disciplined, tolerance, and so on. In the Islamic perspective character education paired with akhlak (Islamic ethics education. Among the important characteristics are: it sourced from the Quran Hadith; Prophet Muhammad as a role model; priority-based methods of mental-spiritual (soul management, habituation, exemplary, and healthy environment; are simultaneous in which three education centers, namely schools, families, and communities should play a role in synergy. The government and the mass media also play a role in supporting the education of character.   Pendidikan karakter di Indonesia telah menjadi kebutuhan yang tidak dapat ditawar. Berbagai kasus kejahatan moral dan penyimpangan menjadi jelas bahwa karakter sebagian besar warga sudah pada tahap mengkhawatirkan. Oleh karena itu, sejak awal, pendidikan nasional tidak hanya bertujuan menghasilkan manusia cerdas dan terampil, tetapi juga karakter yang mulia. Hal ini diwujudkan melalui pengenalan 18 karakter berprestasi di sekolah (agama, jujur, disiplin, tolerann, dan sebagainya. Dalam pendidikan karakter perspektif Islam dipasangkan dengan pendidikan akhlak (etika Islam. Di antara karakteristik penting adalah: itu bersumber dari al-Quran Hadis; Nabi Muhammad sebagai panutan; metode berbasis prioritas mental-spiritual (manajemen jiwa, pembiasaan, keteladanan, dan lingkungan yang sehat; yang simultan di mana tiga pusat pendidikan, yaitu sekolah, keluarga, dan masyarakat harus berperan dalam sinergi. Pemerintah dan media massa juga berperan dalam mendukung

  19. Neutron spallation source and the Dubna cascade code

    CERN Document Server

    Kumar, V; Goel, U; Barashenkov, V S

    2003-01-01

    Neutron multiplicity per incident proton, n/p, in collision of high energy proton beam with voluminous Pb and W targets has been estimated from the Dubna cascade code and compared with the available experimental data for the purpose of benchmarking of the code. Contributions of various atomic and nuclear processes for heat production and isotopic yield of secondary nuclei are also estimated to assess the heat and radioactivity conditions of the targets. Results obtained from the code show excellent agreement with the experimental data at beam energy, E < 1.2 GeV and differ maximum up to 25% at higher energy. (author)

  20. A Missing Piece of the Contemporary Character Education Puzzle: The Individualisation of Moral Character

    Science.gov (United States)

    Chen, Yi-Lin

    2013-01-01

    The different sorts of virtuous people who display various virtues to a remarkable degree have brought the issue of individualisation of moral character to the forefront. It signals a more personal dimension of character development which is notoriously ignored in the current discourse on character education. The case is made that since in…

  1. Bar code usage in nuclear materials accountability

    International Nuclear Information System (INIS)

    Mee, W.T.

    1983-01-01

    The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At this time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable, however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented

  2. Character Issues: Reality Character Problems and Solutions through Education in Indonesia

    Science.gov (United States)

    Saidek, Abdul Rahim; Islami, Raisul; Abdoludin

    2016-01-01

    Weak character education raises the problem of corruption, a fight between students, free sex, drugs and rape/abortion indicate that the issue of character education of the nation must be improved and the concern of all parties, the nation's leaders, law enforcement officers, educators, religious leaders, groups and other etc. There are two…

  3. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  4. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    International Nuclear Information System (INIS)

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C

  5. A statistical–mechanical view on source coding: physical compression and data compression

    International Nuclear Information System (INIS)

    Merhav, Neri

    2011-01-01

    We draw a certain analogy between the classical information-theoretic problem of lossy data compression (source coding) of memoryless information sources and the statistical–mechanical behavior of a certain model of a chain of connected particles (e.g. a polymer) that is subjected to a contracting force. The free energy difference pertaining to such a contraction turns out to be proportional to the rate-distortion function in the analogous data compression model, and the contracting force is proportional to the derivative of this function. Beyond the fact that this analogy may be interesting in its own right, it may provide a physical perspective on the behavior of optimum schemes for lossy data compression (and perhaps also an information-theoretic perspective on certain physical system models). Moreover, it triggers the derivation of lossy compression performance for systems with memory, using analysis tools and insights from statistical mechanics

  6. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    Energy Technology Data Exchange (ETDEWEB)

    Zehtabian, M; Zaker, N; Sina, S [Shiraz University, Shiraz, Fars (Iran, Islamic Republic of); Meigooni, A Soleimani [Comprehensive Cancer Center of Nevada, Las Vegas, Nevada (United States)

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 which is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.

  7. MPEG-compliant joint source/channel coding using discrete cosine transform and substream scheduling for visual communication over packet networks

    Science.gov (United States)

    Kim, Seong-Whan; Suthaharan, Shan; Lee, Heung-Kyu; Rao, K. R.

    2001-01-01

    Quality of Service (QoS)-guarantee in real-time communication for multimedia applications is significantly important. An architectural framework for multimedia networks based on substreams or flows is effectively exploited for combining source and channel coding for multimedia data. But the existing frame by frame approach which includes Moving Pictures Expert Group (MPEG) cannot be neglected because it is a standard. In this paper, first, we designed an MPEG transcoder which converts an MPEG coded stream into variable rate packet sequences to be used for our joint source/channel coding (JSCC) scheme. Second, we designed a classification scheme to partition the packet stream into multiple substreams which have their own QoS requirements. Finally, we designed a management (reservation and scheduling) scheme for substreams to support better perceptual video quality such as the bound of end-to-end jitter. We have shown that our JSCC scheme is better than two other two popular techniques by simulation and real video experiments on the TCP/IP environment.

  8. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  9. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  10. Contribution to automatic handwritten characters recognition. Application to optical moving characters recognition

    International Nuclear Information System (INIS)

    Gokana, Denis

    1986-01-01

    This paper describes a research work on computer aided vision relating to the design of a vision system which can recognize isolated handwritten characters written on a mobile support. We use a technique which consists in analyzing information contained in the contours of the polygon circumscribed to the character's shape. These contours are segmented and labelled to give a new set of features constituted by: - right and left 'profiles', - topological and algebraic unvarying properties. A new method of character's recognition induced from this representation based on a multilevel hierarchical technique is then described. In the primary level, we use a fuzzy classification with dynamic programming technique using 'profiles'. The other levels adjust the recognition by using topological and algebraic unvarying properties. Several results are presented and an accuracy of 99 pc was reached for handwritten numeral characters, thereby attesting the robustness of our algorithm. (author) [fr

  11. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  12. Leading Character?s Antisocial Personality Disorder In James B Stewart?s Blind Eye

    OpenAIRE

    Lestari, Ayu

    2016-01-01

    110705043 The title of this thesis isLeading Character?s Antisocial Personality Disorder in James B Stewart?s Blind Eyethat is research about antisocial personality of leading character in the novel, namely Dr. Michael Swango. The purpose of this thesis is to find out characteristic of Swango that show he has antisocial personality disorder and to know the causes of his disorder. The writer refers to theory antisocial personality disorder that take in a research of APA (American Psychiatri...

  13. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  14. ANALYSIS OF THE INTERACTIONS BETWEEN IMMIGRANT/FOREIGN CHARACTERS AND NATIONAL/AUTOCHTHONOUS CHARACTERS IN SPANISH TELEVISION FICTION

    Directory of Open Access Journals (Sweden)

    María Marcos Ramos

    2014-10-01

    Full Text Available In this paper we present the results of an empirical research in which it were analysed the interactions between immigrants and national characters in a sample of Spanish television fictional programs broadcasted on prime time. This study is a content analysis of 282 interactions between immigrant/foreign and national characters. Thus, it was found that the largest number of relationships between the characters involved was produced in a working context. Moreover, there was a higher proportion of the use of aggressive humour from the national characters to the immigrant/foreigners than in the reverse way. It was also observed that the immigrant/foreigner characters hardly spoke about their feelings, nor were heard by the nationals when they did it, although there were a high number of interactions in which immigrant characters were expressing their opinions and these were attended by national characters. The analysis of the interactions between immigrant and national characters in television fiction is a very important research issue, because it has been proposed that the parasocial interactions are complementary of the interpersonal interactions taking place in daily life (Müller, 2009; Park, 2012.

  15. Calculation Of Fuel Burnup And Radionuclide Inventory In The Syrian Miniature Neutron Source Reactor Using The GETERA Code

    International Nuclear Information System (INIS)

    Khattab, K.; Dawahra, S.

    2011-01-01

    Calculations of the fuel burnup and radionuclide inventory in the Syrian Miniature Neutron Source Reactor (MNSR) after 10 years (the reactor core expected life) of the reactor operation time are presented in this paper using the GETERA code. The code is used to calculate the fuel group constants and the infinite multiplication factor versus the reactor operating time for 10, 20, and 30 kW operating power levels. The amounts of uranium burnup and plutonium produced in the reactor core, the concentrations and radionuclides of the most important fission product and actinide radionuclides accumulated in the reactor core, and the total radioactivity of the reactor core were calculated using the GETERA code as well. It is found that the GETERA code is better than the WIMSD4 code for the fuel burnup calculation in the MNSR reactor since it is newer and has a bigger library of isotopes and more accurate. (author)

  16. Character Selection During Interactive Taxonomic Identification: “Best Characters”

    Directory of Open Access Journals (Sweden)

    Nadia Talent

    2014-03-01

    Full Text Available Software interfaces for interactive multiple-entry taxonomic identification (polyclaves sometimes provide a “best character” or “separation” coefficient, to guide the user to choose a character that could most effectively reduce the number of identification steps required. The coefficient could be particularly helpful when difficult or expensive tasks are needed for forensic identification, and in very large databases, uses that appear likely to increase in importance. Several current systems also provide tools to develop taxonomies or single-entry identification keys, with a variety of coefficients that are appropriate to that purpose. For the identification task, however, information theory neatly applies, and provides the most appropriate coefficient. To our knowledge, Delta-Intkey is the only currently available system that uses a coefficient related to information theory, and it is currently being reimplemented, which may allow for improvement. We describe two improvements to the algorithm used by Delta-Intkey. The first improves transparency as the number of remaining taxa decreases, by normalizing the range of the coefficient to [0,1]. The second concerns numeric ranges, which require consistent treatment of sub-intervals and their end-points. A stand-alone Bestchar program for categorical data is provided, in the Python and R languages. The source code is freely available and dedicated to the Public Domain.

  17. Maya Studio Projects Photorealistic Characters

    CERN Document Server

    Palamar, Todd

    2011-01-01

    Create realistic characters with Maya tools and this project-based book Maya character generation tools are extremely sophisticated, and there's no better way to learn all their capabilities than by working through the projects in this hands-on book. This official guide focuses on understanding and implementing Maya's powerful tools for creating realistic characters for film, games, and TV. Use a variety of tools to create characters from skeleton to clothing, including hairstyles and facial hair, and learn how to use Performance Capture. A DVD includes supplementary videos, project support fi

  18. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    International Nuclear Information System (INIS)

    Trejos, Sorayda; Barrera, John Fredy; Torroba, Roberto

    2015-01-01

    We present for the first time an optical encrypting–decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome. (paper)

  19. Optimized and secure technique for multiplexing QR code images of single characters: application to noiseless messages retrieval

    Science.gov (United States)

    Trejos, Sorayda; Fredy Barrera, John; Torroba, Roberto

    2015-08-01

    We present for the first time an optical encrypting-decrypting protocol for recovering messages without speckle noise. This is a digital holographic technique using a 2f scheme to process QR codes entries. In the procedure, letters used to compose eventual messages are individually converted into a QR code, and then each QR code is divided into portions. Through a holographic technique, we store each processed portion. After filtering and repositioning, we add all processed data to create a single pack, thus simplifying the handling and recovery of multiple QR code images, representing the first multiplexing procedure applied to processed QR codes. All QR codes are recovered in a single step and in the same plane, showing neither cross-talk nor noise problems as in other methods. Experiments have been conducted using an interferometric configuration and comparisons between unprocessed and recovered QR codes have been performed, showing differences between them due to the involved processing. Recovered QR codes can be successfully scanned, thanks to their noise tolerance. Finally, the appropriate sequence in the scanning of the recovered QR codes brings a noiseless retrieved message. Additionally, to procure maximum security, the multiplexed pack could be multiplied by a digital diffuser as to encrypt it. The encrypted pack is easily decoded by multiplying the multiplexing with the complex conjugate of the diffuser. As it is a digital operation, no noise is added. Therefore, this technique is threefold robust, involving multiplexing, encryption, and the need of a sequence to retrieve the outcome.

  20. Holistic Processing of Chinese Characters

    Directory of Open Access Journals (Sweden)

    Alan Chun-Nang Wong

    2011-05-01

    Full Text Available Enhanced holistic processing (obligatory attention to all parts of an object has been associated with different types of perceptual expertise involving faces, cars, fingerprints, musical notes, English words, etc. Curiously Chinese characters are regarded as an exception, as indicated by the lack of holistic processing found for experts (Hsiao and Cottrell, 2009. The ceiling performance of experts, however, may have caused this null effect. We revisit this issue by adopting the often-used face-composite sequential-matching task to two-part Chinese characters. Participants matched the target halves (left or right of two characters while ignoring the irrelevant halves. Both Chinese readers (experts and non-Chinese readers (novices showed holistic processing. Follow-up experiments suggested different origins of the effects for the two groups. For experts, holistic processing was sensitive to the amount of experience with the characters, as it was larger for words than non-words (formed by swapping the two parts of a valid character. Novices, however, showed similar degree of holistic processing to words and non-words, suggesting that their effects were more related to their inefficient decomposition of a complex, character-like pattern into parts. Overall these findings suggest that holistic processing may be a marker of expertise with Chinese characters, contrary to previous claims.

  1. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  2. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  3. Character order processing in Chinese reading.

    Science.gov (United States)

    Gu, Junjuan; Li, Xingshan; Liversedge, Simon P

    2015-02-01

    We explored how character order information is encoded in isolated word processing or Chinese sentence reading in 2 experiments using a masked priming paradigm and a gaze-contingent display-change paradigm. The results showed that response latencies in the lexical decision task and reading times on the target word region were longer in the unrelated condition (the prime or the preview was unrelated with the target word) than the transposed-character condition (the prime or the preview was a transposition of the 2 characters of the target word), which were respectively longer than in the identity condition (the prime or preview was identical to the target word). These results show that character order is encoded at an early stage of processing in Chinese reading, but character position encoding was not strict. We also found that character order encoding was similar for single-morpheme and multiple-morpheme words, suggesting that morphemic status does not affect character order encoding. The current results represent an early contribution to our understanding of character order encoding during Chinese reading.

  4. Issues in Developing a Surveillance Case Definition for Nonfatal Suicide Attempt and Intentional Self-harm Using International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) Coded Data.

    Science.gov (United States)

    Hedegaard, Holly; Schoenbaum, Michael; Claassen, Cynthia; Crosby, Alex; Holland, Kristin; Proescholdbell, Scott

    2018-02-01

    Suicide and intentional self-harm are among the leading causes of death in the United States. To study this public health issue, epidemiologists and researchers often analyze data coded using the International Classification of Diseases (ICD). Prior to October 1, 2015, health care organizations and providers used the clinical modification of the Ninth Revision of ICD (ICD-9-CM) to report medical information in electronic claims data. The transition in October 2015 to use of the clinical modification of the Tenth Revision of ICD (ICD-10-CM) resulted in the need to update methods and selection criteria previously developed for ICD-9-CM coded data. This report provides guidance on the use of ICD-10-CM codes to identify cases of nonfatal suicide attempts and intentional self-harm in ICD-10-CM coded data sets. ICD-10-CM codes for nonfatal suicide attempts and intentional self-harm include: X71-X83, intentional self-harm due to drowning and submersion, firearms, explosive or thermal material, sharp or blunt objects, jumping from a high place, jumping or lying in front of a moving object, crashing of motor vehicle, and other specified means; T36-T50 with a 6th character of 2 (except for T36.9, T37.9, T39.9, T41.4, T42.7, T43.9, T45.9, T47.9, and T49.9, which are included if the 5th character is 2), intentional self-harm due to drug poisoning (overdose); T51-T65 with a 6th character of 2 (except for T51.9, T52.9, T53.9, T54.9, T56.9, T57.9, T58.0, T58.1, T58.9, T59.9, T60.9, T61.0, T61.1, T61.9, T62.9, T63.9, T64.0, T64.8, and T65.9, which are included if the 5th character is 2), intentional self-harm due to toxic effects of nonmedicinal substances; T71 with a 6th character of 2, intentional self-harm due to asphyxiation, suffocation, strangulation; and T14.91, Suicide attempt. Issues to consider when selecting records for nonfatal suicide attempts and intentional self-harm from ICD-10-CM coded administrative data sets are also discussed. All material appearing in this

  5. SCRIC: a code dedicated to the detailed emission and absorption of heterogeneous NLTE plasmas; application to xenon EUV sources; SCRIC: un code pour calculer l'absorption et l'emission detaillees de plasmas hors equilibre, inhomogenes et etendus; application aux sources EUV a base de xenon

    Energy Technology Data Exchange (ETDEWEB)

    Gaufridy de Dortan, F. de

    2006-07-01

    Nearly all spectral opacity codes for LTE and NLTE plasmas rely on configurations approximate modelling or even supra-configurations modelling for mid Z plasmas. But in some cases, configurations interaction (either relativistic and non relativistic) induces dramatic changes in spectral shapes. We propose here a new detailed emissivity code with configuration mixing to allow for a realistic description of complex mid Z plasmas. A collisional radiative calculation. based on HULLAC precise energies and cross sections. determines the populations. Detailed emissivities and opacities are then calculated and radiative transfer equation is resolved for wide inhomogeneous plasmas. This code is able to cope rapidly with very large amount of atomic data. It is therefore possible to use complex hydrodynamic files even on personal computers in a very limited time. We used this code for comparison with Xenon EUV sources within the framework of nano-lithography developments. It appears that configurations mixing strongly shifts satellite lines and must be included in the description of these sources to enhance their efficiency. (author)

  6. Gaze strategies can reveal the impact of source code features on the cognitive load of novice programmers

    DEFF Research Database (Denmark)

    Wulff-Jensen, Andreas; Ruder, Kevin Vignola; Triantafyllou, Evangelia

    2018-01-01

    As shown by several studies, programmers’ readability of source code is influenced by its structural and the textual features. In order to assess the importance of these features, we conducted an eye-tracking experiment with programming students. To assess the readability and comprehensibility of...

  7. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    Science.gov (United States)

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  8. Species tree phylogeny and character evolution in the genus Centipeda (Asteraceae): evidence from DNA sequences from coding and non-coding loci from the plastid and nuclear genomes.

    Science.gov (United States)

    Nylinder, Stephan; Cronholm, Bodil; de Lange, Peter J; Walsh, Neville; Anderberg, Arne A

    2013-08-01

    A species tree phylogeny of the Australian/New Zealand genus Centipeda (Asteraceae) is estimated based on nucleotide sequence data. We analysed sequences of nuclear ribosomal DNA (ETS, ITS) and three plasmid loci (ndhF, psbA-trnH, and trnL-F) using the multi-species coalescent module in BEAST. A total of 129 individuals from all 10 recognised species of Centipeda were sampled throughout the species distribution ranges, including two subspecies. We conclude that the inferred species tree topology largely conform previous assumptions on species relationships. Centipeda racemosa (Snuffweed) is the sister to remaining species, which is also the only consistently perennial representative in the genus. Centipeda pleiocephala (Tall Sneezeweed) and C. nidiformis (Cotton Sneezeweed) constitute a species pair, as does C. borealis and C. minima (Spreading Sneezeweed), all sharing the symplesiomorphic characters of spherical capitulum and convex receptacle with C. racemosa. Another species group comprising C. thespidioides (Desert Sneezeweed), C. cunninghamii (Old man weed, or Common sneeze-weed), C. crateriformis is well-supported but then include the morphologically aberrant C. aotearoana, all sharing the character of having capitula that mature more slowly relative the subtending shoot. Centipeda elatinoides takes on a weakly supported intermediate position between the two mentioned groups, and is difficult to relate to any of the former groups based on morphological characters. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Character profiles and life satisfaction.

    Science.gov (United States)

    Park, Hwanjin; Suh, Byung Seong; Kim, Won Sool; Lee, Hye-Kyung; Park, Seon-Cheol; Lee, Kounseok

    2015-04-01

    There is a surge of interest in subjective well-being (SWB), which concerns how individuals feel about their happiness. Life satisfaction tends to be influenced by individual psychological traits and external social factors. The aim of this study was to examine the relationship between individual character and SWB. Data from 3522 university students were analyzed in this study. Character profiles were evaluated using the Temperament and Character Inventory-Revised Short version (TCI-RS). Life satisfaction was assessed using the Satisfaction with Life Scale (SWLS). All statistical tests regarding the correlations between each character profile and life satisfaction were conducted using ANOVAs, t-tests, multiple linear regression models and correlation analyses. The creative (SCT) profile was associated with the highest levels of life satisfaction, whereas the depressive (sct) profile was associated with the lowest levels of life satisfaction. Additionally, high self-directedness, self-transcendence and cooperation were associated with high life satisfaction. The results of gender-adjusted multiple regression analysis showed that the effects of self-directedness were the strongest in the assessment of one's quality of life, followed by self-transcendence and cooperativeness, in that order. All of the three-character profiles were significantly correlated with one's quality of life, and the character profiles of TCI-RS explained 27.6% of life satisfaction in total. Among the three-character profiles, the self-directedness profile was most associated with life satisfaction. Our study was cross-sectional, and self-reported data from students at a single university were analyzed. The results of this study showed that, among the character profiles, the effects of self-directedness were the strongest for predicting life satisfaction. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Variable code gamma ray imaging system

    International Nuclear Information System (INIS)

    Macovski, A.; Rosenfeld, D.

    1979-01-01

    A gamma-ray source distribution in the body is imaged onto a detector using an array of apertures. The transmission of each aperture is modulated using a code such that the individual views of the source through each aperture can be decoded and separated. The codes are chosen to maximize the signal to noise ratio for each source distribution. These codes determine the photon collection efficiency of the aperture array. Planar arrays are used for volumetric reconstructions and circular arrays for cross-sectional reconstructions. 14 claims

  11. The IAEA code of conduct on the safety of radiation sources and the security of radioactive materials. A step forwards or backwards?

    International Nuclear Information System (INIS)

    Boustany, K.

    2001-01-01

    About the finalization of the Code of Conduct on the Safety and Security of radioactive Sources, it appeared that two distinct but interrelated subject areas have been identified: the prevention of accidents involving radiation sources and the prevention of theft or any other unauthorized use of radioactive materials. What analysis reveals is rather that there are gaps in both the content of the Code and the processes relating to it. Nevertheless, new standards have been introduced as a result of this exercise and have thus, as an enactment of what constitutes appropriate behaviour in the field of the safety and security of radioactive sources, emerged into the arena of international relations. (N.C.)

  12. Signalign: An Ontology of DNA as Signal for Comparative Gene Structure Prediction Using Information-Coding-and-Processing Techniques.

    Science.gov (United States)

    Yu, Ning; Guo, Xuan; Gu, Feng; Pan, Yi

    2016-03-01

    Conventional character-analysis-based techniques in genome analysis manifest three main shortcomings-inefficiency, inflexibility, and incompatibility. In our previous research, a general framework, called DNA As X was proposed for character-analysis-free techniques to overcome these shortcomings, where X is the intermediates, such as digit, code, signal, vector, tree, graph network, and so on. In this paper, we further implement an ontology of DNA As Signal, by designing a tool named Signalign for comparative gene structure analysis, in which DNA sequences are converted into signal series, processed by modified method of dynamic time warping and measured by signal-to-noise ratio (SNR). The ontology of DNA As Signal integrates the principles and concepts of other disciplines including information coding theory and signal processing into sequence analysis and processing. Comparing with conventional character-analysis-based methods, Signalign can not only have the equivalent or superior performance, but also enrich the tools and the knowledge library of computational biology by extending the domain from character/string to diverse areas. The evaluation results validate the success of the character-analysis-free technique for improved performances in comparative gene structure prediction.

  13. A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors.

    Science.gov (United States)

    Cheng, Juan; Chen, Xun; Liu, Aiping; Peng, Hu

    2015-09-15

    Sign language recognition (SLR) is an important communication tool between the deaf and the external world. It is highly necessary to develop a worldwide continuous and large-vocabulary-scale SLR system for practical usage. In this paper, we propose a novel phonology- and radical-coded Chinese SLR framework to demonstrate the feasibility of continuous SLR using accelerometer (ACC) and surface electromyography (sEMG) sensors. The continuous Chinese characters, consisting of coded sign gestures, are first segmented into active segments using EMG signals by means of moving average algorithm. Then, features of each component are extracted from both ACC and sEMG signals of active segments (i.e., palm orientation represented by the mean and variance of ACC signals, hand movement represented by the fixed-point ACC sequence, and hand shape represented by both the mean absolute value (MAV) and autoregressive model coefficients (ARs)). Afterwards, palm orientation is first classified, distinguishing "Palm Downward" sign gestures from "Palm Inward" ones. Only the "Palm Inward" gestures are sent for further hand movement and hand shape recognition by dynamic time warping (DTW) algorithm and hidden Markov models (HMM) respectively. Finally, component recognition results are integrated to identify one certain coded gesture. Experimental results demonstrate that the proposed SLR framework with a vocabulary scale of 223 characters can achieve an averaged recognition accuracy of 96.01% ± 0.83% for coded gesture recognition tasks and 92.73% ± 1.47% for character recognition tasks. Besides, it demonstrats that sEMG signals are rather consistent for a given hand shape independent of hand movements. Hence, the number of training samples will not be significantly increased when the vocabulary scale increases, since not only the number of the completely new proposed coded gestures is constant and limited, but also the transition movement which connects successive signs needs no

  14. Vacuum entanglement governs the bosonic character of magnons

    International Nuclear Information System (INIS)

    Morimae, Tomoyuki

    2010-01-01

    It is well known that magnons, which are elementary excitations in a magnetic material, behave as bosons when their density is low. We study how the bosonic character of magnons is governed by the amount of multipartite entanglement in the vacuum state on which magnons are excited. We show that if multipartite entanglement is strong, magnons cease to be bosons. We also consider some examples, such as ground states of the Heisenberg ferromagnet and the transverse Ising model, the condensation of magnons, the one-way quantum computer, and Kitaev's toric code. Our result provides insights into the quantum statistics of elementary excitations in these models, and into the reason why a nonlocal transformation, such as the Jordan-Wigner transformation, is necessary for some many-body systems.

  15. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    Science.gov (United States)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  16. A zero-dimensional EXTRAP computer code

    International Nuclear Information System (INIS)

    Karlsson, P.

    1982-10-01

    A zero-dimensional computer code has been designed for the EXTRAP experiment to predict the density and the temperature and their dependence upon paramenters such as the plasma current and the filling pressure of neutral gas. EXTRAP is a Z-pinch immersed in a vacuum octupole field and could be either linear or toroidal. In this code the density and temperature are assumed to be constant from the axis up to a breaking point from where they decrease linearly in the radial direction out to the plasma radius. All quantities, however, are averaged over the plasma volume thus giving the zero-dimensional character of the code. The particle, momentum and energy one-fluid equations are solved including the effects of the surrounding neutral gas and oxygen impurities. The code shows that the temperature and density are very sensitive to the shape of the plasma, flatter profiles giving higher temperatures and densities. The temperature, however, is not strongly affected for oxygen concentration less than 2% and is well above the radiation barrier even for higher concentrations. (Author)

  17. Study of the source term of radiation of the CDTN GE-PET trace 8 cyclotron with the MCNPX code

    Energy Technology Data Exchange (ETDEWEB)

    Benavente C, J. A.; Lacerda, M. A. S.; Fonseca, T. C. F.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R., E-mail: jhonnybenavente@gmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico)

    2015-10-15

    Full text: The knowledge of the neutron spectra in a PET cyclotron is important for the optimization of radiation protection of the workers and individuals of the public. The main objective of this work is to study the source term of radiation of the GE-PET trace 8 cyclotron of the Development Center of Nuclear Technology (CDTN/CNEN) using computer simulation by the Monte Carlo method. The MCNPX version 2.7 code was used to calculate the flux of neutrons produced from the interaction of the primary proton beam with the target body and other cyclotron components, during 18F production. The estimate of the source term and the corresponding radiation field was performed from the bombardment of a H{sub 2}{sup 18}O target with protons of 75 μA current and 16.5 MeV of energy. The values of the simulated fluxes were compared with those reported by the accelerator manufacturer (GE Health care Company). Results showed that the fluxes estimated with the MCNPX codes were about 70% lower than the reported by the manufacturer. The mean energies of the neutrons were also different of that reported by GE Health Care. It is recommended to investigate other cross sections data and the use of physical models of the code itself for a complete characterization of the source term of radiation. (Author)

  18. Character Development in Adolescents.

    Science.gov (United States)

    Kessler, Glenn R.; And Others

    1986-01-01

    Explored the effects of a program consisting of communication and counseling skills, assertiveness training and moral dilemmas on the character development of high school students. The results demonstrated that the character development of the students in the experimental treatment group was affected significantly over time by the program.…

  19. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  20. Analysis of source term aspects in the experiment Phebus FPT1 with the MELCOR and CFX codes

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Fuertes, F. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain)]. E-mail: francisco.martinfuertes@upm.es; Barbero, R. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Martin-Valdepenas, J.M. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain); Jimenez, M.A. [Universidad Politecnica de Madrid, UPM, Nuclear Engineering Department, Jose Gutierrez Abascal 2, 28006 Madrid (Spain)

    2007-03-15

    Several aspects related to the source term in the Phebus FPT1 experiment have been analyzed with the help of MELCOR 1.8.5 and CFX 5.7 codes. Integral aspects covering circuit thermalhydraulics, fission product and structural material release, vapours and aerosol retention in the circuit and containment were studied with MELCOR, and the strong and weak points after comparison to experimental results are stated. Then, sensitivity calculations dealing with chemical speciation upon release, vertical line aerosol deposition and steam generator aerosol deposition were performed. Finally, detailed calculations concerning aerosol deposition in the steam generator tube are presented. They were obtained by means of an in-house code application, named COCOA, as well as with CFX computational fluid dynamics code, in which several models for aerosol deposition were implemented and tested, while the models themselves are discussed.

  1. Social character of materialism.

    Science.gov (United States)

    Chatterjee, A; Hunt, J M; Kernan, J B

    2000-06-01

    Scores for 170 undergraduates on Richins and Dawson's Materialism scale were correlated with scores on Kassarjian's Social Preference Scale, designed to measure individuals' character structure. A correlation of .26 between materialism and other-directed social character suggested that an externally oriented reference system guides materialists' perceptions, judgments, acquisitions, and possessions.

  2. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    Science.gov (United States)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as

  3. APLIKASI SPOKES-CHARACTERS DALAM KAITAN DENGAN MEREK PRODUK

    Directory of Open Access Journals (Sweden)

    Caroline Widjoyo

    2004-01-01

    Full Text Available Advertisement by using spokes-characters have potency to alter the choice of consumer brand with the compared to bigger impact which do not use the spokes-character. A lot of producer which hence spokes-character to increase assess to sell from a product. Spokes-Characters experience of the evolution from time to time keep abreast of the era%2C now emerge the new type of spokes-character and its application. Abstract in Bahasa Indonesia : Iklan dengan spokes-characters berpotensi mengubah pilihan merek konsumen dengan dampak lebih besar dibanding yang tidak menggunakan spokes-character. Banyak produsen yang memakai spokes-character untuk meningkatkan nilai jual dari sebuah produk. Spokes-characters mengalami evolusi dari masa ke masa mengikuti perkembangan jaman%2C sekarang muncul jenis baru spokes-character dan aplikasinya. spokes-characters%2C advertising%2C branding.

  4. Character Play – The use of game characters in multi- player Role Playing Games across platforms

    DEFF Research Database (Denmark)

    Tychsen, Anders; Hitchens, M.; Brolund, T.

    2008-01-01

    histories of game characters. This article presents results from a comprehensive empirical study of the way complex game characters are utilized by players in multiplayer role-playing games across two different media platforms. The results indicate that adult players are capable of comprehending...... and utilizing game characters with well-defined personalities and backgrounds, as well as rules-based components. Furthermore, that the game format plays a significant role in the pattern of usage of the character elements. This pattern appears directly linked with variations in the way that the different game...

  5. Marvel and DC Characters Inspired by Arachnids

    Directory of Open Access Journals (Sweden)

    Elidiomar Ribeiro Da-Silva

    2014-12-01

    Full Text Available This article compares arachnid-based Marvel and DC comics characters. The composition of a comic book character often has interesting ‘real-life’ influences. Given the strong connection between arachnids (especially spiders, scorpions and mites, all belonging to the zoological class 'Arachnida' and human beings it is not surprising that they have inspired many fictional characters. We recorded 84 Marvel Comics characters and 40 DC Comics characters, detailed in the dataset that accompanies the article (Da-Silva 2014. Most characters have been created recently, since the 1990s. Marvel has significantly more arachnid characters than DC. As for taxonomic classification, the characters were based mostly on spiders (zoological order 'Araneae'. Of the total characters, the majority are human beings, but an overwhelming number have at least some typical arachnid features. Villains (60.91% of total are significantly more numerous, considering the sum of the two publishers. Arachnids have bad reputation for being dangerous (Thorp and Woodson 1976; Ruppert and Barnes 1996. Since the public usually considers spiders, scorpions and mites “harmful” in general, we expected a larger contingent of villains. However, there was no statistical difference between the amount of villains and heroes in Marvel characters. It did not happen probably due to the success of one character: the Amazing Spider-Man.

  6. Handwritten Sindhi Character Recognition Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Shafique Ahmed Awan

    2018-01-01

    Full Text Available OCR (OpticalCharacter Recognition is a technology in which text image is used to understand and write text by machines. The work on languages containing isolated characters such as German, English, French and others is at its peak. The OCR and ICR (Intelligent Character Recognition research in Sindhi script is currently at in starting stages and not sufficient work have been cited in this area even though Sindhi language is rich in culture and history. This paper presents one of the initial steps in recognizing Sindhi handwritten characters. The isolated characters of Sindhi script written by thesubjects have been recognized. The various subjects were asked to write Sindhi characters in unconstrained form and then the written samples were collected and scanned through a flatbed scanner. The scanned documents were preprocessedwith the help of binary conversion, removing noise by pepper noise and the lines were segmented with the help of horizontal profile technique. The segmented lines were used to extract characters from scanned pages.This character segmentation was done by vertical projection. The extracted characters have been used to extract features so that the characters can be classified easily. Zoning was used for the feature extraction technique. For the classification, neural network has been used. The recognized characters converted into editable text with an average accuracy of 85%.

  7. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  8. Degraded character recognition based on gradient pattern

    Science.gov (United States)

    Babu, D. R. Ramesh; Ravishankar, M.; Kumar, Manish; Wadera, Kevin; Raj, Aakash

    2010-02-01

    Degraded character recognition is a challenging problem in the field of Optical Character Recognition (OCR). The performance of an optical character recognition depends upon printed quality of the input documents. Many OCRs have been designed which correctly identifies the fine printed documents. But, very few reported work has been found on the recognition of the degraded documents. The efficiency of the OCRs system decreases if the input image is degraded. In this paper, a novel approach based on gradient pattern for recognizing degraded printed character is proposed. The approach makes use of gradient pattern of an individual character for recognition. Experiments were conducted on character image that is either digitally written or a degraded character extracted from historical documents and the results are found to be satisfactory.

  9. Multirate Filter Bank Representations of RS and BCH Codes

    Directory of Open Access Journals (Sweden)

    Van Meerbergen Geert

    2008-01-01

    Full Text Available Abstract This paper addresses the use of multirate filter banks in the context of error-correction coding. An in-depth study of these filter banks is presented, motivated by earlier results and applications based on the filter bank representation of Reed-Solomon (RS codes, such as Soft-In Soft-Out RS-decoding or RS-OFDM. The specific structure of the filter banks (critical subsampling is an important aspect in these applications. The goal of the paper is twofold. First, the filter bank representation of RS codes is now explained based on polynomial descriptions. This approach allows us to gain new insight in the correspondence between RS codes and filter banks. More specifically, it allows us to show that the inherent periodically time-varying character of a critically subsampled filter bank matches remarkably well with the cyclic properties of RS codes. Secondly, an extension of these techniques toward the more general class of BCH codes is presented. It is demonstrated that a BCH code can be decomposed into a sum of critically subsampled filter banks.

  10. Multirate Filter Bank Representations of RS and BCH Codes

    Directory of Open Access Journals (Sweden)

    Marc Moonen

    2009-01-01

    Full Text Available This paper addresses the use of multirate filter banks in the context of error-correction coding. An in-depth study of these filter banks is presented, motivated by earlier results and applications based on the filter bank representation of Reed-Solomon (RS codes, such as Soft-In Soft-Out RS-decoding or RS-OFDM. The specific structure of the filter banks (critical subsampling is an important aspect in these applications. The goal of the paper is twofold. First, the filter bank representation of RS codes is now explained based on polynomial descriptions. This approach allows us to gain new insight in the correspondence between RS codes and filter banks. More specifically, it allows us to show that the inherent periodically time-varying character of a critically subsampled filter bank matches remarkably well with the cyclic properties of RS codes. Secondly, an extension of these techniques toward the more general class of BCH codes is presented. It is demonstrated that a BCH code can be decomposed into a sum of critically subsampled filter banks.

  11. Character Education and Students Social Behavior

    Directory of Open Access Journals (Sweden)

    Syamsu A. Kamaruddin

    2012-09-01

    Full Text Available

    In an educational environment, in the form of character education program has been done both formally and informally. It's intended as one of the supporting ideas for follow-up in the form of design activities. Character education should basically refers to the vision and mission of the institution concerned. It shows the orientation of the two things in the character of the students are: aspects of human character and individual learners hallmark institution. In this paper, these two aspects is the author trying to ideas by referring to some other writings. The end result, the authors expect the birth of a design patent as early referral to spearhead a character development program learners.

  12. PRIMUS: a computer code for the preparation of radionuclide ingrowth matrices from user-specified sources

    International Nuclear Information System (INIS)

    Hermann, O.W.; Baes, C.F. III; Miller, C.W.; Begovich, C.L.; Sjoreen, A.L.

    1984-10-01

    The computer program, PRIMUS, reads a library of radionuclide branching fractions and half-lives and constructs a decay-chain data library and a problem-specific decay-chain data file. PRIMUS reads the decay data compiled for 496 nuclides from the Evaluated Nuclear Structure Data File (ENSDF). The ease of adding radionuclides to the input library allows the CRRIS system to further expand its comprehensive data base. The decay-chain library produced is input to the ANEMOS code. Also, PRIMUS produces a data set reduced to only the decay chains required in a particular problem, for input to the SUMIT, TERRA, MLSOIL, and ANDROS codes. Air concentrations and deposition rates from the PRIMUS decay-chain data file. Source term data may be entered directly to PRIMUS to be read by MLSOIL, TERRA, and ANDROS. The decay-chain data prepared by PRIMUS is needed for a matrix-operator method that computes either time-dependent decay products from an initial concentration generated from a constant input source. This document describes the input requirements and the output obtained. Also, sections are included on methods, applications, subroutines, and sample cases. A short appendix indicates a method of utilizing PRIMUS and the associated decay subroutines from TERRA or ANDROS for applications to other decay problems. 18 references

  13. Essays on Character & Opportunity

    Science.gov (United States)

    Center on Children and Families at Brookings, 2014

    2014-01-01

    These essays provide richer set of writings on the philosophical, empirical and practical issues raised by a focus on character, and in particular its relationship to questions of opportunity. Each one is an intellectual pemmican: sharp and to the point. Two scholars draw attention to the gendered nature of character formation (Segal and Lexmond);…

  14. Evaluation of the efficacy of twelve mitochondrial protein-coding genes as barcodes for mollusk DNA barcoding.

    Science.gov (United States)

    Yu, Hong; Kong, Lingfeng; Li, Qi

    2016-01-01

    In this study, we evaluated the efficacy of 12 mitochondrial protein-coding genes from 238 mitochondrial genomes of 140 molluscan species as potential DNA barcodes for mollusks. Three barcoding methods (distance, monophyly and character-based methods) were used in species identification. The species recovery rates based on genetic distances for the 12 genes ranged from 70.83 to 83.33%. There were no significant differences in intra- or interspecific variability among the 12 genes. The monophyly and character-based methods provided higher resolution than the distance-based method in species delimitation. Especially in closely related taxa, the character-based method showed some advantages. The results suggested that besides the standard COI barcode, other 11 mitochondrial protein-coding genes could also be potentially used as a molecular diagnostic for molluscan species discrimination. Our results also showed that the combination of mitochondrial genes did not enhance the efficacy for species identification and a single mitochondrial gene would be fully competent.

  15. BLT [Breach, Leach, and Transport]: A source term computer code for low-level waste shallow land burial

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1990-01-01

    This paper discusses the development of a source term model for low-level waste shallow land burial facilities and separates the problem into four individual compartments. These are water flow, corrosion and subsequent breaching of containers, leaching of the waste forms, and solute transport. For the first and the last compartments, we adopted the existing codes, FEMWATER and FEMWASTE, respectively. We wrote two new modules for the other two compartments in the form of two separate Fortran subroutines -- BREACH and LEACH. They were incorporated into a modified version of the transport code FEMWASTE. The resultant code, which contains all three modules of container breaching, waste form leaching, and solute transport, was renamed BLT (for Breach, Leach, and Transport). This paper summarizes the overall program structure and logistics, and presents two examples from the results of verification and sensitivity tests. 6 refs., 7 figs., 1 tab

  16. Building Character through Literacy with Children's Literature

    Science.gov (United States)

    Almerico, Gina M.

    2014-01-01

    Character education is described as curriculum specifically developed to teach children about the quality and traits of good character. One means in which children can learn about good character is through the pages of high quality children's literature. In this study, the author defines the characteristics of an effective character development…

  17. The Effect of Realistic Appearance of Virtual Characters in Immersive Environments - Does the Character's Personality Play a Role?

    Science.gov (United States)

    Zibrek, Katja; Kokkinara, Elena; Mcdonnell, Rachel

    2018-04-01

    Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games. We were particularly interested in whether different render styles (realistic, cartoon, etc.) would directly influence appeal, or if a character's personality was the most important indicator of appeal. We used a number of perceptual metrics such as subjective ratings, proximity, and attribution bias in order to test our hypothesis. Our main result shows that affinity towards virtual characters is a complex interaction between the character's appearance and personality, and that realism is in fact a positive choice for virtual characters in virtual reality.

  18. On the Cultural Coding Function of the Korean Four-Character Idioms%韩国四字成语之文化密码功能探析

    Institute of Scientific and Technical Information of China (English)

    任晓礼

    2016-01-01

    韩国四字成语凭借其高度的浓缩性和强大的表现力,常被作为一种文化密码来概括某种社会文化现象。本论文旨在通过实例分析,从多个层面考查韩国四字成语所具有的文化密码功能,从而为人们正确理解和认识韩国社会文化提供一条便捷的路径。为此,考查了韩国四字成语的来源,从而确定了研究范围;以“八道人物评”为例,揭示了韩国四字成语文化密码功能的历史性;从“年度世态”、“企业经营哲学”、“足球赛战略思想”、“社会文化心理”四个方面,解析了四字成语所具体标示的韩国社会文化现象。%Due to their high brevity and powerful expressive forces, the Korean four-character idioms, regarded as a cultural code, are often implanted into social culture. In order to provide a shortcut for people to correctly understand the Korean society and its culture, this thesis analyzes some examples from many perspectives to explore the cultural coding function of these idioms. The paper ifrstly investigates the origin of these idioms and determines thereby the research scope. Then it begins to illustrate the long history of the coding function by “the comment on the people in the eight provinces”. Moreover, this thesis explains the socio-cultural phenomena coded by the idioms in terms of “the annual great Korean events”, “business philosophy of the enterprises”, “strategic thinking of the football match” and “the socio-cultural psychology”.

  19. Promoting Character Development through Coach Education

    Science.gov (United States)

    Power, F. Clark; Seroczynski, A. D.

    2015-01-01

    Can youth sports build character? Research suggests that the answer to this question leads to 2 further questions: (1) can youth sport coaches be effectively prepared to become character educators, and (2) can character education take place in today's competitive youth sport environment? (Bredemeier & Shields, 2006; Power, 2015; Power &…

  20. Braille Character Recognition Using Artificial Neural Network

    OpenAIRE

    Subur, Joko; Sardjono, Tri Arief; Mardiyanto, Ronny

    2015-01-01

    Braille letter is characters designed for the blind, consist of six embossed points, arranged in a standard braille character. Braille letters is touched and read using fingers, therefore the sensitivity of the fingers is important. Those characters need to be memorized, so it is very difficult to be learned. The aim of this research is to create a braille characters recognition system and translate it to alpha-numeric text. Webcam camera is used to capture braille image from braille characte...

  1. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    Science.gov (United States)

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  2. An Implementation of Error Minimization Data Transmission in OFDM using Modified Convolutional Code

    Directory of Open Access Journals (Sweden)

    Hendy Briantoro

    2016-04-01

    Full Text Available This paper presents about error minimization in OFDM system. In conventional system, usually using channel coding such as BCH Code or Convolutional Code. But, performance BCH Code or Convolutional Code is not good in implementation of OFDM System. Error bits of OFDM system without channel coding is 5.77%. Then, we used convolutional code with code rate 1/2, it can reduce error bitsonly up to 3.85%. So, we proposed OFDM system with Modified Convolutional Code. In this implementation, we used Software Define Radio (SDR, namely Universal Software Radio Peripheral (USRP NI 2920 as the transmitter and receiver. The result of OFDM system using Modified Convolutional Code with code rate is able recover all character received so can decrease until 0% error bit. Increasing performance of Modified Convolutional Code is about 1 dB in BER of 10-4 from BCH Code and Convolutional Code. So, performance of Modified Convolutional better than BCH Code or Convolutional Code. Keywords: OFDM, BCH Code, Convolutional Code, Modified Convolutional Code, SDR, USRP

  3. The Evaluation of Dogmatic Television Characters by Dogmatic Viewers: "Is Archie Bunker a Credible Source?"

    Science.gov (United States)

    Surlin, Stuart H.

    The highly rated television program series, "All in the Family," was used to test the relationship between attitudes espoused by televised characters and attitudes held by viewers of this type of television programing. On the basis of survey questionnaires, it was condluded that people who hold dogmatic and, especially, racist beliefs find…

  4. Version 4. 00 of the MINTEQ geochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Eary, L.E.; Jenne, E.A.

    1992-09-01

    The MINTEQ code is a thermodynamic model that can be used to calculate solution equilibria for geochemical applications. Included in the MINTEQ code are formulations for ionic speciation, ion exchange, adsorption, solubility, redox, gas-phase equilibria, and the dissolution of finite amounts of specified solids. Since the initial development of the MINTEQ geochemical code, a number of undocumented versions of the source code and data files have come into use at the Pacific Northwest Laboratory (PNL). This report documents these changes, describes source code modifications made for the Aquifer Thermal Energy Storage (ATES) program, and provides comprehensive listings of the data files. A version number of 4.00 has been assigned to the MINTEQ source code and the individual data files described in this report.

  5. Version 4.00 of the MINTEQ geochemical code

    Energy Technology Data Exchange (ETDEWEB)

    Eary, L.E.; Jenne, E.A.

    1992-09-01

    The MINTEQ code is a thermodynamic model that can be used to calculate solution equilibria for geochemical applications. Included in the MINTEQ code are formulations for ionic speciation, ion exchange, adsorption, solubility, redox, gas-phase equilibria, and the dissolution of finite amounts of specified solids. Since the initial development of the MINTEQ geochemical code, a number of undocumented versions of the source code and data files have come into use at the Pacific Northwest Laboratory (PNL). This report documents these changes, describes source code modifications made for the Aquifer Thermal Energy Storage (ATES) program, and provides comprehensive listings of the data files. A version number of 4.00 has been assigned to the MINTEQ source code and the individual data files described in this report.

  6. A Study of Character among Collegiate Athletes

    Science.gov (United States)

    Heupel, Jill D.

    2017-01-01

    The idea that sport builds character has been around for a long time. However, sports may not build the type of character once thought. Character of athletes was defined based on differing views held by sport scholars, coaches, athletes, and sport enthusiast. Sport scholars tend to view character of athletes from a moral perspective. Coaches,…

  7. QR code based noise-free optical encryption and decryption of a gray scale image

    Science.gov (United States)

    Jiao, Shuming; Zou, Wenbin; Li, Xia

    2017-03-01

    In optical encryption systems, speckle noise is one major challenge in obtaining high quality decrypted images. This problem can be addressed by employing a QR code based noise-free scheme. Previous works have been conducted for optically encrypting a few characters or a short expression employing QR codes. This paper proposes a practical scheme for optically encrypting and decrypting a gray-scale image based on QR codes for the first time. The proposed scheme is compatible with common QR code generators and readers. Numerical simulation results reveal the proposed method can encrypt and decrypt an input image correctly.

  8. The Analysis Of Personality Disorder On Two Characters In The Animation Series Black Rock Shooter

    OpenAIRE

    Ramadhana, Rizki Andrian

    2015-01-01

    The title of this thesis is The Analysis of Personality Disorder on Two Characters in the Animation Series “Black Rock Shooter” which discusses about the personality disorder of two characters from this series; they are Kagari Izuriha and Yomi Takanashi. The animation series Black Rock Shooter is chosen as the source of data because this animation has psychological genre and represents the complexity of human relationship, especially when build up a friendship. It is because human is a social...

  9. Network coding for multi-resolution multicast

    DEFF Research Database (Denmark)

    2013-01-01

    A method, apparatus and computer program product for utilizing network coding for multi-resolution multicast is presented. A network source partitions source content into a base layer and one or more refinement layers. The network source receives a respective one or more push-back messages from one...... or more network destination receivers, the push-back messages identifying the one or more refinement layers suited for each one of the one or more network destination receivers. The network source computes a network code involving the base layer and the one or more refinement layers for at least one...... of the one or more network destination receivers, and transmits the network code to the one or more network destination receivers in accordance with the push-back messages....

  10. Pre-Test Analysis of the MEGAPIE Spallation Source Target Cooling Loop Using the TRAC/AAA Code

    International Nuclear Information System (INIS)

    Bubelis, Evaldas; Coddington, Paul; Leung, Waihung

    2006-01-01

    A pilot project is being undertaken at the Paul Scherrer Institute in Switzerland to test the feasibility of installing a Lead-Bismuth Eutectic (LBE) spallation target in the SINQ facility. Efforts are coordinated under the MEGAPIE project, the main objectives of which are to design, build, operate and decommission a 1 MW spallation neutron source. The technology and experience of building and operating a high power spallation target are of general interest in the design of an Accelerator Driven System (ADS) and in this context MEGAPIE is one of the key experiments. The target cooling is one of the important aspects of the target system design that needs to be studied in detail. Calculations were performed previously using the RELAP5/Mod 3.2.2 and ATHLET codes, but in order to verify the previous code results and to provide another capability to model LBE systems, a similar study of the MEGAPIE target cooling system has been conducted with the TRAC/AAA code. In this paper a comparison is presented for the steady-state results obtained using the above codes. Analysis of transients, such as unregulated cooling of the target, loss of heat sink, the main electro-magnetic pump trip of the LBE loop and unprotected proton beam trip, were studied with TRAC/AAA and compared to those obtained earlier using RELAP5/Mod 3.2.2. This work extends the existing validation data-base of TRAC/AAA to heavy liquid metal systems and comprises the first part of the TRAC/AAA code validation study for LBE systems based on data from the MEGAPIE test facility and corresponding inter-code comparisons. (authors)

  11. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  12. IMAGE PROCESSING BASED OPTICAL CHARACTER RECOGNITION USING MATLAB

    OpenAIRE

    Jyoti Dalal*1 & Sumiran Daiya2

    2018-01-01

    Character recognition techniques associate a symbolic identity with the image of character. In a typical OCR systems input characters are digitized by an optical scanner. Each character is then located and segmented, and the resulting character image is fed into a pre-processor for noise reduction and normalization. Certain characteristics are the extracted from the character for classification. The feature extraction is critical and many different techniques exist, each having its strengths ...

  13. Body Language Advanced 3D Character Rigging

    CERN Document Server

    Allen, Eric; Fong, Jared; Sidwell, Adam G

    2011-01-01

    Whether you're a professional Character TD or just like to create 3D characters, this detailed guide reveals the techniques you need to create sophisticated 3D character rigs that range from basic to breathtaking. Packed with step-by-step instructions and full-color illustrations, Body Language walks you through rigging techniques for all the body parts to help you create realistic and believable movements in every character you design. You'll learn advanced rigging concepts that involve MEL scripting and advanced deformation techniques and even how to set up a character pipeline.

  14. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  15. SEVERAL OBSERVATIONS REGARDING THE REGULATION OF THE CONTRACT OF PARTNERSHIP IN THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    IOLANDA-ELENA CADARIU-LUNGU

    2012-05-01

    Full Text Available Following the model of the Italian Civil Code, of the Civil Code from Quebec, the Swiss and the Dutch ones, the new Romanian Civil Code has adopted the monist conception of regulating the private law relationships, gathering in the same normative act traditional civil law dispositions as well as dispositions that are specific to the commercial relationships among professionals. In this regulating context, one of the fundamental changes the new Civil Code brings is the unification of the legal regime applicable to civil and commercial contracts, with all the consequences that derive from this new legislative approach. This fundamental modification is first determined by the profound change of the character of social, economic and juridical relationships, by the change of the cultural level of the Romanian society, by the closeness of the two branches of civil and commercial law and, last but not least, by the evolution of the business environment. In this line of thought, we can identify important changes in the matter of the contract of partnership which, as regulated by the new Civil Code, constitutes the common law both for the simple partnerships (former civil societies as well as for the commercial companies, to which the special legislation still in force in the matter still applies. In this study we aimed at analyzing the general common features of all associative forms listed by art. 1.888 Civil Code and the new elements in the matter, with critical observations where needed, which take the form of a comparison with the specific legislation in the field from the Civil Codes that served as a source of inspiration for the Romanian legislator.

  16. On Hemingway’s Literary Characters

    Directory of Open Access Journals (Sweden)

    Maria-Miruna Ciocoi-Pop

    2013-01-01

    Full Text Available The present paper is a brief outline of Hemingway’s characters and the way in which they correspond to the author himself. It is known for a fact that Hemingway evinced a tendency to imitate his characters when they were coming to grips with diverse situations. Thus I have tried to briefly pinpoint the fading boundaries between reality and imagination in his work. By doing so, I have focused on both male and female characters, underlining the major dissimilarities between these two categories, as well as their main features.

  17. Moral character effects in endorser perception

    Directory of Open Access Journals (Sweden)

    Chang Joseph W.

    2017-06-01

    Full Text Available This research consists of two experimental studies investigating the influence of moral character on endorser perception, and the influence of perceiver characteristics on tarnished endorser perception and brand evaluations. Perceiver characteristics are discussed from the perspectives of dispositional tendency, innate moral intuitions and self-location. The first study compared the influences of moral character and warmth on endorser perception. The second study examined the impact of perceiver characteristics on tarnished endorsers and brand evaluations. The findings reveal that moral character is more influential than warmth on endorser evaluations. Tarnished endorsers with immoral character exert more negative influence than tarnished endorsers with coldness character on brand evaluations. Innate moral intuitions and self-location moderate brand evaluations. High-morality consumers and heart-locators are more vulnerable than low-morality and brain-locators to the brands endorsed by tarnished endorsers, respectively.

  18. Video Game Characters. Theory and Analysis

    Directory of Open Access Journals (Sweden)

    Felix Schröter

    2014-06-01

    Full Text Available This essay develops a method for the analysis of video game characters based on a theoretical understanding of their medium-specific representation and the mental processes involved in their intersubjective construction by video game players. We propose to distinguish, first, between narration, simulation, and communication as three modes of representation particularly salient for contemporary video games and the characters they represent, second, between narrative, ludic, and social experience as three ways in which players perceive video game characters and their representations, and, third, between three dimensions of video game characters as ‘intersubjective constructs’, which usually are to be analyzed not only as fictional beings with certain diegetic properties but also as game pieces with certain ludic properties and, in those cases in which they function as avatars in the social space of a multiplayer game, as representations of other players. Having established these basic distinctions, we proceed to analyze their realization and interrelation by reference to the character of Martin Walker from the third-person shooter Spec Ops: The Line (Yager Development 2012, the highly customizable player-controlled characters from the role-playing game The Elder Scrolls V: Skyrim (Bethesda 2011, and the complex multidimensional characters in the massively multiplayer online role-playing game Star Wars: The Old Republic (BioWare 2011-2014.

  19. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    Science.gov (United States)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  20. Evolving Lattices for Analyzing Behavioral Dynamics of Characters in Literary Text

    Directory of Open Access Journals (Sweden)

    Eugene S Kitamura

    2011-10-01

    Full Text Available This paper is about an application of rough set derived lattices in order to analyze the dynamics of literary text. Due to the double approximation nature of rough set theory, a pseudo-closure obtained from two different equivalence relations allows us to form arbitrary lattices. Moreover, such double approximations with different equivalence relations permit us to obtain lattice fixed points based on two interpretations. The two interpretations used for literary text analysis are subjects and their attributes. The attributes chosen for this application are verbs. The progression of a story is defined by the sequence of verbs (or event occurrences. By fixing a window size and sliding the window down the story steps, we obtain a lattice representing the relationship between subjects and their attributes within that window frame. The resulting lattice provides information such as complementarity (lattice complement existence rate and distributivity (lattice complement possession rate. These measurements depend on the overlap and the lack of overlap among the attributes of characters. As the story develops and new character and attributes are provided as the source of lattices, one can observe its evolution. In fact, a dramatic change in the behavior dynamics in a scene is reflected in the particular shifts in the character-attribute relationship. This method lets us quantify the developments of character behavioral dynamics in a story.

  1. Comparison of brain mechanisms underlying the processing of Chinese characters and pseudo-characters: an event-related potential study.

    Science.gov (United States)

    Wang, Ting; Li, Hong; Zhang, Qinglin; Tu, Shen; Yu, Caiyun; Qiu, Jiang

    2010-04-01

    Most Chinese characters are composed of a semantic radical on the left and a phonetic radical on the right. The semantic radical provides the semantic information; the phonetic radical provides information concerning the pronunciation of the whole character. The pseudo-characters in the study consisted of different sub-lexical parts of real Chinese characters and consequently they also had the semantic radical and the phonetic radical. But they were not readable and had no actual meaning. In order to investigate the spatiotemporal cortical activation patterns underlying the orthographic, phonological and semantic processing of Chinese characters, we used event-related brain potentials (ERPs) to explore the processing of Chinese characters and pseudo-characters when 14 healthy Chinese college students viewed the characters passively. Results showed that both Chinese characters and pseudo-characters elicited an evident negative potential peaking around 120 ms (N120), which appeared to reflect initial orthographic distinction and evaluation. Then, Chinese pseudo-characters elicited a more positive ERP deflection (P220) than did Chinese characters 200-250 ms after onset of the stimuli. It was similar to the recognition potential (RP) and might reflect the integration processes of phonological and semantic processing on the basis of early orthographic information. Dipole source analysis of the difference wave (pseudo-characters minus characters) indicated that a generator localized in the left temporal-occipital junction contributed to this effect, which was possibly related to phonological and perceptual-semantic information integration. Between 350-450 ms, a greater negativity (N360) in pseudo-characters as compared to characters was found over midline fronto-central scalp regions. Dipole analysis localized the generator of N360 in the right parahippocampal cortex. Therefore, the N360 might be an N400 component and reflect the higher-level semantic activation on the

  2. Modeling the Semiotic Structure of Player-Characters

    DEFF Research Database (Denmark)

    Vella, Daniel

    2014-01-01

    When game studies has tackled the player-character, it has tended to do so by means of an opposition to the notion of the avatar, with the result that the ontological and semiotic nature of the character in itself has not been given due attention. This paper draws on understandings of character...... from the fields of narratology and literary theory to highlight the double-layered ontology of character as both a possible individual and as a semiotic construction. Uri Margolin’s narratological model of character signification is used as the basis for developing a semiotic-structural model...

  3. Nuclear science references coding manual

    International Nuclear Information System (INIS)

    Ramavataram, S.; Dunford, C.L.

    1996-08-01

    This manual is intended as a guide to Nuclear Science References (NSR) compilers. The basic conventions followed at the National Nuclear Data Center (NNDC), which are compatible with the maintenance and updating of and retrieval from the Nuclear Science References (NSR) file, are outlined. In Section H, the structure of the NSR file such as the valid record identifiers, record contents, text fields as well as the major TOPICS for which are prepared are enumerated. Relevant comments regarding a new entry into the NSR file, assignment of , generation of and linkage characteristics are also given in Section II. In Section III, a brief definition of the Keyword abstract is given followed by specific examples; for each TOPIC, the criteria for inclusion of an article as an entry into the NSR file as well as coding procedures are described. Authors preparing Keyword abstracts either to be published in a Journal (e.g., Nucl. Phys. A) or to be sent directly to NNDC (e.g., Phys. Rev. C) should follow the illustrations in Section III. The scope of the literature covered at the NNDC, the categorization into Primary and Secondary sources, etc., is discussed in Section IV. Useful information regarding permitted character sets, recommended abbreviations, etc., is given under Section V as Appendices

  4. Building innovative and creative character through mathematics

    Science.gov (United States)

    Suyitno, Hardi; Suyitno, Amin

    2018-03-01

    21st century is predicted as the century with rapid development in all aspects of life. People require creative and innovative character to exist. Specifically, mathematics has been given to students from the kindergarten until the middle school. Thus, building character through mathematics should begin since the early age. The problem is how to build creative and innovative character through mathematics education? The goal expected from this question is to build innovative and creative characters to face the challenges of the 21st century. This article discusses the values of mathematics, the values in mathematics education, innovative and creative character, and the integration of these values in teaching mathematics that support the innovative and creative character building, and applying the values in structurely programmed, measurable, and applicable learning activities.

  5. Use of WIMS-E lattice code for prediction of the transuranic source term for spent fuel dose estimation

    International Nuclear Information System (INIS)

    Schwinkendorf, K.N.

    1996-01-01

    A recent source term analysis has shown a discrepancy between ORIGEN2 transuranic isotopic production estimates and those produced with the WIMS-E lattice physics code. Excellent agreement between relevant experimental measurements and WIMS-E was shown, thus exposing an error in the cross section library used by ORIGEN2

  6. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  7. Use of CITATION code for flux calculation in neutron activation analysis with voluminous sample using an Am-Be source

    International Nuclear Information System (INIS)

    Khelifi, R.; Idiri, Z.; Bode, P.

    2002-01-01

    The CITATION code based on neutron diffusion theory was used for flux calculations inside voluminous samples in prompt gamma activation analysis with an isotopic neutron source (Am-Be). The code uses specific parameters related to the energy spectrum source and irradiation system materials (shielding, reflector). The flux distribution (thermal and fast) was calculated in the three-dimensional geometry for the system: air, polyethylene and water cuboidal sample (50x50x50 cm). Thermal flux was calculated in a series of points inside the sample. The results agreed reasonably well with observed values. The maximum thermal flux was observed at a distance of 3.2 cm while CITATION gave 3.7 cm. Beyond a depth of 7.2 cm, the thermal flux to fast flux ratio increases up to twice and allows us to optimise the detection system position in the scope of in-situ PGAA

  8. Virtual Character Personality Influences Participant Attitudes and Behavior - An Interview with a Virtual Human Character about Her Social Anxiety

    Directory of Open Access Journals (Sweden)

    Xueni ePan

    2015-02-01

    Full Text Available We introduce a novel technique for the study of human-virtual character interaction in immersive virtual reality. The human participants verbally administered a standard questionnaire about social anxiety to a virtual female character, that responded to each question through speech and body movements. The purpose was to study the extent to which participants responded differently to characters that exhibited different personalities, even though the verbal content of their answers was always the same. A separate online study provided evidence that our intention to create two different personality types had been successful. In the main between-groups experiment that utilized a Cave system there were 24 male participants, where 12 interacted with a female virtual character portrayed to exhibit shyness and the remaining 12 with an identical but more confident virtual character. Our results indicate that although the content of the verbal responses of both virtual characters was the same, participants showed different subjective and behavioral responses to the two different personalities. In particular participants evaluated the shy character more positively, for example, expressing willingness to spend more time with her. Participants evaluated the confident character more negatively and waited for a significantly longer time to call her back after she had left the scene in order to answer a telephone call. The method whereby participants interviewed the virtual character allowed naturalistic conversation while avoiding the necessity of speech processing and generation, and natural language understanding. It is therefore a useful method for the study of the impact of virtual character personality on participant responses.

  9. Artificial Neural Network Based Optical Character Recognition

    OpenAIRE

    Vivek Shrivastava; Navdeep Sharma

    2012-01-01

    Optical Character Recognition deals in recognition and classification of characters from an image. For the recognition to be accurate, certain topological and geometrical properties are calculated, based on which a character is classified and recognized. Also, the Human psychology perceives characters by its overall shape and features such as strokes, curves, protrusions, enclosures etc. These properties, also called Features are extracted from the image by means of spatial pixel-...

  10. Character Education in Print: Content Analysis of Character Education in Introduction to Education Textbooks

    Science.gov (United States)

    Protz, Babette Marisa

    2013-01-01

    Albert Einstein is credited with saying that the most important component of education is the development of students' character. While debate exists as to the delivery of character education in the public schools, it must be recognized that not all students have a support system outside of the schoolhouse. Consequently, when character…

  11. Landscape Character of Pongkor Mining Ecotourism Area

    Science.gov (United States)

    Kusumoarto, A.; Gunawan, A.; Machfud; Hikmat, A.

    2017-10-01

    Pongkor Mining Ecotourism Area has a diverse landscape character as a potential landscape resources for the development of ecotourism destination. This area is part of the Mount of Botol Resort, Halimun Salak National Park (HSNP). This area also has a fairly high biodiversity. This study aims to identify and analysis the category of landscape character in the Pongkor Mining Ecotourism Area for the development of ecotourism destination. This study used a descriptive approach through field surveys and interviews, was carried out through two steps : 1) identify the landscape character, and 2) analysis of the landscape character. The results showed that in areas set aside for ecotourism destination in Pongkor Mining, landscape character category scattered forests, tailing ponds, river, plain, and the built environment. The Category of landscape character most dominant scattered in the area is forest, here is the river, plain, tailing ponds, the built environment, and plain. The landscape character in a natural environment most preferred for ecotourism activities. The landscape character that spread in the natural environment and the built environment is a potential that must be protected and modified such as elimination of incongruous element, accentuation of natural form, alteration of the natural form, intensification and enhanced visual quality intensively to be developed as a ecotourism destination area.

  12. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  13. Data set for Tifinagh handwriting character recognition

    Directory of Open Access Journals (Sweden)

    Omar Bencharef

    2015-09-01

    Full Text Available The Tifinagh alphabet-IRCAM is the official alphabet of the Amazigh language widely used in North Africa [1]. It includes thirty-one basic letter and two letters each composed of a base letter followed by the sign of labialization. Normalized only in 2003 (Unicode [2], ICRAM-Tifinagh is a young character repertoire. Which needs more work on all levels. In this context we propose a data set for handwritten Tifinagh characters composed of 1376 image; 43 Image For Each character. The dataset can be used to train a Tifinagh character recognition system, or to extract the meaning characteristics of each character.

  14. Guidelines on Active Content and Mobile Code: Recommendations of the National Institute of Standards and Technology

    National Research Council Canada - National Science Library

    Jansen, Wayne

    2001-01-01

    .... One such category of technologies is active content. Broadly speaking, active content refers to electronic documents that, unlike past character documents based on the American Standard Code for Information Interchange (ASCII...

  15. A Classification Scheme for Literary Characters

    Directory of Open Access Journals (Sweden)

    Matthew Berry

    2017-10-01

    Full Text Available There is no established classification scheme for literary characters in narrative theory short of generic categories like protagonist vs. antagonist or round vs. flat. This is so despite the ubiquity of stock characters that recur across media, cultures, and historical time periods. We present here a proposal of a systematic psychological scheme for classifying characters from the literary and dramatic fields based on a modification of the Thomas-Kilmann (TK Conflict Mode Instrument used in applied studies of personality. The TK scheme classifies personality along the two orthogonal dimensions of assertiveness and cooperativeness. To examine the validity of a modified version of this scheme, we had 142 participants provide personality ratings for 40 characters using two of the Big Five personality traits as well as assertiveness and cooperativeness from the TK scheme. The results showed that assertiveness and cooperativeness were orthogonal dimensions, thereby supporting the validity of using a modified version of TK’s two-dimensional scheme for classifying characters.

  16. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  17. Character, attitude and disposition

    OpenAIRE

    Webber, Jonathan

    2015-01-01

    Recent debate over the empirical psychological presuppositions of virtue ethics has focused on reactive behavioural dispositions. But there are many character traits that cannot be understood properly in this way. Such traits are well described by attitude psychology. Moreover, the findings of attitude psychology support virtue ethics in three ways. First, they confirm the role of habituation in the development of character. Further, they show virtue ethics to be compatible with the situation...

  18. Container code recognition in information auto collection system of container inspection

    International Nuclear Information System (INIS)

    Su Jianping; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2003-01-01

    Now custom needs electrical application and automatic detection. Container inspection should not only give the image of the goods, but also auto-attain container's code and weight. Its function and track control, information transfer make up the Information Auto Collection system of Container Inspection. Code Recognition is the point. The article is based on model match, the close property of character, and uses it to recognize. Base on checkout rule, design the adjustment arithmetic, form the whole recognition strategy. This strategy can achieve high recognition ratio and robust property

  19. Rapid extraction of lexical tone phonology in Chinese characters: a visual mismatch negativity study.

    Directory of Open Access Journals (Sweden)

    Xiao-Dong Wang

    Full Text Available BACKGROUND: In alphabetic languages, emerging evidence from behavioral and neuroimaging studies shows the rapid and automatic activation of phonological information in visual word recognition. In the mapping from orthography to phonology, unlike most alphabetic languages in which there is a natural correspondence between the visual and phonological forms, in logographic Chinese, the mapping between visual and phonological forms is rather arbitrary and depends on learning and experience. The issue of whether the phonological information is rapidly and automatically extracted in Chinese characters by the brain has not yet been thoroughly addressed. METHODOLOGY/PRINCIPAL FINDINGS: We continuously presented Chinese characters differing in orthography and meaning to adult native Mandarin Chinese speakers to construct a constant varying visual stream. In the stream, most stimuli were homophones of Chinese characters: The phonological features embedded in these visual characters were the same, including consonants, vowels and the lexical tone. Occasionally, the rule of phonology was randomly violated by characters whose phonological features differed in the lexical tone. CONCLUSIONS/SIGNIFICANCE: We showed that the violation of the lexical tone phonology evoked an early, robust visual response, as revealed by whole-head electrical recordings of the visual mismatch negativity (vMMN, indicating the rapid extraction of phonological information embedded in Chinese characters. Source analysis revealed that the vMMN was involved in neural activations of the visual cortex, suggesting that the visual sensory memory is sensitive to phonological information embedded in visual words at an early processing stage.

  20. Rapid extraction of lexical tone phonology in Chinese characters: a visual mismatch negativity study.

    Science.gov (United States)

    Wang, Xiao-Dong; Liu, A-Ping; Wu, Yin-Yuan; Wang, Peng

    2013-01-01

    In alphabetic languages, emerging evidence from behavioral and neuroimaging studies shows the rapid and automatic activation of phonological information in visual word recognition. In the mapping from orthography to phonology, unlike most alphabetic languages in which there is a natural correspondence between the visual and phonological forms, in logographic Chinese, the mapping between visual and phonological forms is rather arbitrary and depends on learning and experience. The issue of whether the phonological information is rapidly and automatically extracted in Chinese characters by the brain has not yet been thoroughly addressed. We continuously presented Chinese characters differing in orthography and meaning to adult native Mandarin Chinese speakers to construct a constant varying visual stream. In the stream, most stimuli were homophones of Chinese characters: The phonological features embedded in these visual characters were the same, including consonants, vowels and the lexical tone. Occasionally, the rule of phonology was randomly violated by characters whose phonological features differed in the lexical tone. We showed that the violation of the lexical tone phonology evoked an early, robust visual response, as revealed by whole-head electrical recordings of the visual mismatch negativity (vMMN), indicating the rapid extraction of phonological information embedded in Chinese characters. Source analysis revealed that the vMMN was involved in neural activations of the visual cortex, suggesting that the visual sensory memory is sensitive to phonological information embedded in visual words at an early processing stage.

  1. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/..mu..Ci-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult.

  2. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/μCi-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult

  3. Using Quick Response Codes in the Classroom: Quality Outcomes.

    Science.gov (United States)

    Zurmehly, Joyce; Adams, Kellie

    2017-10-01

    With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.

  4. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  5. Recycling source terms for edge plasma fluid models and impact on convergence behaviour of the BRAAMS 'B2' code

    International Nuclear Information System (INIS)

    Maddison, G.P.; Reiter, D.

    1994-02-01

    Predictive simulations of tokamak edge plasmas require the most authentic description of neutral particle recycling sources, not merely the most expedient numerically. Employing a prototypical ITER divertor arrangement under conditions of high recycling, trial calculations with the 'B2' steady-state edge plasma transport code, plus varying approximations or recycling, reveal marked sensitivity of both results and its convergence behaviour to details of sources incorporated. Comprehensive EIRENE Monte Carlo resolution of recycling is implemented by full and so-called 'shot' intermediate cycles between the plasma fluid and statistical neutral particle models. As generally for coupled differencing and stochastic procedures, though, overall convergence properties become more difficult to assess. A pragmatic criterion for the 'B2'/EIRENE code system is proposed to determine its success, proceeding from a stricter condition previously identified for one particular analytic approximation of recycling in 'B2'. Certain procedures are also inferred potentially to improve their convergence further. (orig.)

  6. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  7. Madness in Shakespeare's Characters

    Directory of Open Access Journals (Sweden)

    Nuno Borja-Santos

    2014-10-01

    Full Text Available This paper begins with an introduction where the aims are explained: a psychopathological analysis of a Shakespearean character - Othello – followed by the discussion of the English dramatist’s importance in helping us understand madness in the emergent world of Renaissance. The main characteristics of Othello’s personality, which allowed the development of his jealousy delusion, are described. Finally, the conclusions underline the overlap of the symptoms developed by the character with the DSM-IV classification.

  8. Optical character recognition based on nonredundant correlation measurements.

    Science.gov (United States)

    Braunecker, B; Hauck, R; Lohmann, A W

    1979-08-15

    The essence of character recognition is a comparison between the unknown character and a set of reference patterns. Usually, these reference patterns are all possible characters themselves, the whole alphabet in the case of letter characters. Obviously, N analog measurements are highly redundant, since only K = log(2)N binary decisions are enough to identify one out of N characters. Therefore, we devised K reference patterns accordingly. These patterns, called principal components, are found by digital image processing, but used in an optical analog computer. We will explain the concept of principal components, and we will describe experiments with several optical character recognition systems, based on this concept.

  9. Leadership, character and its development: A qualitative exploration

    Directory of Open Access Journals (Sweden)

    Roslyn de Braine

    2007-11-01

    Full Text Available The purpose of this study was to explore (1 what organisational leaders consider to be character elements of leaders within the workplace, (2 what influences leaders’ character development, and (3 how an organisation can continue the process of character development. The literature review and findings revealed that leadership, integrity, industriousness, empathy, loyalty, optimism, fairness and compassion are the most sought after character elements within leaders in the workplace. Leadership and integrity were found to be the most supported character elements. The findings also indicate that work environmental factors, a person’s own efforts, and the daily experiences of work life contribute towards character development.

  10. Understanding the Properties of Interactive Televised Characters

    Science.gov (United States)

    Claxton, Laura J.; Ponto, Katelyn C.

    2013-01-01

    Children's television programming frequently uses interactive characters that appear to directly engage the viewers. These characters encourage children to answer questions and perform actions to help the characters solve problems in the televised world. Children readily engage in these interactions; however, it is unclear why they do so. To…

  11. Introduction to coding and information theory

    CERN Document Server

    Roman, Steven

    1997-01-01

    This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.

  12. Bar code usage in nuclear materials accountability

    International Nuclear Information System (INIS)

    Mee, W.T.

    1983-01-01

    The age old method of physically taking an inventory of materials by listing each item's identification number has lived beyond its usefulness. In this age of computerization, which offers the local grocery store a quick, sure, and easy means to inventory, it is time for nuclear materials facilities to automate accountability activities. The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At that time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable; however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented

  13. HyDEn: A Hybrid Steganocryptographic Approach for Data Encryption Using Randomized Error-Correcting DNA Codes

    Directory of Open Access Journals (Sweden)

    Dan Tulpan

    2013-01-01

    Full Text Available This paper presents a novel hybrid DNA encryption (HyDEn approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach.

  14. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  15. Active Fault Near-Source Zones Within and Bordering the State of California for the 1997 Uniform Building Code

    Science.gov (United States)

    Petersen, M.D.; Toppozada, Tousson R.; Cao, T.; Cramer, C.H.; Reichle, M.S.; Bryant, W.A.

    2000-01-01

    The fault sources in the Project 97 probabilistic seismic hazard maps for the state of California were used to construct maps for defining near-source seismic coefficients, Na and Nv, incorporated in the 1997 Uniform Building Code (ICBO 1997). The near-source factors are based on the distance from a known active fault that is classified as either Type A or Type B. To determine the near-source factor, four pieces of geologic information are required: (1) recognizing a fault and determining whether or not the fault has been active during the Holocene, (2) identifying the location of the fault at or beneath the ground surface, (3) estimating the slip rate of the fault, and (4) estimating the maximum earthquake magnitude for each fault segment. This paper describes the information used to produce the fault classifications and distances.

  16. Factorized combinations of Virasoro characters

    International Nuclear Information System (INIS)

    Bytsko, A.G.; Fring, A.

    2000-01-01

    We investigate linear combinations of characters for minimal Virasoro models which are representable as a product of several basic blocks. Our analysis is based on consideration of asymptotic behaviour of the characters in the quasi-classical limit. In particular, we introduce a notion of the secondary effective central charge. We find all possible cases for which factorization occurs on the base of the Gauss-Jacobi or the Watson identities. Exploiting these results, we establish various types of identities between different characters. In particular, we present several identities generalizing the Rogers-Ramanujan identities. Applications to quasi-particle representations, modular invariant partition functions, super-conformal theories and conformal models with boundaries are briefly discussed. (orig.)

  17. Character Recognition Using Genetically Trained Neural Networks

    Energy Technology Data Exchange (ETDEWEB)

    Diniz, C.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1998-10-01

    Computationally intelligent recognition of characters and symbols addresses a wide range of applications including foreign language translation and chemical formula identification. The combination of intelligent learning and optimization algorithms with layered neural structures offers powerful techniques for character recognition. These techniques were originally developed by Sandia National Laboratories for pattern and spectral analysis; however, their ability to optimize vast amounts of data make them ideal for character recognition. An adaptation of the Neural Network Designer soflsvare allows the user to create a neural network (NN_) trained by a genetic algorithm (GA) that correctly identifies multiple distinct characters. The initial successfid recognition of standard capital letters can be expanded to include chemical and mathematical symbols and alphabets of foreign languages, especially Arabic and Chinese. The FIN model constructed for this project uses a three layer feed-forward architecture. To facilitate the input of characters and symbols, a graphic user interface (GUI) has been developed to convert the traditional representation of each character or symbol to a bitmap. The 8 x 8 bitmap representations used for these tests are mapped onto the input nodes of the feed-forward neural network (FFNN) in a one-to-one correspondence. The input nodes feed forward into a hidden layer, and the hidden layer feeds into five output nodes correlated to possible character outcomes. During the training period the GA optimizes the weights of the NN until it can successfully recognize distinct characters. Systematic deviations from the base design test the network's range of applicability. Increasing capacity, the number of letters to be recognized, requires a nonlinear increase in the number of hidden layer neurodes. Optimal character recognition performance necessitates a minimum threshold for the number of cases when genetically training the net. And, the

  18. The Psychological Effect of Television Characters: The Case of Archie Bunker and Authoritarian Viewers.

    Science.gov (United States)

    Surlin, Stuart H.; Bowden, Elizabeth

    Reference group theory suggests that a perceived similarity between interacting individuals leads to future interaction, increased source credibility, and more frequent agreement on specific issues. This study shows how the reference group theory applies to the authoritarian television character Archie Bunker and television viewers that watch…

  19. Recent advances in coding theory for near error-free communications

    Science.gov (United States)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  20. Good character at school: positive classroom behavior mediates the link between character strengths and school achievement.

    Science.gov (United States)

    Wagner, Lisa; Ruch, Willibald

    2015-01-01

    Character strengths have been found to be substantially related to children's and adolescents' well-being. Initial evidence suggests that they also matter for school success (e.g., Weber and Ruch, 2012). The present set of two studies aimed at replicating and extending these findings in two different age groups, primary school students (N = 179; mean age = 11.6 years) and secondary school students (N = 199; mean age = 14.4 years). The students completed the VIA-Youth (Values in Action Inventory of Strengths for Youth), a self-report measure of the 24 character strengths in the VIA classification. Their teachers rated the students' positive behavior in the classroom. Additionally, school achievement was assessed: For the primary school students (Study 1), teachers rated the students' overall school achievement and for the secondary school students (Study 2), we used their grades as a measure of school achievement. We found that several character strengths were associated with both positive classroom behavior and school achievement. Across both samples, school achievement was correlated with love of learning, perseverance, zest, gratitude, hope, and perspective. The strongest correlations with positive classroom behavior were found for perseverance, self-regulation, prudence, social intelligence, and hope. For both samples, there were indirect effects of some of the character strengths on school achievement through teacher-rated positive classroom behavior. The converging findings from the two samples support the notion that character strengths contribute to positive classroom behavior, which in turn enhances school achievement. Results are discussed in terms of their implications for future research and for school interventions based on character strengths.

  1. Good character at school: Positive classroom behavior mediates the link between character strengths and school achievement

    Directory of Open Access Journals (Sweden)

    Lisa eWagner

    2015-05-01

    Full Text Available Character strengths have been found to be substantially related to children’s and adolescents’ well-being. Initial evidence suggests that they also matter for school success (e.g., Weber and Ruch, 2012. The present set of two studies aimed at replicating and extending these findings in two different age groups, primary school students (N = 179; mean age = 11.6 years and secondary school students (N = 199; mean age = 14.4 years. The students completed the VIA-Youth, a self-report measure of the 24 character strengths in the VIA classification. Their teachers rated the students’ positive behavior in the classroom. Additionally, school achievement was assessed: For the primary school students (Study 1, teachers rated the students’ overall school achievement and for the secondary school students (Study 2, we used their grades as a measure of school achievement. We found that several character strengths were associated with both positive classroom behavior and school achievement. Across both samples school achievement was correlated with love of learning, perseverance, zest, gratitude, hope, and perspective. The strongest correlations with positive classroom behavior were found for perseverance, self-regulation, prudence, social intelligence, and hope. For both samples, there were indirect effects of most of the character strengths on school achievement through teacher-rated positive classroom behavior. The converging findings from the two samples support the notion that character strengths contribute to positive classroom behavior, which in turn enhances school achievement. Results are discussed in terms of their implications for future research and for school interventions based on character strengths.

  2. Generic programming for deterministic neutron transport codes

    International Nuclear Information System (INIS)

    Plagne, L.; Poncot, A.

    2005-01-01

    This paper discusses the implementation of neutron transport codes via generic programming techniques. Two different Boltzmann equation approximations have been implemented, namely the Sn and SPn methods. This implementation experiment shows that generic programming allows us to improve maintainability and readability of source codes with no performance penalties compared to classical approaches. In the present implementation, matrices and vectors as well as linear algebra algorithms are treated separately from the rest of source code and gathered in a tool library called 'Generic Linear Algebra Solver System' (GLASS). Such a code architecture, based on a linear algebra library, allows us to separate the three different scientific fields involved in transport codes design: numerical analysis, reactor physics and computer science. Our library handles matrices with optional storage policies and thus applies both to Sn code, where the matrix elements are computed on the fly, and to SPn code where stored matrices are used. Thus, using GLASS allows us to share a large fraction of source code between Sn and SPn implementations. Moreover, the GLASS high level of abstraction allows the writing of numerical algorithms in a form which is very close to their textbook descriptions. Hence the GLASS algorithms collection, disconnected from computer science considerations (e.g. storage policy), is very easy to read, to maintain and to extend. (authors)

  3. On the information content of discrete phylogenetic characters.

    Science.gov (United States)

    Bordewich, Magnus; Deutschmann, Ina Maria; Fischer, Mareike; Kasbohm, Elisa; Semple, Charles; Steel, Mike

    2017-12-16

    Phylogenetic inference aims to reconstruct the evolutionary relationships of different species based on genetic (or other) data. Discrete characters are a particular type of data, which contain information on how the species should be grouped together. However, it has long been known that some characters contain more information than others. For instance, a character that assigns the same state to each species groups all of them together and so provides no insight into the relationships of the species considered. At the other extreme, a character that assigns a different state to each species also conveys no phylogenetic signal. In this manuscript, we study a natural combinatorial measure of the information content of an individual character and analyse properties of characters that provide the maximum phylogenetic information, particularly, the number of states such a character uses and how the different states have to be distributed among the species or taxa of the phylogenetic tree.

  4. Coupling the MCNP Monte Carlo code and the FISPACT activation code with automatic visualization of the results of simulations

    International Nuclear Information System (INIS)

    Bourauel, Peter; Nabbi, Rahim; Biel, Wolfgang; Forrest, Robin

    2009-01-01

    The MCNP 3D Monte Carlo computer code is used not only for criticality calculations of nuclear systems but also to simulate transports of radiation and particles. The findings so obtained about neutron flux distribution and the associated spectra allow information about materials activation, nuclear heating, and radiation damage to be obtained by means of activation codes such as FISPACT. The stochastic character of particle and radiation transport processes normally links findings to the materials cells making up the geometry model of MCNP. Where high spatial resolution is required for the activation calculations with FISPACT, fine segmentation of the MCNP geometry becomes compulsory, which implies considerable expense for the modeling process. For this reason, an alternative simulation technique has been developed in an effort to automate and optimize data transfer between MCNP and FISPACT. (orig.)

  5. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  6. Living Up to the Code's Exhortations? Social Workers' Political Knowledge Sources, Expectations, and Behaviors.

    Science.gov (United States)

    Felderhoff, Brandi Jean; Hoefer, Richard; Watson, Larry Dan

    2016-01-01

    The National Association of Social Workers' (NASW's) Code of Ethics urges social workers to engage in political action. However, little recent research has been conducted to examine whether social workers support this admonition and the extent to which they actually engage in politics. The authors gathered data from a survey of social workers in Austin, Texas, to address three questions. First, because keeping informed about government and political news is an important basis for action, the authors asked what sources of knowledge social workers use. Second, they asked what the respondents believe are appropriate political behaviors for other social workers and NASW. Third, they asked for self-reports regarding respondents' own political behaviors. Results indicate that social workers use the Internet and traditional media services to stay informed; expect other social workers and NASW to be active; and are, overall, more active than the general public in many types of political activities. The comparisons made between expectations for others and their own behaviors are interesting in their complex outcomes. Social workers should strive for higher levels of adherence to the code's urgings on political activity. Implications for future work are discussed.

  7. User's manual for BINIAC: A computer code to translate APET bins

    International Nuclear Information System (INIS)

    Gough, S.T.

    1994-03-01

    This report serves as the user's manual for the FORTRAN code BINIAC. BINIAC is a utility code designed to format the output from the Defense Waste Processing Facility (DWPF) Accident Progression Event Tree (APET) methodology. BINIAC inputs the accident progression bins from the APET methodology, converts the frequency from occurrences per hour to occurrences per year, sorts the progression bins, and converts the individual dimension character codes into facility attributes. Without the use of BINIAC, this process would be done manually at great time expense. BINIAC was written under the quality assurance control of IQ34 QAP IV-1, revision 0, section 4.1.4. Configuration control is established through the use of a proprietor and a cognizant users list

  8. Which "Character" Should Sport Develop?

    Science.gov (United States)

    Rudd, Andy

    2005-01-01

    For years, strong claims have been made that sport builds character. Despite such claims, a "winning at all cost" mentality can frequently be seen within all of sport. The reason for this paradox may relate to confusion around what it means to demonstrate character. The purpose of this article is to show that there are indeed two distinct types of…

  9. Whether and Where to Code in the Wireless Relay Channel

    DEFF Research Database (Denmark)

    Shi, Xiaomeng; Médard, Muriel; Roetter, Daniel Enrique Lucani

    2013-01-01

    The throughput benefits of random linear network codes have been studied extensively for wirelined and wireless erasure networks. It is often assumed that all nodes within a network perform coding operations. In energy-constrained systems, however, coding subgraphs should be chosen to control...... the number of coding nodes while maintaining throughput. In this paper, we explore the strategic use of network coding in the wireless packet erasure relay channel according to both throughput and energy metrics. In the relay channel, a single source communicates to a single sink through the aid of a half......-duplex relay. The fluid flow model is used to describe the case where both the source and the relay are coding, and Markov chain models are proposed to describe packet evolution if only the source or only the relay is coding. In addition to transmission energy, we take into account coding and reception...

  10. File compression and encryption based on LLS and arithmetic coding

    Science.gov (United States)

    Yu, Changzhi; Li, Hengjian; Wang, Xiyu

    2018-03-01

    e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.

  11. The character of free topological groups II

    Directory of Open Access Journals (Sweden)

    Peter Nickolas

    2005-04-01

    Full Text Available A systematic analysis is made of the character of the free and free abelian topological groups on metrizable spaces and compact spaces, and on certain other closely related spaces. In the first case, it is shown that the characters of the free and the free abelian topological groups on X are both equal to the “small cardinal” d if X is compact and metrizable, but also, more generally, if X is a non-discrete k!-space all of whose compact subsets are metrizable, or if X is a non-discrete Polish space. An example is given of a zero-dimensional separable metric space for which both characters are equal to the cardinal of the continuum. In the case of a compact space X, an explicit formula is derived for the character of the free topological group on X involving no cardinal invariant of X other than its weight; in particular the character is fully determined by the weight in the compact case. This paper is a sequel to a paper by the same authors in which the characters of the free groups were analysed under less restrictive topological assumptions.

  12. DIALOGUE AND CHARACTER CLASSIFICATION IN WOLE ...

    African Journals Online (AJOL)

    position, level of education, character and habits of a character are reflected in the speech .... Dictators are averse to any form of ... breaks out shouting slogans of praises of himself. ..... Task easier if I can get all the Obas settled before our.

  13. Aristotelian versus Virtue Ethical Character Education

    Science.gov (United States)

    Curren, Randall

    2016-01-01

    This article examines some central aspects of Kristján Kristjánsson's book, "Aristotelian Character Education," beginning with the claim that contemporary virtue ethics provides methodological, ontological, epistemological, and moral foundations for Aristotelian character education. It considers three different formulations of what…

  14. A Review of Virtual Character's Emotion Model

    Science.gov (United States)

    Liu, Zhen

    2008-11-01

    Emotional virtual characters are essential to digital entertainment, an emotion is related to virtual environment and a virtual character's inner variables, emotion model of virtual character is a hot topic in many fields, domain knowledge is very important for modeling emotion, and the current research of emotion expression in the world was also summarized, and some new research directions of emotion model are presented.

  15. Is handwriting constrained by phonology? Evidence from Stroop tasks with written responses and Chinese characters

    Directory of Open Access Journals (Sweden)

    Markus eDamian

    2013-10-01

    Full Text Available To what extent is handwritten word production based on phonological codes? A few studies conducted in Western languages have recently provided evidence showing that phonology contributes to the retrieval of graphemic properties in written output tasks. Less is known about how orthographic production works in languages with non-alphabetic scripts such as written Chinese. We report a Stroop study in which Chinese participants wrote the colour of characters on a digital graphic tablet; characters were either neutral, or homophonic to the target (congruent, or homophonic to an alternative (incongruent. Facilitation was found from congruent homophonic distractors, but only when the homophone shared the same tone with the target. This finding suggests a contribution of phonology to written word production. A second experiment served as a control experiment to exclude the possibility that the effect in Experiment 1 had an exclusively semantic locus. Overall, the findings offer new insight into the relative contribution of phonology to handwriting, particularly in non-Western languages.

  16. Binary Systematic Network Coding for Progressive Packet Decoding

    OpenAIRE

    Jones, Andrew L.; Chatzigeorgiou, Ioannis; Tassi, Andrea

    2015-01-01

    We consider binary systematic network codes and investigate their capability of decoding a source message either in full or in part. We carry out a probability analysis, derive closed-form expressions for the decoding probability and show that systematic network coding outperforms conventional net- work coding. We also develop an algorithm based on Gaussian elimination that allows progressive decoding of source packets. Simulation results show that the proposed decoding algorithm can achieve ...

  17. The influence of media characters on children's food choices.

    Science.gov (United States)

    Kotler, Jennifer A; Schiffman, Jennifer M; Hanson, Katherine G

    2012-01-01

    Two experiments were conducted to assess the role of media characters in influencing children's food choices; the first focused on children's self-reported preference, whereas the second focused on actual choice. The results of the experiments suggest that popular characters can make a difference in encouraging children to select one food over another. In the first experiment, children were more likely to indicate a preference for one food over another when one was associated with characters that they liked and with whom they were familiar. This effect was particularly strong when a sugary or salty snack branded by a favored character was competing with a healthier option branded by an unknown character or no character. Alternatively, when children were asked to choose between a healthy food and a sugary or salty snack, branding of the healthy food with a favored character did not significantly change appeal of that healthy snack. However, when foods within the same category (i.e., 2 vegetables, 2 fruits, or 2 grains) were asked to compete against each other, character branding strongly influenced children's food choice. Findings from the second experiment suggest that children are more willing to try more pieces of a healthy food if a favored character, in comparison with an unknown character, is promoting that food.

  18. The evolution of the mitochondrial genetic code in arthropods revisited.

    Science.gov (United States)

    Abascal, Federico; Posada, David; Zardoya, Rafael

    2012-04-01

    A variant of the invertebrate mitochondrial genetic code was previously identified in arthropods (Abascal et al. 2006a, PLoS Biol 4:e127) in which, instead of translating the AGG codon as serine, as in other invertebrates, some arthropods translate AGG as lysine. Here, we revisit the evolution of the genetic code in arthropods taking into account that (1) the number of arthropod mitochondrial genomes sequenced has triplicated since the original findings were published; (2) the phylogeny of arthropods has been recently resolved with confidence for many groups; and (3) sophisticated probabilistic methods can be applied to analyze the evolution of the genetic code in arthropod mitochondria. According to our analyses, evolutionary shifts in the genetic code have been more common than previously inferred, with many taxonomic groups displaying two alternative codes. Ancestral character-state reconstruction using probabilistic methods confirmed that the arthropod ancestor most likely translated AGG as lysine. Point mutations at tRNA-Lys and tRNA-Ser correlated with the meaning of the AGG codon. In addition, we identified three variables (GC content, number of AGG codons, and taxonomic information) that best explain the use of each of the two alternative genetic codes.

  19. Bit-wise arithmetic coding for data compression

    Science.gov (United States)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  20. The Proximate Unit in Chinese Handwritten Character Production

    Directory of Open Access Journals (Sweden)

    Jenn-Yeu eChen

    2013-08-01

    Full Text Available In spoken word production, a proximate unit is the first phonological unit at the sublexical level that is selectable for production (O’Seaghdha, Chen, & Chen, 2010. The present study investigated whether the proximate unit in Chinese handwritten word production is the stroke, the radical, or something in between. A written version of the form preparation task was adopted. Chinese participants learned sets of two-character words, later were cued with the first character of each word, and had to write down the second character (the target. Response times were measured from the onset of a cue character to the onset of a written response. In Experiment 1, the target characters within a block shared (homogeneous or did not share (heterogeneous the first stroke. In Experiment 2, the first two strokes were shared in the homogeneous blocks. Response times in the homogeneous blocks and in the heterogeneous blocks were comparable in both experiments (Exp. 1: 687 ms vs. 684 ms, Exp. 2: 717 vs. 716. In Experiment 3 and 4, the target characters within a block shared or did not share the first radical. Response times in the homogeneous blocks were significantly faster than those in the heterogeneous blocks (Exp. 3: 685 vs. 704, Exp. 4: 594 vs. 650. In Experiment 5 and 6, the shared component was a Gestalt-like form that is more than a stroke, constitutes a portion of the target character, can be a stand-alone character itself, can be a radical of another character but is not a radical of the target character (e.g., 士in聲, 鼓, 穀, 款; called a logographeme. Response times in the homogeneous blocks were significantly faster than those in the heterogeneous blocks (Exp. 5: 576 vs. 625, Exp. 6: 586 vs. 620. These results suggest a model of Chinese handwritten character production in which the stroke is not a functional unit, the radical plays the role of a morpheme, and the logographeme is the proximate unit.

  1. Dynamic Obstacle Clearing for Real-time Character Animation

    OpenAIRE

    Glardon, Pascal; Boulic, Ronan; Thalmann, Daniel

    2006-01-01

    This paper proposes a novel method to control virtual characters in dynamic environments. A virtual character is animated by a locomotion and jumping engine, enabling production of continuous parameterized motions. At any time during runtime, flat obstacles (e.g. a puddle of water) can be created and placed in front of a character. The method first decides whether the character is able to get around or jump over the obstacle. Then the motion parameters are accordingly modified. The transition...

  2. Character feature integration of Chinese calligraphy and font

    Science.gov (United States)

    Shi, Cao; Xiao, Jianguo; Jia, Wenhua; Xu, Canhui

    2013-01-01

    A framework is proposed in this paper to effectively generate a new hybrid character type by means of integrating local contour feature of Chinese calligraphy with structural feature of font in computer system. To explore traditional art manifestation of calligraphy, multi-directional spatial filter is applied for local contour feature extraction. Then the contour of character image is divided into sub-images. The sub-images in the identical position from various characters are estimated by Gaussian distribution. According to its probability distribution, the dilation operator and erosion operator are designed to adjust the boundary of font image. And then new Chinese character images are generated which possess both contour feature of artistical calligraphy and elaborate structural feature of font. Experimental results demonstrate the new characters are visually acceptable, and the proposed framework is an effective and efficient strategy to automatically generate the new hybrid character of calligraphy and font.

  3. Character education in perspective of chemistry pre-service teacher

    Science.gov (United States)

    Merdekawati, Krisna

    2017-12-01

    As one of the pre-service teacher education programs, Chemistry Education Department Islamic University of Indonesia (UII) is committed to providing quality education. It is an education that can produce competent and characteristic chemistry pre-service teacher. The focus of research is to describe the perception of students as a potential teacher of chemistry on character education and achievement of character education. The research instruments include questionnaires and observation sheets. Research data show that students have understood the importance of character education and committed to organizing character education later in schools. Students have understood the ways in which character education can be used. The students stated that Chemistry Education Department has tried to equip students with character education. The observation result shows that students generally have character as a pre-service teacher.

  4. Coding For Compression Of Low-Entropy Data

    Science.gov (United States)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  5. Development of authentication code for multi-access optical code division multiplexing based quantum key distribution

    Science.gov (United States)

    Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.

    2018-05-01

    One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

  6. Handwritten Javanese Character Recognition Using Several Artificial Neural Network Methods

    Directory of Open Access Journals (Sweden)

    Gregorius Satia Budhi

    2015-07-01

    Full Text Available Javanese characters are traditional characters that are used to write the Javanese language. The Javanese language is a language used by many people on the island of Java, Indonesia. The use of Javanese characters is diminishing more and more because of the difficulty of studying the Javanese characters themselves. The Javanese character set consists of basic characters, numbers, complementary characters, and so on. In this research we have developed a system to recognize Javanese characters. Input for the system is a digital image containing several handwritten Javanese characters. Preprocessing and segmentation are performed on the input image to get each character. For each character, feature extraction is done using the ICZ-ZCZ method. The output from feature extraction will become input for an artificial neural network. We used several artificial neural networks, namely a bidirectional associative memory network, a counterpropagation network, an evolutionary network, a backpropagation network, and a backpropagation network combined with chi2. From the experimental results it can be seen that the combination of chi2 and backpropagation achieved better recognition accuracy than the other methods.

  7. Alterations in subspecific characters of groundnut

    International Nuclear Information System (INIS)

    Mouli, C.; Patil, S.H.; Kale, D.M.

    1983-01-01

    Recombination of beneficial characters associated in the cultivars of groundnut (Arachis hypogaea, L.) belonging to the two subspecies hypogaea and fastigiata had little success in conventional breeding programme. The cultures of ssp. hypogaea have the desirable characters for the crop improvement viz; various growth habits, profuse branching, large pod, seed dormancy and stress tolerance. Sequential flowering, early maturity, compact fruiting habit and high kernel outturn are the other useful characters present in ssp. fastigiata cultures. Mutation research in a popular variety, Spanish Improved belonging to ssp. fastigiata led to the selection of various mutants. One among the mutants had large pod, a characteristic of hypogaea ssp. Hybridization among the mutants and improved cultivars as well as radiation treatment of selected cultures resulted in the isolation of cultures having not only combinations and alterations of characters in both subspecies, but also modifications. These cultures are classified into major groups and their significance in the groundnut improvement is discussed. (author)

  8. Character & Cane

    Science.gov (United States)

    Sartorius, Tara Cady

    2009-01-01

    They say first impressions can be deceiving. The difficulty of getting to know someone increases when that person is mostly fictional. Whatever the author writes is all readers can know. Whatever they read about the character is all they have to go on. Now take it another step back, and imagine a portrait drawing, painting or print of that…

  9. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  10. The source rock characters of U-rich granite

    Energy Technology Data Exchange (ETDEWEB)

    Mingyue, Feng; Debao, He [CNNC Key Laboratory of Uranium Resources Exploration and Evaluation Technology, Beijing Research Institute of Uranium Geology (China)

    2012-03-15

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  11. The source rock characters of U-rich granite

    International Nuclear Information System (INIS)

    Feng Mingyue; He Debao

    2012-01-01

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  12. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  13. Identification Of Minangkabau Landscape Characters

    Science.gov (United States)

    Asrina, M.; Gunawan, A.; Aris, Munandar

    2017-10-01

    Minangkabau is one of cultures in indonesia which occupies landscape intact. Landscape of Minangkabau have a very close relationship with the culture of the people. Uniqueness of Minangkabau culture and landscape forming an inseparable characterunity. The landscape is necessarily identified to know the inherent landscape characters. The objective of this study was to identify the character of the Minangkabau landscape characterizes its uniqueness. The study was conducted by using descriptive method comprised literature review and field observasion. Observed the landscape characters comprised two main features, they were major and minor features. Indetification of the features was conducted in two original areas (darek) of the Minangkabau traditional society. The research results showed that major features or natural features of the landscape were predominantly landform, landcover, and hidrology. All luhak (districts) of Minangkabau showed similar main features such as hill, canyon, lake, valley, and forest. The existence of natural features such as hills, canyon and valleys characterizes the nature of minangkabau landscape. Minor features formed by Minangkabau cultural society were agricultural land and settlement. Rumah gadang (big house) is one of famous minor features characterizes the Minangkabau culture. In addition, several historical artefacts of building and others structure may strengthen uniqueness of the Minangkabau landscape character, such as The royal palace, inscription, and tunnels.

  14. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  15. CHARACTER EDUCATION IN ISLAMIC BOARDING SCHOOL- BASED SMA AMANAH

    Directory of Open Access Journals (Sweden)

    Nana Herdiana Abdurrahman

    2016-06-01

    Full Text Available This paper aims to describe findings of the study in pesantren-based SMA Amanah covering: 1 the principal policy in developing character education, 2 the methods used in developing character education, 3 students’ characteristics as the result of the character education process, 4 the problems encountered in the implementation of character education and the efforts made in addressing the implementation of character education at the school. This study applied qualitative method using descriptive technique. The data were collected through observation, interviews, and documentation. The findings of study showed that: 1 the principal’s policy in developing character education was carried out by implementing government policies in line with the school’s vision, mission, and programs; 2 the methods used in the process of character education were through role-modelling, assignments and nurturing, habituation, training programs, and  students’ participation in various types of activities, as well as the application of rewards and punishments; 3 the student's characteristics resulted from the character  education process were piousness and devotion as well as being able to apply their knowledge and piety in everyday life; 4 problems encountered in the implementation of character education  were different values and norms students brought from home; imbalance of facilities compared to the number of students; as well as teachers’ readiness to actually implement the new program, which  was character education. Meanwhile the efforts made to overcome those problems were namely developing personal, family, neighborhood or community characters, and making commitment of all related parties/stakeholdres of SMA Amanah.

  16. MARE2DEM: a 2-D inversion code for controlled-source electromagnetic and magnetotelluric data

    Science.gov (United States)

    Key, Kerry

    2016-10-01

    This work presents MARE2DEM, a freely available code for 2-D anisotropic inversion of magnetotelluric (MT) data and frequency-domain controlled-source electromagnetic (CSEM) data from onshore and offshore surveys. MARE2DEM parametrizes the inverse model using a grid of arbitrarily shaped polygons, where unstructured triangular or quadrilateral grids are typically used due to their ease of construction. Unstructured grids provide significantly more geometric flexibility and parameter efficiency than the structured rectangular grids commonly used by most other inversion codes. Transmitter and receiver components located on topographic slopes can be tilted parallel to the boundary so that the simulated electromagnetic fields accurately reproduce the real survey geometry. The forward solution is implemented with a goal-oriented adaptive finite-element method that automatically generates and refines unstructured triangular element grids that conform to the inversion parameter grid, ensuring accurate responses as the model conductivity changes. This dual-grid approach is significantly more efficient than the conventional use of a single grid for both the forward and inverse meshes since the more detailed finite-element meshes required for accurate responses do not increase the memory requirements of the inverse problem. Forward solutions are computed in parallel with a highly efficient scaling by partitioning the data into smaller independent modeling tasks consisting of subsets of the input frequencies, transmitters and receivers. Non-linear inversion is carried out with a new Occam inversion approach that requires fewer forward calls. Dense matrix operations are optimized for memory and parallel scalability using the ScaLAPACK parallel library. Free parameters can be bounded using a new non-linear transformation that leaves the transformed parameters nearly the same as the original parameters within the bounds, thereby reducing non-linear smoothing effects. Data

  17. Character and Characterization in “Palms and Men” novel

    Directory of Open Access Journals (Sweden)

    Fariedeh Khajehpour

    2013-11-01

    Full Text Available Abstract Nowadays one of the most significant elements in story writing is character and characterization. Character in a narrative or a play, is a person that his mental and moral qualities reflect in his deeds and what he says and does. Creating such characters in a story or a novel that seem to be like real people to the reader is called characterization (Mir Sadeghi, 1382: p 85. In this article character and characterization in one of the novels about the eight year war between Iran and Iraq, calling “Palms and Men”, is investigated. Character types, methods of characterization, character appearance, prototype, the relation of character with other elements, and characterization defects are some of the subjects studied in this article.  Character types: these characters are generally categorized in two groups of stereotype and type. Samir, Touraj, Reza, Hamid etc. all belong to Sepah forces category which are type characters. Some women characters such as Zeinab, Samir’s mother, Kolsoum, Hanieh’s mother etc. are stereotype characters.   Methods of characterization in “Palms and Men” novel:   The description of characters is done in two methods by Mr. Nematollah Soleimani:   a Direct characterization: If not all the characters, most of them are certainly characterized in this way. For example in expressing Hamid’s characteristic who is the commander of the mission, he writes, “Hamid didn’t bat an eyelid in any incidents. Hamid was lion-hearted”. (Soleimani, 1380: 128. In describing uncle Heidar, he writes, “wisdom and insight, and the effects of years of suffering, difficulty, and experiencing could be clearly seen in his limpid eyes. The rural man was indeed a wise and experienced man” (the same: 264_265.   b Indirect Characterization: Although in novels and long stories, direct characterization method is often used, in “Palms and Men” this kind of characterization is also appeared in different conversations

  18. Character and Characterization in “Palms and Men” novel

    Directory of Open Access Journals (Sweden)

    F. Khajeh pour

    Full Text Available Nowadays one of the most significant elements in story writing is character and characterization. Character in a narrative or a play, is a person that his mental and moral qualities reflect in his deeds and what he says and does. Creating such characters in a story or a novel that seem to be like real people to the reader is called characterization (Mir Sadeghi, 1382: p 85. In this article character and characterization in one of the novels about the eight year war between Iran and Iraq, calling “Palms and Men”, is investigated. Character types, methods of characterization, character appearance, prototype, the relation of character with other elements, and characterization defects are some of the subjects studied in this article.Character types: these characters are generally categorized in two groups of stereotype and type. Samir, Touraj, Reza, Hamid etc. all belong to Sepah forces category which are type characters. Some women characters such as Zeinab, Samir’s mother, Kolsoum, Hanieh’s mother etc. are stereotype characters.Methods of characterization in “Palms and Men” novel:The description of characters is done in two methods by Mr. Nematollah Soleimani:aDirect characterization: If not all the characters, most of them are certainly characterized in this way. For example in expressing Hamid’s characteristic who is the commander of the mission, he writes, “Hamid didn’t bat an eyelid in any incidents. Hamid was lion-hearted”. (Soleimani, 1380: 128. In describing uncle Heidar, he writes, “wisdom and insight, and the effects of years of suffering, difficulty, and experiencing could be clearly seen in his limpid eyes. The rural man was indeed a wise and experienced man” (the same: 264_265.bIndirect Characterization: Although in novels and long stories, direct characterization method is often used, in “Palms and Men” this kind of characterization is also appeared in different conversations, stream of consciousness, deeds

  19. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available

  20. WWC Evidence Review Protocol for Character Education Interventions

    Science.gov (United States)

    What Works Clearinghouse, 2014

    2014-01-01

    Character education is an inclusive concept regarding all aspects of how families, schools, and related social institutions support the positive character development of children and adults. "Character" in this context refers to the moral and ethical qualities of persons as well as the demonstration of those qualities in their emotional…

  1. Drawing and Recognizing Chinese Characters with Recurrent Neural Network.

    Science.gov (United States)

    Zhang, Xu-Yao; Yin, Fei; Zhang, Yan-Ming; Liu, Cheng-Lin; Bengio, Yoshua

    2018-04-01

    Recent deep learning based approaches have achieved great success on handwriting recognition. Chinese characters are among the most widely adopted writing systems in the world. Previous research has mainly focused on recognizing handwritten Chinese characters. However, recognition is only one aspect for understanding a language, another challenging and interesting task is to teach a machine to automatically write (pictographic) Chinese characters. In this paper, we propose a framework by using the recurrent neural network (RNN) as both a discriminative model for recognizing Chinese characters and a generative model for drawing (generating) Chinese characters. To recognize Chinese characters, previous methods usually adopt the convolutional neural network (CNN) models which require transforming the online handwriting trajectory into image-like representations. Instead, our RNN based approach is an end-to-end system which directly deals with the sequential structure and does not require any domain-specific knowledge. With the RNN system (combining an LSTM and GRU), state-of-the-art performance can be achieved on the ICDAR-2013 competition database. Furthermore, under the RNN framework, a conditional generative model with character embedding is proposed for automatically drawing recognizable Chinese characters. The generated characters (in vector format) are human-readable and also can be recognized by the discriminative RNN model with high accuracy. Experimental results verify the effectiveness of using RNNs as both generative and discriminative models for the tasks of drawing and recognizing Chinese characters.

  2. SKEMA - A computer code to estimate atmospheric dispersion

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1985-01-01

    This computer code is a modified version of DWNWND code, developed in Oak Ridge National Laboratory. The Skema code makes an estimative of concentration in air of a material released in atmosphery, by ponctual source. (C.M.) [pt

  3. A usability evaluation of an interactive application for halal products using optical character recognition and augmented reality technologies

    Science.gov (United States)

    Lam, Meng Chun; Nizam, Siti Soleha Muhammad; Arshad, Haslina; A'isyah Ahmad Shukri, Saidatul; Hashim, Nurhazarifah Che; Putra, Haekal Mozzia; Abidin, Rimaniza Zainal

    2017-10-01

    This article discusses the usability of an interactive application for halal products using Optical Character Recognition (OCR) and Augmented Reality (AR) technologies. Among the problems that have been identified in this study is that consumers have little knowledge about the E-Code. Therefore, users often have doubts about the halal status of the product. Nowadays, the integrity of halal status can be doubtful due to the actions of some irresponsible people spreading false information about a product. Therefore, an application that uses OCR and AR technology developed in this study will help the users to identify the information content of a product by scanning the E-Code label and by scanning the product's brand to know the halal status of the product. In this application, E-Code on the label of a product is scanned using OCR technology to display information about the E-Code. The product's brand is scan using augmented reality technology to display halal status of the product. The findings reveal that users are satisfied with this application and it is useful and easy to use.

  4. The Feasibility of Multidimensional CFD Applied to Calandria System in the Moderator of CANDU-6 PHWR Using Commercial and Open-Source Codes

    Directory of Open Access Journals (Sweden)

    Hyoung Tae Kim

    2016-01-01

    Full Text Available The moderator system of CANDU, a prototype of PHWR (pressurized heavy-water reactor, has been modeled in multidimension for the computation based on CFD (computational fluid dynamics technique. Three CFD codes are tested in modeled hydrothermal systems of heavy-water reactors. Commercial codes, COMSOL Multiphysics and ANSYS-CFX with OpenFOAM, an open-source code, are introduced for the various simplified and practical problems. All the implemented computational codes are tested for a benchmark problem of STERN laboratory experiment with a precise modeling of tubes, compared with each other as well as the measured data and a porous model based on the experimental correlation of pressure drop. Also the effect of turbulence model is discussed for these low Reynolds number flows. As a result, they are shown to be successful for the analysis of three-dimensional numerical models related to the calandria system of CANDU reactors.

  5. First Course in Japanese: Character Workbook.

    Science.gov (United States)

    Niwa, Tamako

    This character workbook is an introduction to Japanese writing designed to be used in conjunction with Parts One and Two of this introductory course in Japanese. All the "hiragana", several "katakana", and 88 Japanese characters are introduced in this text. The workbook, consisting of 30 lessons, is divided into three parts.…

  6. Do Rural Schools Need Character Education?

    Science.gov (United States)

    Reynolds, Janice Carner

    Studies suggest that the challenge of violence in public schools can be met through character education, whether by providing a school culture in which core values are practiced or some form of moral training (indoctrination). To assess the need for character education in rural schools, small-school superintendents and board members in central…

  7. Moral character predominates in person perception and evaluation.

    Science.gov (United States)

    Goodwin, Geoffrey P; Piazza, Jared; Rozin, Paul

    2014-01-01

    What sorts of trait information do people most care about when forming impressions of others? Recent research in social cognition suggests that "warmth," broadly construed, should be of prime importance in impression formation. Yet, some prior research suggests that information about others' specifically moral traits--their moral "character"--may be a primary dimension. Although warmth and character have sometimes been conceived of as interchangeable, we argue that they are separable, and that across a wide variety of contexts, character is usually more important than warmth in impression formation. We first showed that moral character and social warmth traits are indeed separable (Studies 1 and 2). Further studies that used correlational and experimental methods showed that, as predicted, in most contexts, moral character information is more important in impression formation than is warmth information (Studies 2-6). Character information was also more important than warmth information with respect to judgments of traits' perceived fundamentalness to identity, their uniquely human quality, their context-independence, and their controllability (Study 2). Finally, Study 7 used an archival method to show that moral character information appears more prominently than warmth information in obituaries, and more strongly determines the impressions people form of the individuals described in those obituaries. We discuss implications for current theories of person perception and social cognition.

  8. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  9. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W. L.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries

  10. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W.L.; Leopando, L.S.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries. (author)

  11. Perceptions of Americans and the Iraq Invasion: Implications for Understanding National Character Stereotypes

    Science.gov (United States)

    Terracciano, Antonio; McCrae, Robert R.

    2008-01-01

    This study examines perceptions of the “typical American” from 49 cultures around the world. Contrary to the ethnocentric bias hypothesis, we found strong agreement between in-group and out-group ratings on the American profile (assertive, open-minded, but antagonistic); Americans in fact had a somewhat less desirable view of Americans than did others. Within cultures, in-group ratings were not systematically more favorable than out-group ratings. The Iraq invasion had a slight negative effect on perceptions of the typical American, but people around the world seem to draw a clear distinction between U.S. foreign policy and the character of the American people. National character stereotypes appear to have a variety of sources and to be perpetuated by both cognitive mechanisms and socio-cultural forces. PMID:18618011

  12. Stars with shell energy sources. Part 1. Special evolutionary code

    International Nuclear Information System (INIS)

    Rozyczka, M.

    1977-01-01

    A new version of the Henyey-type stellar evolution code is described and tested. It is shown, as a by-product of the tests, that the thermal time scale of the core of a red giant approaching the helium flash is of the order of the evolutionary time scale. The code itself appears to be a very efficient tool for investigations of the helium flash, carbon flash and the evolution of a white dwarf accreting mass. (author)

  13. Cultivating characters (moral value) through internalization strategy in science classroom

    Science.gov (United States)

    Ibrahim, M.; Abadi

    2018-01-01

    It is still in a crucial debate that characters play an important learning outcome to be realized by design. So far, most people think that characters were reached as nurturance effect with the assumption that students who are knowledgeable and skillful will have good characters automatically. Lately, obtained evidence that this assumption is not true. Characters should be taught deliberately or by design. This study was designed to culture elementary school students’ characters through science classroom. The teaching-learning process was conducted to facilitate and bridge the students from the known (concrete images: Science phenomena) to the unknown (abstract ideas: characters: care, and tolerance. Characters were observed five weeks before and after the intervention. Data were analyzed from observation of 24 students in internalization strategy-based courses. Qualitative and quantitative data suggested that the internalization strategy that use of science phenomena to represent abstract ideas (characters) in science classroom positively cultivating characters.

  14. Compliments in Audiovisual Translation – issues in character identity

    Directory of Open Access Journals (Sweden)

    Isabel Fernandes Silva

    2011-12-01

    Full Text Available Over the last decades, audiovisual translation has gained increased significance in Translation Studies as well as an interdisciplinary subject within other fields (media, cinema studies etc. Although many articles have been published on communicative aspects of translation such as politeness, only recently have scholars taken an interest in the translation of compliments. This study will focus on both these areas from a multimodal and pragmatic perspective, emphasizing the links between these fields and how this multidisciplinary approach will evidence the polysemiotic nature of the translation process. In Audiovisual Translation both text and image are at play, therefore, the translation of speech produced by the characters may either omit (because it is provided by visualgestual signs or it may emphasize information. A selection was made of the compliments present in the film What Women Want, our focus being on subtitles which did not successfully convey the compliment expressed in the source text, as well as analyze the reasons for this, namely difference in register, Culture Specific Items and repetitions. These differences lead to a different portrayal/identity/perception of the main character in the English version (original soundtrack and subtitled versions in Portuguese and Italian.

  15. Open Inclusion or Shameful Secret: A Comparison of Characters with Fetal Alcohol Spectrum Disorders (FASD) and Characters with Autism Spectrum Disorders (ASD) in a North American Sample of Books for Children and Young Adults

    Science.gov (United States)

    Barker, Conor; Kulyk, Juli; Knorr, Lyndsay; Brenna, Beverley

    2011-01-01

    Using a framework of critical literacy, and acknowledging the characteristics of Radical Change, the authors explore 75 North American youth fiction novels which depict characters with disabilities. Books were identified from a variety of sources (i.e., awards lists, book reviews, other research, and word-of-mouth), to represent a random sample…

  16. Some optimizations of the animal code

    International Nuclear Information System (INIS)

    Fletcher, W.T.

    1975-01-01

    Optimizing techniques were performed on a version of the ANIMAL code (MALAD1B) at the source-code (FORTRAN) level. Sample optimizing techniques and operations used in MALADOP--the optimized version of the code--are presented, along with a critique of some standard CDC 7600 optimizing techniques. The statistical analysis of total CPU time required for MALADOP and MALAD1B shows a run-time saving of 174 msec (almost 3 percent) in the code MALADOP during one time step

  17. Runtime Detection of C-Style Errors in UPC Code

    Energy Technology Data Exchange (ETDEWEB)

    Pirkelbauer, P; Liao, C; Panas, T; Quinlan, D

    2011-09-29

    Unified Parallel C (UPC) extends the C programming language (ISO C 99) with explicit parallel programming support for the partitioned global address space (PGAS), which provides a global memory space with localized partitions to each thread. Like its ancestor C, UPC is a low-level language that emphasizes code efficiency over safety. The absence of dynamic (and static) safety checks allows programmer oversights and software flaws that can be hard to spot. In this paper, we present an extension of a dynamic analysis tool, ROSE-Code Instrumentation and Runtime Monitor (ROSECIRM), for UPC to help programmers find C-style errors involving the global address space. Built on top of the ROSE source-to-source compiler infrastructure, the tool instruments source files with code that monitors operations and keeps track of changes to the system state. The resulting code is linked to a runtime monitor that observes the program execution and finds software defects. We describe the extensions to ROSE-CIRM that were necessary to support UPC. We discuss complications that arise from parallel code and our solutions. We test ROSE-CIRM against a runtime error detection test suite, and present performance results obtained from running error-free codes. ROSE-CIRM is released as part of the ROSE compiler under a BSD-style open source license.

  18. Simulation of droplet impact onto a deep pool for large Froude numbers in different open-source codes

    Science.gov (United States)

    Korchagova, V. N.; Kraposhin, M. V.; Marchevsky, I. K.; Smirnova, E. V.

    2017-11-01

    A droplet impact on a deep pool can induce macro-scale or micro-scale effects like a crown splash, a high-speed jet, formation of secondary droplets or thin liquid films, etc. It depends on the diameter and velocity of the droplet, liquid properties, effects of external forces and other factors that a ratio of dimensionless criteria can account for. In the present research, we considered the droplet and the pool consist of the same viscous incompressible liquid. We took surface tension into account but neglected gravity forces. We used two open-source codes (OpenFOAM and Gerris) for our computations. We review the possibility of using these codes for simulation of processes in free-surface flows that may take place after a droplet impact on the pool. Both codes simulated several modes of droplet impact. We estimated the effect of liquid properties with respect to the Reynolds number and Weber number. Numerical simulation enabled us to find boundaries between different modes of droplet impact on a deep pool and to plot corresponding mode maps. The ratio of liquid density to that of the surrounding gas induces several changes in mode maps. Increasing this density ratio suppresses the crown splash.

  19. Character combinations, convergence and diversification in ectoparasitic arthropods.

    Science.gov (United States)

    Poulin, Robert

    2009-08-01

    Different lineages of organisms diversify over time at different rates, in part as a consequence of the characteristics of the species in these lineages. Certain suites of traits possessed by species within a clade may determine rates of diversification, with some particular combinations of characters acting synergistically to either limit or promote diversification; the most successful combinations may also emerge repeatedly in different clades via convergent evolution. Here, the association between species characters and diversification is investigated amongst 21 independent lineages of arthropods ectoparasitic on vertebrate hosts. Using nine characters (each with two to four states) that capture general life history strategy, transmission mode and host-parasite interaction, each lineage was described by the set of character states it possesses. The results show, firstly, that most possible pair-wise combinations of character states have been adopted at least once, sometimes several times independently by different lineages; thus, ectoparasitic arthropods have explored most of the life history character space available to them. Secondly, lineages possessing commonly observed combinations of character states are not necessarily the ones that have experienced the highest rates of diversification (measured as a clade's species-per-genus ratio). Thirdly, some specific traits are associated with higher rates of diversification. Using more than one host per generation, laying eggs away from the host and intermediate levels of fecundity are features that appear to have promoted diversification. These findings indicate that particular species characters may be evolutionary drivers of diversity, whose effects could also apply in other taxa.

  20. The dark cube: dark and light character profiles

    Directory of Open Access Journals (Sweden)

    Danilo Garcia

    2016-02-01

    Full Text Available Background. Research addressing distinctions and similarities between people’s malevolent character traits (i.e., the Dark Triad: Machiavellianism, narcissism, and psychopathy has detected inconsistent linear associations to temperament traits. Additionally, these dark traits seem to have a common core expressed as uncooperativeness. Hence, some researchers suggest that the dark traits are best represented as one global construct (i.e., the unification argument rather than as ternary construct (i.e., the uniqueness argument. We put forward the dark cube (cf. Cloninger’s character cube comprising eight dark profiles that can be used to compare individuals who differ in one dark character trait while holding the other two constant. Our aim was to investigate in which circumstances individuals who are high in each one of the dark character traits differ in Cloninger’s “light” character traits: self-directedness, cooperativeness, and self-transcendence. We also investigated if people’s dark character profiles were associated to their light character profiles. Method. A total of 997 participants recruited from Amazon’s Mechanical Turk (MTurk responded to the Short Dark Triad and the Short Character Inventory. Participants were allocated to eight different dark profiles and eight light profiles based on their scores in each of the traits and any possible combination of high and low scores. We used three-way interaction regression analyses and t-tests to investigate differences in light character traits between individuals with different dark profiles. As a second step, we compared the individuals’ dark profile with her/his character profile using an exact cell-wise analysis conducted in the ROPstat software (http://www.ropstat.com. Results. Individuals who expressed high levels of Machiavellianism and those who expressed high levels of psychopathy also expressed low self-directedness and low cooperativeness. Individuals with high

  1. Skipping of Chinese characters does not rely on word-based processing.

    Science.gov (United States)

    Lin, Nan; Angele, Bernhard; Hua, Huimin; Shen, Wei; Zhou, Junyi; Li, Xingshan

    2018-02-01

    Previous eye-movement studies have indicated that people tend to skip extremely high-frequency words in sentence reading, such as "the" in English and "/de" in Chinese. Two alternative hypotheses have been proposed to explain how this frequent skipping happens in Chinese reading: one assumes that skipping happens when the preview has been fully identified at the word level (word-based skipping); the other assumes that skipping happens whenever the preview character is easy to identify regardless of whether lexical processing has been completed or not (character-based skipping). Using the gaze-contingent display change paradigm, we examined the two hypotheses by substituting the preview of the third character of a four-character Chinese word with the high-frequency Chinese character "/de", which should disrupt the ongoing word-level processing. The character-based skipping hypothesis predicts that this manipulation will enhance the skipping probability of the target character (i.e., the third character of the target word), because the character "/de" has much higher character frequency than the original character. The word-based skipping hypothesis instead predicts a reduction of the skipping probability of the target character because the presence of the character "/de" is lexically infelicitous at word level. The results supported the character-based skipping hypothesis, indicating that in Chinese reading the decision of skipping a character can be made before integrating it into a word.

  2. Fundamentals of information theory and coding design

    CERN Document Server

    Togneri, Roberto

    2003-01-01

    In a clear, concise, and modular format, this book introduces the fundamental concepts and mathematics of information and coding theory. The authors emphasize how a code is designed and discuss the main properties and characteristics of different coding algorithms along with strategies for selecting the appropriate codes to meet specific requirements. They provide comprehensive coverage of source and channel coding, address arithmetic, BCH, and Reed-Solomon codes and explore some more advanced topics such as PPM compression and turbo codes. Worked examples and sets of basic and advanced exercises in each chapter reinforce the text's clear explanations of all concepts and methodologies.

  3. Head capsule characters in the Hymenoptera and their phylogenetic implications

    Directory of Open Access Journals (Sweden)

    Lars Vilhelmsen

    2011-09-01

    Full Text Available The head capsule of a taxon sample of three outgroup and 86 ingroup taxa is examined for characters of possible phylogenetic significance within Hymenoptera. 21 morphological characters are illustrated and scored, and their character evolution explored by mapping them onto a phylogeny recently produced from a large morphological data set. Many of the characters are informative and display unambiguous changes. Most of the character support demonstrated is supportive at the superfamily or family level. In contrast, only few characters corroborate deeper nodes in the phylogeny of Hymenoptera.

  4. Applications guide to the RSIC-distributed version of the MCNP code (coupled Monte Carlo neutron-photon Code)

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-09-01

    An overview of the RSIC-distributed version of the MCNP code (a soupled Monte Carlo neutron-photon code) is presented. All general features of the code, from machine hardware requirements to theoretical details, are discussed. The current nuclide cross-section and other libraries available in the standard code package are specified, and a realistic example of the flexible geometry input is given. Standard and nonstandard source, estimator, and variance-reduction procedures are outlined. Examples of correct usage and possible misuse of certain code features are presented graphically and in standard output listings. Finally, itemized summaries of sample problems, various MCNP code documentation, and future work are given

  5. Submatrices of character tables and basic sets

    DEFF Research Database (Denmark)

    Bessenrodt, Christine; Olsson, Jørn Børling

    2012-01-01

    In this investigation of character tables of nite groups we study basic sets and associated representation theoretic data for complementary sets of conjugacy lasses. For the symmetric groups we nd unexpected properties of characters on restricted sets of conjugacy classes, like beautiful...... combinatorial determinant formulae for submatrices of the character table and Cartan matrices with respect to basic sets; we observe that similar phenomena occur for the transition matrices between power sum symmetric functions to bounded partitions and the k-Schur functions dened by Lapointe and Morse...

  6. Visual Antipriming Effect: Evidence from Chinese Character Identification

    Directory of Open Access Journals (Sweden)

    Feng Zhang

    2017-10-01

    Full Text Available Marsolek et al. (2006 have differentiated antipriming effects from priming effects, by adopting a novel priming paradigm comprised of four phases that include a baseline measurement. The general concept of antipriming supports the overlapping representation theory of knowledge. This study extended examination of the Marsolek et al. (2006 paradigm by investigating antipriming and priming effects in a series of Chinese character identification tasks. Results showed that identification accuracy of old characters was significantly higher than baseline measurements (i.e., the priming effect, while identification accuracy of novel characters was significantly lower than baseline measurements (i.e., the antipriming effect. This study demonstrates for the first time the effect of visual antipriming in Chinese character identification. It further provides new evidence for the overlapping representation theory of knowledge, and supports generalizability of the phenomenon to Chinese characters.

  7. Character strengths, social anxiety, and physiological stress reactivity

    Directory of Open Access Journals (Sweden)

    Tingting Li

    2017-05-01

    Full Text Available In this paper, the effects of character strengths on the physiological reactivity to social anxiety induced by the Trier Social Stress Task were reported. On the basis of their scores in the Chinese Virtues Questionnaire, 30 college students were assigned to either high- (n = 15 or low-character-strength (n = 15 groups. Their psychological stress and physiological data across three laboratory stages (namely, baseline, stress exposure, and post-stress were collected. Results indicated that individuals with high character strengths exhibited rapid cardiovascular recovery from baseline to post-stress even if high- and low-character-strength groups showed similar patterns of cardiovascular arousal in response to the stress at baseline and stress exposure. These results prove that character strengths are stress-defense factors that allow for psychological and physiological adaptation to stress.

  8. Representations of deaf characters in children's picture books.

    Science.gov (United States)

    Golos, Debbie B; Moses, Annie M

    2011-01-01

    Picture books can influence how children perceive people of different backgrounds, including people with disabilities whose cultures differ from their own. Researchers have examined the portrayal of multicultural characters with disabilities in children's literature. However, few have specifically considered the portrayal of deaf characters, despite increased inclusion of deaf characters in children's literature over the past two decades. The present study analyzed the portrayal of deaf characters in picture books for children ages 4-8 years. A content analysis of 20 children's picture books was conducted in which the books were analyzed for messages linked to pathological and cultural categories. Results indicated that these books did not portray Deaf characters from a cultural perspective but, rather, highlighted aspects of deafness as a medical condition, one that requires fixing and that perpetuates stereotypes of deafness as a disability.

  9. Influence of a Character-Based App on Children's Learning of Nutritional Information: Should Apps Be Served with a Side of Media Characters?

    Science.gov (United States)

    Putnam, Marisa M; Richmond, Elana M; Brunick, Kaitlin L; Wright, Charlotte A; Calvert, Sandra L

    2018-04-01

    Childhood obesity is a health issue in the United States, associated with marketing practices in which media characters are often used to sell unhealthy products. This study examined exposure to a socially contingent touch-screen gaming app, which replied immediately, reliably, and accurately to children's actions. Children's recall of nutritional content and their liking of the character were assessed. Four- and five-year-old children (N = 114) received no-exposure, single-exposure, or repeated-exposure to a character-based iPad app rewarding healthy and penalizing unhealthy behaviors. Children reported how much they liked the character and recalled healthy and unhealthy items from the app. An ordinary least squares regression was conducted on how much children liked the character by condition. Poisson regressions were conducted on the number of items recalled by condition alone, and in an interacted model of treatment condition by liking the character. Children liked the character more in the repeated app-exposure condition than in the control group (P = 0.018). Children in the repeated and single app-exposure conditions recalled more healthy (P < 0.001) and unhealthy (P < 0.001) items than the control group. Within treatment conditions, liking the character increased recall of healthy items in the single app-exposure compared to the repeated app-exposure condition (P = 0.005). Results revealed that repeated exposure increased children's learning of nutritional information and liking of the character. The results contribute to our understanding of how to deliver effective nutrition information to young children in a new venue, a gaming app.

  10. ON CODE REFACTORING OF THE DIALOG SUBSYSTEM OF CDSS PLATFORM FOR THE OPEN-SOURCE MIS OPENMRS

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2016-08-01

    The open-source MIS OpenMRS developer tools and software API are reviewed. The results of code refactoring of the dialog subsystem of the CDSS platform which is made as module for the open-source MIS OpenMRS are presented. The structure of information model of database of the CDSS dialog subsystem was updated according with MIS OpenMRS requirements. The Model-View-Controller (MVC based approach to the CDSS dialog subsystem architecture was re-implemented with Java programming language using Spring and Hibernate frameworks. The MIS OpenMRS Encounter portlet form for the CDSS dialog subsystem integration is developed as an extension. The administrative module of the CDSS platform is recreated. The data exchanging formats and methods for interaction of OpenMRS CDSS dialog subsystem module and DecisionTree GAE service are re-implemented with help of AJAX technology via jQuery library

  11. Code generation of RHIC accelerator device objects

    International Nuclear Information System (INIS)

    Olsen, R.H.; Hoff, L.; Clifford, T.

    1995-01-01

    A RHIC Accelerator Device Object is an abstraction which provides a software view of a collection of collider control points known as parameters. A grammar has been defined which allows these parameters, along with code describing methods for acquiring and modifying them, to be specified efficiently in compact definition files. These definition files are processed to produce C++ source code. This source code is compiled to produce an object file which can be loaded into a front end computer. Each loaded object serves as an Accelerator Device Object class definition. The collider will be controlled by applications which set and get the parameters in instances of these classes using a suite of interface routines. Significant features of the grammar are described with details about the generated C++ code

  12. The Journey of a Source Line: How your Code is Translated into a Controlled Flow of Electrons

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    In this series we help you understand the bits and pieces that make your code command the underlying hardware. A multitude of layers translate and optimize source code, written in compiled and interpreted programming languages such as C++, Python or Java, to machine language. We explain the role and behavior of the layers in question in a typical usage scenario. While our main focus is on compilers and interpreters, we also talk about other facilities - such as the operating system, instruction sets and instruction decoders.   Biographie: Andrzej Nowak runs TIK Services, a technology and innovation consultancy based in Geneva, Switzerland. In the recent past, he co-founded and sold an award-winning Fintech start-up focused on peer-to-peer lending. Earlier, Andrzej worked at Intel and in the CERN openlab. At openlab, he managed a lab collaborating with Intel and was part of the Chief Technology Office, which set up next-generation technology projects for CERN and the openlab partne...

  13. The Journey of a Source Line: How your Code is Translated into a Controlled Flow of Electrons

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    In this series we help you understand the bits and pieces that make your code command the underlying hardware. A multitude of layers translate and optimize source code, written in compiled and interpreted programming languages such as C++, Python or Java, to machine language. We explain the role and behavior of the layers in question in a typical usage scenario. While our main focus is on compilers and interpreters, we also talk about other facilities - such as the operating system, instruction sets and instruction decoders. Biographie: Andrzej Nowak runs TIK Services, a technology and innovation consultancy based in Geneva, Switzerland. In the recent past, he co-founded and sold an award-winning Fintech start-up focused on peer-to-peer lending. Earlier, Andrzej worked at Intel and in the CERN openlab. At openlab, he managed a lab collaborating with Intel and was part of the Chief Technology Office, which set up next-generation technology projects for CERN and the openlab partners.

  14. Coupled geochemical and solute transport code development

    International Nuclear Information System (INIS)

    Morrey, J.R.; Hostetler, C.J.

    1985-01-01

    A number of coupled geochemical hydrologic codes have been reported in the literature. Some of these codes have directly coupled the source-sink term to the solute transport equation. The current consensus seems to be that directly coupling hydrologic transport and chemical models through a series of interdependent differential equations is not feasible for multicomponent problems with complex geochemical processes (e.g., precipitation/dissolution reactions). A two-step process appears to be the required method of coupling codes for problems where a large suite of chemical reactions must be monitored. Two-step structure requires that the source-sink term in the transport equation is supplied by a geochemical code rather than by an analytical expression. We have developed a one-dimensional two-step coupled model designed to calculate relatively complex geochemical equilibria (CTM1D). Our geochemical module implements a Newton-Raphson algorithm to solve heterogeneous geochemical equilibria, involving up to 40 chemical components and 400 aqueous species. The geochemical module was designed to be efficient and compact. A revised version of the MINTEQ Code is used as a parent geochemical code

  15. Neutron and photon measurements through concrete from a 15 GeV electron beam on a target-comparison with models and calculations. [Intermediate energy source term, Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, T M [Stanford Linear Accelerator Center, CA (USA)

    1979-02-15

    Measurements of neutron and photon dose equivalents from a 15 GeV electron beam striking an iron target inside a scale model of a PEP IR hall are described, and compared with analytic-empirical calculations and with the Monte Carlo code, MORSE. The MORSE code is able to predict both absolute neutron and photon dose equivalents for geometries where the shield is relatively thin, but fails as the shield thickness is increased. An intermediate energy source term is postulated for analytic-empirical neutron shielding calculations to go along with the giant resonance and high energy terms, and a new source term due to neutron capture is postulated for analytic-empirical photon shielding calculations. The source strengths for each energy source term, and each type, are given from analysis of the measurements.

  16. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    Science.gov (United States)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years

  17. The Implementation of Character Education at Senior High School

    Directory of Open Access Journals (Sweden)

    Julia

    2018-01-01

    Full Text Available This paper is aimed at analyzing the implementation of character education at Senior High School in Sumedang Regency, West Java, Indonesia. A content analysis method was employed to collect the data with interviews with six teachers from six different senior high schools, which represent the rural, transitional, and urban areas. The findings revealed that: (1 not all teachers understand the concept of character education; (2 the character education has not been done systematically or has not had the specific design/model for the teaching and learning process. Most teachers embedded the character values during the teaching and learning process as a form of character education. For example, through Qur’an recitation, learning tasks, group discussion, lecture, socio-drama, observation and admonition, and even through teachers’ model. Meanwhile, outside the class, character education was implemented through competition and extracurricular activities; (3 the evaluation of character education was relatively varied, such as an observation followed by admonition, group guidance, and also focusing on behavior and affective assessment in the classroom learning process. This research implied that it is needed to make a policy on a program development through the Bureau of Education to improve Senior High School teachers’ knowledge and skills in implementing the character education.

  18. Calculations of fuel burn-up and radionuclide inventory in the syrian miniature neutron source reactor using the WIMSD4 code

    International Nuclear Information System (INIS)

    Khattab, K.

    2005-01-01

    Calculations of the fuel burn up and radionuclide inventory in the Miniature Neutron Source Reactor after 10 years (the reactor core expected life) of the reactor operating time are presented in this paper. The WIMSD4 code is used to generate the fuel group constants and the infinite multiplication factor versus the reactor operating time for 10, 20, and 30 kW operating power levels. The amounts of uranium burnt up and plutonium produced in the reactor core, the concentrations and radioactivities of the most important fission product and actinide radionuclides accumulated in the reactor core, and the total radioactivity of the reactor core are calculated using the WIMSD4 code as well

  19. Four energy group neutron flux distribution in the Syrian miniature neutron source reactor using the WIMSD4 and CITATION code

    International Nuclear Information System (INIS)

    Khattab, K.; Omar, H.; Ghazi, N.

    2009-01-01

    A 3-D (R, θ , Z) neutronic model for the Miniature Neutron Source Reactor (MNSR) was developed earlier to conduct the reactor neutronic analysis. The group constants for all the reactor components were generated using the WIMSD4 code. The reactor excess reactivity and the four group neutron flux distributions were calculated using the CITATION code. This model is used in this paper to calculate the point wise four energy group neutron flux distributions in the MNSR versus the radius, angle and reactor axial directions. Good agreement is noticed between the measured and the calculated thermal neutron flux in the inner and the outer irradiation site with relative difference less than 7% and 5% respectively. (author)

  20. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    Science.gov (United States)

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  1. Javanese Character Feature Extraction Based on Shape Energy

    Directory of Open Access Journals (Sweden)

    Galih Hendra Wibowo

    2017-07-01

    Full Text Available Javanese character is one of Indonesia's noble culture, especially in Java. However, the number of Javanese people who are able to read the letter has decreased so that there need to be conservation efforts in the form of a system that is able to recognize the characters. One solution to these problem lies in Optical Character Recognition (OCR studies, where one of its heaviest points lies in feature extraction which is to distinguish each character. Shape Energy is one of feature extraction method with the basic idea of how the character can be distinguished simply through its skeleton. Based on the basic idea, then the development of feature extraction is done based on its components to produce an angular histogram with various variations of multiples angle. Furthermore, the performance test of this method and its basic method is performed in Javanese character dataset, which has been obtained from various images, is 240 data with 19 labels by using K-Nearest Neighbors as its classification method. Performance values were obtained based on the accuracy which is generated through the Cross-Validation process of 80.83% in the angular histogram with an angle of 20 degrees, 23% better than Shape Energy. In addition, other test results show that this method is able to recognize rotated character with the lowest performance value of 86% at 180-degree rotation and the highest performance value of 96.97% at 90-degree rotation. It can be concluded that this method is able to improve the performance of Shape Energy in the form of recognition of Javanese characters as well as robust to the rotation.

  2. The Inaccuracy of National Character Stereotypes

    Science.gov (United States)

    McCrae, Robert R.; Chan, Wayne; Jussim, Lee; De Fruyt, Filip; Löckenhoff, Corinna E.; De Bolle, Marleen; Costa, Paul T.; Hřebíčková, Martina; Graf, Sylvie; Realo, Anu; Allik, Jüri; Nakazato, Katsuharu; Shimonaka, Yoshiko; Yik, Michelle; Ficková, Emília; Brunner-Sciarra, Marina; Reátigui, Norma; de Figueora, Nora Leibovich; Schmidt, Vanina; Ahn, Chang-kyu; Ahn, Hyun-nie; Aguilar-Vafaie, Maria E.; Siuta, Jerzy; Szmigielska, Barbara; Cain, Thomas R.; Crawford, Jarret T.; Mastor, Khairul Anwar; Rolland, Jean-Pierre; Nansubuga, Florence; Miramontez, Daniel R.; Benet-Martínez, Veronica; Rossier, Jérôme; Bratko, Denis; Marušić, Iris; Halberstadt, Jamin; Yamaguchi, Mami; Knežević, Goran; Purić, Danka; Martin, Thomas A.; Gheorghiu, Mirona; Smith, Peter B.; Barbaranelli, Claudio; Wang, Lei; Shakespeare-Finch, Jane; Lima, Margarida P.; Klinkosz, Waldemar; Sekowski, Andrzej; Alcalay, Lidia; Simonetti, Franco; Avdeyeva, Tatyana V.; Pramila, V. S.; Terracciano, Antonio

    2013-01-01

    Consensual stereotypes of some groups are relatively accurate, whereas others are not. Previous work suggesting that national character stereotypes are inaccurate has been criticized on several grounds. In this article we (a) provide arguments for the validity of assessed national mean trait levels as criteria for evaluating stereotype accuracy; and (b) report new data on national character in 26 cultures from descriptions (N=3,323) of the typical male or female adolescent, adult, or old person in each. The average ratings were internally consistent and converged with independent stereotypes of the typical culture member, but were weakly related to objective assessments of personality. We argue that this conclusion is consistent with the broader literature on the inaccuracy of national character stereotypes. PMID:24187394

  3. Animated Character Analysis and Costume Design with Structured Analysis

    OpenAIRE

    Yıldırım Artaç, Berna; Ağaç, Saliha

    2016-01-01

    In various genres, costumes complement fictional characters and not onlyconstitute the external appearance of the fictional character, but are alsoused for purposes of fun or style by fans who dress up as the character andinternalize that character’s state of mind. This phenomenon is calledcosplay. A literature review of the field has revealed no study made previouslyon the cosplay costume design process according to character analysis. Thepresent study emphasizes the link between an...

  4. A New Monte Carlo Neutron Transport Code at UNIST

    International Nuclear Information System (INIS)

    Lee, Hyunsuk; Kong, Chidong; Lee, Deokjung

    2014-01-01

    Monte Carlo neutron transport code named MCS is under development at UNIST for the advanced reactor design and research purpose. This MC code can be used for fixed source calculation and criticality calculation. Continuous energy neutron cross section data and multi-group cross section data can be used for the MC calculation. This paper presents the overview of developed MC code and its calculation results. The real time fixed source calculation ability is also tested in this paper. The calculation results show good agreement with commercial code and experiment. A new Monte Carlo neutron transport code is being developed at UNIST. The MC codes are tested with several benchmark problems: ICSBEP, VENUS-2, and Hoogenboom-Martin benchmark. These benchmarks covers pin geometry to 3-dimensional whole core, and results shows good agreement with reference results

  5. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    Energy Technology Data Exchange (ETDEWEB)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu [Planetary Sciences Group, Department of Physics, University of Central Florida, Orlando, FL 32816-2385 (United States)

    2016-07-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  6. TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES

    International Nuclear Information System (INIS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver

    2016-01-01

    We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.

  7. Radiation transport phenomena and modeling - part A: Codes

    International Nuclear Information System (INIS)

    Lorence, L.J.

    1997-01-01

    The need to understand how particle radiation (high-energy photons and electrons) from a variety of sources affects materials and electronics has motivated the development of sophisticated computer codes that describe how radiation with energies from 1.0 keV to 100.0 GeV propagates through matter. Predicting radiation transport is the necessary first step in predicting radiation effects. The radiation transport codes that are described here are general-purpose codes capable of analyzing a variety of radiation environments including those produced by nuclear weapons (x-rays, gamma rays, and neutrons), by sources in space (electrons and ions) and by accelerators (x-rays, gamma rays, and electrons). Applications of these codes include the study of radiation effects on electronics, nuclear medicine (imaging and cancer treatment), and industrial processes (food disinfestation, waste sterilization, manufacturing.) The primary focus will be on coupled electron-photon transport codes, with some brief discussion of proton transport. These codes model a radiation cascade in which electrons produce photons and vice versa. This coupling between particles of different types is important for radiation effects. For instance, in an x-ray environment, electrons are produced that drive the response in electronics. In an electron environment, dose due to bremsstrahlung photons can be significant once the source electrons have been stopped

  8. Method for coding low entrophy data

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1995-01-01

    A method of lossless data compression for efficient coding of an electronic signal of information sources of very low information rate is disclosed. In this method, S represents a non-negative source symbol set, (s(sub 0), s(sub 1), s(sub 2), ..., s(sub N-1)) of N symbols with s(sub i) = i. The difference between binary digital data is mapped into symbol set S. Consecutive symbols in symbol set S are then paired into a new symbol set Gamma which defines a non-negative symbol set containing the symbols (gamma(sub m)) obtained as the extension of the original symbol set S. These pairs are then mapped into a comma code which is defined as a coding scheme in which every codeword is terminated with the same comma pattern, such as a 1. This allows a direct coding and decoding of the n-bit positive integer digital data differences without the use of codebooks.

  9. About Chinese Characters

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    It is perhaps a facet of human nature that makes a person want to beking, and to control others. The character "王" was originally"王",symbolizing the emerald prayer beads worn exclusively by the king. Inthe course of this character’s evolution, however, new connotations were

  10. The genetic and environmental structure of the character sub-scales of the temperament and character inventory in adolescence.

    Science.gov (United States)

    Lester, Nigel; Garcia, Danilo; Lundström, Sebastian; Brändström, Sven; Råstam, Maria; Kerekes, Nóra; Nilsson, Thomas; Cloninger, C Robert; Anckarsäter, Henrik

    2016-01-01

    The character higher order scales (self-directedness, cooperativeness, and self-transcendence) in the temperament and character inventory are important general measures of health and well-being [Mens Sana Monograph 11:16-24 (2013)]. Recent research has found suggestive evidence of common environmental influence on the development of these character traits during adolescence. The present article expands earlier research by focusing on the internal consistency and the etiology of traits measured by the lower order sub-scales of the character traits in adolescence. The twin modeling analysis of 423 monozygotic pairs and 408 same sex dizygotic pairs estimated additive genetics (A), common environmental (C), and non-shared environmental (E) influences on twin resemblance. All twins were part of the on-going longitudinal Child and Adolescent Twin Study in Sweden (CATSS). The twin modeling analysis suggested a common environmental contribution for two out of five self-directedness sub-scales (0.14 and 0.23), for three out of five cooperativeness sub-scales (0.07-0.17), and for all three self-transcendence sub-scales (0.10-0.12). The genetic structure at the level of the character lower order sub-scales in adolescents shows that the proportion of the shared environmental component varies in the trait of self-directedness and in the trait of cooperativeness, while it is relatively stable across the components of self-transcendence. The presence of this unique shared environmental effect in adolescence has implications for understanding the relative importance of interventions and treatment strategies aimed at promoting overall maturation of character, mental health, and well-being during this period of the life span.

  11. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  12. Directivity of Spherical Polyhedron Sound Source Used in Near-Field HRTF Measurements

    International Nuclear Information System (INIS)

    Yu Guang-Zheng; Xie Bo-Sun; Rao Dan

    2010-01-01

    The omnidirectional character is one of important requirements for the sound source used in near-field head-related transfer function (HRTF) measurements. Based on the analysis on the radiation sound pressure and directivity character of various spherical polyhedron sound sources, a spherical dodecahedral sound source with radius of 0.035m is proposed and manufactured. Theoretical and measured results indicate that the sound source is approximately omnidirectional below the frequency of 8 kHz. In addition, the sound source has reasonable magnitude response from 350Hz to 20kHz and linear phase characteristics. Therefore, it is suitable for the near-field HRTF measurements. (fundamental areas of phenomenology(including applications))

  13. Parity-Check Network Coding for Multiple Access Relay Channel in Wireless Sensor Cooperative Communications

    Directory of Open Access Journals (Sweden)

    Du Bing

    2010-01-01

    Full Text Available A recently developed theory suggests that network coding is a generalization of source coding and channel coding and thus yields a significant performance improvement in terms of throughput and spatial diversity. This paper proposes a cooperative design of a parity-check network coding scheme in the context of a two-source multiple access relay channel (MARC model, a common compact model in hierarchical wireless sensor networks (WSNs. The scheme uses Low-Density Parity-Check (LDPC as the surrogate to build up a layered structure which encapsulates the multiple constituent LDPC codes in the source and relay nodes. Specifically, the relay node decodes the messages from two sources, which are used to generate extra parity-check bits by a random network coding procedure to fill up the rate gap between Source-Relay and Source-Destination transmissions. Then, we derived the key algebraic relationships among multidimensional LDPC constituent codes as one of the constraints for code profile optimization. These extra check bits are sent to the destination to realize a cooperative diversity as well as to approach MARC decode-and-forward (DF capacity.

  14. How jurors use and misuse character evidence.

    Science.gov (United States)

    Hunt, Jennifer S; Budesheim, Thomas Lee

    2004-04-01

    The Federal Rules of Evidence allow defendants to offer testimony about their good character, but that testimony can be impeached with cross-examination or a rebuttal witness. It is assumed that jurors use the defense's character evidence (CE) to form guilt and conviction judgments but use impeachment evidence only to assess the character witness's credibility. Two experiments tested these assumptions by presenting mock jurors with various forms of CE and impeachment. Participants made trait ratings for the character witness and defendant and guilt and conviction judgments. Positive CE did not affect guilt or conviction judgments, but cross-examination caused a backlash in which judgments were harsher than when no CE was given. Using path analysis, the authors tested a model of the process by which CE and impeachment affect defendant and witness impressions and guilt and conviction judgments. Implications for juror decision making are discussed.

  15. Introducing Character Animation with Blender

    CERN Document Server

    Mullen, Tony

    2011-01-01

    Introducing Character Animation with Blender, 2nd Edition is written in a friendly but professional tone, with clear descriptions and numerous illustrative screenshots. Throughout the book, tutorials focus on how to accomplish actual animation goals, while illustrating the necessary technical methods along the way. These are reinforced by clear descriptions of how each specific aspect of Blender works and fits together with the rest of the package. By following all the tutorials, the reader will gain all the skills necessary to build and animate a well-modeled, fully-rigged character of their

  16. Benchmark for license plate character segmentation

    Science.gov (United States)

    Gonçalves, Gabriel Resende; da Silva, Sirlene Pio Gomes; Menotti, David; Shwartz, William Robson

    2016-09-01

    Automatic license plate recognition (ALPR) has been the focus of many researches in the past years. In general, ALPR is divided into the following problems: detection of on-track vehicles, license plate detection, segmentation of license plate characters, and optical character recognition (OCR). Even though commercial solutions are available for controlled acquisition conditions, e.g., the entrance of a parking lot, ALPR is still an open problem when dealing with data acquired from uncontrolled environments, such as roads and highways when relying only on imaging sensors. Due to the multiple orientations and scales of the license plates captured by the camera, a very challenging task of the ALPR is the license plate character segmentation (LPCS) step, because its effectiveness is required to be (near) optimal to achieve a high recognition rate by the OCR. To tackle the LPCS problem, this work proposes a benchmark composed of a dataset designed to focus specifically on the character segmentation step of the ALPR within an evaluation protocol. Furthermore, we propose the Jaccard-centroid coefficient, an evaluation measure more suitable than the Jaccard coefficient regarding the location of the bounding box within the ground-truth annotation. The dataset is composed of 2000 Brazilian license plates consisting of 14000 alphanumeric symbols and their corresponding bounding box annotations. We also present a straightforward approach to perform LPCS efficiently. Finally, we provide an experimental evaluation for the dataset based on five LPCS approaches and demonstrate the importance of character segmentation for achieving an accurate OCR.

  17. Technical guide for monitoring selected conditions related to wilderness character

    Science.gov (United States)

    Peter Landres; Steve Boutcher; Liese Dean; Troy Hall; Tamara Blett; Terry Carlson; Ann Mebane; Carol Hardy; Susan Rinehart; Linda Merigliano; David N. Cole; Andy Leach; Pam Wright; Deb Bumpus

    2009-01-01

    The purpose of monitoring wilderness character is to improve wilderness stewardship by providing managers a tool to assess how selected actions and conditions related to wilderness character are changing over time. Wilderness character monitoring provides information to help answer two key questions about wilderness character and wilderness stewardship: 1. How is...

  18. Radioactive releases of nuclear power plants: the code ASTEC

    International Nuclear Information System (INIS)

    Sdouz, G.; Pachole, M.

    1999-11-01

    In order to adopt potential countermeasures to protect the population during the course of an accident in a nuclear power plant a fast prediction of the radiation exposure is necessary. The basic input value for such a dispersion calculation is the source term, which is the description of the physical and chemical behavior of the released radioactive nuclides. Based on a source term data base a pilot system has been developed to determine a relevant source term and to generate the input file for the dispersion code TAMOS of the Zentralanstalt fuer Meteorologie und Geodynamik (ZAMG). This file can be sent directly as an attachment of e-mail to the TAMOS user for further processing. The source terms for 56 European nuclear power plant units are included in the pilot version of the code ASTEC (Austrian Source Term Estimation Code). The use of the system is demonstrated in an example based on an accident in the unit TEMELIN-1. In order to calculate typical core inventories for the data bank the international computer code OBIGEN 2.1 was installed and applied. The report has been completed with a discussion on the optimal data transfer. (author)

  19. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corp., Tokyo (Japan)

    2012-07-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  20. Input/output manual of light water reactor fuel performance code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2012-07-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which has been fully disclosed in the code model description published recently as JAEA-Data/Code 2010-035. The present manual, which is the counterpart of this description, gives detailed explanations of operation method of FEMAXI-7 code and its related codes, methods of Input/Output, methods of source code modification, features of subroutine modules, and internal variables in a specific manner in order to facilitate users to perform a fuel analysis with FEMAXI-7. This report includes some descriptions which are modified from the original contents of JAEA-Data/Code 2010-035. A CD-ROM is attached as an appendix. (author)

  1. Effect of refractive error on temperament and character properties

    Institute of Scientific and Technical Information of China (English)

    Emine; Kalkan; Akcay; Fatih; Canan; Huseyin; Simavli; Derya; Dal; Hacer; Yalniz; Nagihan; Ugurlu; Omer; Gecici; Nurullah; Cagil

    2015-01-01

    AIM: To determine the effect of refractive error on temperament and character properties using Cloninger’s psychobiological model of personality.METHODS: Using the Temperament and Character Inventory(TCI), the temperament and character profiles of 41 participants with refractive errors(17 with myopia,12 with hyperopia, and 12 with myopic astigmatism) were compared to those of 30 healthy control participants.Here, temperament comprised the traits of novelty seeking, harm-avoidance, and reward dependence, while character comprised traits of self-directedness,cooperativeness, and self-transcendence.RESULTS: Participants with refractive error showed significantly lower scores on purposefulness,cooperativeness, empathy, helpfulness, and compassion(P <0.05, P <0.01, P <0.05, P <0.05, and P <0.01,respectively).CONCLUSION: Refractive error might have a negative influence on some character traits, and different types of refractive error might have different temperament and character properties. These personality traits may be implicated in the onset and/or perpetuation of refractive errors and may be a productive focus for psychotherapy.

  2. Glyph Identification and Character Recognition for Sindhi OCR

    Directory of Open Access Journals (Sweden)

    NISAR AHMEDMEMON

    2017-10-01

    Full Text Available A computer can read and write multiple languages and today?s computers are capable of understanding various human languages. A computer can be given instructions through various input methods but OCR (Optical Character Recognition and handwritten character recognition are the input methods in which a scanned page containing text is converted into written or editable text. The change in language text available on scanned page demands different algorithm to recognize text because every language and script pose varying number of challenges to recognize text. The Latin language recognition pose less difficulties compared to Arabic script and languages that use Arabic script for writing and OCR systems for these Latin languages are near to perfection. Very little work has been done on regional languages of Pakistan. In this paper the Sindhi glyphs are identified and the number of characters and connected components are identified for this regional language of Pakistan. A graphical user interface has been created to perform identification task for glyphs and characters of Sindhi language. The glyphs of characters are successfully identified from scanned page and this information can be used to recognize characters. The language glyph identification can be used to apply suitable algorithm to identify language as well as to achieve a higher recognition rate.

  3. Character, Social-Emotional, and Academic Outcomes among Underachieving Elementary School Students

    Science.gov (United States)

    Grier, Leslie K.

    2012-01-01

    One purpose of this research was to examine the psychometric properties of a character assessment scale (the Character Assessment for School Age Children; CASAC) based on 6 pillars of character (Josephson Institute, 2009). Many youth development and character education programs utilize some, if not all, of the pillars of character explicitly or…

  4. A Japanese logographic character frequency list for cognitive science research.

    Science.gov (United States)

    Chikamatsu, N; Yokoyama, S; Nozaki, H; Long, E; Fukuda, S

    2000-08-01

    This paper describes a Japanese logographic character (kanji) frequency list, which is based on an analysis of the largest recently available corpus of Japanese words and characters. This corpus comprised a full year of morning and evening editions of a major newspaper, containing more than 23 million kanji characters and more than 4,000 different kanji characters. This paper lists the 3,000 most frequent kanji characters, as well as an analysis of kanji usage and correlations between the present list and previous Japanese frequency lists. The authors believe that the present list will help researchers more accurately and efficiently control the selection of kanji characters in cognitive science research and interpret related psycholinguistic data.

  5. Calculation of source terms for NUREG-1150

    International Nuclear Information System (INIS)

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP

  6. Fast-neutron, coded-aperture imager

    International Nuclear Information System (INIS)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-01-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  7. Fast-neutron, coded-aperture imager

    Energy Technology Data Exchange (ETDEWEB)

    Woolf, Richard S., E-mail: richard.woolf@nrl.navy.mil; Phlips, Bernard F., E-mail: bernard.phlips@nrl.navy.mil; Hutcheson, Anthony L., E-mail: anthony.hutcheson@nrl.navy.mil; Wulf, Eric A., E-mail: eric.wulf@nrl.navy.mil

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  8. A database application for wilderness character monitoring

    Science.gov (United States)

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  9. On character amenability of semigroup algebras | Maepa ...

    African Journals Online (AJOL)

    We study the character amenability of semigroup algebras. We work on general semigroups and certain semigroups such as inverse semigroups with a nite number of idempotents, inverse semigroups with uniformly locally nite idempotent set, Brandt and Rees semigroup and study the character amenability of the ...

  10. The Strategies for Character Building through Sports Participation

    OpenAIRE

    M.S. Omar-Fauzee; Mohd Nizam Nazarudin; Yudha M. Saputra; Nina Sutresna; Duangkrai Taweesuk; Wipoj Chansem; Rozita Abd. Latif; Soh Kim Geok

    2012-01-01

    The sport participation has been a major part of our life in the societies. Studies on sports participation have found that sports have both positive and negative influence on character buildings. It has been on-going debate on whether ‘sports build character’ but through literature analysis, author had found that ‘with the intention, sports do build character.’ Therefore, strategies of building character through sports are suggested in this paper.

  11. Total Moral Quality: A New Approach for Character Education in Pesantren

    Directory of Open Access Journals (Sweden)

    Hasan Baharun

    2017-06-01

    Full Text Available This paper presents the concept of character education in pesantren which gives valuable contributions to the success of moral development for students. It also offers a different paradigm in developing the concept of character building in educational institution. This article is inspired by the lack of effective character learning in a variety of formal educational institutions. Hence, the schools which succeed in instilling character education can be used as a reference to develop character education. This study focuses on character education model developed by pesantren and offers an alternative perspective of the development of character education in Indonesia. This study adopts a qualitative research approach and uses a case study design. The study shows that the model of character education in pesantren is carried out through a multi-disciplinary approach so as to provide maximum results for the development of character education. This study suggests that the Total Moral Quality (TMQ is the further development of Thomas Lickona’s concept of character education of moral modeling, moral knowing, moral feeling and moral habituation and is applicable in the school.

  12. Character context: a shape descriptor for Arabic handwriting recognition

    Science.gov (United States)

    Mudhsh, Mohammed; Almodfer, Rolla; Duan, Pengfei; Xiong, Shengwu

    2017-11-01

    In the handwriting recognition field, designing good descriptors are substantial to obtain rich information of the data. However, the handwriting recognition research of a good descriptor is still an open issue due to unlimited variation in human handwriting. We introduce a "character context descriptor" that efficiently dealt with the structural characteristics of Arabic handwritten characters. First, the character image is smoothed and normalized, then the character context descriptor of 32 feature bins is built based on the proposed "distance function." Finally, a multilayer perceptron with regularization is used as a classifier. On experimentation with a handwritten Arabic characters database, the proposed method achieved a state-of-the-art performance with recognition rate equal to 98.93% and 99.06% for the 66 and 24 classes, respectively.

  13. Video Game Characters. Theory and Analysis

    OpenAIRE

    Felix Schröter; Jan-Noël Thon

    2014-01-01

    This essay develops a method for the analysis of video game characters based on a theoretical understanding of their medium-specific representation and the mental processes involved in their intersubjective construction by video game players. We propose to distinguish, first, between narration, simulation, and communication as three modes of representation particularly salient for contemporary video games and the characters they represent, second, between narrative, ludic, and social experien...

  14. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa [Japan Atomic Energy Agency, Nuclear Safety Research Center, Tokai, Ibaraki (Japan); Saitou, Hiroaki [ITOCHU Techno-Solutions Corporation, Tokyo (Japan)

    2013-10-15

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  15. Input/output manual of light water reactor fuel analysis code FEMAXI-7 and its related codes

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Udagawa, Yutaka; Nagase, Fumihisa; Saitou, Hiroaki

    2013-10-01

    A light water reactor fuel analysis code FEMAXI-7 has been developed, as an extended version from the former version FEMAXI-6, for the purpose of analyzing the fuel behavior in normal conditions and in anticipated transient conditions. Numerous functional improvements and extensions have been incorporated in FEMAXI-7, which are fully disclosed in the code model description published in the form of another JAEA-Data/Code report. The present manual, which is the very counterpart of this description document, gives detailed explanations of files and operation method of FEMAXI-7 code and its related codes, methods of input/output, sample Input/Output, methods of source code modification, subroutine structure, and internal variables in a specific manner in order to facilitate users to perform fuel analysis by FEMAXI-7. (author)

  16. Featureous: A Tool for Feature-Centric Analysis of Java Software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    Feature-centric comprehension of source code is necessary for incorporating user-requested modifications during software evolution and maintenance. However, such comprehension is difficult to achieve in case of large object-oriented programs due to the size, complexity, and implicit character...... of mappings between features and source code. To support programmers in overcoming these difficulties, we present a feature-centric analysis tool, Featureous. Our tool extends the NetBeans IDE with mechanisms for efficient location of feature implementations in legacy source code, and an extensive analysis...

  17. Importance and correlations of characters for cowpea diversity in traditional varieties

    Directory of Open Access Journals (Sweden)

    Márcia Silva de Mendonça

    Full Text Available ABSTRACT Cowpea is a legume with ample plasticity, versatility and nutritional potential. It is a species widely used as a source of income and subsistence for small farmers in several Brazilian states, among them Acre. Due to the different varieties found in the State, it is the target of studies aiming at its genetic improvement. Thus, as one of the first stages of its improvement, it was aimed to determine the importance and correlations of characters for diversity and selection in traditional varieties of cowpea. The experiment was carried in completely randomized design, with plots consisting of four vase with capacity to 15.7 L, with one plant each and two replicates. The characteristics evaluated were: days for emergence, flowering (days, plant vigor (note, number of main stem nodes, apical leaflet length (mm, apical leaflet width (mm, length of pod (cm and width of the pod (cm. The correlation coefficients (phenotypic, genotypic and environmental were obtained, and the principal components analysis and the importance of the characters by the method proposed by Singh were carried out. The Singh method and principal components analysis were partially concordant in the distinction of the evaluated characters. The days for emergence, width of the apical leaflet, flowering and length of the apical leaflet were the main determinants for quantification of the genotypes and those that contributed the most to the variability of cowpea. The least discriminant characteristic by principal component analysis and recommended for discarding was plant vigor.

  18. Evaluation Model of the Entrepreneurial Character in EU Countries

    Directory of Open Access Journals (Sweden)

    Sebastian Madalin Munteanu

    2015-02-01

    Full Text Available The evidence of entrepreneurship development as a factor of sustainable growth at national and regional level frequently calls for the interest of theorists and practitioners on identifying and outlining the best conditions and economic essential prerequisites for supporting the entrepreneurial initiatives on the long term. In this context, the objective of the present research is to analyse and measure the entrepreneurial character of the European Union member countries in an integrated manner, by developing an innovative model for proposing specific action lines and objectively evaluating the entrepreneurship development in the investigated states. Our model is based on a synthesis variable of the entrepreneurial national character, which was developed by sequential application of principal component analysis, while the initial variables are from secondary sources with good conceptual representativeness. Depending on the objective relevance of the three model components (cultural, economic and administrative, and entrepreneurial education components, the achieved results confirm the importance of a favourable cultural and economic and administrative background for entrepreneurship development and they reiterate the inefficiency of isolated entrepreneurial education unless supported by good entrepreneurial culture or adequate economic and administrative infrastructure. The case of Romania, in relation with the European Union member countries, is presented in detail.

  19. Methods for Coding Tobacco-Related Twitter Data: A Systematic Review.

    Science.gov (United States)

    Lienemann, Brianna A; Unger, Jennifer B; Cruz, Tess Boley; Chu, Kar-Hai

    2017-03-31

    As Twitter has grown in popularity to 313 million monthly active users, researchers have increasingly been using it as a data source for tobacco-related research. The objective of this systematic review was to assess the methodological approaches of categorically coded tobacco Twitter data and make recommendations for future studies. Data sources included PsycINFO, Web of Science, PubMed, ABI/INFORM, Communication Source, and Tobacco Regulatory Science. Searches were limited to peer-reviewed journals and conference proceedings in English from January 2006 to July 2016. The initial search identified 274 articles using a Twitter keyword and a tobacco keyword. One coder reviewed all abstracts and identified 27 articles that met the following inclusion criteria: (1) original research, (2) focused on tobacco or a tobacco product, (3) analyzed Twitter data, and (4) coded Twitter data categorically. One coder extracted data collection and coding methods. E-cigarettes were the most common type of Twitter data analyzed, followed by specific tobacco campaigns. The most prevalent data sources were Gnip and Twitter's Streaming application programming interface (API). The primary methods of coding were hand-coding and machine learning. The studies predominantly coded for relevance, sentiment, theme, user or account, and location of user. Standards for data collection and coding should be developed to be able to more easily compare and replicate tobacco-related Twitter results. Additional recommendations include the following: sample Twitter's databases multiple times, make a distinction between message attitude and emotional tone for sentiment, code images and URLs, and analyze user profiles. Being relatively novel and widely used among adolescents and black and Hispanic individuals, Twitter could provide a rich source of tobacco surveillance data among vulnerable populations. ©Brianna A Lienemann, Jennifer B Unger, Tess Boley Cruz, Kar-Hai Chu. Originally published in the

  20. Computational Design of Animated Mechanical Characters

    Science.gov (United States)

    Coros, Stelian; Thomaszewski, Bernhard; DRZ Team Team

    2014-03-01

    A factor key to the appeal of modern CG movies and video-games is that the virtual worlds they portray place no bounds on what can be imagined. Rapid manufacturing devices hold the promise of bringing this type of freedom to our own world, by enabling the fabrication of physical objects whose appearance, deformation behaviors and motions can be precisely specified. In order to unleash the full potential of this technology however, computational design methods that create digital content suitable for fabrication need to be developed. In recent work, we presented a computational design system that allows casual users to create animated mechanical characters. Given an articulated character as input, the user designs the animated character by sketching motion curves indicating how they should move. For each motion curve, our framework creates an optimized mechanism that reproduces it as closely as possible. The resulting mechanisms are attached to the character and then connected to each other using gear trains, which are created in a semi-automated fashion. The mechanical assemblies generated with our system can be driven with a single input driver, such as a hand-operated crank or an electric motor, and they can be fabricated using rapid prototyping devices.

  1. Optical Character Recognition.

    Science.gov (United States)

    Converso, L.; Hocek, S.

    1990-01-01

    This paper describes computer-based optical character recognition (OCR) systems, focusing on their components (the computer, the scanner, the OCR, and the output device); how the systems work; and features to consider in selecting a system. A list of 26 questions to ask to evaluate systems for potential purchase is included. (JDD)

  2. A structural query system for Han characters

    DEFF Research Database (Denmark)

    Skala, Matthew

    2016-01-01

    The IDSgrep structural query system for Han character dictionaries is presented. This dictionary search system represents the spatial structure of Han characters using Extended Ideographic Description Sequences (EIDSes), a data model and syntax based on the Unicode IDS concept. It includes a query...... language for EIDS databases, with a freely available implementation and format translation from popular third-party IDS and XML character databases. The system is designed to suit the needs of font developers and foreign language learners. The search algorithm includes a bit vector index inspired by Bloom...... filters to support faster query operations. Experimental results are presented, evaluating the effect of the indexing on query performance....

  3. Molecular phylogenetics and character evolution of morphologically diverse groups, Dendrobium section Dendrobium and allies

    Science.gov (United States)

    Takamiya, Tomoko; Wongsawad, Pheravut; Sathapattayanon, Apirada; Tajima, Natsuko; Suzuki, Shunichiro; Kitamura, Saki; Shioda, Nao; Handa, Takashi; Kitanaka, Susumu; Iijima, Hiroshi; Yukawa, Tomohisa

    2014-01-01

    It is always difficult to construct coherent classification systems for plant lineages having diverse morphological characters. The genus Dendrobium, one of the largest genera in the Orchidaceae, includes ∼1100 species, and enormous morphological diversification has hindered the establishment of consistent classification systems covering all major groups of this genus. Given the particular importance of species in Dendrobium section Dendrobium and allied groups as floriculture and crude drug genetic resources, there is an urgent need to establish a stable classification system. To clarify phylogenetic relationships in Dendrobium section Dendrobium and allied groups, we analysed the macromolecular characters of the group. Phylogenetic analyses of 210 taxa of Dendrobium were conducted on DNA sequences of internal transcribed spacer (ITS) regions of 18S–26S nuclear ribosomal DNA and the maturase-coding gene (matK) located in an intron of the plastid gene trnK using maximum parsimony and Bayesian methods. The parsimony and Bayesian analyses revealed 13 distinct clades in the group comprising section Dendrobium and its allied groups. Results also showed paraphyly or polyphyly of sections Amblyanthus, Aporum, Breviflores, Calcarifera, Crumenata, Dendrobium, Densiflora, Distichophyllae, Dolichocentrum, Holochrysa, Oxyglossum and Pedilonum. On the other hand, the monophyly of section Stachyobium was well supported. It was found that many of the morphological characters that have been believed to reflect phylogenetic relationships are, in fact, the result of convergence. As such, many of the sections that have been recognized up to this point were found to not be monophyletic, so recircumscription of sections is required. PMID:25107672

  4. Developing Individual and Team Character in Sport

    Science.gov (United States)

    Gaines, Stacey A.

    2012-01-01

    The idea that participation in sport builds character is a long-standing one. Advocates of sport participation believe that sport provides an appropriate context for the learning of social skills such as cooperation and the development of prosocial behavior (Weiss, Smith, & Stuntz, 2008). Research in sport regarding character development has…

  5. Polarization diversity scheme on spectral polarization coding optical code-division multiple-access network

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chang, Yao-Tang; Chen, Bo-Hau

    2010-12-01

    We present an experiment demonstrating the spectral-polarization coding optical code-division multiple-access system introduced with a nonideal state of polarization (SOP) matching conditions. In the proposed system, the encoding and double balanced-detection processes are implemented using a polarization-diversity scheme. Because of the quasiorthogonality of Hadamard codes combining with array waveguide grating routers and a polarization beam splitter, the proposed codec pair can encode-decode multiple code words of Hadamard code while retaining the ability for multiple-access interference cancellation. The experimental results demonstrate that when the system is maintained with an orthogonal SOP for each user, an effective reduction in the phase-induced intensity noise is obtained. The analytical SNR values are found to overstate the experimental results by around 2 dB when the received effective power is large. This is mainly limited by insertion losses of components and a nonflattened optical light source. Furthermore, the matching conditions can be improved by decreasing nonideal influences.

  6. Personality, temperament and character in Erich Fromm's theory

    Directory of Open Access Journals (Sweden)

    Redžić Saduša F.

    2012-01-01

    Full Text Available The character of the man is the substitute for instincts that animals have. In reality characters are not found in pure form, as specified orientation, but as a mixture of types. The social character of Fromm has an ethical and heuristic importance. Human passions are rooted in character and are a way to give meaning to existence, to respond to the human existential situation. If we are not able to respond through the love, then, in the absence of, we turn to destructiveness. According to Fromm, the most important goal of society should be human development. He lays down a rational belief in critical thought coupled with love of life. Although the development of personality is largely determined by social structure, Fromm concludes it is not entirely passive, that a man has the opportunity, space and power to use his mind to react to the alienation and inhumane living conditions. Through analysis of the social character From is giving a kind of critique of modern, market-oriented society, based on the principles of humanistic ethics. .

  7. Membangun Karakter Anak Usia Dini melalui Pembelajaran Math Character

    Directory of Open Access Journals (Sweden)

    Titin Faridatun Nisa’

    2016-09-01

    Full Text Available Penelitian ini bertujuan untuk mengetahui penerapan pembelajaran math character untuk membangun karakter Anak Usia Dini (AUD dan kesulitan-kesulitan yang dialami guru dalam penerapan pembelajaran math character. Target penelitian ini adalah terbentuknya karakter anak usia dini melalui pembelajaran math character. Jenis penelitian ini adalah penelitian deskriptif dengan metode penelitian kualitatif. Teknik pengumpulan informasi penelitian ini dengan metode observasi dan wawancara. Analisis data penelitian ini menggunakan analisis deskriptif. Hasil penelitian menunjukkan bahwa penerapan pembelajaran math character dapat membangun delapan belas nilai-nilai karakter AUD. Kesulitan-kesulitan yang dialami guru dalam pembentukan karakter AUD melalui pembelajaran math character meliputi tema yang digunakan termasuk tema baru, siswa belum terbiasa dengan pembelajaran berbasis sentra, usia siswa bervariasi, dan adanya ikut campur wali siswa dalam kegiatan pembelajaran di kelas sehingga siswa menjadi kurang mandiri.

  8. Chinese character recognition based on Gabor feature extraction and CNN

    Science.gov (United States)

    Xiong, Yudian; Lu, Tongwei; Jiang, Yongyuan

    2018-03-01

    As an important application in the field of text line recognition and office automation, Chinese character recognition has become an important subject of pattern recognition. However, due to the large number of Chinese characters and the complexity of its structure, there is a great difficulty in the Chinese character recognition. In order to solve this problem, this paper proposes a method of printed Chinese character recognition based on Gabor feature extraction and Convolution Neural Network(CNN). The main steps are preprocessing, feature extraction, training classification. First, the gray-scale Chinese character image is binarized and normalized to reduce the redundancy of the image data. Second, each image is convoluted with Gabor filter with different orientations, and the feature map of the eight orientations of Chinese characters is extracted. Third, the feature map through Gabor filters and the original image are convoluted with learning kernels, and the results of the convolution is the input of pooling layer. Finally, the feature vector is used to classify and recognition. In addition, the generalization capacity of the network is improved by Dropout technology. The experimental results show that this method can effectively extract the characteristics of Chinese characters and recognize Chinese characters.

  9. Computation of the bounce-average code

    International Nuclear Information System (INIS)

    Cutler, T.A.; Pearlstein, L.D.; Rensink, M.E.

    1977-01-01

    The bounce-average computer code simulates the two-dimensional velocity transport of ions in a mirror machine. The code evaluates and bounce-averages the collision operator and sources along the field line. A self-consistent equilibrium magnetic field is also computed using the long-thin approximation. Optionally included are terms that maintain μ, J invariance as the magnetic field changes in time. The assumptions and analysis that form the foundation of the bounce-average code are described. When references can be cited, the required results are merely stated and explained briefly. A listing of the code is appended

  10. Code system BCG for gamma-ray skyshine calculation

    International Nuclear Information System (INIS)

    Ryufuku, Hiroshi; Numakunai, Takao; Miyasaka, Shun-ichi; Minami, Kazuyoshi.

    1979-03-01

    A code system BCG has been developed for calculating conveniently and efficiently gamma-ray skyshine doses using the transport calculation codes ANISN and DOT and the point-kernel calculation codes G-33 and SPAN. To simplify the input forms to the system, the forms for these codes are unified, twelve geometric patterns are introduced to give material regions, and standard data are available as a library. To treat complex arrangements of source and shield, it is further possible to use successively the code such that the results from one code may be used as input data to the same or other code. (author)

  11. The CHEASE code for toroidal MHD equilibria

    International Nuclear Information System (INIS)

    Luetjens, H.

    1996-03-01

    CHEASE solves the Grad-Shafranov equation for the MHD equilibrium of a Tokamak-like plasma with pressure and current profiles specified by analytic forms or sets of data points. Equilibria marginally stable to ballooning modes or with a prescribed fraction of bootstrap current can be computed. The code provides a mapping to magnetic flux coordinates, suitable for MHD stability calculations or global wave propagation studies. The code computes equilibrium quantities for the stability codes ERATO, MARS, PEST, NOVA-W and XTOR and for the global wave propagation codes LION and PENN. The two-dimensional MHD equilibrium (Grad-Shafranov) equation is solved in variational form. The discretization uses bicubic Hermite finite elements with continuous first order derivates for the poloidal flux function Ψ. The nonlinearity of the problem is handled by Picard iteration. The mapping to flux coordinates is carried out with a method which conserves the accuracy of the cubic finite elements. The code uses routines from the CRAY libsci.a program library. However, all these routines are included in the CHEASE package itself. If CHEASE computes equilibrium quantities for MARS with fast Fourier transforms, the NAG library is required. CHEASE is written in standard FORTRAN-77, except for the use of the input facility NAMELIST. CHEASE uses variable names with up to 8 characters, and therefore violates the ANSI standard. CHEASE transfers plot quantities through an external disk file to a plot program named PCHEASE using the UNIRAS or the NCAR plot package. (author) figs., tabs., 34 refs

  12. The CHEASE code for toroidal MHD equilibria

    Energy Technology Data Exchange (ETDEWEB)

    Luetjens, H. [Ecole Polytechnique, 91 - Palaiseau (France). Centre de Physique Theorique; Bondeson, A. [Chalmers Univ. of Technology, Goeteborg (Sweden). Inst. for Electromagnetic Field Theory and Plasma Physics; Sauter, O. [ITER-San Diego, La Jolla, CA (United States)

    1996-03-01

    CHEASE solves the Grad-Shafranov equation for the MHD equilibrium of a Tokamak-like plasma with pressure and current profiles specified by analytic forms or sets of data points. Equilibria marginally stable to ballooning modes or with a prescribed fraction of bootstrap current can be computed. The code provides a mapping to magnetic flux coordinates, suitable for MHD stability calculations or global wave propagation studies. The code computes equilibrium quantities for the stability codes ERATO, MARS, PEST, NOVA-W and XTOR and for the global wave propagation codes LION and PENN. The two-dimensional MHD equilibrium (Grad-Shafranov) equation is solved in variational form. The discretization uses bicubic Hermite finite elements with continuous first order derivates for the poloidal flux function {Psi}. The nonlinearity of the problem is handled by Picard iteration. The mapping to flux coordinates is carried out with a method which conserves the accuracy of the cubic finite elements. The code uses routines from the CRAY libsci.a program library. However, all these routines are included in the CHEASE package itself. If CHEASE computes equilibrium quantities for MARS with fast Fourier transforms, the NAG library is required. CHEASE is written in standard FORTRAN-77, except for the use of the input facility NAMELIST. CHEASE uses variable names with up to 8 characters, and therefore violates the ANSI standard. CHEASE transfers plot quantities through an external disk file to a plot program named PCHEASE using the UNIRAS or the NCAR plot package. (author) figs., tabs., 34 refs.

  13. Adaptive Combined Source and Channel Decoding with Modulation ...

    African Journals Online (AJOL)

    In this paper, an adaptive system employing combined source and channel decoding with modulation is proposed for slow Rayleigh fading channels. Huffman code is used as the source code and Convolutional code is used for error control. The adaptive scheme employs a family of Convolutional codes of different rates ...

  14. A Chinese character teaching system using structure theory and morphing technology.

    Science.gov (United States)

    Sun, Linjia; Liu, Min; Hu, Jiajia; Liang, Xiaohui

    2014-01-01

    This paper proposes a Chinese character teaching system by using the Chinese character structure theory and the 2D contour morphing technology. This system, including the offline phase and the online phase, automatically generates animation for the same Chinese character from different writing stages to intuitively show the evolution of shape and topology in the process of Chinese characters teaching. The offline phase builds the component models database for the same script and the components correspondence database for different scripts. Given two or several different scripts of the same Chinese character, the online phase firstly divides the Chinese characters into components by using the process of Chinese character parsing, and then generates the evolution animation by using the process of Chinese character morphing. Finally, two writing stages of Chinese characters, i.e., seal script and clerical script, are used in experiment to show the ability of the system. The result of the user experience study shows that the system can successfully guide students to improve the learning of Chinese characters. And the users agree that the system is interesting and can motivate them to learn.

  15. A Chinese character teaching system using structure theory and morphing technology.

    Directory of Open Access Journals (Sweden)

    Linjia Sun

    Full Text Available This paper proposes a Chinese character teaching system by using the Chinese character structure theory and the 2D contour morphing technology. This system, including the offline phase and the online phase, automatically generates animation for the same Chinese character from different writing stages to intuitively show the evolution of shape and topology in the process of Chinese characters teaching. The offline phase builds the component models database for the same script and the components correspondence database for different scripts. Given two or several different scripts of the same Chinese character, the online phase firstly divides the Chinese characters into components by using the process of Chinese character parsing, and then generates the evolution animation by using the process of Chinese character morphing. Finally, two writing stages of Chinese characters, i.e., seal script and clerical script, are used in experiment to show the ability of the system. The result of the user experience study shows that the system can successfully guide students to improve the learning of Chinese characters. And the users agree that the system is interesting and can motivate them to learn.

  16. Character Education of the Most Developed Countries in ASEAN

    Science.gov (United States)

    Istiningsih

    2016-01-01

    Character education into an international issue, especially in developing countries. More specifically in Indonesia, character education is a major issue in the 2012's to the present. What kind of education that may build character? To be able to answer this question, we need a broad and deep research. Research simpler related to character…

  17. Character Education in Three Schools: Catholic, Quaker and Public

    Science.gov (United States)

    Meidl, Christopher; Meidl, Tynisha

    2013-01-01

    Character education has always played a role in the purpose of schools. Most US states have a statement about character education as a part of the mission of the schools. This research studied how character education was perceived by participants in regards to school mission statements/philosophies, school atmosphere and curriculum in a Catholic…

  18. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Liu, T; Lin, H; Xu, X [Rensselaer Polytechnic Institute, Troy, NY (United States); Su, L [John Hopkins University, Baltimore, MD (United States); Shi, C [Saint Vincent Medical Center, Bridgeport, CT (United States); Tang, X [Memorial Sloan Kettering Cancer Center, West Harrison, NY (United States); Bednarz, B [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analytically derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.

  19. TU-AB-BRC-10: Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison of GPU and MIC Computing Accelerators

    International Nuclear Information System (INIS)

    Liu, T; Lin, H; Xu, X; Su, L; Shi, C; Tang, X; Bednarz, B

    2016-01-01

    Purpose: (1) To perform phase space (PS) based source modeling for Tomotherapy and Varian TrueBeam 6 MV Linacs, (2) to examine the accuracy and performance of the ARCHER Monte Carlo code on a heterogeneous computing platform with Many Integrated Core coprocessors (MIC, aka Xeon Phi) and GPUs, and (3) to explore the software micro-optimization methods. Methods: The patient-specific source of Tomotherapy and Varian TrueBeam Linacs was modeled using the PS approach. For the helical Tomotherapy case, the PS data were calculated in our previous study (Su et al. 2014 41(7) Medical Physics). For the single-view Varian TrueBeam case, we analytically derived them from the raw patient-independent PS data in IAEA’s database, partial geometry information of the jaw and MLC as well as the fluence map. The phantom was generated from DICOM images. The Monte Carlo simulation was performed by ARCHER-MIC and GPU codes, which were benchmarked against a modified parallel DPM code. Software micro-optimization was systematically conducted, and was focused on SIMD vectorization of tight for-loops and data prefetch, with the ultimate goal of increasing 512-bit register utilization and reducing memory access latency. Results: Dose calculation was performed for two clinical cases, a Tomotherapy-based prostate cancer treatment and a TrueBeam-based left breast treatment. ARCHER was verified against the DPM code. The statistical uncertainty of the dose to the PTV was less than 1%. Using double-precision, the total wall time of the multithreaded CPU code on a X5650 CPU was 339 seconds for the Tomotherapy case and 131 seconds for the TrueBeam, while on 3 5110P MICs it was reduced to 79 and 59 seconds, respectively. The single-precision GPU code on a K40 GPU took 45 seconds for the Tomotherapy dose calculation. Conclusion: We have extended ARCHER, the MIC and GPU-based Monte Carlo dose engine to Tomotherapy and Truebeam dose calculations.

  20. Relationships between Character Education and School Climate

    Science.gov (United States)

    Karaburk, Hasan

    2017-01-01

    The purpose of this study was to explore the relationships between character education and school climate based on the lived experiences and beliefs of teachers. The research was conducted in a public middle school to explore understandings and beliefs of teachers about character education and its perceived impact on school climate. Social…

  1. [INVITED] Luminescent QR codes for smart labelling and sensing

    Science.gov (United States)

    Ramalho, João F. C. B.; António, L. C. F.; Correia, S. F. H.; Fu, L. S.; Pinho, A. S.; Brites, C. D. S.; Carlos, L. D.; André, P. S.; Ferreira, R. A. S.

    2018-05-01

    QR (Quick Response) codes are two-dimensional barcodes composed of special geometric patterns of black modules in a white square background that can encode different types of information with high density and robustness, correct errors and physical damages, thus keeping the stored information protected. Recently, these codes have gained increased attention as they offer a simple physical tool for quick access to Web sites for advertising and social interaction. Challenges encompass the increase of the storage capacity limit, even though they can store approximately 350 times more information than common barcodes, and encode different types of characters (e.g., numeric, alphanumeric, kanji and kana). In this work, we fabricate luminescent QR codes based on a poly(methyl methacrylate) substrate coated with organic-inorganic hybrid materials doped with trivalent terbium (Tb3+) and europium (Eu3+) ions, demonstrating the increase of storage capacity per unit area by a factor of two by using the colour multiplexing, when compared to conventional QR codes. A novel methodology to decode the multiplexed QR codes is developed based on a colour separation threshold where a decision level is calculated through a maximum-likelihood criteria to minimize the error probability of the demultiplexed modules, maximizing the foreseen total storage capacity. Moreover, the thermal dependence of the emission colour coordinates of the Eu3+/Tb3+-based hybrids enables the simultaneously QR code colour-multiplexing and may be used to sense temperature (reproducibility higher than 93%), opening new fields of applications for QR codes as smart labels for sensing.

  2. "It Could Affect You as a Person, Character-Wise": Promoting Character Development and Preventing Sexual Violence at West Point

    Science.gov (United States)

    Arbeit, Miriam R.

    2017-01-01

    The United States Military Academy at West Point develops cadets into "leaders of character" who will become Army officers. This focus on character presents an opportunity for the prevention of sexual violence through an emphasis on military values. Using constructivist grounded theory, this study examined how cadets experience their own…

  3. THE REGULATION OF THE BANKING CONTRACTS IN THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    George Chiocaru

    2012-11-01

    Full Text Available Starting with the enactment of the New Civil Code have been regulated for the first time contracts and legal institutions specific to the banking activity. The new regulation even if they brought important solutions for certain problems regarded in the commercial activity and especially in the banking sector, have also raised new questions regarding especially their domain of applicability, their imperative or dispositive character or the possibility for the parties to conclude contract other than those expressly regulated. By taking into consideration the special character of the operations involving the administration of money as well as the sensitivity from a legal and especially social perspective of the ownership relation regarding money, we have focused in our analysis on the relations resulting from the contracts regulated by the chapter “the bank account and other banking contracts”

  4. The fractal character of radiation defects aggregation in crystals

    International Nuclear Information System (INIS)

    Akylbekov, A.; Akimbekov, E.; Baktybekov, K.; Vasil'eva, I.

    2002-01-01

    In processes of self-organization, which characterize open systems, the source of ordering is a non-equilibrium. One of the samples of ordering system is radiation-stimulated aggregation of defects in solids. In real work the analysis of criterions of ordering defects structures in solid, which is continuously irradiate at low temperature is presented. The method of cellular automata used in simulation of irradiation. It allowed us to imitate processes of defects formation and recombination. The simulation realized on the surfaces up to 1000x1000 units with initial concentration of defects C n (the power of dose) 0.1-1 %. The number of iterations N (duration of irradiation) mounted to 10 6 cycles. The single centers, which are the sources of formation aggregates, survive in the result of probabilistic nature of formation and recombination genetic pairs of defects and with strictly fixed radius of recombination (the minimum inter anionic distance). For determination the character of same type defects distribution the potential of their interaction depending of defects type and reciprocal distance is calculated. For more detailed study of processes, proceeding in cells with certain sizes of aggregates, the time dependence of potential interaction is constructed. It is shown, that on primary stage the potential is negative, then it increase and approach the saturation in positive area. The minimum of interaction potential corresponds to state of physical chaos in system. Its increasing occurs with formation of same type defects aggregates. Further transition to saturation and 'undulating' character of curves explains by formation and destruction aggregates. The data indicated that - these processes occur simultaneously in cells with different sizes. It allows us to assume that the radiation defects aggregation have a fractal nature

  5. Integrating Character Education In Teaching Speaking For Business Administration Students

    OpenAIRE

    Woro Prastiwi, Chyntia Heru

    2016-01-01

    Globalization along with the advancement of information and communication technology has brought tremendous effects on students' character. Education field as a place of community has to contribute in developing students' character traits. Integrating character education in curriculum is the key for qualified education. This research aimed to describe the way to integrate character education in teaching speaking for Business Administration students. The data was obtained from teaching and lea...

  6. The Characteristics of Electromagnetic Fields Induced by Different Type Sources

    Science.gov (United States)

    Di, Q.; Fu, C.; Wang, R.; Xu, C.; An, Z.

    2011-12-01

    Controlled source audio-frequence magnetotelluric (CSAMT) method has played an important role in the shallow exploration (less than 1.5km) in the field of resources, environment and engineering geology. In order to prospect the deeper target, one has to increase the strength of the source and offset. However, the exploration is nearly impossible for the heavy larger power transmitting source used in the deeper prospecting and mountain area. So an EM method using a fixed large power source, such as long bipole current source, two perpendicular "L" shape long bipole current source and large radius circle current source, is beginning to take shape. In order to increase the strength of the source, the length of the transmitting bipole in one direction or in perpendicular directions has to be much larger, such as L=100km, or the radius of the circle current source is much larger. The electric field strength are IL2and IL2/4π separately for long bipole source and circle current source with the same wire length. Just considering the effectiveness of source, the strength of the circle current source is larger than that of long bipole source if is large enough. However, the strength of the electromagnetic signal doesn't totally depend on the transmitting source, the effect of ionosphere on the electromagnetic (EM) field should be considered when observation is carried at a very far (about several thousands kilometers) location away from the source for the long bipole source or the large radius circle current source. We firstly calculate the electromagnetic fields with the traditional controlled source (CSEM) configuration using the integral equation (IE) code developed by our research group for a three layers earth-ionosphere model which consists of ionosphere, atmosphere and earth media. The modeling results agree well with the half space analytical results because the effect of ionosphere for this small scale source can be ignorable, which means the integral equation

  7. ECONOMIC ETHICS: APPLIED AND PROFESSIONAL CHARACTER

    Directory of Open Access Journals (Sweden)

    Ella Gordova

    2012-01-01

    Full Text Available In given article economic ethics are considered as set of norms of behavior of the businessman, the requirements shown by a cultural society to its style of work, to character of dialogue between participants of business, to their social shape. The conclusion becomes that economic ethics have applied character in relation to theoretical, to obschenormativnoy ethics, hence, represent section of applied ethics. On the other hand, the specific standard maintenance characterizes economic ethics as ethics professional.

  8. Coding conventions and principles for a National Land-Change Modeling Framework

    Science.gov (United States)

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  9. Diradical character of some fluoranthenes

    Directory of Open Access Journals (Sweden)

    SVETLANA MARKOVIĆ

    2010-09-01

    Full Text Available It is shown that some Kekuléan fluoranthenes are diradicals and that their ground state is a triplet. In the energetically less favorable singlet state, these hydrocarbons also exhibit pronounced diradical character. The diradical character y of the compounds under investigation was estimated using the unrestricted symmetry-broken (yPUHF and complete active space (yNOON methods. It was found that the yPUHF values better reproduce the diradical character of the investigated hydrocarbons. It was shown that singly occupied molecular orbital (SOMO and SOMO-1 of a diradical structure occupy different parts of space with a small shared region, resulting in a spin density distribution over the entire molecule. The spatial diradical distribution in the singlet diradical structures was examined by inspecting the HOMOs and LUMOs for a and b spin electrons. It was shown that the a-HOMO and the b-LUMO (as well as the b-HOMO and the a-LUMO occupy practically the same part of space. In this way, there are no unpaired electrons in a singlet diradical structure, yet two of them occupy different parts of space, thus allowing the p-electrons to delocalize.

  10. Development of a coupling code for PWR reactor cavity radiation streaming calculation

    International Nuclear Information System (INIS)

    Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.

    2012-01-01

    PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)

  11. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2009-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  12. A Framework for Reverse Engineering Large C++ Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Byelas, Heorhiy; Voinea, Lucian

    2008-01-01

    When assessing the quality and maintainability of large C++ code bases, tools are needed for extracting several facts from the source code, such as: architecture, structure, code smells, and quality metrics. Moreover, these facts should be presented in such ways so that one can correlate them and

  13. A novel handwritten character recognition system using gradient ...

    Indian Academy of Sciences (India)

    The issues faced by the handwritten character recognition systems are the similarity. ∗ ... tical/structural features have also been successfully used in character ..... The coordinates (xc, yc) of centroid are calculated by equations (4) and (5). xc =.

  14. Peter as character in the Gospel of Matthew: complexity and inversion

    Directory of Open Access Journals (Sweden)

    João Leonel

    2014-03-01

    Full Text Available This article focuses on the apostle Peter as a character in the Gospel of Matthew.  It aims at identifying the nuances and changes of the character Peter in the Gospel. For this purpose, I take as a starting point that the gospel belongs to the literary genre of ancient Greco-Roman Biography, which presents Jesus Christ as the protagonist. The other characters are developed in relationship with him. The same is true with the Apostle Peter. The article unfolds from narrative theory, in particular the categorization of characters. I categorize, based on Erich Auerbach and Robert Alter, the features of biblical characters, developing comparisons with theories of the character in the modern novel. The analysis of the main texts from the Gospel of Matthew that portray the character Peter leads to the conclusion that its main features are complexity and inversion. They produce an overview of the involution of the character in the narrative of the Gospel of Matthew.

  15. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    Science.gov (United States)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  16. The audience eats more if a movie character keeps eating: An unconscious mechanism for media influence on eating behaviors.

    Science.gov (United States)

    Zhou, Shuo; Shapiro, Michael A; Wansink, Brian

    2017-01-01

    Media's presentation of eating is an important source of influence on viewers' eating goals and behaviors. Drawing on recent research indicating that whether a story character continues to pursue a goal or completes a goal can unconsciously influence an audience member's goals, a scene from a popular movie comedy was manipulated to end with a character continuing to eat (goal ongoing) or completed eating (goal completed). Participants (N = 147) were randomly assigned to a goal status condition. As a reward, after viewing the movie clip viewers were offered two types of snacks: ChexMix and M&M's, in various size portions. Viewers ate more food after watching the characters continue to eat compared to watching the characters complete eating, but only among those manipulated to identify with a character. Viewers were more likely to choose savory food after viewing the ongoing eating scenes, but sweet dessert-like food after viewing the completed eating scenes. The results extend the notion of media influence on unconscious goal contagion and satiation to movie eating, and raise the possibility that completing a goal can activate a logically subsequent goal. Implications for understanding media influence on eating and other health behaviors are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Automated Degradation Diagnosis in Character Recognition System Subject to Camera Vibration

    Directory of Open Access Journals (Sweden)

    Chunmei Liu

    2014-01-01

    Full Text Available Degradation diagnosis plays an important role for degraded character processing, which can tell the recognition difficulty of a given degraded character. In this paper, we present a framework for automated degraded character recognition system by statistical syntactic approach using 3D primitive symbol, which is integrated by degradation diagnosis to provide accurate and reliable recognition results. Our contribution is to design the framework to build the character recognition submodels corresponding to degradation subject to camera vibration or out of focus. In each character recognition submodel, statistical syntactic approach using 3D primitive symbol is proposed to improve degraded character recognition performance. In the experiments, we show attractive experimental results, highlighting the system efficiency and recognition performance by statistical syntactic approach using 3D primitive symbol on the degraded character dataset.

  18. Distributional Similarity for Chinese: Exploiting Characters and Radicals

    Directory of Open Access Journals (Sweden)

    Peng Jin

    2012-01-01

    Full Text Available Distributional Similarity has attracted considerable attention in the field of natural language processing as an automatic means of countering the ubiquitous problem of sparse data. As a logographic language, Chinese words consist of characters and each of them is composed of one or more radicals. The meanings of characters are usually highly related to the words which contain them. Likewise, radicals often make a predictable contribution to the meaning of a character: characters that have the same components tend to have similar or related meanings. In this paper, we utilize these properties of the Chinese language to improve Chinese word similarity computation. Given a content word, we first extract similar words based on a large corpus and a similarity score for ranking. This rank is then adjusted according to the characters and components shared between the similar word and the target word. Experiments on two gold standard datasets show that the adjusted rank is superior and closer to human judgments than the original rank. In addition to quantitative evaluation, we examine the reasons behind errors drawing on linguistic phenomena for our explanations.

  19. Application of RASCAL code for multiunit accident in domestic nuclear sites

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Hyun; Jeong, Seung Young [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    All of domestic nuclear power plant sites are multiunit site (at least 5 - 6 reactors are operating), so this capability has to be quickly secured for nuclear licensee and institutes responsible for nuclear emergency response. In this study, source term and offsite dose from multiunit event were assessed using a computer code, RASCAL. An emergency exercise scenario was chosen to verify applicability of the codes to domestic nuclear site accident. Employing tools and new features of the code, such as merging more than two individual source terms and source term estimate for long term progression accident, main parameters and information in the scenario, release estimates and dose projections were performed. Radiological releases and offsite doses from multiunit accident were calculated using RASCAL.. A scenario, in which three reactors were damaged coincidently by a great natural disaster, was considered. Surrogate plants were chosen for the code calculation. Source terms of each damaged unit were calculated individually first, and then total source term and integrated offsite dose assessment data was acquired using a source term merge function in the code. Also comparison between LTSBO and LOCA source term estimate options was performed. Differences in offsite doses were caused by release characteristics. From LTSBO option, iodines were released much higher than LOCA. Also LTSBO source term release was delayed and the duration was longer than LOCA. This option would be useful to accidents which progress with much longer time frame than LOCA. RASCAL can be useful tool for radiological consequence assessment in domestic nuclear site accidents.

  20. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  1. The OpenMOC method of characteristics neutral particle transport code

    International Nuclear Information System (INIS)

    Boyd, William; Shaner, Samuel; Li, Lulu; Forget, Benoit; Smith, Kord

    2014-01-01

    Highlights: • An open source method of characteristics neutron transport code has been developed. • OpenMOC shows nearly perfect scaling on CPUs and 30× speedup on GPUs. • Nonlinear acceleration techniques demonstrate a 40× reduction in source iterations. • OpenMOC uses modern software design principles within a C++ and Python framework. • Validation with respect to the C5G7 and LRA benchmarks is presented. - Abstract: The method of characteristics (MOC) is a numerical integration technique for partial differential equations, and has seen widespread use for reactor physics lattice calculations. The exponential growth in computing power has finally brought the possibility for high-fidelity full core MOC calculations within reach. The OpenMOC code is being developed at the Massachusetts Institute of Technology to investigate algorithmic acceleration techniques and parallel algorithms for MOC. OpenMOC is a free, open source code written using modern software languages such as C/C++ and CUDA with an emphasis on extensible design principles for code developers and an easy to use Python interface for code users. The present work describes the OpenMOC code and illustrates its ability to model large problems accurately and efficiently

  2. Noncoherent Spectral Optical CDMA System Using 1D Active Weight Two-Code Keying Codes

    Directory of Open Access Journals (Sweden)

    Bih-Chyun Yeh

    2016-01-01

    Full Text Available We propose a new family of one-dimensional (1D active weight two-code keying (TCK in spectral amplitude coding (SAC optical code division multiple access (OCDMA networks. We use encoding and decoding transfer functions to operate the 1D active weight TCK. The proposed structure includes an optical line terminal (OLT and optical network units (ONUs to produce the encoding and decoding codes of the proposed OLT and ONUs, respectively. The proposed ONU uses the modified cross-correlation to remove interferences from other simultaneous users, that is, the multiuser interference (MUI. When the phase-induced intensity noise (PIIN is the most important noise, the modified cross-correlation suppresses the PIIN. In the numerical results, we find that the bit error rate (BER for the proposed system using the 1D active weight TCK codes outperforms that for two other systems using the 1D M-Seq codes and 1D balanced incomplete block design (BIBD codes. The effective source power for the proposed system can achieve −10 dBm, which has less power than that for the other systems.

  3. Playing MMORPGs: connections between addiction and identifying with a character.

    Science.gov (United States)

    Smahel, David; Blinka, Lukas; Ledabyl, Ondrej

    2008-12-01

    Addiction to online role-playing games is one of the most discussed aspects of recent cyberpsychology, mainly for its potentially negative impact on the social lives of young people. In our study, we focus on some aspects of youth and adolescent addiction to MMORPGs. We investigated connections between players and their game characters and examined if, and in what ways, player relationship to their character affected potential addiction. Players attitude to their characters seems to play a specific role, since players who tend to be addicted view their characters as being superior and more often wish to be like their characters in their real lives. Our research also confirmed that younger players are generally more prone to addiction.

  4. Genus-two characters of the Ising model

    International Nuclear Information System (INIS)

    Choi, J.H.; Koh, I.G.

    1989-01-01

    As a first step in studying conformal theories on a higher-genus Riemann surface, we construct genus-two characters of the Ising model from their behavior in zero- and nonzero-homology pinching limits, the Goddard-Kent-Oliveco set-space construction, and the branching coefficients in the level-two A 1 /sup (1)/ Kac-Moody characters on the higher-genus Riemann surface

  5. An investigation of player to player character identification via personal pronouns

    DEFF Research Database (Denmark)

    Hichens, Michael; Drachen, Anders; Richards, Deborah

    2012-01-01

    The player character is an important feature of many games, where it is through the character that the player interacts with game world. There has been considerable interest in the relationship between the player and the player character. Much of this work has examined the identification of players......, third) as an indication of the relationship between player and character. Results indicate that the presence of story and information about the player character had no effect on identification with the plater character. However, characteristics of the players, particularly gender and general experience...... in playing video games, did have a statistically significant affect, indicating that different levels of identification are more dependent on the player than on the game. This indicates that players are not a homogeneous group with respect to player character identification and is an important consideration...

  6. Phonetic radicals, not phonological coding systems, support orthographic learning via self-teaching in Chinese.

    Science.gov (United States)

    Li, Luan; Wang, Hua-Chen; Castles, Anne; Hsieh, Miao-Ling; Marinus, Eva

    2018-07-01

    According to the self-teaching hypothesis (Share, 1995), phonological decoding is fundamental to acquiring orthographic representations of novel written words. However, phonological decoding is not straightforward in non-alphabetic scripts such as Chinese, where words are presented as characters. Here, we present the first study investigating the role of phonological decoding in orthographic learning in Chinese. We examined two possible types of phonological decoding: the use of phonetic radicals, an internal phonological aid, andthe use of Zhuyin, an external phonological coding system. Seventy-three Grade 2 children were taught the pronunciations and meanings of twelve novel compound characters over four days. They were then exposed to the written characters in short stories, and were assessed on their reading accuracy and on their subsequent orthographic learning via orthographic choice and spelling tasks. The novel characters were assigned three different types of pronunciation in relation to its phonetic radical - (1) a pronunciation that is identical to the phonetic radical in isolation; (2) a common alternative pronunciation associated with the phonetic radical when it appears in other characters; and (3) a pronunciation that is unrelated to the phonetic radical. The presence of Zhuyin was also manipulated. The children read the novel characters more accurately when phonological cues from the phonetic radicals were available and in the presence of Zhuyin. However, only the phonetic radicals facilitated orthographic learning. The findings provide the first empirical evidence of orthographic learning via self-teaching in Chinese, and reveal how phonological decoding functions to support learning in non-alphabetic writing systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Bring out your codes! Bring out your codes! (Increasing Software Visibility and Re-use)

    Science.gov (United States)

    Allen, A.; Berriman, B.; Brunner, R.; Burger, D.; DuPrie, K.; Hanisch, R. J.; Mann, R.; Mink, J.; Sandin, C.; Shortridge, K.; Teuben, P.

    2013-10-01

    Progress is being made in code discoverability and preservation, but as discussed at ADASS XXI, many codes still remain hidden from public view. With the Astrophysics Source Code Library (ASCL) now indexed by the SAO/NASA Astrophysics Data System (ADS), the introduction of a new journal, Astronomy & Computing, focused on astrophysics software, and the increasing success of education efforts such as Software Carpentry and SciCoder, the community has the opportunity to set a higher standard for its science by encouraging the release of software for examination and possible reuse. We assembled representatives of the community to present issues inhibiting code release and sought suggestions for tackling these factors. The session began with brief statements by panelists; the floor was then opened for discussion and ideas. Comments covered a diverse range of related topics and points of view, with apparent support for the propositions that algorithms should be readily available, code used to produce published scientific results should be made available, and there should be discovery mechanisms to allow these to be found easily. With increased use of resources such as GitHub (for code availability), ASCL (for code discovery), and a stated strong preference from the new journal Astronomy & Computing for code release, we expect to see additional progress over the next few years.

  8. Applications guide to the MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-08-01

    A practical guide for the implementation of the MORESE-CG Monte Carlo radiation transport computer code system is presented. The various versions of the MORSE code are compared and contrasted, and the many references dealing explicitly with the MORSE-CG code are reviewed. The treatment of angular scattering is discussed, and procedures for obtaining increased differentiality of results in terms of reaction types and nuclides from a multigroup Monte Carlo code are explained in terms of cross-section and geometry data manipulation. Examples of standard cross-section data input and output are shown. Many other features of the code system are also reviewed, including (1) the concept of primary and secondary particles, (2) fission neutron generation, (3) albedo data capability, (4) DOMINO coupling, (5) history file use for post-processing of results, (6) adjoint mode operation, (7) variance reduction, and (8) input/output. In addition, examples of the combinatorial geometry are given, and the new array of arrays geometry feature (MARS) and its three-dimensional plotting code (JUNEBUG) are presented. Realistic examples of user routines for source, estimation, path-length stretching, and cross-section data manipulation are given. A deatiled explanation of the coupling between the random walk and estimation procedure is given in terms of both code parameters and physical analogies. The operation of the code in the adjoint mode is covered extensively. The basic concepts of adjoint theory and dimensionality are discussed and examples of adjoint source and estimator user routines are given for all common situations. Adjoint source normalization is explained, a few sample problems are given, and the concept of obtaining forward differential results from adjoint calculations is covered. Finally, the documentation of the standard MORSE-CG sample problem package is reviewed and on-going and future work is discussed

  9. Affective affordances: Improving interface characters engagement through interaction.

    NARCIS (Netherlands)

    Van Vugt, H.C.; Hoorn, J.F.; Konijn, E.A.; De Bie Dimitriadou, A.

    2006-01-01

    The nature of humans interacting with interface characters (e.g. embodied agents) is not well understood. The I-PEFiC model provides an integrative perspective on human-character interaction, assuming that the processes of engagement and user interaction exchange information in explaining user

  10. Affective affordances: Improving interface character engagement through interaction

    NARCIS (Netherlands)

    van Vugt, H.C.; Hoorn, J.F.; Konijn, E.A.; de Bie Dimitriadou, A.

    2006-01-01

    The nature of humans interacting with interface characters (e.g. embodied agents) is not well understood. The I-PEFiC model provides an integrative perspective on human-character interaction, assuming that the processes of engagement and user interaction exchange information in explaining user

  11. The Life Mission Theory VI. A Theory for the Human Character: Healing with Holistic Medicine Through Recovery of Character and Purpose of Life

    Directory of Open Access Journals (Sweden)

    Søren Ventegodt

    2004-01-01

    Full Text Available The human character can be understood as an extension of the life mission or purpose of life, and explained as the primary tool of a person to impact others and express the purpose of life. Repression of the human character makes it impossible for a person to realize his personal mission in life and, therefore, is one of the primary causes of self-repression resulting in poor quality of life, health, and ability. From Hippocrates to Hahnemann, repression of physical, mental, and spiritual character can be seen as the prime cause of disease, while recovery of character has been the primary intention of the treatment. In this paper, human character is explained as an intersubjective aspect of consciousness with the ability to influence the consciousness of another person directly. To understand consciousness, we reintroduce the seven-ray theory of consciousness explaining consciousness in accordance with a fractal ontology with a bifurcation number of seven (the numbers four to ten work almost as well. A case report on a female, aged 35 years, with severe hormonal disturbances, diagnosed with extremely early menopause, is presented and treated according to the theory of holistic existential healing (the holistic process theory of healing. After recovery of her character and purpose of life, her quality of life dramatically improved and hormonal status normalized. We believe that the recovery of human character and purpose of life was the central intention of Hippocrates and thus the original essence of western medicine. Interestingly, there are strong parallels to the peyote medicine of the Native Americans, the African Sangomas, the Australian Aboriginal healers, and the old Nordic medicine. The recovery of human character was also the intention of Hahnemann's homeopathy. We believe that we are at the core of consciousness-based medicine, as recovery of purpose of life and human character has been practiced as medicine in most human cultures

  12. Systematic review of character development and childhood chronic illness.

    Science.gov (United States)

    Maslow, Gary R; Hill, Sherika N

    2016-05-08

    To review empirical evidence on character development among youth with chronic illnesses. A systematic literature review was conducted using PubMed and PSYCHINFO from inception until November 2013 to find quantitative studies that measured character strengths among youth with chronic illnesses. Inclusion criteria were limited to English language studies examining constructs of character development among adolescents or young adults aged 13-24 years with a childhood-onset chronic medical condition. A librarian at Duke University Medical Center Library assisted with the development of the mesh search term. Two researchers independently reviewed relevant titles (n = 549), then abstracts (n = 45), and finally manuscripts (n = 3). There is a lack of empirical research on character development and childhood-onset chronic medical conditions. Three studies were identified that used different measures of character based on moral themes. One study examined moral reasoning among deaf adolescents using Kohlberg's Moral Judgement Instrument; another, investigated moral values of adolescent cancer survivors with the Values In Action Classification of Strengths. A third study evaluated moral behavior among young adult survivors of burn injury utilizing the Tennessee Self-Concept, 2(nd) edition. The studies observed that youth with chronic conditions reasoned at less advanced stages and had a lower moral self-concept compared to referent populations, but that they did differ on character virtues and strengths when matched with healthy peers for age, sex, and race/ethnicity. Yet, generalizations could not be drawn regarding character development of youth with chronic medical conditions because the studies were too divergent from each other and biased from study design limitations. Future empirical studies should learn from the strengths and weaknesses of the existing literature on character development among youth with chronic medical conditions.

  13. Systematic review of character development and childhood chronic illness

    Science.gov (United States)

    Maslow, Gary R; Hill, Sherika N

    2016-01-01

    AIM: To review empirical evidence on character development among youth with chronic illnesses. METHODS: A systematic literature review was conducted using PubMed and PSYCHINFO from inception until November 2013 to find quantitative studies that measured character strengths among youth with chronic illnesses. Inclusion criteria were limited to English language studies examining constructs of character development among adolescents or young adults aged 13-24 years with a childhood-onset chronic medical condition. A librarian at Duke University Medical Center Library assisted with the development of the mesh search term. Two researchers independently reviewed relevant titles (n = 549), then abstracts (n = 45), and finally manuscripts (n = 3). RESULTS: There is a lack of empirical research on character development and childhood-onset chronic medical conditions. Three studies were identified that used different measures of character based on moral themes. One study examined moral reasoning among deaf adolescents using Kohlberg’s Moral Judgement Instrument; another, investigated moral values of adolescent cancer survivors with the Values In Action Classification of Strengths. A third study evaluated moral behavior among young adult survivors of burn injury utilizing the Tennessee Self-Concept, 2nd edition. The studies observed that youth with chronic conditions reasoned at less advanced stages and had a lower moral self-concept compared to referent populations, but that they did differ on character virtues and strengths when matched with healthy peers for age, sex, and race/ethnicity. Yet, generalizations could not be drawn regarding character development of youth with chronic medical conditions because the studies were too divergent from each other and biased from study design limitations. CONCLUSION: Future empirical studies should learn from the strengths and weaknesses of the existing literature on character development among youth with chronic medical conditions

  14. The Importance of Character Development: An Interview with Ron Kinnamon.

    Science.gov (United States)

    Kinnamon, Ron

    2003-01-01

    Building good character in today's youth is an adult issue because children learn values from adults. Adults must demonstrate the core values: trustworthiness, respect, responsibility, fairness, caring, and citizenship. Camps have developed expertise in character development and can provide leadership in the community in character education.…

  15. Project of decree relative to the licensing and statement system of nuclear activities and to their control and bearing various modifications of the public health code and working code; Projet de decret relatif au regime d'autorisation et de declaration des activites nucleaires et a leur controle et portant diverses modifications du code de la sante publique et du code du travail

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This decree concerns the control of high level sealed radioactive sources and orphan sources. It has for objective to introduce administrative simplification, especially the radiation sources licensing and statement system, to reinforce the control measures planed by the public health code and by the employment code, to bring precision and complements in the editing of several already existing arrangements. (N.C.)

  16. A Unique Perspective on Data Coding and Decoding

    Directory of Open Access Journals (Sweden)

    Wen-Yan Wang

    2010-12-01

    Full Text Available The concept of a loss-less data compression coding method is proposed, and a detailed description of each of its steps follows. Using the Calgary Corpus and Wikipedia data as the experimental samples and compared with existing algorithms, like PAQ or PPMstr, the new coding method could not only compress the source data, but also further re-compress the data produced by the other compression algorithms. The final files are smaller, and by comparison with the original compression ratio, at least 1% redundancy could be eliminated. The new method is simple and easy to realize. Its theoretical foundation is currently under study. The corresponding Matlab source code is provided in  the Appendix.

  17. Brain Activity while Reading Sentences with Kanji Characters Expressing Emotions

    Science.gov (United States)

    Yuasa, Masahide; Saito, Keiichi; Mukawa, Naoki

    In this paper, we describe the brain activity associated with kanji characters expressing emotion, which are places at the end of a sentence. Japanese people use a special kanji character in brackets at the end of sentences in text messages such as those sent through e-mail and messenger tools. Such kanji characters plays a role to expresses the sender's emotion (such as fun, laughter, sadness, tears), like emoticons. It is a very simple and effective way to convey the senders' emotions and his/her thoughts to the receiver. In this research, we investigate the effects of emotional kanji characters by using an fMRI study. The experimental results show that both the right and left inferior frontal gyrus, which have been implicated on verbal and nonverbal information, were activated. We found that we detect a sentence with an emotional kanji character as the verbal and nonverval information, and a sentence with emotional kanji characters enrich communication between the sender and the reciever.

  18. Stroop phenomena in the Japanese language: the case of ideographic characters (kanji) and syllabic characters (kana).

    Science.gov (United States)

    Morikawa, Y

    1981-08-01

    Utilizing a unique feature of the Japanese languages--that besides two syllabic orthographies, which have identical pronunciations, words with the same pronunciation may also be written in an orthography composed of ideographic characters--we have conducted an investigation of Stroop phenomena. The fact that pronunciations of the three Japanese orthographies are identical means that, if there are any differences between them in the Stroop phenomena observed, we can place the locus of this interference effect in the perceptual process. Five color names were written in the ideographic characters (kanji) and the two syllabic orthographies (hiragana and katakana). Color-congruent cards and incongruent cards were utilized in a color-naming task and a word-reading task. Mean required times for the color-naming condition and the word-reading conditions were compared with those for control conditions. Stroop phenomena were observed in both ideographic and syllabic orthographies. Significant differences in mean required times were observed between the ideographic and syllabic orthographies but not between the two syllabic orthographies. Interferences in comparisons of Japanese orthographies and color patch control conditions were much smaller than in the case of Stroop's (1935) experiment. A "Reverse Stroop Phenomenon" was observed only in the case of kanji on incongruent cards in the word-reading condition. The results support the hypothesis that both ideographic characters (in this case, kanji) and colors are processed in a parallel fashion in the non-dominant right cerebral hemisphere, while syllabic or phonetic characters are processed in the dominant left cerebral hemisphere.

  19. A New Experiment on Bengali Character Recognition

    Science.gov (United States)

    Barman, Sumana; Bhattacharyya, Debnath; Jeon, Seung-Whan; Kim, Tai-Hoon; Kim, Haeng-Kon

    This paper presents a method to use View based approach in Bangla Optical Character Recognition (OCR) system providing reduced data set to the ANN classification engine rather than the traditional OCR methods. It describes how Bangla characters are processed, trained and then recognized with the use of a Backpropagation Artificial neural network. This is the first published account of using a segmentation-free optical character recognition system for Bangla using a view based approach. The methodology presented here assumes that the OCR pre-processor has presented the input images to the classification engine described here. The size and the font face used to render the characters are also significant in both training and classification. The images are first converted into greyscale and then to binary images; these images are then scaled to a fit a pre-determined area with a fixed but significant number of pixels. The feature vectors are then formed extracting the characteristics points, which in this case is simply a series of 0s and 1s of fixed length. Finally, an artificial neural network is chosen for the training and classification process.

  20. Animal regeneration: ancestral character or evolutionary novelty?

    Science.gov (United States)

    Slack, Jonathan Mw

    2017-09-01

    An old question about regeneration is whether it is an ancestral character which is a general property of living matter, or whether it represents a set of specific adaptations to the different circumstances faced by different types of animal. In this review, some recent results on regeneration are assessed to see if they can throw any new light on this question. Evidence in favour of an ancestral character comes from the role of Wnt and bone morphogenetic protein signalling in controlling the pattern of whole-body regeneration in acoels, which are a basal group of bilaterian animals. On the other hand, there is some evidence for adaptive acquisition or maintenance of the regeneration of appendages based on the occurrence of severe non-lethal predation, the existence of some novel genes in regenerating organisms, and differences at the molecular level between apparently similar forms of regeneration. It is tentatively concluded that whole-body regeneration is an ancestral character although has been lost from most animal lineages. Appendage regeneration is more likely to represent a derived character resulting from many specific adaptations. © 2017 The Author.