WorldWideScience

Sample records for underlying process codes

  1. Kombucha brewing under the Food and Drug Administration model Food Code: risk analysis and processing guidance.

    Science.gov (United States)

    Nummer, Brian A

    2013-11-01

    Kombucha is a fermented beverage made from brewed tea and sugar. The taste is slightly sweet and acidic and it may have residual carbon dioxide. Kombucha is consumed in many countries as a health beverage and it is gaining in popularity in the U.S. Consequently, many retailers and food service operators are seeking to brew this beverage on site. As a fermented beverage, kombucha would be categorized in the Food and Drug Administration model Food Code as a specialized process and would require a variance with submission of a food safety plan. This special report was created to assist both operators and regulators in preparing or reviewing a kombucha food safety plan.

  2. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for PKA energy spectra and heating number under neutron irradiation

    International Nuclear Information System (INIS)

    Iwamoto, Y.; Ogawa, T.

    2016-01-01

    The modelling of the damage in materials irradiated by neutrons is needed for understanding the mechanism of radiation damage in fission and fusion reactor facilities. The molecular dynamics simulations of damage cascades with full atomic interactions require information about the energy distribution of the Primary Knock on Atoms (PKAs). The most common process to calculate PKA energy spectra under low-energy neutron irradiation is to use the nuclear data processing code NJOY2012. It calculates group-to-group recoil cross section matrices using nuclear data libraries in ENDF data format, which is energy and angular recoil distributions for many reactions. After the NJOY2012 process, SPKA6C is employed to produce PKA energy spectra combining recoil cross section matrices with an incident neutron energy spectrum. However, intercomparison with different processes and nuclear data libraries has not been studied yet. Especially, the higher energy (~5 MeV) of the incident neutrons, compared to fission, leads to many reaction channels, which produces a complex distribution of PKAs in energy and type. Recently, we have developed the event generator mode (EGM) in the Particle and Heavy Ion Transport code System PHITS for neutron incident reactions in the energy region below 20 MeV. The main feature of EGM is to produce PKA with keeping energy and momentum conservation in a reaction. It is used for event-by-event analysis in application fields such as soft error analysis in semiconductors, micro dosimetry in human body, and estimation of Displacement per Atoms (DPA) value in metals and so on. The purpose of this work is to specify differences of PKA spectra and heating number related with kerma between different calculation method using PHITS-EGM and NJOY2012+SPKA6C with different libraries TENDL-2015, ENDF/B-VII.1 and JENDL-4.0 for fusion relevant materials

  3. Comparative study of Monte Carlo particle transport code PHITS and nuclear data processing code NJOY for recoil cross section spectra under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Iwamoto, Yosuke, E-mail: iwamoto.yosuke@jaea.go.jp; Ogawa, Tatsuhiko

    2017-04-01

    Because primary knock-on atoms (PKAs) create point defects and clusters in materials that are irradiated with neutrons, it is important to validate the calculations of recoil cross section spectra that are used to estimate radiation damage in materials. Here, the recoil cross section spectra of fission- and fusion-relevant materials were calculated using the Event Generator Mode (EGM) of the Particle and Heavy Ion Transport code System (PHITS) and also using the data processing code NJOY2012 with the nuclear data libraries TENDL2015, ENDF/BVII.1, and JEFF3.2. The heating number, which is the integral of the recoil cross section spectra, was also calculated using PHITS-EGM and compared with data extracted from the ACE files of TENDL2015, ENDF/BVII.1, and JENDL4.0. In general, only a small difference was found between the PKA spectra of PHITS + TENDL2015 and NJOY + TENDL2015. From analyzing the recoil cross section spectra extracted from the nuclear data libraries using NJOY2012, we found that the recoil cross section spectra were incorrect for {sup 72}Ge, {sup 75}As, {sup 89}Y, and {sup 109}Ag in the ENDF/B-VII.1 library, and for {sup 90}Zr and {sup 55}Mn in the JEFF3.2 library. From analyzing the heating number, we found that the data extracted from the ACE file of TENDL2015 for all nuclides were problematic in the neutron capture region because of incorrect data regarding the emitted gamma energy. However, PHITS + TENDL2015 can calculate PKA spectra and heating numbers correctly.

  4. Data processing of imperfectly coded images

    International Nuclear Information System (INIS)

    Hammersley, A.P.; Skinner, G.K.

    1984-01-01

    The theory of mask coding is well established for perfect coding systems, but imperfections in practical situations produce new data processing problems. The Spacelab 2 telescopes are fully coded systems, but some complications arise as parts of the detector are obscured by a strengthening cross. The effects of this sort of obscuration on image quality and ways of handling them will be discussed. (orig.)

  5. CITOPP, CITMOD, CITWI, Processing codes for CITATION Code

    International Nuclear Information System (INIS)

    Albarhoum, M.

    2008-01-01

    Description of program or function: CITOPP processes the output file of the CITATION 3-D diffusion code. The program can plot axial, radial and circumferential flux distributions (in cylindrical geometry) in addition to the multiplication factor convergence. The flux distributions can be drawn for each group specified by the program and visualized on the screen. CITMOD processes both the output and the input files of the CITATION 3-D diffusion code. CITMOD can visualize both axial, and radial-angular models of the reactor described by CITATION input/output files. CITWI processes the input file (CIT.INP) of CITATION 3-D diffusion code. CIT.INP is processed to deduce the dimensions of the cell whose cross sections can be representative of the homonym reactor component in section 008 of CIT.INP

  6. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  7. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  8. Energy-coded processing in nuclear medicine

    International Nuclear Information System (INIS)

    Beck, R.; Metz, C.; Chen, H.T.

    1981-01-01

    A method for processing image data which takes into account the energy of each detected gamma-ray photon. Weighted spatial averaging of local detected count densities in radionuclide images can increase the visual detectability of abnormalities. In principle, the benefits of image processing in nuclear medicine can be increased by processing the image data in each interval of the detected photon spectrum using a procedure that is appropriate for the spatial resolution and statistical quality associated with that energy interval, and by combining energy-coded processed image components using generally energy-dependent weights. The potential gains in detection performance by implementation of such an approach are examined

  9. Investigating the Simulink Auto-Coding Process

    Science.gov (United States)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of

  10. Arabic Natural Language Processing System Code Library

    Science.gov (United States)

    2014-06-01

    POS Tagging, and Dependency Parsing. Fourth Workshop on Statistical Parsing of Morphologically Rich Languages (SPMRL). English (Note: These are for...Detection, Affix Labeling, POS Tagging, and Dependency Parsing" by Stephen Tratz presented at the Statistical Parsing of Morphologically Rich Languages ...and also English ) natural language processing (NLP), containing code for training and applying the Arabic NLP system described in Stephen Tratz’s

  11. A code inspection process for security reviews

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele; /Fermilab

    2009-05-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  12. A code inspection process for security reviews

    Science.gov (United States)

    Garzoglio, Gabriele

    2010-04-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  13. Calculation code MIXSET for Purex process

    International Nuclear Information System (INIS)

    Gonda, Kozo; Fukuda, Shoji.

    1977-09-01

    MIXSET is a FORTRAN IV calculation code for Purex process that simulate the dynamic behavior of solvent extraction processes in mixer-settlers. Two options permit terminating dynamic phase by time or by achieving steady state. These options also permit continuing calculation successively using new inputs from a arbitrary phase. A third option permits artificial rapid close to steady state and a fourth option permits searching optimum input to satisfy both of specification and recovery rate of product. MIXSET handles maximum chemical system of eight components with or without mutual dependence of the distribution of the components. The chemical system in MIXSET includes chemical reactions and/or decaying reaction. Distribution data can be supplied by third-power polynominal equations or tables, and kinetic data by tables or given constants. The fluctuation of the interfacial level height in settler is converted into the flow rate changes of organic and aqueous stream to follow dynamic behavior of extraction process in detail. MIXSET can be applied to flowsheet study, start up and/or shut down procedure study and real time process management in countercurrent solvent extraction processes. (auth.)

  14. A code inspection process for security reviews

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele

    2010-01-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  15. Speech and audio processing for coding, enhancement and recognition

    CERN Document Server

    Togneri, Roberto; Narasimha, Madihally

    2015-01-01

    This book describes the basic principles underlying the generation, coding, transmission and enhancement of speech and audio signals, including advanced statistical and machine learning techniques for speech and speaker recognition with an overview of the key innovations in these areas. Key research undertaken in speech coding, speech enhancement, speech recognition, emotion recognition and speaker diarization are also presented, along with recent advances and new paradigms in these areas. ·         Offers readers a single-source reference on the significant applications of speech and audio processing to speech coding, speech enhancement and speech/speaker recognition. Enables readers involved in algorithm development and implementation issues for speech coding to understand the historical development and future challenges in speech coding research; ·         Discusses speech coding methods yielding bit-streams that are multi-rate and scalable for Voice-over-IP (VoIP) Networks; ·     �...

  16. Description of ground motion data processing codes: Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, M.L.

    1988-02-01

    Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ``PSRV`` code. 39 figs.

  17. Qualifying codes under software quality assurance: Two examples as guidelines for codes that are existing or under development

    Energy Technology Data Exchange (ETDEWEB)

    Mangold, D.

    1993-05-01

    Software quality assurance is an area of concem for DOE, EPA, and other agencies due to the poor quality of software and its documentation they have received in the past. This report briefly summarizes the software development concepts and terminology increasingly employed by these agencies and provides a workable approach to scientific programming under the new requirements. Following this is a practical description of how to qualify a simulation code, based on a software QA plan that has been reviewed and officially accepted by DOE/OCRWM. Two codes have recently been baselined and qualified, so that they can be officially used for QA Level 1 work under the DOE/OCRWM QA requirements. One of them was baselined and qualified within one week. The first of the codes was the multi-phase multi-component flow code TOUGH version 1, an already existing code, and the other was a geochemistry transport code STATEQ that was under development The way to accomplish qualification for both types of codes is summarized in an easy-to-follow step-by step fashion to illustrate how to baseline and qualify such codes through a relatively painless procedure.

  18. Consistent Code Qualification Process and Application to WWER-1000 NPP

    International Nuclear Information System (INIS)

    Berthon, A.; Petruzzi, A.; Giannotti, W.; D'Auria, F.; Reventos, F.

    2006-01-01

    Calculation analysis by application of the system codes are performed to evaluate the NPP or the facility behavior during a postulated transient or to evaluate the code capability. The calculation analysis constitutes a process that involves the code itself, the data of the reference plant, the data about the transient, the nodalization, and the user. All these elements affect one each other and affect the results. A major issue in the use of mathematical model is constituted by the model capability to reproduce the plant or facility behavior under steady state and transient conditions. These aspects constitute two main checks that must be satisfied during the qualification process. The first of them is related to the realization of a scheme of the reference plant; the second one is related to the capability to reproduce the transient behavior. The aim of this paper is to describe the UMAE (Uncertainty Method based on Accuracy Extrapolation) methodology developed at University of Pisa for qualifying a nodalization and analysing the calculated results and to perform the uncertainty evaluation of the system code by the CIAU code (Code with the capability of Internal Assessment of Uncertainty). The activity consists with the re-analysis of the Experiment BL-44 (SBLOCA) performed in the LOBI facility and the analysis of a Kv-scaling calculation of the WWER-1000 NPP nodalization taking as reference the test BL-44. Relap5/Mod3.3 has been used as thermal-hydraulic system code and the standard procedure adopted at University of Pisa has been applied to show the capability of the code to predict the significant aspects of the transient and to obtain a qualified nodalization of the WWER-1000 through a systematic qualitative and quantitative accuracy evaluation. The qualitative accuracy evaluation is based on the selection of Relevant Thermal-hydraulic Aspects (RTAs) and is a prerequisite to the application of the Fast Fourier Transform Based Method (FFTBM) which quantifies

  19. Obsolescence : The underlying processes

    NARCIS (Netherlands)

    Thomsen, A.F.; Nieboer, N.E.T.; Van der Flier, C.L.

    2015-01-01

    Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of housing properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, namely

  20. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  1. Parallel processing Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    McKinney, G.W.

    1994-01-01

    Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine

  2. Offer and acceptance under the Russian Civil Code

    OpenAIRE

    Musin, Valery

    2013-01-01

    The article deals with a procedure of entering into a contract under Russian civil law both at the domestic and foreign markets. An offer and an acceptance are considered in the light or relevant provisions of the Russian Civil Codes of 1922, 1964 and that currently effective as compared with rules of the UN Convention on Contracts for the International Sale of Goods 1980 and INIDROIT Principles of International Commercial Contracts 2010.

  3. Offer and Acceptance under the Russian Civil Code

    Directory of Open Access Journals (Sweden)

    Valery Musin

    2013-01-01

    Full Text Available The article deals with a procedure of entering into a contract under Russian civil law both at the domestic and foreign markets. An offer and an acceptance are considered in the light or relevant provisions of the Russian Civil Codes of 1922, 1964 and that currently effective as compared with rules of the UN Convention on Contracts for the International Sale of Goods 1980 and INIDROIT Principles of International Commercial Contracts 2010.

  4. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  5. Component processes underlying future thinking.

    Science.gov (United States)

    D'Argembeau, Arnaud; Ortoleva, Claudia; Jumentier, Sabrina; Van der Linden, Martial

    2010-09-01

    This study sought to investigate the component processes underlying the ability to imagine future events, using an individual-differences approach. Participants completed several tasks assessing different aspects of future thinking (i.e., fluency, specificity, amount of episodic details, phenomenology) and were also assessed with tasks and questionnaires measuring various component processes that have been hypothesized to support future thinking (i.e., executive processes, visual-spatial processing, relational memory processing, self-consciousness, and time perspective). The main results showed that executive processes were correlated with various measures of future thinking, whereas visual-spatial processing abilities and time perspective were specifically related to the number of sensory descriptions reported when specific future events were imagined. Furthermore, individual differences in self-consciousness predicted the subjective feeling of experiencing the imagined future events. These results suggest that future thinking involves a collection of processes that are related to different facets of future-event representation.

  6. Performance analysis of simultaneous dense coding protocol under decoherence

    Science.gov (United States)

    Huang, Zhiming; Zhang, Cai; Situ, Haozhen

    2017-09-01

    The simultaneous dense coding (SDC) protocol is useful in designing quantum protocols. We analyze the performance of the SDC protocol under the influence of noisy quantum channels. Six kinds of paradigmatic Markovian noise along with one kind of non-Markovian noise are considered. The joint success probability of both receivers and the success probabilities of one receiver are calculated for three different locking operators. Some interesting properties have been found, such as invariance and symmetry. Among the three locking operators we consider, the SWAP gate is most resistant to noise and results in the same success probabilities for both receivers.

  7. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  8. The duties of the seller under the civil code and under the Hire ...

    African Journals Online (AJOL)

    This paper examines the duties of the seller under the Hire Purchase and Credit Sale Act 1964 (the Act) and under the Civil Code respectively as to the quiet enjoyment of the goods purchased, the right of the seller to sell them and the duty of the seller as to their merchantable quality. A comparison is made between these ...

  9. Calculation code; REPROSY-P for process studies on Pu purification with TBP

    International Nuclear Information System (INIS)

    Tsujino, Takeshi; Kohsaka, Atsuo; Aochi, Tetsuo

    1975-10-01

    To evaluate the plutonium purification process with tributhyl phosphate (TBP), the calculation code; REPROSY-P for TBP-HNO 3 -Pu system, has been prepared on the basis of a batchwise counter-current extraction cascade model. With the code, the concentration profiles under transient or steady state condition and the number of theoretical stages required, can be calculated for a given process condition. Empirical distribution equations, distribution data of plutonium and the detailed calculation program, are described. (auth.)

  10. Data processing codes for fatigue and tensile tests

    International Nuclear Information System (INIS)

    Sanchez Sarmiento, Gustavo; Iorio, A.F.; Crespi, J.C.

    1981-01-01

    The processing of fatigue and tensile tests data in order to obtain several parameters of engineering interest requires a considerable effort of numerical calculus. In order to reduce the time spent in this work and to establish standard data processing from a set of similar type tests, it is very advantageous to have a calculation code for running in a computer. Two codes have been developed in FORTRAN language; one of them predicts cyclic properties of materials from the monotonic and incremental or multiple cyclic step tests (ENSPRED CODE), and the other one reduces data coming from strain controlled low cycle fatigue tests (ENSDET CODE). Two examples are included using Zircaloy-4 material from different manufacturers. (author) [es

  11. Progress on China nuclear data processing code system

    Science.gov (United States)

    Liu, Ping; Wu, Xiaofei; Ge, Zhigang; Li, Songyang; Wu, Haicheng; Wen, Lili; Wang, Wenming; Zhang, Huanyu

    2017-09-01

    China is developing the nuclear data processing code Ruler, which can be used for producing multi-group cross sections and related quantities from evaluated nuclear data in the ENDF format [1]. The Ruler includes modules for reconstructing cross sections in all energy range, generating Doppler-broadened cross sections for given temperature, producing effective self-shielded cross sections in unresolved energy range, calculating scattering cross sections in thermal energy range, generating group cross sections and matrices, preparing WIMS-D format data files for the reactor physics code WIMS-D [2]. Programming language of the Ruler is Fortran-90. The Ruler is tested for 32-bit computers with Windows-XP and Linux operating systems. The verification of Ruler has been performed by comparison with calculation results obtained by the NJOY99 [3] processing code. The validation of Ruler has been performed by using WIMSD5B code.

  12. Case studies in Gaussian process modelling of computer codes

    International Nuclear Information System (INIS)

    Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony

    2006-01-01

    In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics

  13. A Realistic Model Under Which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, Harry; van der Gulik, Peter T. S.; Klau, Gunnar W.; Schaffner, Christian; Speijer, Dave; Stougie, Leen

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  14. A Realistic Model under which the Genetic Code is Optimal

    NARCIS (Netherlands)

    Buhrman, H.; van der Gulik, P.T.S.; Klau, G.W.; Schaffner, C.; Speijer, D.; Stougie, L.

    2013-01-01

    The genetic code has a high level of error robustness. Using values of hydrophobicity scales as a proxy for amino acid character, and the mean square measure as a function quantifying error robustness, a value can be obtained for a genetic code which reflects the error robustness of that code. By

  15. Post-processing of the TRAC code's results

    International Nuclear Information System (INIS)

    Baron, J.H.; Neuman, D.

    1987-01-01

    The TRAC code serves for the analysis of accidents in nuclear installations from the thermohydraulic point of view. A program has been developed with the aim of processing the information rapidly generated by the code, with screening graph capacity, both in high and low resolution, or either in paper through printer or plotter. Although the programs are intended to be used after the TRAC runs, they may be also used even when the program is running so as to observe the calculation process. The advantages of employing this type of tool, its actual capacity and its possibilities of expansion according to the user's needs are herein described. (Author)

  16. SCAMPI: A code package for cross-section processing

    Energy Technology Data Exchange (ETDEWEB)

    Parks, C.V.; Petrie, L.M.; Bowman, S.M.; Broadhead, B.L.; Greene, N.M.; White, J.E.

    1996-04-01

    The SCAMPI code package consists of a set of SCALE and AMPX modules that have been assembled to facilitate user needs for preparation of problem-specific, multigroup cross-section libraries. The function of each module contained in the SCANTI code package is discussed, along with illustrations of their use in practical analyses. Ideas are presented for future work that can enable one-step processing from a fine-group, problem-independent library to a broad-group, problem-specific library ready for a shielding analysis.

  17. Coded Ultrasound for Blood Flow Estimation Using Subband Processing

    DEFF Research Database (Denmark)

    Gran, Fredrik; Udesen, Jesper; Nielsen, Michael Bachamnn

    2008-01-01

    signals are used to increase SNR, followed by subband processing. The received broadband signal is filtered using a set of narrow-band filters. Estimating the velocity in each of the bands and averaging the results yields better performance compared with what would be possible when transmitting a narrow...... was laminar and had a parabolic flow-profile with a peak velocity of 0.09 m/s. The mean relative standard deviation of the velocity estimate using the reference method with an 8-cycle excitation pulse at 7 MHz was 0.544% compared with the peak velocity in the rig. Two Barker codes were tested with a length...... of 5 and 13 bits, respectively. The corresponding mean relative standard deviations were 0.367% and 0.310%, respectively. For the Golay coded experiment, two 8-bit codes were used, and the mean relative standard deviation was 0.335%....

  18. Channel modeling, signal processing and coding for perpendicular magnetic recording

    Science.gov (United States)

    Wu, Zheng

    With the increasing areal density in magnetic recording systems, perpendicular recording has replaced longitudinal recording to overcome the superparamagnetic limit. Studies on perpendicular recording channels including aspects of channel modeling, signal processing and coding techniques are presented in this dissertation. To optimize a high density perpendicular magnetic recording system, one needs to know the tradeoffs between various components of the system including the read/write transducers, the magnetic medium, and the read channel. We extend the work by Chaichanavong on the parameter optimization for systems via design curves. Different signal processing and coding techniques are studied. Information-theoretic tools are utilized to determine the acceptable region for the channel parameters when optimal detection and linear coding techniques are used. Our results show that a considerable gain can be achieved by the optimal detection and coding techniques. The read-write process in perpendicular magnetic recording channels includes a number of nonlinear effects. Nonlinear transition shift (NLTS) is one of them. The signal distortion induced by NLTS can be reduced by write precompensation during data recording. We numerically evaluate the effect of NLTS on the read-back signal and examine the effectiveness of several write precompensation schemes in combating NLTS in a channel characterized by both transition jitter noise and additive white Gaussian electronics noise. We also present an analytical method to estimate the bit-error-rate and use it to help determine the optimal write precompensation values in multi-level precompensation schemes. We propose a mean-adjusted pattern-dependent noise predictive (PDNP) detection algorithm for use on the channel with NLTS. We show that this detector can offer significant improvements in bit-error-rate (BER) compared to conventional Viterbi and PDNP detectors. Moreover, the system performance can be further improved by

  19. Error Processing Techniques for the Modified Read Facsimile Code.

    Science.gov (United States)

    1981-09-01

    applications, dont la rentabilitd ne pourra Oitre assurd, Mt seront pas entreprises . Actueflement, sur trente applications qui wit pu etro globalement d~inles...REPORT & PERIOD COVERED Error Processing Techniques for the Modified Read Facsimile Code Final 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(a) S. CONTRACT...OR GRANT NUMBER(a) Richard A. Schaphorst et al DCA100-V8O-C-0233 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA

  20. The role of the PIRT process in identifying code improvements and executing code development

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.E. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Boyack, B.E. [Los Alamos National Lab., NM (United States)

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, was originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.

  1. Coded ultrasound for blood flow estimation using subband processing

    DEFF Research Database (Denmark)

    Gran, F.; Udesen, J.; Jensen, J.A.

    2008-01-01

    signals are used to increase SNR, followed by subband processing. The received broadband signal is filtered using a set of narrow-band filters. Estimating the velocity in each of the bands and averaging the results yields better performance compared with what would be possible when transmitting a narrow......This paper investigates the use of coded excitation for blood flow estimation in medical ultrasound. Traditional autocorrelation estimators use narrow-band excitation signals to provide sufficient signal-to-noise-ratio (SNR) and velocity estimation performance. In this paper, broadband coded...... for velocity estimation is compared with a conventional approach transmitting a narrow-band pulse. The study was carried out using an experimental ultrasound scanner and a commercial linear array 7 MHz transducer. A circulating flow rig was scanned with a beam-to-flow angle of 60 degrees. The flow in the rig...

  2. Study on the properties of infrared wavefront coding athermal system under several typical temperature gradient distributions

    Science.gov (United States)

    Cai, Huai-yu; Dong, Xiao-tong; Zhu, Meng; Huang, Zhan-hua

    2018-01-01

    Wavefront coding for athermal technique can effectively ensure the stability of the optical system imaging in large temperature range, as well as the advantages of compact structure and low cost. Using simulation method to analyze the properties such as PSF and MTF of wavefront coding athermal system under several typical temperature gradient distributions has directive function to characterize the working state of non-ideal temperature environment, and can effectively realize the system design indicators as well. In this paper, we utilize the interoperability of data between Solidworks and ZEMAX to simplify the traditional process of structure/thermal/optical integrated analysis. Besides, we design and build the optical model and corresponding mechanical model of the infrared imaging wavefront coding athermal system. The axial and radial temperature gradients of different degrees are applied to the whole system by using SolidWorks software, thus the changes of curvature, refractive index and the distance between the lenses are obtained. Then, we import the deformation model to ZEMAX for ray tracing, and obtain the changes of PSF and MTF in optical system. Finally, we discuss and evaluate the consistency of the PSF (MTF) of the wavefront coding athermal system and the image restorability, which provides the basis and reference for the optimal design of the wavefront coding athermal system. The results show that the adaptability of single material infrared wavefront coding athermal system to axial temperature gradient can reach the upper limit of temperature fluctuation of 60°C, which is much higher than that of radial temperature gradient.

  3. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  4. SSYST. A code system to analyze LWR fuel rod behavior under accident conditions

    International Nuclear Information System (INIS)

    Gulden, W.; Meyder, R.; Borgwaldt, H.

    1982-01-01

    SSYST (Safety SYSTem) is a modular system to analyze the behavior of light water reactor fuel rods and fuel rod simulators under accident conditions. It has been developed in close cooperation between Kernforschungszentrum Karlsruhe (KfK) and the Institut fuer Kerntechnik und Energiewandlung (IKE), University Stuttgart, under contract of Projekt Nukleare Sicherheit (PNS) at KfK. Although originally aimed at single rod analysis, features are available to calculate effects such as blockage ratios of bundles and wholes cores. A number of inpile and out-of-pile experiments were used to assess the system. Main differences versus codes like FRAP-T with similar applications are (1) an open-ended modular code organisation, (2) availability of modules of different sophistication levels for the same physical processes, and (3) a preference for simple models, wherever possible. The first feature makes SSYST a very flexible tool, easily adapted to changing requirements; the second enables the user to select computational models adequate to the significance of the physical process. This leads together with the third feature to short execution times. The analysis of transient rod behavior under LOCA boundary conditions e.g. takes 2 mins cpu-time (IBM-3033), so that extensive parametric studies become possible

  5. Code of Conduct for Gas Marketers : rule made under part 3 of the Ontario Energy Board Act, 1998

    International Nuclear Information System (INIS)

    1999-01-01

    Text of the code of conduct for gas marketers in Ontario is presented. This code sets the minimum standards under which a gas marketer may sell or offer to sell gas to a low-volume consumer, or act as an agent or broker with respect to the sale of gas. The document describes the standards and principles regarding: (1) fair marketing practices, (2) identification, (3) information to be maintained by a gas marketer, (4) confidentiality of consumer information, (5) conditions in offers, (6) contracts, (7) contract renewals, (8) assignment, sale and transfer contracts, (9) independent arms-length consumer complaints resolution process, and (10) penalties for breach of this code

  6. Code of Conduct for Gas Marketers : rule made under part 3 of the Ontario Energy Board Act, 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-02

    Text of the code of conduct for gas marketers in Ontario is presented. This code sets the minimum standards under which a gas marketer may sell or offer to sell gas to a low-volume consumer, or act as an agent or broker with respect to the sale of gas. The document describes the standards and principles regarding: (1) fair marketing practices, (2) identification, (3) information to be maintained by a gas marketer, (4) confidentiality of consumer information, (5) conditions in offers, (6) contracts, (7) contract renewals, (8) assignment, sale and transfer contracts, (9) independent arms-length consumer complaints resolution process, and (10) penalties for breach of this code.

  7. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Committee Action Hearings is April 21-30, 2013 in Dallas, TX at the Sheraton Dallas Hotel. This will be... Convention Center. ADDRESSES: Committee Action Hearings in Dallas, TX at the Sheraton Dallas Hotel and the... Fuel Gas Code. International Green Construction Code. International Mechanical Code. ICC Performance...

  8. CodeRocket™: Accelerating the Software Development Process

    Science.gov (United States)

    Parkes, S.; Ramsay, C.

    2009-05-01

    CodeRocket™ is an innovative software documentation and design support tool that reduces the cost and time to market for software development projects. It provides detailed design documentation automatically and keeps it fully synchronised with the program code both during development and subsequently when changes are required. Legacy code can be imported into CodeRocket™ and extensive detailed-design documentation extracted automatically. CodeRocket™ integrates seamlessly with the way software engineers work in practice making adoption easy. This paper explains the rationale for CodeRocket™ and then describes the key feature of the tool. Finally the major benefits of the tool for different stakeholders are considered.

  9. Calculation code of mass and heat transfer in a pulsed column for Purex process

    International Nuclear Information System (INIS)

    Tsukada, Takeshi; Takahashi, Keiki

    1993-01-01

    A calculation code for extraction behavior analysis in a pulsed column employed at an extraction process of a reprocessing plant was developed. This code was also combined with our previously developed calculation code for axial temperature profiles in a pulsed column. The one-dimensional dispersion model was employed for both of the extraction behavior analysis and the axial temperature profile analysis. The reported values of the fluid characteristics coefficient, the transfer coefficient and the diffusivities in the pulsed column were used. The calculated concentration profiles of HNO 3 , U and Pu for the steady state have a good agreement with the reported experimental results. The concentration and temperature profiles were calculated under the operation conditions which induce the abnormal U extraction behavior, i.e. U extraction zone is moved to the bottom of the column. Thought there is slight difference between calculated and experimental value, it is appeared that our developed code can be applied to the simulation under the normal operation condition and the relatively slowly transient condition. Pu accumulation phenomena was analyzed with this code and the accumulation tendency is similar to the reported analysis results. (author)

  10. Coded ultrasound for blood flow estimation using subband processing

    DEFF Research Database (Denmark)

    Gran, Fredrik; Udesen, Jesper; Nielsen, Michael bachmann

    2007-01-01

    coded signals are used to increase SNR, followed by sub-band processing. The received broadband signal, is filtered using a set of narrow-band filters. Estimating the velocity in each of the bands and averaging the results yields better performance compared to what would be possible when transmitting...... was carried out using an experimental ultrasound scanner and a commercial linear array 7 MHz transducer. A circulating flow rig was scanned with a beam-to-flow angle of 60 degrees. The flow in the rig was laminar and had a parabolic flow-profile with a peak velocity of 0.09 m/s. The mean relative standard...... deviation of the reference method using an eight cycle excitation pulse at 7 MHz was 0.544% compared to the peak velocity in the rig. Two Barker codes were tested with a length of 5 and 13 bits, respectively. The corresponding mean relative standard deviations were 0.367% and 0.310%, respectively...

  11. Calculation code PULCO for Purex process in pulsed column

    International Nuclear Information System (INIS)

    Gonda, Kozo; Matsuda, Teruo

    1982-03-01

    The calculation code PULCO, which can simulate the Purex process using a pulsed column as an extractor, has been developed. The PULCO is based on the fundamental concept of mass transfer that the mass transfer within a pulsed column occurs through the interface of liquid drops and continuous phase fluid, and is the calculation code different from conventional ones, by which various phenomena such as the generation of liquid drops, their rising and falling, and the unification of liquid drops actually occurring in a pulsed column are exactly reflected and can be correctly simulated. In the PULCO, the actually measured values of the fundamental quantities representing the extraction behavior of liquid drops in a pulsed column are incorporated, such as the mass transfer coefficient of each component, the diameter and velocity of liquid drops in a pulsed column, the holdup of dispersed phase, and axial turbulent flow diffusion coefficient. The verification of the results calculated with the PULCO was carried out by installing a pulsed column of 50 mm inside diameter and 2 m length with 40 plate stage in a glove box for unirradiated uranium-plutonium mixed system. The results of the calculation and test were in good agreement, and the validity of the PULCO was confirmed. (Kako, I.)

  12. Coordinate-free sensorimotor processing: computing with population codes.

    Science.gov (United States)

    Morasso, Pietro G.; Sanguineti, Vittorio; Frisone, Francesco; Perico, Luca

    1998-10-01

    The purpose of the study is to outline a computational architecture for the intelligent processing of sensorimotor patterns. The focus is on the nature of the internal representations of the outside world which are necessary for planning and other goal-oriented functions. A model of cortical map dynamics and self-organization is proposed that integrates a number of concepts and methods partly explored in the field. The novelty and the biological plausibility is related to the global architecture which allows one to deal with sensorimotor patterns in a coordinate-free way, using population codes as distributed internal representations of external variables and the coupled dynamics of cortical maps as a general tool of trajectory formation. The basic computational features of the model are demonstrated in the case of articulatory speech synthesis and some of the metric properties are evaluated by means of simple simulation studies.

  13. 77 FR 38597 - Multistakeholder Process To Develop Consumer Data Privacy Code of Conduct Concerning Mobile...

    Science.gov (United States)

    2012-06-28

    ... Process To Develop Consumer Data Privacy Code of Conduct Concerning Mobile Application Transparency AGENCY... multistakeholder processes to develop legally enforceable codes of conduct that specify how the Consumer Privacy... of the first multistakeholder process is to develop a code of conduct to provide transparency in how...

  14. Locating protein-coding sequences under selection for additional, overlapping functions in 29 mammalian genomes

    DEFF Research Database (Denmark)

    Lin, Michael F; Kheradpour, Pouya; Washietl, Stefan

    2011-01-01

    synonymous constraint in these regions reflects selection on overlapping functional elements including splicing regulatory elements, dual-coding genes, RNA secondary structures, microRNA target sites, and developmental enhancers. Our results show that overlapping functional elements are common in mammalian......The degeneracy of the genetic code allows protein-coding DNA and RNA sequences to simultaneously encode additional, overlapping functional elements. A sequence in which both protein-coding and additional overlapping functions have evolved under purifying selection should show increased evolutionary...

  15. A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage Scheme

    Directory of Open Access Journals (Sweden)

    Winch Peter J

    2011-04-01

    Full Text Available Abstract Background In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Methods Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large, location (urban/rural, and type (public/private. Results Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1 Discharge Summarization, 2 Completeness Checking, 3 Diagnosis and Procedure Coding, 4 Code Checking, 5 Relative Weight Challenging, 6 Coding Report, and 7 Internal Audit. The hospital coding practice can be affected by at least five main factors: 1 Internal Dynamics, 2 Management Context, 3 Financial Dependency, 4 Resource and Capacity, and 5 External Factors. Conclusions Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.

  16. A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage scheme.

    Science.gov (United States)

    Pongpirul, Krit; Walker, Damian G; Winch, Peter J; Robinson, Courtland

    2011-04-08

    In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.

  17. Effects of Charter Party Arbitration Clauses Under the New Turkish Commercial Code

    Directory of Open Access Journals (Sweden)

    Didem Algantürk Light

    2016-06-01

    Full Text Available The New Turkish Commercial Code has became effective on 1 July 2012. The new Code covers bareboat charterparty, time charterparty, voyage charterparty and contracts of carriage by sea which is defined in general terms. On the other hand, Turkey is the one of the conracting State of Convention on the Recognition and Enforcement of Foreign Arbitral Awards (New York Convention, 1958 as well as European Convention on International Commercial Arbitration, 1961. In this study, we will examine the effects of the charter party arbitration clauses under the new Turkish Commercial Code and discuss the characteristics of the clauses to be valid under Turkish Law.

  18. Single integrated device for optical CDMA code processing in dual-code environment.

    Science.gov (United States)

    Huang, Yue-Kai; Glesk, Ivan; Greiner, Christoph M; Iazkov, Dmitri; Mossberg, Thomas W; Wang, Ting; Prucnal, Paul R

    2007-06-11

    We report on the design, fabrication and performance of a matching integrated optical CDMA encoder-decoder pair based on holographic Bragg reflector technology. Simultaneous encoding/decoding operation of two multiple wavelength-hopping time-spreading codes was successfully demonstrated and shown to support two error-free OCDMA links at OC-24. A double-pass scheme was employed in the devices to enable the use of longer code length.

  19. Characteristics of a Dairy Process under Uncertainty

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    2007-01-01

    In this work, the characteristics of a dairy production process under diverse product uncertainties are investigated through a process simulation. The flexibility analysis method of Grossmann and his co-workers (Swaney and Grossmann, 1985) is applied through a process simulation tool, PRO/II. A new...

  20. MHD code using multi graphical processing units: SMAUG+

    Science.gov (United States)

    Gyenge, N.; Griffiths, M. K.; Erdélyi, R.

    2018-01-01

    This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.

  1. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  2. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  3. The NAICS Code Selection Process And Small Business Participation

    Science.gov (United States)

    2016-03-01

    particular requirement. Companies can identify with multiple NAICS codes, and any one business can potentially be considered small in one industry...aside an acquisition to any type of small business in general (i.e., companies that meet the SBA-published industry small business size standard), or...that appropriate [NAICS] code and related small business size standard and include them in solicitations above the micro -purchase threshold. If

  4. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2010-04-16

    ... hearing is May 14-22, 2010 in Dallas, Texas at the Sheraton Dallas Hotel. Completion of this cycle results... for Residential Construction in High Wind Areas. ICC 700: National Green Building Standard. The... recently completed the drafting phase in the development of the International Green Construction Code which...

  5. PERFORMANCE ANALYSIS OF OPTICAL CDMA SYSTEM USING VC CODE FAMILY UNDER VARIOUS OPTICAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    HASSAN YOUSIF AHMED

    2012-06-01

    Full Text Available The intent of this paper is to study the performance of spectral-amplitude coding optical code-division multiple-access (OCDMA systems using Vector Combinatorial (VC code under various optical parameters. This code can be constructed by an algebraic way based on Euclidian vectors for any positive integer number. One of the important properties of this code is that the maximum cross-correlation is always one which means that multi-user interference (MUI and phase induced intensity noise are reduced. Transmitter and receiver structures based on unchirped fiber Bragg grating (FBGs using VC code and taking into account effects of the intensity, shot and thermal noise sources is demonstrated. The impact of the fiber distance effects on bit error rate (BER is reported using a commercial optical systems simulator, virtual photonic instrument, VPITM. The VC code is compared mathematically with reported codes which use similar techniques. We analyzed and characterized the fiber link, received power, BER and channel spacing. The performance and optimization of VC code in SAC-OCDMA system is reported. By comparing the theoretical and simulation results taken from VPITM, we have demonstrated that, for a high number of users, even if data rate is higher, the effective power source is adequate when the VC is used. Also it is found that as the channel spacing width goes from very narrow to wider, the BER decreases, best performance occurs at a spacing bandwidth between 0.8 and 1 nm. We have shown that the SAC system utilizing VC code significantly improves the performance compared with the reported codes.

  6. Adaptive under relaxation factor of MATRA code for the efficient whole core analysis

    International Nuclear Information System (INIS)

    Kwon, Hyuk; Kim, S. J.; Seo, K. W.; Hwang, D. H.

    2013-01-01

    Such nonlinearities are handled in MATRA code using outer iteration with Picard scheme. The Picard scheme involves successive updating of the coefficient matrix based on the previously calculated values. The scheme is a simple and effective method for the nonlinear problem but the effectiveness greatly depends on the under-relaxing capability. Accuracy and speed of calculation are very sensitively dependent on the under-relaxation factor in outer-iteration updating the axial mass flow using the continuity equation. The under-relaxation factor in MATRA is generally utilized with a fixed value that is empirically determined. Adapting the under-relaxation factor to the outer iteration is expected to improve the calculation effectiveness of MATRA code rather than calculation with the fixed under-relaxation factor. The present study describes the implementation of adaptive under-relaxation within the subchannel code MATRA. Picard iterations with adaptive under-relaxation can accelerate the convergence for mass conservation in subchannel code MATRA. The most efficient approach for adaptive under relaxation appears to be very problem dependent

  7. Coded excitation and sub-band processing for blood velocity estmation in medical ultrasound

    DEFF Research Database (Denmark)

    Gran, Fredrik; Udesen, Jesper; Jensen, Jørgen Arendt

    2007-01-01

    This paper investigates the use of broadband coded excitation and subband processing for blood velocity estimation in medical ultrasound. In conventional blood velocity estimation a long (narrow-band) pulse is emitted and the blood velocity is estimated using an auto-correlation based approach...... rate can be increased. To increase the SNR for the broad-band excitation waveforms, coding is proposed. Three different coding methods are investigated: nonlinear frequency modulation, complementary (Golay) codes, and Barker codes. Code design is described for the three different methods as well...

  8. Mapping Saldana's Coding Methods onto the Literature Review Process

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Frels, Rebecca K.; Hwang, Eunjin

    2016-01-01

    Onwuegbuzie and Frels (2014) provided a step-by-step guide illustrating how discourse analysis can be used to analyze literature. However, more works of this type are needed to address the way that counselor researchers conduct literature reviews. Therefore, we present a typology for coding and analyzing information extracted for literature…

  9. Supporting chemical process design under uncertainty

    Directory of Open Access Journals (Sweden)

    A. Wechsung

    2010-09-01

    Full Text Available A major challenge in chemical process design is to make design decisions based on partly incomplete or imperfect design input data. Still, process engineers are expected to design safe, dependable and cost-efficient processes under these conditions. The complexity of typical process models limits intuitive engineering estimates to judge the impact of uncertain parameters on the proposed design. In this work, an approach to quantify the effect of uncertainty on a process design in order to enhance comparisons among different designs is presented. To facilitate automation, a novel relaxation-based heuristic to differentiate between numerical and physical infeasibility when simulations do not converge is introduced. It is shown how this methodology yields more details about limitations of a studied process design.

  10. The status of simulation codes for extraction process using mixer-settler

    International Nuclear Information System (INIS)

    Byeon, Kee Hoh; Lee, Eil Hee; Kwon, Seong Gil; Kim, Kwang Wook; Yang, Han Beom; Chung, Dong Yong; Lim, Jae Kwan; Shin, Hyun Kyoo; Kim, Soo Ho

    1999-10-01

    We have studied and analyzed the mixer-settler simulation codes such as three kinds of SEPHIS series, PUBG, and EXTRA.M, which is the most recently developed code. All of these are sufficiently satisfactory codes in the fields of process/device modeling, but it is necessary to formulate the accurate distribution data and chemical reaction mechanism for the aspect of accuracy and reliability. In the aspect of application to be the group separation process, the mixer-settler model of these codes have no problems, but the accumulation and formulation of partitioning and reaction equilibrium data of chemical elements used in group separation process is very important. (author)

  11. The "periodic table" of the genetic code: A new way to look at the code and the decoding process.

    Science.gov (United States)

    Komar, Anton A

    2016-01-01

    Henri Grosjean and Eric Westhof recently presented an information-rich, alternative view of the genetic code, which takes into account current knowledge of the decoding process, including the complex nature of interactions between mRNA, tRNA and rRNA that take place during protein synthesis on the ribosome, and it also better reflects the evolution of the code. The new asymmetrical circular genetic code has a number of advantages over the traditional codon table and the previous circular diagrams (with a symmetrical/clockwise arrangement of the U, C, A, G bases). Most importantly, all sequence co-variances can be visualized and explained based on the internal logic of the thermodynamics of codon-anticodon interactions.

  12. Annotating long intergenic non-coding RNAs under artificial selection during chicken domestication.

    Science.gov (United States)

    Wang, Yun-Mei; Xu, Hai-Bo; Wang, Ming-Shan; Otecko, Newton Otieno; Ye, Ling-Qun; Wu, Dong-Dong; Zhang, Ya-Ping

    2017-08-15

    Numerous biological functions of long intergenic non-coding RNAs (lincRNAs) have been identified. However, the contribution of lincRNAs to the domestication process has remained elusive. Following domestication from their wild ancestors, animals display substantial changes in many phenotypic traits. Therefore, it is possible that diverse molecular drivers play important roles in this process. We analyzed 821 transcriptomes in this study and annotated 4754 lincRNA genes in the chicken genome. Our population genomic analysis indicates that 419 lincRNAs potentially evolved during artificial selection related to the domestication of chicken, while a comparative transcriptomic analysis identified 68 lincRNAs that were differentially expressed under different conditions. We also found 47 lincRNAs linked to special phenotypes. Our study provides a comprehensive view of the genome-wide landscape of lincRNAs in chicken. This will promote a better understanding of the roles of lincRNAs in domestication, and the genetic mechanisms associated with the artificial selection of domestic animals.

  13. UNR. A code for processing unresolved resonance data for MCNP

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-09-01

    In neutron transport problems the correct treatment of self-shielding is important for those nuclei present in large concentrations. Monte Carlo calculations using continuous-energy cross section data, such as calculations with the code MCNP, offer the advantage that neutron transport is calculated in a very accurate way. Self-shielding in the resolved resonance region is taken into account exactly in MCNP. However, self-shielding in the unresolved resonance region can not be taken into account by MCNP, although the effect of it may be important in many applications. In this report a description is given of the computer code UNR. With this code problem-dependent cross section libraries can be produced for MCNP. In these libraries self-shielded cross section data in the unresolved resonance range are given, which are produced by NJOY-module UNRESR. It is noted, that the treatment for resonance self-shielding presented in this report is approximate. However, the current version of MCNP does not allow the use of probability tables, which would be a general solution. (orig.)

  14. Spike-coding mechanisms of cerebellar temporal processing in classical conditioning and voluntary movements.

    Science.gov (United States)

    Yamaguchi, Kenji; Sakurai, Yoshio

    2014-10-01

    Time is a fundamental and critical factor in daily life. Millisecond timing, which is the underlying temporal processing for speaking, dancing, and other activities, is reported to rely on the cerebellum. In this review, we discuss the cerebellar spike-coding mechanisms for temporal processing. Although the contribution of the cerebellum to both classical conditioning and voluntary movements is well known, the difference of the mechanisms for temporal processing between classical conditioning and voluntary movements is not clear. Therefore, we review the evidence of cerebellar temporal processing in studies of classical conditioning and voluntary movements and report the similarities and differences between them. From some studies, which used tasks that can change some of the temporal properties (e.g., the duration of interstimulus intervals) with keeping identical movements, we concluded that classical conditioning and voluntary movements may share a common spike-coding mechanism because simple spikes in Purkinje cells decrease at predicted times for responses regardless of the intervals between responses or stimulation.

  15. The current status of cyanobacterial nomenclature under the "prokaryotic" and the "botanical" code.

    Science.gov (United States)

    Oren, Aharon; Ventura, Stefano

    2017-10-01

    Cyanobacterial taxonomy developed in the botanical world because Cyanobacteria/Cyanophyta have traditionally been identified as algae. However, they possess a prokaryotic cell structure, and phylogenetically they belong to the Bacteria. This caused nomenclature problems as the provisions of the International Code of Nomenclature for algae, fungi, and plants (ICN; the "Botanical Code") differ from those of the International Code of Nomenclature of Prokaryotes (ICNP; the "Prokaryotic Code"). While the ICN recognises names validly published under the ICNP, Article 45(1) of the ICN has not yet been reciprocated in the ICNP. Different solutions have been proposed to solve the current problems. In 2012 a Special Committee on the harmonisation of the nomenclature of Cyanobacteria was appointed, but its activity has been minimal. Two opposing proposals to regulate cyanobacterial nomenclature were recently submitted, one calling for deletion of the cyanobacteria from the groups of organisms whose nomenclature is regulated by the ICNP, the second to consistently apply the rules of the ICNP to all cyanobacteria. Following a general overview of the current status of cyanobacterial nomenclature under the two codes we present five case studies of genera for which nomenclatural aspects have been discussed in recent years: Microcystis, Planktothrix, Halothece, Gloeobacter and Nostoc.

  16. P-Code-Enhanced Encryption-Mode Processing of GPS Signals

    Science.gov (United States)

    Young, Lawrence; Meehan, Thomas; Thomas, Jess B.

    2003-01-01

    A method of processing signals in a Global Positioning System (GPS) receiver has been invented to enable the receiver to recover some of the information that is otherwise lost when GPS signals are encrypted at the transmitters. The need for this method arises because, at the option of the military, precision GPS code (P-code) is sometimes encrypted by a secret binary code, denoted the A code. Authorized users can recover the full signal with knowledge of the A-code. However, even in the absence of knowledge of the A-code, one can track the encrypted signal by use of an estimate of the A-code. The present invention is a method of making and using such an estimate. In comparison with prior such methods, this method makes it possible to recover more of the lost information and obtain greater accuracy.

  17. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    Science.gov (United States)

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  18. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    Science.gov (United States)

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  19. Development of the Log-in Process and the Operation Process for the VHTR-SI Process Dynamic Simulation Code

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2009-01-01

    The VHTR-SI process is a hydrogen production technique by using Sulfur and Iodine. The SI process for a hydrogen production uses a high temperature (about 950 .deg. C) of the He gas which is a cooling material for an energy sources. The Korea Atomic Energy Research Institute Dynamic Simulation Code (KAERI DySCo) is an integration application software that simulates the dynamic behavior of the VHTR-SI process. A dynamic modeling is used to express and model the behavior of the software system over time. The dynamic modeling deals with the control flow of system, the interaction of objects and the order of actions in view of a time and transition by using a sequence diagram and a state transition diagram. In this paper, we present an user log-in process and an operation process for the KAERI DySCo by using a sequence diagram and a state transition diagram

  20. The TOUGH codes - a family of simulation tools for multiphase flowand transport processes in permeable media

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, Karsten

    2003-08-08

    Numerical simulation has become a widely practiced andaccepted technique for studying flow and transport processes in thevadose zone and other subsurface flow systems. This article discusses asuite of codes, developed primarily at Lawrence Berkeley NationalLaboratory (LBNL), with the capability to model multiphase flows withphase change. We summarize history and goals in the development of theTOUGH codes, and present the governing equations for multiphase,multicomponent flow. Special emphasis is given to space discretization bymeans of integral finite differences (IFD). Issues of code implementationand architecture are addressed, as well as code applications,maintenance, and future developments.

  1. ARC Code TI: Block-GP: Scalable Gaussian Process Regression

    Data.gov (United States)

    National Aeronautics and Space Administration — Block GP is a Gaussian Process regression framework for multimodal data, that can be an order of magnitude more scalable than existing state-of-the-art nonlinear...

  2. The Kepler Science Data Processing Pipeline Source Code Road Map

    Science.gov (United States)

    Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima; hide

    2016-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.

  3. A proposal for further integration of the cyanobacteria under the Bacteriological Code.

    Science.gov (United States)

    Oren, Aharon

    2004-09-01

    This taxonomic note reviews the present status of the nomenclature of the cyanobacteria under the Bacteriological Code. No more than 13 names of cyanobacterial species have been proposed so far in the International Journal of Systematic and Evolutionary Microbiology (IJSEM)/International Journal of Systematic Bacteriology (IJSB), and of these only five are validly published. The cyanobacteria (Cyanophyta, blue-green algae) are also named under the Botanical Code, and the dual nomenclature system causes considerable confusion. This note calls for a more intense involvement of the International Committee on Systematics of Prokaryotes (ICSP), its Judicial Commission and its Subcommittee on the Taxonomy of Photosynthetic Prokaryotes in the nomenclature of the cyanobacteria under the Bacteriological Code. The establishment of minimal standards for the description of new species and genera should be encouraged in a way that will be acceptable to the botanical authorities as well. This should be followed by the publication of an 'Approved List of Names of Cyanobacteria' in IJSEM. The ultimate goal is to achieve a consensus nomenclature that is acceptable both to bacteriologists and to botanists, anticipating the future implementation of a universal 'Biocode' that would regulate the nomenclature of all organisms living on Earth.

  4. High Frequency Scattering Code in a Distributed Processing Environment

    Science.gov (United States)

    1991-06-01

    for output The first option is simple, but involves post -processing the individual data files to foim a composite output file. This would add an...for data structures that are conducive to consc-lidation. The use of a post -process for the actual output of results is viable, though use of the...May 1991. 94 Imagen Laser Printer (im132) ; Owner ssuhr Host wbl7 /printer im132 Date Sat Jun 1 02:02:43 1991 o User ssuhr o formlength 66 /l

  5. Communication theory and signal processing for transform coding

    CERN Document Server

    El-Shennawy, Khamies Mohammed Ali

    2014-01-01

    This book is tailored to fulfil the requirements in the area of the signal processing in communication systems. The book contains numerous examples, solved problems and exercises to explain the methodology of Fourier Series, Fourier Analysis, Fourier Transform and properties, Fast Fourier Transform FFT, Discrete Fourier Transform DFT and properties, Discrete Cosine Transform DCT, Discrete Wavelet Transform DWT and Contourlet Transform CT. The book is characterized by three directions, the communication theory and signal processing point of view, the mathematical point of view and utility compu

  6. Development of the multistep compound process calculation code

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Toshihiko [Kyushu Univ., Fukuoka (Japan)

    1998-03-01

    A program `cmc` has been developed to calculate the multistep compound (MSC) process by Feshback-Kerman-Koonin. A radial overlap integral in the transition matrix element is calculated microscopically, and comparisons are made for neutron induced {sup 93}Nb reactions. Strengths of the two-body interaction V{sub 0} are estimated from the total MSC cross sections. (author)

  7. A Critical Appraisal of the Juvenile Justice System under Cameroon's 2005 Criminal Procedure Code: Emerging Challenges

    Directory of Open Access Journals (Sweden)

    S Tabe

    2012-03-01

    Full Text Available The objective of this article is to examine the changes introduced by the 2005 Cameroonian Criminal Procedure Code on matters of juvenile justice, considering that before this Code, juvenile justice in Cameroon was governed by extra-national laws. In undertaking this analysis, the article highlights the evolution of the administration of juvenile justice 50 years after independence of Cameroon. It also points out the various difficulties and shortcomings in the treatment of juvenile offenders in Cameroon since the enactment of the new Criminal Procedure Code. The article reveals that the 2005 Code is an amalgamation of all hitherto existing laws in the country that pertained to juvenile justice, and that despite the considerable amount of criticism it has received, the Code is clearly an improvement of the system of juvenile justice in Cameroon, since it represents a balance of the due process rights of young people, the protection of society and the special needs of young offenders. This is so because the drafters of the Code took a broad view of the old laws on juvenile justice. Also a wide range of groups were consulted, including criminal justice professionals, children’s service organisations, victims, parents, young offenders, educators, advocacy groups and social-policy analysts. However, to address the challenges that beset the juvenile justice system of Cameroon, the strategy of the government should be focussed on three areas: the prevention of youth crime, the provision of meaningful consequences for the actions of young people, and the rehabilitation and reintegration of young offenders. Cameroonian law should seek educative solutions rather than to impose prison sentences or other repressive measures on young offenders. Special courts to deal with young offenders should be established outside the regular penal system and should be provided with resources that are adequate for and appropriate to fostering their understanding of

  8. The enhanced variance propagation code for the Idaho Chemical Processing Plant

    International Nuclear Information System (INIS)

    Kern, E.A.; Zack, N.R.; Britschgi, J.J.

    1992-01-01

    The Variance Propagation (VP) Code was developed by the Los Alamos National Laboratory's Safeguard's Systems Group to provide off-line variance propagation and systems analysis for nuclear material processing facilities. The code can also be used as a tool in the design and evaluation of material accounting systems. In this regard , the VP code was enhanced to incorporate a model of the material accountability measurements used in the Idaho Chemical Processing Plant operated by the Westinghouse Idaho Nuclear Company. Inputs to the code were structured to account for the dissolves/headend process, the waste streams, process performed to determine the sensitivity of measurement and sampling errors to the overall material balance error. We determined that the material balance error is very sensitive to changes in the sampling errors. 3 refs

  9. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  10. UNICOS CPC6: Automated Code Generation for Process Control Applications

    OpenAIRE

    Fernandez Adiego, B; Blanco Vinuela, E; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins ...

  11. Coding efficiency of fly motion processing is set by firing rate, not firing precision.

    Directory of Open Access Journals (Sweden)

    Deusdedit Lineu Spavieri

    Full Text Available To comprehend the principles underlying sensory information processing, it is important to understand how the nervous system deals with various sources of perturbation. Here, we analyze how the representation of motion information in the fly's nervous system changes with temperature and luminance. Although these two environmental variables have a considerable impact on the fly's nervous system, they do not impede the fly to behave suitably over a wide range of conditions. We recorded responses from a motion-sensitive neuron, the H1-cell, to a time-varying stimulus at many different combinations of temperature and luminance. We found that the mean firing rate, but not firing precision, changes with temperature, while both were affected by mean luminance. Because we also found that information rate and coding efficiency are mainly set by the mean firing rate, our results suggest that, in the face of environmental perturbations, the coding efficiency is improved by an increase in the mean firing rate, rather than by an increased firing precision.

  12. Development of a model and computer code to describe solar grade silicon production processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gould, R K; Srivastava, R

    1979-12-01

    Models and computer codes which may be used to describe flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides are described. A prominent example of the type of process which may be studied using the codes developed in this program is the SiCl/sub 4//Na reactor currently being developed by the Westinghouse Electric Corp. During this program two large computer codes were developed. The first is the CHEMPART code, an axisymmetric, marching code which treats two-phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. This code, based on the AeroChem LAPP (Low Altitude Plume Program) code can be used to describe flow reactors in which reactants mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, depositon of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail than can be afforded using CHEMPART. These two codes have been used in this program to predict particle formation characteristics and wall collection efficiencies for SiCl/sub 4//Na flow reactors. Results are described.

  13. Industrial process heat case studies. [PROSYS/ECONMAT code

    Energy Technology Data Exchange (ETDEWEB)

    Hooker, D.W.; May, E.K.; West, R.E.

    1980-05-01

    Commercially available solar collectors have the potential to provide a large fraction of the energy consumed for industrial process heat (IPH). Detailed case studies of individual industrial plants are required in order to make an accurate assessment of the technical and economic feasibility of applications. This report documents the results of seven such case studies. The objectives of the case study program are to determine the near-term feasibility of solar IPH in selected industries, identify energy conservation measures, identify conditions of IPH systems that affect solar applications, test SERI's IPH analysis software (PROSYS/ECONOMAT), disseminate information to the industrial community, and provide inputs to the SERI research program. The detailed results from the case studies are presented. Although few near-term, economical solar applications were found, the conditions that would enhance the opportunities for solar IPH applications are identified.

  14. Processing Government Data: ZIP Codes, Python, and OpenRefine

    Directory of Open Access Journals (Sweden)

    Frank Donnelly

    2014-07-01

    Full Text Available While there is a vast amount of useful US government data on the web, some of it is in a raw state that is not readily accessible to the average user. Data librarians can improve accessibility and usability for their patrons by processing data to create subsets of local interest and by appending geographic identifiers to help users select and aggregate data. This case study illustrates how census geography crosswalks, Python, and OpenRefine were used to create spreadsheets of non-profit organizations in New York City from the IRS Tax-Exempt Organization Masterfile. This paper illustrates the utility of Python for data librarians and should be particularly insightful for those who work with address-based data.

  15. Fuel corrosion processes under waste disposal conditions

    International Nuclear Information System (INIS)

    Shoesmith, D.W.

    1999-09-01

    Under the oxidizing conditions likely to be encountered in the Yucca Mountain Repository, fuel dissolution is a corrosion process involving the coupling of the anodic dissolution of the fuel with the cathodic reduction of oxidants available within the repository. The oxidants potentially available to drive fuel corrosion are environmental oxygen, supplied by the transport through the permeable rock of the mountain and molecular and radical species produced by the radiolysis of available aerated water. The mechanism of these coupled anodic and cathodic reactions is reviewed in detail. While gaps in understanding remain, many kinetic features of these reactions have been studied in considerable detail, and a reasonably justified mechanism for fuel corrosion is available. The corrosion rate is determined primarily by environmental factors rather than the properties of the fuel. Thus, with the exception of increase in rate due to an increase in surface area, pre-oxidation of the fuel has little effect on the corrosion rate

  16. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    Energy Technology Data Exchange (ETDEWEB)

    Giovedi, Claudia; Martins, Marcelo Ramos, E-mail: claudia.giovedi@labrisco.usp.br, E-mail: mrmartin@usp.br [Laboratorio de Analise, Avaliacao e Gerenciamento de Risco (LabRisco/POLI/USP), São Paulo, SP (Brazil); Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e, E-mail: ayabe@ipen.br, E-mail: dsgomes@ipen.br, E-mail: teixiera@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  17. Modification of fuel performance code to evaluate iron-based alloy behavior under LOCA scenario

    International Nuclear Information System (INIS)

    Giovedi, Claudia; Martins, Marcelo Ramos; Abe, Alfredo; Muniz, Rafael O.R.; Gomes, Daniel de Souza; Silva, Antonio Teixeira e

    2017-01-01

    Accident tolerant fuels (ATF) has been studied since the Fukushima Daiichi accident in the research efforts to develop new materials which under accident scenarios could maintain the fuel rod integrity for a longer period compared to the cladding and fuel system usually utilized in Pressurized Water Reactors (PWR). The efforts have been focused on new materials applied as cladding, then iron-base alloys appear as a possible candidate. The aim of this paper is to implement modifications in a fuel performance code to evaluate the behavior of iron based alloys under Loss-of-Coolant Accident (LOCA) scenario. For this, initially the properties related to the thermal and mechanical behavior of iron-based alloys were obtained from the literature, appropriately adapted and introduced in the fuel performance code subroutines. The adopted approach was step by step modifications, where different versions of the code were created. The assessment of the implemented modification was carried out simulating an experiment available in the open literature (IFA-650.5) related to zirconium-based alloy fuel rods submitted to LOCA conditions. The obtained results for the iron-based alloy were compared to those obtained using the regular version of the fuel performance code for zircaloy-4. The obtained results have shown that the most important properties to be changed are those from the subroutines related to the mechanical properties of the cladding. The results obtained have shown that the burst is observed at a longer time for fuel rods with iron-based alloy, indicating the potentiality of this material to be used as cladding with ATF purposes. (author)

  18. Characterization of the MCNPX computer code in micro processed architectures

    International Nuclear Information System (INIS)

    Almeida, Helder C.; Dominguez, Dany S.; Orellana, Esbel T.V.; Milian, Felix M.

    2009-01-01

    The MCNPX (Monte Carlo N-Particle extended) can be used to simulate the transport of several types of nuclear particles, using probabilistic methods. The technique used for MCNPX is to follow the history of each particle from its origin to its extinction that can be given by absorption, escape or other reasons. To obtain accurate results in simulations performed with the MCNPX is necessary to process a large number of histories, which demand high computational cost. Currently the MCNPX can be installed in virtually all computing platforms available, however there is virtually no information on the performance of the application in each. This paper studies the performance of MCNPX, to work with electrons and photons in phantom Faux on two platforms used by most researchers, Windows and Li nux. Both platforms were tested on the same computer to ensure the reliability of the hardware in the measures of performance. The performance of MCNPX was measured by time spent to run a simulation, making the variable time the main measure of comparison. During the tests the difference in performance between the two platforms MCNPX was evident. In some cases we were able to gain speed more than 10% only with the exchange platforms, without any specific optimization. This shows the relevance of the study to optimize this tool on the platform most appropriate for its use. (author)

  19. Development of JURA -A Simulation Code for Electrochemical Processes in Molten Salt-

    OpenAIRE

    小林 嗣幸

    2003-01-01

    A simulation code for electrochemical processes in molten salt named JURA has been developed. This code can simulate the time history of the processes such as the metal pyro-process and the oxide pyro-process by the diffusion layer theory at various temperature and melts. This report describes the specific formulations of the theory for various electrodes such as the solid cathode, the Cd cathode, and the Cd pool anode. Explanation of the input data and sample calculations of the solid cathod...

  20. Processes of code status transitions in hospitalized patients with advanced cancer.

    Science.gov (United States)

    El-Jawahri, Areej; Lau-Min, Kelsey; Nipp, Ryan D; Greer, Joseph A; Traeger, Lara N; Moran, Samantha M; D'Arpino, Sara M; Hochberg, Ephraim P; Jackson, Vicki A; Cashavelly, Barbara J; Martinson, Holly S; Ryan, David P; Temel, Jennifer S

    2017-12-15

    Although hospitalized patients with advanced cancer have a low chance of surviving cardiopulmonary resuscitation (CPR), the processes by which they change their code status from full code to do not resuscitate (DNR) are unknown. We conducted a mixed-methods study on a prospective cohort of hospitalized patients with advanced cancer. Two physicians used a consensus-driven medical record review to characterize processes that led to code status order transitions from full code to DNR. In total, 1047 hospitalizations were reviewed among 728 patients. Admitting clinicians did not address code status in 53% of hospitalizations, resulting in code status orders of "presumed full." In total, 275 patients (26.3%) transitioned from full code to DNR, and 48.7% (134 of 275 patients) of those had an order of "presumed full" at admission; however, upon further clarification, the patients expressed that they had wished to be DNR before the hospitalization. We identified 3 additional processes leading to order transition from full code to DNR acute clinical deterioration (15.3%), discontinuation of cancer-directed therapy (17.1%), and education about the potential harms/futility of CPR (15.3%). Compared with discontinuing therapy and education, transitions because of acute clinical deterioration were associated with less patient involvement (P = .002), a shorter time to death (P cancer were because of full code orders in patients who had a preference for DNR before hospitalization. Transitions due of acute clinical deterioration were associated with less patient engagement and a higher likelihood of inpatient death. Cancer 2017;123:4895-902. © 2017 American Cancer Society. © 2017 American Cancer Society.

  1. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    Science.gov (United States)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  2. Safety analysis code 'COOLTMP' for assessment of PHT cooling under reactor shutdown conditions

    International Nuclear Information System (INIS)

    Krishna Kumar, P.; Hajela, S.; Datta, D.; Malhotra, P.K.

    2006-01-01

    The thermal energy generated by the reactor core is removed by the Primary Heat Transport (PHT) System when the reactor is under normal operation, by operation of the primary circulation pumps and steam generators. However, when the reactor is shutdown, the decay heat removal is done by the Shut Down (S/D) Cooling heat exchangers and pumps of lower capacity. In the event of loss/stoppage of circulation of PHT under such a situation, the bulk of the decay heat generated will be distributed to the moderator system, end shield system and through the feeders to the feeder cabinet/FM vault environment. However, the PHT inventory in the channel will be heated up because of loss of flow in the channel. The code COOLTMP has been developed to estimate the temperature of PHT following a loss/stoppage of circulation, when the reactor is under shutdown condition. It predicts the increase in the PHT temperature with time for hot channel, average channel or a specific channel under such a condition. It also calculates the apportionment of the decay heat to different heat sinks, viz. moderator, end shield and FM Vault. This computation is required when the plant is required to be under shutdown for doing some maintenance job on the PHT system, feeders or channels where the S/D cooling system has to be stopped and in some cases the headers have to be drained. At that time such a calculation gives whether the peak PHT temperature, or the time available to reach such a temperature, as obtained, is acceptable to carry out such a job. Hence, the schedule of the maintenance job can be decided. This code has been validated for RAPS and MAPS and used extensively for predicting PHT temperature after reactor shutdown to obtain regulatory clearances to stop forced circulation with and without header filled. (author)

  3. 78 FR 73502 - Multistakeholder Process To Develop Consumer Data Privacy Code of Conduct Concerning Facial...

    Science.gov (United States)

    2013-12-06

    ..., Privacy Multistakeholder Process: Mobile Application Transparency, http://www.ntia.doc.gov/other-publication/2013/privacy-multistakeholder-process-mobile-application-transparency . \\4\\ NTIA, Facial... enforceable codes of conduct that specify how the Consumer Privacy Bill of Rights applies in specific business...

  4. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  5. Energy meshing techniques for processing ENDF/B-VI cross sections using the AMPX code system

    International Nuclear Information System (INIS)

    Dunn, M.E.; Greene, N.M.; Leal, L.C.

    1999-01-01

    Modern techniques for the establishment of criticality safety for fissile systems invariably require the use of neutronic transport codes with applicable cross-section data. Accurate cross-section data are essential for solving the Boltzmann Transport Equation for fissile systems. In the absence of applicable critical experimental data, the use of independent calculational methods is crucial for the establishment of subcritical limits. Moreover, there are various independent modern transport codes available to the criticality safety analyst (e.g., KENO V.a., MCNP, and MONK). In contrast, there is currently only one complete software package that processes data from the Version 6 format of the Evaluated Nuclear Data File (ENDF) to a format useable by criticality safety codes. To facilitate independent cross-section processing, Oak Ridge National Laboratory (ORNL) is upgrading the AMPX code system to enable independent processing of Version 6 formats using state-of-the-art procedures. The AMPX code system has been in continuous use at ORNL since the early 1970s and is the premier processor for providing multigroup cross sections for criticality safety analysis codes. Within the AMPX system, the module POLIDENT is used to access the resonance parameters in File 2 of an ENDF/B library, generate point cross-section data, and combine the cross sections with File 3 point data. At the heart of any point cross-section processing code is the generation of a suitable energy mesh for representing the data. The purpose of this work is to facilitate the AMPX upgrade through the development of a new and innovative energy meshing technique for processing point cross-section data

  6. Simulation codes of chemical separation process of spent fuel reprocessing. Tool for process development and safety research

    International Nuclear Information System (INIS)

    Asakura, Toshihide; Sato, Makoto; Matsumura, Masakazu; Morita, Yasuji

    2005-01-01

    This paper reviews the succeeding development and utilization of Extraction System Simulation Code for Advanced Reprocessing (ESSCAR). From the viewpoint of development, more tests with spent fuel and calculations should be performed with better understanding of the physico-chemical phenomena in a separation process. From the viewpoint of process safety research on fuel cycle facilities, it is important to know the process behavior of a key substance; being highly reactive but existing only trace amount. (author)

  7. A multi-level code for metallurgical effects in metal-forming processes

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, P.A.; Silling, S.A. [Sandia National Labs., Albuquerque, NM (United States). Computational Physics and Mechanics Dept.; Hughes, D.A.; Bammann, D.J.; Chiesa, M.L. [Sandia National Labs., Livermore, CA (United States)

    1997-08-01

    The authors present the final report on a Laboratory-Directed Research and Development (LDRD) project, A Multi-level Code for Metallurgical Effects in metal-Forming Processes, performed during the fiscal years 1995 and 1996. The project focused on the development of new modeling capabilities for simulating forging and extrusion processes that typically display phenomenology occurring on two different length scales. In support of model fitting and code validation, ring compression and extrusion experiments were performed on 304L stainless steel, a material of interest in DOE nuclear weapons applications.

  8. Octopaminergic modulation of temporal frequency coding in an identified optic flow-processing interneuron

    Directory of Open Access Journals (Sweden)

    Kit D. Longden

    2010-11-01

    Full Text Available Flying generates predictably different patterns of optic flow compared with other locomotor states. A sensorimotor system tuned to rapid responses and a high bandwidth of optic flow would help the animal to avoid wasting energy through imprecise motor action. However, neural processing that covers a higher input bandwidth itself comes at higher energetic costs which would be a poor investment when the animal was not flying. How does the blowfly adjust the dynamic range of its optic flow-processing neurons to the locomotor state? Octopamine (OA is a biogenic amine central to the initiation and maintenance of flight in insects. We used an OA agonist chlordimeform (CDM to simulate the widespread OA release during flight and recorded the effects on the temporal frequency coding of the H2 cell. This cell is a visual interneuron known to be involved in flight stabilization reflexes. The application of CDM resulted in i an increase in the cell's spontaneous activity, expanding the inhibitory signalling range ii an initial response gain to moving gratings (20 – 60 ms post-stimulus that depended on the temporal frequency of the grating and iii a reduction in the rate and magnitude of motion adaptation that was also temporal frequency-dependent. To our knowledge, this is the first demonstration that the application of a neuromodulator can induce velocity-dependent alterations in the gain of a wide-field optic flow-processing neuron. The observed changes in the cell’s response properties resulted in a 33% increase of the cell’s information rate when encoding random changes in temporal frequency of the stimulus. The increased signalling range and more rapid, longer lasting responses employed more spikes to encode each bit, and so consumed a greater amount of energy. It appears that for the fly investing more energy in sensory processing during flight is more efficient than wasting energy on under-performing motor control.

  9. MOLOCH computer code for molecular-dynamics simulation of processes in condensed matter

    Directory of Open Access Journals (Sweden)

    Derbenev I.V.

    2011-01-01

    Full Text Available Theoretical and experimental investigation into properties of condensed matter is one of the mainstreams in RFNC-VNIITF scientific activity. The method of molecular dynamics (MD is an innovative method of theoretical materials science. Modern supercomputers allow the direct simulation of collective effects in multibillion atom sample, making it possible to model physical processes on the atomistic level, including material response to dynamic load, radiation damage, influence of defects and alloying additions upon material mechanical properties, or aging of actinides. During past ten years, the computer code MOLOCH has been developed at RFNC-VNIITF. It is a parallel code suitable for massive parallel computing. Modern programming techniques were used to make the code almost 100% efficient. Practically all instruments required for modelling were implemented in the code: a potential builder for different materials, simulation of physical processes in arbitrary 3D geometry, and calculated data processing. A set of tests was developed to analyse algorithms efficiency. It can be used to compare codes with different MD implementation between each other.

  10. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  11. Video processing for human perceptual visual quality-oriented video coding.

    Science.gov (United States)

    Oh, Hyungsuk; Kim, Wonha

    2013-04-01

    We have developed a video processing method that achieves human perceptual visual quality-oriented video coding. The patterns of moving objects are modeled by considering the limited human capacity for spatial-temporal resolution and the visual sensory memory together, and an online moving pattern classifier is devised by using the Hedge algorithm. The moving pattern classifier is embedded in the existing visual saliency with the purpose of providing a human perceptual video quality saliency model. In order to apply the developed saliency model to video coding, the conventional foveation filtering method is extended. The proposed foveation filter can smooth and enhance the video signals locally, in conformance with the developed saliency model, without causing any artifacts. The performance evaluation results confirm that the proposed video processing method shows reliable improvements in the perceptual quality for various sequences and at various bandwidths, compared to existing saliency-based video coding methods.

  12. Examining the relationship between comprehension and production processes in code-switched language

    Science.gov (United States)

    Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.

    2016-01-01

    We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049

  13. Recent development for the ITS code system: Parallel processing and visualization

    International Nuclear Information System (INIS)

    Fan, W.C.; Turner, C.D.; Halbleib, J.A. Sr.; Kensek, R.P.

    1996-01-01

    A brief overview is given for two software developments related to the ITS code system. These developments provide parallel processing and visualization capabilities and thus allow users to perform ITS calculations more efficiently. Timing results and a graphical example are presented to demonstrate these capabilities

  14. Classification of working processes to facilitate occupational hazard coding on industrial trawlers

    DEFF Research Database (Denmark)

    Jensen, Olaf C; Stage, Søren; Noer, Preben

    2003-01-01

    and a classification of the principal working processes on all kinds of vessels and a detailed classification for industrial trawlers. In industrial trawling, fish are landed for processing purposes, for example, for the production of fish oil and fish meal. The classification was subsequently used to code...... the injuries reported to the Danish Maritime Authority over a 5-year period. RESULTS: On industrial trawlers, 374 of 394 (95%) injuries were captured by the classification. Setting out and hauling in the gear and nets were the processes with the most injuries and accounted for 58.9% of all injuries......BACKGROUND: Commercial fishing is an extremely dangerous economic activity. In order to more accurately describe the risks involved, a specific injury coding based on the working process was developed. METHOD: Observation on six different types of vessels was conducted and allowed a description...

  15. Working under the PJVA gas processing agreement

    International Nuclear Information System (INIS)

    Collins, S.

    1996-01-01

    The trend in the natural gas industry is towards custom processing. New gas reserves tend to be smaller and in tighter reservoirs than in the past. This has resulted in plants having processing and transportation capacity available to be leased to third parties. Major plant operators and owners are finding themselves in the business of custom processing in a more focused way. Operators recognize that the dilution of operating costs can result in significant benefits to the plant owners as well as the third party processor. The relationship between the gas processor and the gas producer as they relate to the Petroleum Joint Venture Association (PJVA) Gas Processing Agreement were discussed. Details of the standard agreement that clearly defines the responsibilities of the third party producer and the processor were explained. In addition to outlining obligations of the parties, it also provides a framework for fee negotiation. It was concluded that third party processing can lower facility operating costs, extend facility life, and keep Canadian gas more competitive in holding its own in North American gas markets

  16. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    Science.gov (United States)

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  17. Obsolescence – understanding the underlying processes

    NARCIS (Netherlands)

    Thomsen, A.F.

    2017-01-01

    Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of built properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, i.e. between

  18. Development and application of a deflagration pressure analysis code for high level waste processing

    Energy Technology Data Exchange (ETDEWEB)

    Hensel, S.J.; Thomas, J.K.

    1994-06-01

    The Deflagration Pressure Analysis Code (DPAC) was developed primarily to evaluate peak pressures for deflagrations in radioactive waste storage and process facilities at the Savannah River Site (SRS). Deflagrations in these facilities are generally considered to be incredible events, but it was judged prudent to develop modeling capabilities in order to facilitate risk estimates. DPAC is essentially an engineering analysis tool, as opposed to a detailed thermal hydraulics code. It accounts for mass loss via venting, energy dissipation by radiative heat transfer, and gas PdV work. Volume increases due to vessel deformation can also be included using pressure-volume data from a structural analysis of the enclosure. This paper presents an overview of the code, benchmarking, and applications at SRS.

  19. PANAMA. A computer code to predict TRISO particle failure under accident conditions

    International Nuclear Information System (INIS)

    Verfondern, K.; Nabielek, H.

    1985-02-01

    The computer code PANAMA and its underlying modeling assumptions are presented. The models are based on independent measurements of the properties of TRISO particles with a SiC interlayer. Essential features are the calculation of internal gas pressure, of coating strength and its decrease during irradiation and its weakening due to fission product interaction during accidents. At very high temperatures, particle life is determined by SiC thermal decomposition. Good comparison is obtained in the temperature range 1600 - 2500 0 C when applying PANAMA to a wide variation of existing accident simulation experiments with spherical fuel elements. At lower temperatures, PANAMA tends to be over-conservative. Predictions of particle failure during the depressurized accident sequence with the worst temperatures of the 200 MWsub(th) side-by-side Modular Reactor System remain below the level of normal operations. The same holds true for the HTR-500 MWsub(e) accident sequence with the system under pressure. In the depressurized case, however, failure of all particles has to be expected after approximately 100 hours in the least favourable core position. (orig.) [de

  20. Rod behaviour under base load, load follow and frequency control operation: CYRANO 2 code predictions versus experimental results

    International Nuclear Information System (INIS)

    Gautier, B.; Raybaud, A.

    1984-01-01

    The French PWR reactors are now currently operating under load follow and frequency control. In order to demonstrate that these operating conditions were not able to increase the fuel failure rate, fuel rod behaviour calculations have been performed by E.D.F. with CYRANO 2 code. In parallel with these theoretical calculations, code predictions have been compared to experimental results. The paper presents some of the comparisons performed on 17x17 fuel irradiated in FESSENHEIM 2 up to 30 GWd/tU under base load operation and in the CAP reactor under load follow and frequency control conditions. It is shown that experimental results can be predicted with a reasonable accuracy by CYRANO 2 code. The experimental work was carried out under joint R and D programs by EDF, FRAGEMA, CEA, and WESTINGHOUSE (CAP program by French partners only). (author)

  1. The reasonable time of the process in the brazilian law and the new Civil Procedure Code: progress and setbacks

    Directory of Open Access Journals (Sweden)

    Elaine Harzheim Macedo

    2015-06-01

    Full Text Available This paper aims to examine the legal nature assumed by the reasonable duration of the institute process in Brazilian law. It is observed initially that the search for a speedy process is not new and that the idea entered in Brazil before the enactment of the Constitution of 1988. Subsequently the three main constitutional classifications that point the institute as rule, principle and rule-principle are sorted out. Later, the institute is analyzed under the infra perspective, especially from the perspective of the new Civil Procedure Code, with the close analysis of the contributions of the new procedural law to increase the speed of the process. Above all, the unremovable need for fixing clear conceptual beacons that define the reasonable duration of the process enabling the realization of its effectiveness in the daily lives of jurisdictional emerges.

  2. Bundled tungsten oxide nanowires under thermal processing

    International Nuclear Information System (INIS)

    Sun Shibin; Zhao Yimin; Xia Yongde; Zhu Yanqiu; Zou Zengda; Min Guanghui

    2008-01-01

    Ultra-thin W 18 O 49 nanowires were initially obtained by a simple solvothermal method using tungsten chloride and cyclohexanol as precursors. Thermal processing of the resulting bundled nanowires has been carried out in air in a tube furnace. The morphology and phase transformation behavior of the as-synthesized nanowires as a function of annealing temperature have been characterized by x-ray diffraction and electron microscopy. The nanostructured bundles underwent a series of morphological evolution with increased annealing temperature, becoming straighter, larger in diameter, and smaller in aspect ratio, eventually becoming irregular particles with size up to 5 μm. At 500 deg. C, the monoclinic W 18 O 49 was completely transformed to monoclinic WO 3 phase, which remains stable at high processing temperature. After thermal processing at 400 deg. C and 450 deg. C, the specific surface areas of the resulting nanowires dropped to 110 m 2 g -1 and 66 m 2 g -1 respectively, compared with that of 151 m 2 g -1 for the as-prepared sample. This study may shed light on the understanding of the geometrical and structural evolution occurring in nanowires whose working environment may involve severe temperature variations

  3. Coding properties of three intrinsically distinct retinal ganglion cells under periodic stimuli: a computational study

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2016-09-01

    Full Text Available As the sole output neurons in the retina, ganglion cells play significant roles in transforming visual information into spike trains, and then transmitting them to the higher visual centers. However, coding strategies that retinal ganglion cells (RGCs adopt to accomplish these processes are not completely clear yet. To clarify these issues, we investigate the coding properties of three types of RGCs (repetitive spiking, tonic firing, and phasic firing by two different measures (spike-rate and spike-latency. Model results show that for periodic stimuli, repetitive spiking RGC and tonic RGC exhibit similar spike-rate patterns. Their spike-rates decrease gradually with increased stimulus frequency, moreover, variation of stimulus amplitude would change the two RGCs’ spike-rate patterns. For phasic RGC, it activates strongly at medium levels of frequency when the stimulus amplitude is low. While if high stimulus amplitude is applied, phasic RGC switches to respond strongly at low frequencies. These results suggest that stimulus amplitude is a prominent factor in regulating RGCs in encoding periodic signals. Similar conclusions can be drawn when analyzes spike-latency patterns of the three RGCs. More importantly, the above phenomena can be accurately reproduced by Hodgkin’s three classes of neurons, indicating that RGCs can perform the typical three classes of firing dynamics, depending on the distinctions of ion channel densities. Consequently, model results from the three RGCs may be not specific, but can also applicable to neurons in other brain regions which exhibit part(s or all of the Hodgkin’s three excitabilities.

  4. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  5. Multi-processing CTH: Porting legacy FORTRAN code to MP hardware

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.L.; Elrick, M.G.; Hertel, E.S. Jr.

    1996-12-31

    CTH is a family of codes developed at Sandia National Laboratories for use in modeling complex multi-dimensional, multi-material problems that are characterized by large deformations and/or strong shocks. A two-step, second-order accurate Eulerian solution algorithm is used to solve the mass, momentum, and energy conservation equations. CTH has historically been run on systems where the data are directly accessible to the cpu, such as workstations and vector supercomputers. Multiple cpus can be used if all data are accessible to all cpus. This is accomplished by placing compiler directives or subroutine calls within the source code. The CTH team has implemented this scheme for Cray shared memory machines under the Unicos operating system. This technique is effective, but difficult to port to other (similar) shared memory architectures because each vendor has a different format of directives or subroutine calls. A different model of high performance computing is one where many (> 1,000) cpus work on a portion of the entire problem and communicate by passing messages that contain boundary data. Most, if not all, codes that run effectively on parallel hardware were written with a parallel computing paradigm in mind. Modifying an existing code written for serial nodes poses a significantly different set of challenges that will be discussed. CTH, a legacy FORTRAN code, has been modified to allow for solutions on distributed memory parallel computers such as the IBM SP2, the Intel Paragon, Cray T3D, or a network of workstations. The message passing version of CTH will be discussed and example calculations will be presented along with performance data. Current timing studies indicate that CTH is 2--3 times faster than equivalent C++ code written specifically for parallel hardware. CTH on the Intel Paragon exhibits linear speed up with problems that are scaled (constant problem size per node) for the number of parallel nodes.

  6. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  7. Development of CAD-Based Geometry Processing Module for a Monte Carlo Particle Transport Analysis Code

    International Nuclear Information System (INIS)

    Choi, Sung Hoon; Kwark, Min Su; Shim, Hyung Jin

    2012-01-01

    As The Monte Carlo (MC) particle transport analysis for a complex system such as research reactor, accelerator, and fusion facility may require accurate modeling of the complicated geometry. Its manual modeling by using the text interface of a MC code to define the geometrical objects is tedious, lengthy and error-prone. This problem can be overcome by taking advantage of modeling capability of the computer aided design (CAD) system. There have been two kinds of approaches to develop MC code systems utilizing the CAD data: the external format conversion and the CAD kernel imbedded MC simulation. The first approach includes several interfacing programs such as McCAD, MCAM, GEOMIT etc. which were developed to automatically convert the CAD data into the MCNP geometry input data. This approach makes the most of the existing MC codes without any modifications, but implies latent data inconsistency due to the difference of the geometry modeling system. In the second approach, a MC code utilizes the CAD data for the direct particle tracking or the conversion to an internal data structure of the constructive solid geometry (CSG) and/or boundary representation (B-rep) modeling with help of a CAD kernel. MCNP-BRL and OiNC have demonstrated their capabilities of the CAD-based MC simulations. Recently we have developed a CAD-based geometry processing module for the MC particle simulation by using the OpenCASCADE (OCC) library. In the developed module, CAD data can be used for the particle tracking through primitive CAD surfaces (hereafter the CAD-based tracking) or the internal conversion to the CSG data structure. In this paper, the performances of the text-based model, the CAD-based tracking, and the internal CSG conversion are compared by using an in-house MC code, McSIM, equipped with the developed CAD-based geometry processing module

  8. Population coding is essential for rapid information processing in the moth antennal lobe

    Czech Academy of Sciences Publication Activity Database

    Kobayashi, R.; Namiki, S.; Kanzaki, R.; Kitano, K.; Nishikawa, I.; Lánský, Petr

    2013-01-01

    Roč. 1536, NOV 6 (2013), s. 88-96 ISSN 0006-8993 R&D Projects: GA MŠk(CZ) LC554; GA ČR(CZ) GAP103/11/0282 Institutional research plan: CEZ:AV0Z50110509 Institutional support: RVO:67985823 Keywords : olfactory information processing * antennal lobe * population coding * decoding analysis Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.828, year: 2013

  9. Combined Source-Channel Coding of Images under Power and Bandwidth Constraints

    Directory of Open Access Journals (Sweden)

    Fossorier Marc

    2007-01-01

    Full Text Available This paper proposes a framework for combined source-channel coding for a power and bandwidth constrained noisy channel. The framework is applied to progressive image transmission using constant envelope -ary phase shift key ( -PSK signaling over an additive white Gaussian noise channel. First, the framework is developed for uncoded -PSK signaling (with . Then, it is extended to include coded -PSK modulation using trellis coded modulation (TCM. An adaptive TCM system is also presented. Simulation results show that, depending on the constellation size, coded -PSK signaling performs 3.1 to 5.2 dB better than uncoded -PSK signaling. Finally, the performance of our combined source-channel coding scheme is investigated from the channel capacity point of view. Our framework is further extended to include powerful channel codes like turbo and low-density parity-check (LDPC codes. With these powerful codes, our proposed scheme performs about one dB away from the capacity-achieving SNR value of the QPSK channel.

  10. Verification of aero-elastic offshore wind turbine design codes under IEA Wind Task XXIII

    DEFF Research Database (Denmark)

    Vorpahl, Fabian; Strobel, Michael; Jonkman, Jason M.

    2014-01-01

    This work presents the results of a benchmark study on aero-servo-hydro-elastic codes for offshore wind turbine dynamic simulation. The codes verified herein account for the coupled dynamic systems including the wind inflow, aerodynamics, elasticity and controls of the turbine, along with the inc...

  11. Hospital Coding Practice, Data Quality, and DRG-Based Reimbursement under the Thai Universal Coverage Scheme

    Science.gov (United States)

    Pongpirul, Krit

    2011-01-01

    In the Thai Universal Coverage scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group (DRG) reimbursement. Questionable quality of the submitted DRG codes has been of concern whereas knowledge about hospital coding practice has been lacking. The objectives of this thesis are (1) To explore hospital coding…

  12. The Cortical Organization of Speech Processing: Feedback Control and Predictive Coding the Context of a Dual-Stream Model

    Science.gov (United States)

    Hickok, Gregory

    2012-01-01

    Speech recognition is an active process that involves some form of predictive coding. This statement is relatively uncontroversial. What is less clear is the source of the prediction. The dual-stream model of speech processing suggests that there are two possible sources of predictive coding in speech perception: the motor speech system and the…

  13. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  14. Validation of the TRACR3D code for soil water flow under saturated/unsaturated conditions in three experiments

    International Nuclear Information System (INIS)

    Perkins, B.; Travis, B.; DePoorter, G.

    1985-01-01

    Validation of the TRACR3D code in a one-dimensional form was obtained for flow of soil water in three experiments. In the first experiment, a pulse of water entered a crushed-tuff soil and initially moved under conditions of saturated flow, quickly followed by unsaturated flow. In the second experiment, steady-state unsaturated flow took place. In the final experiment, two slugs of water entered crushed tuff under field conditions. In all three experiments, experimentally measured data for volumetric water content agreed, within experimental errors, with the volumetric water content predicted by the code simulations. The experiments and simulations indicated the need for accurate knowledge of boundary and initial conditions, amount and duration of moisture input, and relevant material properties as input into the computer code. During the validation experiments, limitations on monitoring of water movement in waste burial sites were also noted. 5 references, 34 figures, 9 tables

  15. The Judicial Reorganization under the Approach of the New Brazilian Civil Procedure Code

    Directory of Open Access Journals (Sweden)

    Maria Cláudia Viana Hissa Dias do Vale

    2016-12-01

    Full Text Available This article aims to demonstrate that the new civil procedure order allowed for a contemporary analysis of judicial reorganization procedures, through the adoption of new guidelines that ensure the fair distribution of the burdens arising from the overcoming of the crisis. Among the innovations, there are the alternative methods of conflict resolution, which are capable of reducing the backlog of judicial demands, substituting them by a more democratic management of processes that includes all involved parties. It is concluded that, although the adoption of these innovations are allowed under the Law 11.101/2005, they will be chosen on a case-by-case basis.

  16. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Science.gov (United States)

    2011-10-26

    ... symbol, standard, or technology (Id. at 12510 and 12529). In response to the Bar Code Proposed Rule, FDA... to FDA, they are not required to do so. In recognition of these challenges, in the Federal Register...

  17. Verification of Dinamika-5 code on experimental data of water level behaviour in PGV-440 under dynamic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Beljaev, Y.V.; Zaitsev, S.I.; Tarankov, G.A. [OKB Gidropress (Russian Federation)

    1995-12-31

    Comparison of the results of calculational analysis with experimental data on water level behaviour in horizontal steam generator (PGV-440) under the conditions with cessation of feedwater supply is presented in the report. Calculational analysis is performed using DIMANIKA-5 code, experimental data are obtained at Kola NPP-4. (orig.). 2 refs.

  18. 75 FR 8747 - Revision of Certain Dollar Amounts in the Bankruptcy Code Prescribed Under Section 104(A) of the...

    Science.gov (United States)

    2010-02-25

    ... UNITED STATES Revision of Certain Dollar Amounts in the Bankruptcy Code Prescribed Under Section 104(A...: Francis F. Szczebak, Chief, Bankruptcy Judges Division, Administrative Office of the United States Courts, Washington, DC 20544, telephone (202) 502-1900 or by e-mail at Bankruptcy[email protected

  19. Territorial jurisdiction of the courts under the new Civil Procedure Code

    OpenAIRE

    Ilie-Viorel ARSENE

    2014-01-01

    The new Civil Procedure Code became effective on 15th February 2013 and it brought new stipulations for most of its institutions. The most "spectacular" of them regards the written stage and within it, the regularization of the writ of summons is distinguished. Much less noticeable are the innovations of the New Civil Procedure Code in terms of jurisdiction, namely the territorial jurisdiction. This does not mean they do not exist or that they are less important than others. Let us not forget...

  20. Review of design codes of concrete encased steel short columns under axial compression

    Directory of Open Access Journals (Sweden)

    K.Z. Soliman

    2013-08-01

    Full Text Available In recent years, the use of encased steel concrete columns has been increased significantly in medium-rise or high-rise buildings. The aim of the present investigation is to assess experimentally the current methods and codes for evaluating the ultimate load behavior of concrete encased steel short columns. The current state of design provisions for composite columns from the Egyptian codes ECP203-2007 and ECP-SC-LRFD-2012, as well as, American Institute of Steel Construction, AISC-LRFD-2010, American Concrete Institute, ACI-318-2008, and British Standard BS-5400-5 was reviewed. The axial capacity portion of both the encased steel section and the concrete section was also studied according to the previously mentioned codes. Ten encased steel concrete columns have been investigated experimentally to study the effect of concrete confinement and different types of encased steel sections. The measured axial capacity of the tested ten composite columns was compared with the values calculated by the above mentioned codes. It is concluded that non-negligible discrepancies exist between codes and the experimental results as the confinement effect was not considered in predicting both the strength and ductility of concrete. The confining effect was obviously influenced by the shape of the encased steel section. The tube-shaped steel section leads to better confinement than the SIB section. Among the used codes, the ECP-SC-LRFD-2012 led to the most conservative results.

  1. Formation process of Malaysian modern architecture under influence of nationalism

    OpenAIRE

    宇高, 雄志; 山崎, 大智

    2001-01-01

    This paper examines the Formation Process of Malaysian Modern Architecture under Influence of Nationalism,through the process of independence of Malaysia. The national style as "Malaysian national architecture" which hasengaged on background of political environment under the post colonial situation. Malaysian urban design is alsodetermined under the balance of both of ethnic culture and the national culture. In Malaysia, they decided to choosethe Malay ethnic culture as the national culture....

  2. A multidisciplinary audit of clinical coding accuracy in otolaryngology: financial, managerial and clinical governance considerations under payment-by-results.

    Science.gov (United States)

    Nouraei, S A R; O'Hanlon, S; Butler, C R; Hadovsky, A; Donald, E; Benjamin, E; Sandhu, G S

    2009-02-01

    To audit the accuracy of otolaryngology clinical coding and identify ways of improving it. Prospective multidisciplinary audit, using the 'national standard clinical coding audit' methodology supplemented by 'double-reading and arbitration'. Teaching-hospital otolaryngology and clinical coding departments. Otolaryngology inpatient and day-surgery cases. Concordance between initial coding performed by a coder (first cycle) and final coding by a clinician-coder multidisciplinary team (MDT; second cycle) for primary and secondary diagnoses and procedures, and Health Resource Groupings (HRG) assignment. 1250 randomly-selected cases were studied. Coding errors occurred in 24.1% of cases (301/1250). The clinician-coder MDT reassigned 48 primary diagnoses and 186 primary procedures and identified a further 209 initially-missed secondary diagnoses and procedures. In 203 cases, patient's initial HRG changed. Incorrect coding caused an average revenue loss of 174.90 pounds per patient (14.7%) of which 60% of the total income variance was due to miscoding of a eight highly-complex head and neck cancer cases. The 'HRG drift' created the appearance of disproportionate resource utilisation when treating 'simple' cases. At our institution the total cost of maintaining a clinician-coder MDT was 4.8 times lower than the income regained through the double-reading process. This large audit of otolaryngology practice identifies a large degree of error in coding on discharge. This leads to significant loss of departmental revenue, and given that the same data is used for benchmarking and for making decisions about resource allocation, it distorts the picture of clinical practice. These can be rectified through implementing a cost-effective clinician-coder double-reading multidisciplinary team as part of a data-assurance clinical governance framework which we recommend should be established in hospitals.

  3. Parallel processing is good for your scientific codes...But massively parallel processing is so much better

    International Nuclear Information System (INIS)

    Thomas, B.; Domain, Ch.; Souffez, Y.; Eon-Duval, P.

    1998-01-01

    Harnessing the power of many computers, to solve concurrently difficult scientific problems, is one of the most innovative trend in High Performance Computing. At EDF, we have invested in parallel computing and have achieved significant results. First we improved the processing speed of strategic codes, in order to extend their scope. Then we turned to numerical simulations at the atomic scale. These computations, we never dreamt of before, provided us with a better understanding of metallurgic phenomena. More precisely we were able to trace defects in alloys that are used in nuclear power plants. (author)

  4. An evaluation of the effect of JPEG, JPEG2000, and H.264/AVC on CQR codes decoding process

    Science.gov (United States)

    Vizcarra Melgar, Max E.; Farias, Mylène C. Q.; Zaghetto, Alexandre

    2015-02-01

    This paper presents a binarymatrix code based on QR Code (Quick Response Code), denoted as CQR Code (Colored Quick Response Code), and evaluates the effect of JPEG, JPEG2000 and H.264/AVC compression on the decoding process. The proposed CQR Code has three additional colors (red, green and blue), what enables twice as much storage capacity when compared to the traditional black and white QR Code. Using the Reed-Solomon error-correcting code, the CQR Code model has a theoretical correction capability of 38.41%. The goal of this paper is to evaluate the effect that degradations inserted by common image compression algorithms have on the decoding process. Results show that a successful decoding process can be achieved for compression rates up to 0.3877 bits/pixel, 0.1093 bits/pixel and 0.3808 bits/pixel for JPEG, JPEG2000 and H.264/AVC formats, respectively. The algorithm that presents the best performance is the H.264/AVC, followed by the JPEG2000, and JPEG.

  5. Sparse spike coding : applications of neuroscience to the processing of natural images

    Science.gov (United States)

    Perrinet, Laurent U.

    2008-04-01

    If modern computers are sometimes superior to cognition in some specialized tasks such as playing chess or browsing a large database, they can't beat the efficiency of biological vision for such simple tasks as recognizing a relative or following an object in a complex background. We present in this paper our attempt at outlining the dynamical, parallel and event-based representation for vision in the architecture of the central nervous system. We will illustrate this by showing that in a signal matching framework, a L/LN (linear/non-linear) cascade may efficiently transform a sensory signal into a neural spiking signal and we apply this framework to a model retina. However, this code gets redundant when using an over-complete basis as is necessary for modeling the primary visual cortex: we therefore optimize the efficiency cost by increasing the sparseness of the code. This is implemented by propagating and canceling redundant information using lateral interactions. We compare the eciency of this representation in terms of compression as the reconstruction quality as a function of the coding length. This will correspond to a modification of the Matching Pursuit algorithm where the ArgMax function is optimized for competition, or Competition Optimized Matching Pursuit (COMP). We will particularly focus on bridging neuroscience and image processing and on the advantages of such an interdisciplinary approach.

  6. SSYST: A code-system for analyzing transient LWR fuel rod behaviour under off-normal conditions

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Gulden, W.

    1983-01-01

    SSYST is a code-system for analyzing transient fuel rod behaviour under off-normal conditions, developed conjointly by the Institut fur Kernenergetik und Energiesysteme (IKE), Stuttgart, and Kernforschungszentrum Karlsruhe (KfK) under contract of Projekt Nukleare Sicherheit (PNS) at KfK. The main differences between SSYST and similar codes are an open-ended modular code organization, and a preference for simple models, wherever possible. While the first feature makes SSYST a very flexible tool, easily adapted to changing requirements, the second feature leads to short execution times. The analysis of transient rod behaviour under LOCA boundary conditions takes 2 min cpu-time (IBM-3033), so that extensive parametric studies become possible. This paper gives an outline of the overall code organisation and a general overview of the physical models implemented. Besides explaining the routine application of SSYST in the analysis of loss-of-coolant accidents, examples are given of special applications which have led to a satisfactory understanding of the decisive influence of deviations from rotational symmetry on the fuel rod perimeter

  7. SSYST: A code-system for analysing transient LWR fuel rod behaviour under off-normal conditions

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Gulden, W.

    1983-01-01

    SSYST is a code-system for analysing transient fuel rod behaviour under off-normal conditions, developed conjointly by the Institut fuer Kernenergetik und Energiesysteme (IKE), Stuttgart, and Kernforschungszentrum Karlsruhe (KfK) under contract of Projekt Nukleare Sicherheit (PNS) at KfK. The main differences versus codes with similar applications are: (1) an open-ended modular code organisation, and (2) a preference for simple models, wherever possible. While the first feature makes SSYST a very flexible tool, easily adapted to changing requirements, the second feature leads to short execution times. The analysis of transient rod behaviour under LOCA boundary conditions takes 2 mins cpu-time (IBM-3033), so that extensive parametric studies become possible. The paper gives an outline of the overall code organisation and a general overview of the physical models implemented. Besides explaining the routine application of SSYST in the analysis of loss-of-coolant accidents, examples are given of special applications, which have led to a satisfactory understanding of the decisive influence of deviations from rotational symmetry on the fuel rod perimeter. (author)

  8. Virus-host co-evolution under a modified nuclear genetic code

    Directory of Open Access Journals (Sweden)

    Derek J. Taylor

    2013-03-01

    Full Text Available Among eukaryotes with modified nuclear genetic codes, viruses are unknown. However, here we provide evidence of an RNA virus that infects a fungal host (Scheffersomyces segobiensis with a derived nuclear genetic code where CUG codes for serine. The genomic architecture and phylogeny are consistent with infection by a double-stranded RNA virus of the genus Totivirus. We provide evidence of past or present infection with totiviruses in five species of yeasts with modified genetic codes. All but one of the CUG codons in the viral genome have been eliminated, suggesting that avoidance of the modified codon was important to viral adaptation. Our mass spectroscopy analysis indicates that a congener of the host species has co-opted and expresses a capsid gene from totiviruses as a cellular protein. Viral avoidance of the host’s modified codon and host co-option of a protein from totiviruses suggest that RNA viruses co-evolved with yeasts that underwent a major evolutionary transition from the standard genetic code.

  9. Implementation of decommissioning materials conditional clearance process to the OMEGA calculation code

    International Nuclear Information System (INIS)

    Zachar, Matej; Necas, Vladimir; Daniska, Vladimir

    2011-01-01

    The activities performed during nuclear installation decommissioning process inevitably lead to the production of large amount of radioactive material to be managed. Significant part of materials has such low radioactivity level that allows them to be released to the environment without any restriction for further use. On the other hand, for materials with radioactivity slightly above the defined unconditional clearance level, there is a possibility to release them conditionally for a specific purpose in accordance with developed scenario assuring that radiation exposure limits for population not to be exceeded. The procedure of managing such decommissioning materials, mentioned above, could lead to recycling and reuse of more solid materials and to save the radioactive waste repository volume. In the paper an a implementation of the process of conditional release to the OMEGA Code is analyzed in details; the Code is used for calculation of decommissioning parameters. The analytical approach in the material parameters assessment, firstly, assumes a definition of radiological limit conditions, based on the evaluation of possible scenarios for conditionally released materials, and their application to appropriate sorter type in existing material and radioactivity flow system. Other calculation procedures with relevant technological or economical parameters, mathematically describing e.g. final radiation monitoring or transport outside the locality, are applied to the OMEGA Code in the next step. Together with limits, new procedures creating independent material stream allow evaluation of conditional material release process during decommissioning. Model calculations evaluating various scenarios with different input parameters and considering conditional release of materials to the environment are performed to verify the implemented methodology. Output parameters and results of the model assessment are presented, discussed and conduced in the final part of the paper

  10. Development of the object-oriented analysis code for the estimation of material balance in pyrochemical reprocessing process (2). Modification of the code for the analysis of holdup of nuclear materials in the process

    International Nuclear Information System (INIS)

    Okamura, Nobuo; Tanaka, Hiroshi

    2001-04-01

    Pyrochemical reprocessing is thought to be promising process for FBR fuel cycle mainly from the economical viewpoint. However, the material behavior in the process is not clarified enough because of the lack of experimental data. The authors have been developed the object-oriented analysis code for the estimation of material balance in the process, which has the flexible applicability for the change of process flow sheet. The objective of this study is to modify the code so as to analyze the holdup of nuclear materials in the pyrochemical process from the viewpoint of safeguard, because of the possibility of larger amount of the holdup in the process compared with aqueous process. As a result of the modification, the relationship between the production of nuclear materials and its holdup in the process can be evaluated by the code. (author)

  11. Enabling Ethical Code Embeddedness in Construction Organizations: A Review of Process Assessment Approach.

    Science.gov (United States)

    Oladinrin, Olugbenga Timo; Ho, Christabel Man-Fong

    2016-08-01

    Several researchers have identified codes of ethics (CoEs) as tools that stimulate positive ethical behavior by shaping the organisational decision-making process, but few have considered the information needed for code implementation. Beyond being a legal and moral responsibility, ethical behavior needs to become an organisational priority, which requires an alignment process that integrates employee behavior with the organisation's ethical standards. This paper discusses processes for the responsible implementation of CoEs based on an extensive review of the literature. The internationally recognized European Foundation for Quality Management Excellence Model (EFQM model) is proposed as a suitable framework for assessing an organisation's ethical performance, including CoE embeddedness. The findings presented herein have both practical and research implications. They will encourage construction practitioners to shift their attention from ethical policies to possible enablers of CoE implementation and serve as a foundation for further research on ethical performance evaluation using the EFQM model. This is the first paper to discuss the model's use in the context of ethics in construction practice.

  12. CXSFIT Code Application to Process Charge-Exchange Recombination Spectroscopy Data at the T-10 Tokamak

    Science.gov (United States)

    Serov, S. V.; Tugarinov, S. N.; Klyuchnikov, L. A.; Krupin, V. A.; von Hellermann, M.

    2017-12-01

    The applicability of the CXSFIT code to process experimental data from Charge-eXchange Recombination Spectroscopy (CXRS) diagnostics at the T-10 tokamak is studied with a view to its further use for processing experimental data at the ITER facility. The design and operating principle of the CXRS diagnostics are described. The main methods for processing the CXRS spectra of the 5291-Å line of C5+ ions at the T-10 tokamak (with and without subtraction of parasitic emission from the edge plasma) are analyzed. The method of averaging the CXRS spectra over several shots, which is used at the T-10 tokamak to increase the signal-to-noise ratio, is described. The approximation of the spectrum by a set of Gaussian components is used to identify the active CXRS line in the measured spectrum. Using the CXSFIT code, the ion temperature in ohmic discharges and discharges with auxiliary electron cyclotron resonance heating (ECRH) at the T-10 tokamak is calculated from the CXRS spectra of the 5291-Å line. The time behavior of the ion temperature profile in different ohmic heating modes is studied. The temperature profile dependence on the ECRH power is measured, and the dynamics of ECR removal of carbon nuclei from the T-10 plasma is described. Experimental data from the CXRS diagnostics at T-10 substantially contribute to the implementation of physical programs of studies on heat and particle transport in tokamak plasmas and investigation of geodesic acoustic mode properties.

  13. Error analysis of supercritical water correlations using ATHLET system code under DHT conditions

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, J., E-mail: jeffrey.samuel@uoit.ca [Univ. of Ontario Inst. of Tech., Oshawa, ON (Canada)

    2014-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is used for analysis of anticipated and abnormal plant transients, including safety analysis of Light Water Reactors (LWRs) and Russian Graphite-Moderated High Power Channel-type Reactors (RBMKs). The range of applicability of ATHLET has been extended to supercritical water by updating the fluid-and transport-properties packages, thus enabling the code to the used in analysis of SuperCritical Water-cooled Reactors (SCWRs). Several well-known heat-transfer correlations for supercritical fluids were added to the ATHLET code and a numerical model was created to represent an experimental test section. In this work, the error in the Heat Transfer Coefficient (HTC) calculation by the ATHLET model is studied along with the ability of the various correlations to predict different heat transfer regimes. (author)

  14. ASPECTS CONCERNING THE JOINT VENTURE UNDER THE REGULATION OF THE NEW CIVIL CODE

    Directory of Open Access Journals (Sweden)

    Ana-Maria Lupulescu

    2013-11-01

    Full Text Available The New Civil Code makes the transition, for the first time in the Romanian legal system, from the duality to the unity of private law. Consequently, the Civil Code contains a legal regulation more structured and comprehensive, although not entirely safe from any criticism, in relation to the company, with particular reference to the simple company, regulation that expressly characterizes itself as the common law in this field. Within these general provisions, the legislator has considered the joint venture, to which, however, as in the previous regulation contained in the old Commercial Code – now repealed –, it does not devote too many legal provisions, in order to maintain the flexibility of this form of company. Therefore, this approach appears particularly useful for analysts in law and, especially, for practitioners, since it aims to achieve a comprehensive analysis of the joint venture, form of company with practical incidence.

  15. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a

  16. Standardizing texture and facies codes for a process-based classification of clastic sediment and rock

    Science.gov (United States)

    Farrell, K.M.; Harris, W.B.; Mallinson, D.J.; Culver, S.J.; Riggs, S.R.; Pierson, J.; ,; Lautier, J.C.

    2012-01-01

    Proposed here is a universally applicable, texturally based classification of clastic sediment that is independent from composition, cementation, and geologic environment, is closely allied to process sedimentology, and applies to all compartments in the source-to-sink system. The classification is contingent on defining the term "clastic" so that it is independent from composition or origin and includes any particles or grains that are subject to erosion, transportation, and deposition. Modifications to Folk's (1980) texturally based classification that include applying new assumptions and defining a broader array of textural fields are proposed to accommodate this. The revised ternary diagrams include additional textural fields that better define poorly sorted and coarse-grained deposits, so that all end members (gravel, sand, and mud size fractions) are included in textural codes. Revised textural fields, or classes, are based on a strict adherence to volumetric estimates of percentages of gravel, sand, and mud size grain populations, which by definition must sum to 100%. The new classification ensures that descriptors are applied consistently to all end members in the ternary diagram (gravel, sand, and mud) according to several rules, and that none of the end members are ignored. These modifications provide bases for standardizing vertical displays of texture in graphic logs, lithofacies codes, and their derivatives- hydrofacies. Hydrofacies codes are nondirectional permeability indicators that predict aquifer or reservoir potential. Folk's (1980) ternary diagram for fine-grained clastic sediments (sand, silt, and clay size fractions) is also revised to preserve consistency with the revised diagram for gravel, sand, and mud. Standardizing texture ensures that the principles of process sedimentology are consistently applied to compositionally variable rock sequences, such as mixed carbonate-siliciclastic ramp settings, and the extreme ends of depositional

  17. RODSWELL: a computer code for the thermomechanical analysis of fuel rods under LOCA conditions

    International Nuclear Information System (INIS)

    Casadei, F.; Laval, H.; Donea, J.; Jones, P.M.; Colombo, A.

    1984-01-01

    The code calculates the variation in space and time of all significant fuel rod variables, including fuel, gap and cladding temperature, fuel and cladding deformation, cladding oxidation and rod internal pressure. The code combines a transient 2-dimensional heat conduction code and a 1-dimensional mechanical model for the cladding deformation. The first sections of this report deal with the heat conduction model and the finite element discretization used for the thermal analysis. The mechanical deformation model is presented next: modelling of creep, phase change and oxidation of the zircaloy cladding is discussed in detail. A model describing the effect of oxidation and oxide cracking on the mechanical strength of the cladding is presented too. Next a mechanical restraint model, which allows the simulation of the presence of the neighbouring rods and is particularly important in assessing the amount of channel blockage during a transient, is presented. A description of the models used for the coolant conditions and for the power generation follows. The heat source can be placed either in the fuel or in the cladding, and direct or indirect clad heating by electrical power can be simulated. Then a section follows, dealing with the steady-state and transient types of calculation and with the automatic variable time step selection during the transient. The last sections deal with presentation of results, graphical output, test problems and an example of general application of the code

  18. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  19. Development of hydraulic analysis code for optimizing thermo-chemical is process reactors

    International Nuclear Information System (INIS)

    Terada, Atsuhiko; Hino, Ryutaro; Hirayama, Toshio; Nakajima, Norihiro; Sugiyama, Hitoshi

    2007-01-01

    The Japan Atomic Energy Agency has been conducting study on thermochemical IS process for water splitting hydrogen production. Based on the test results and know-how obtained through the bench-scale test, a pilot test plant, which has a hydrogen production performance of 30 Nm 3 /h, is being designed conceptually as the next step of the IS process development. In design of the IS pilot plant, it is important to make chemical reactors compact with high performance from the viewpoint of plant cost reduction. A new hydraulic analytical code has been developed for optimizing mixing performance of multi-phase flow involving chemical reactions especially in the Bunsen reactor. Complex flow pattern with gas-liquid chemical interaction involving flow instability will be characterized in the Bunsen reactor. Preliminary analytical results obtained with above mentioned code, especially flow patterns induced by swirling flow agreed well with that measured by water experiments, which showed vortex breakdown pattern in a simplified Bunsen reactor. (author)

  20. Genetic Code Expansion as a Tool to Study Regulatory Processes of Transcription

    Science.gov (United States)

    Schmidt, Moritz; Summerer, Daniel

    2014-02-01

    The expansion of the genetic code with noncanonical amino acids (ncAA) enables the chemical and biophysical properties of proteins to be tailored, inside cells, with a previously unattainable level of precision. A wide range of ncAA with functions not found in canonical amino acids have been genetically encoded in recent years and have delivered insights into biological processes that would be difficult to access with traditional approaches of molecular biology. A major field for the development and application of novel ncAA-functions has been transcription and its regulation. This is particularly attractive, since advanced DNA sequencing- and proteomics-techniques continue to deliver vast information on these processes on a global level, but complementing methodologies to study them on a detailed, molecular level and in living cells have been comparably scarce. In a growing number of studies, genetic code expansion has now been applied to precisely control the chemical properties of transcription factors, RNA polymerases and histones, and this has enabled new insights into their interactions, conformational changes, cellular localizations and the functional roles of posttranslational modifications.

  1. From chemical metabolism to life: the origin of the genetic coding process

    Directory of Open Access Journals (Sweden)

    Antoine Danchin

    2017-06-01

    Full Text Available Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life.

  2. Stability of prebiotic, laminaran oligosaccharide under food processing conditions

    Science.gov (United States)

    Chamidah, A.

    2018-04-01

    Prebiotic stability tests on laminaran oligosaccharide under food processing conditions were urgently performed to determine the ability of prebiotics deal with processing. Laminaran, oligosaccharide is produced from enzymatic hydrolysis. To further apply this prebiotic, it is necessary to test its performance on food processing. Single prebiotic or in combination with probiotic can improve human digestive health. The effectiveness evaluation of prebiotic should be taken into account in regards its chemical and functional stabilities. This study aims to investigate the stability of laminaran, oligosaccharide under food processing condition.

  3. Samovar: a thermomechanical code for modeling of geodynamic processes in the lithosphere-application to basin evolution

    DEFF Research Database (Denmark)

    Elesin, Y; Gerya, T; Artemieva, Irina

    2010-01-01

    We present a new 2D finite difference code, Samovar, for high-resolution numerical modeling of complex geodynamic processes. Examples are collision of lithospheric plates (including mountain building and subduction) and lithosphere extension (including formation of sedimentary basins, regions...

  4. Code of Practice on Radiation Protection in the Mining and Processing of Mineral Sands (1982) (Western Australia)

    International Nuclear Information System (INIS)

    1982-01-01

    This Code establishes radiation safety practices for the mineral sands industry in Western Australia. The Code prescribes, not only for operators and managers of mines and processing plants but for their employees as well, certain duties designed to ensure that radiation exposure is kept as low as reasonably practicable. The Code further provides for the management of wastes, again with a view to keeping contaminant concentrations and dose rates within specified levels. Finally, provision is made for the rehabilitation of those sites in which mining or processing operations have ceased by restoring the areas to designated average radiation levels. (NEA) [fr

  5. The effect of interspike interval statistics on the information gain under the rate coding hypothesis

    Czech Academy of Sciences Publication Activity Database

    Koyama, S.; Košťál, Lubomír

    2014-01-01

    Roč. 11, č. 1 (2014), s. 63-80 ISSN 1547-1063. [International Workshop on Neural Coding (NC) /10./. Praha, 02.09.2012-07.09.2012] R&D Projects: GA ČR(CZ) GPP103/12/P558 Institutional support: RVO:67985823 Keywords : Kullback-Leibler divergence * neural spike trains * Fisher information Subject RIV: BD - Theory of Information Impact factor: 0.840, year: 2014

  6. Implementation of PhotoZ under Astro-WISE. A photometric redshift code for large datasets

    Science.gov (United States)

    Saglia, Roberto P.; Snigula, Jan; Senger, Robert; Bender, Ralf

    2013-01-01

    We describe the implementation of the PhotoZ code in the framework of the Astro-WISE package and as part of the Photometric Classification Server of the PanSTARRS pipeline. Both systems allow the automatic measurement of photometric redshifts for the millions of objects being observed in the PanSTARRS project or expected to be observed by future surveys like KIDS, DES or EUCLID.

  7. Durability of switchable QR code carriers under hydrolytic and photolytic conditions

    Science.gov (United States)

    Ecker, Melanie; Pretsch, Thorsten

    2013-09-01

    Following a guest diffusion approach, the surface of a shape memory poly(ester urethane) (PEU) was either black or blue colored. Bowtie-shaped quick response (QR) code carriers were then obtained from laser engraving and cutting, before thermo-mechanical functionalization (programming) was applied to stabilize the PEU in a thermo-responsive (switchable) state. The stability of the dye within the polymer surface and long-term functionality of the polymer were investigated against UVA and hydrolytic ageing. Spectrophotometric investigations verified UVA ageing-related color shifts from black to yellow-brownish and blue to petrol-greenish whereas hydrolytically aged samples changed from black to greenish and blue to light blue. In the case of UVA ageing, color changes were accompanied by dye decolorization, whereas hydrolytic ageing led to contrast declines due to dye diffusion. The Michelson contrast could be identified as an effective tool to follow ageing-related contrast changes between surface-dyed and laser-ablated (undyed) polymer regions. As soon as the Michelson contrast fell below a crucial value of 0.1 due to ageing, the QR code was no longer decipherable with a scanning device. Remarkably, the PEU information carrier base material could even then be adequately fixed and recovered. Hence, the surface contrast turned out to be the decisive parameter for QR code carrier applicability.

  8. Durability of switchable QR code carriers under hydrolytic and photolytic conditions

    International Nuclear Information System (INIS)

    Ecker, Melanie; Pretsch, Thorsten

    2013-01-01

    Following a guest diffusion approach, the surface of a shape memory poly(ester urethane) (PEU) was either black or blue colored. Bowtie-shaped quick response (QR) code carriers were then obtained from laser engraving and cutting, before thermo-mechanical functionalization (programming) was applied to stabilize the PEU in a thermo-responsive (switchable) state. The stability of the dye within the polymer surface and long-term functionality of the polymer were investigated against UVA and hydrolytic ageing. Spectrophotometric investigations verified UVA ageing-related color shifts from black to yellow-brownish and blue to petrol-greenish whereas hydrolytically aged samples changed from black to greenish and blue to light blue. In the case of UVA ageing, color changes were accompanied by dye decolorization, whereas hydrolytic ageing led to contrast declines due to dye diffusion. The Michelson contrast could be identified as an effective tool to follow ageing-related contrast changes between surface-dyed and laser-ablated (undyed) polymer regions. As soon as the Michelson contrast fell below a crucial value of 0.1 due to ageing, the QR code was no longer decipherable with a scanning device. Remarkably, the PEU information carrier base material could even then be adequately fixed and recovered. Hence, the surface contrast turned out to be the decisive parameter for QR code carrier applicability. (paper)

  9. Modeling chemical gradients in sediments under losing and gaining flow conditions: The GRADIENT code

    Science.gov (United States)

    Boano, Fulvio; De Falco, Natalie; Arnon, Shai

    2018-02-01

    Interfaces between sediments and water bodies often represent biochemical hotspots for nutrient reactions and are characterized by steep concentration gradients of different reactive solutes. Vertical profiles of these concentrations are routinely collected to obtain information on nutrient dynamics, and simple codes have been developed to analyze these profiles and determine the magnitude and distribution of reaction rates within sediments. However, existing publicly available codes do not consider the potential contribution of water flow in the sediments to nutrient transport, and their applications to field sites with significant water-borne nutrient fluxes may lead to large errors in the estimated reaction rates. To fill this gap, the present work presents GRADIENT, a novel algorithm to evaluate distributions of reaction rates from observed concentration profiles. GRADIENT is a Matlab code that extends a previously published framework to include the role of nutrient advection, and provides robust estimates of reaction rates in sediments with significant water flow. This work discusses the theoretical basis of the method and shows its performance by comparing the results to a series of synthetic data and to laboratory experiments. The results clearly show that in systems with losing or gaining fluxes, the inclusion of such fluxes is critical for estimating local and overall reaction rates in sediments.

  10. Processes underlying treatment success and failure in assertive community treatment.

    Science.gov (United States)

    Stull, Laura G; McGrew, John H; Salyers, Michelle P

    2012-02-01

    Processes underlying success and failure in assertive community treatment (ACT), a widely investigated treatment model for persons with severe mental illness, are poorly understood. The purpose of the current study was to examine processes in ACT by (1) understanding how consumers and staff describe the processes underlying treatment success and failure and (2) comparing processes identified by staff and consumers. Investigators conducted semi-structured interviews with 25 staff and 23 consumers from four ACT teams. Both staff and consumers identified aspects of the ACT team itself as the most critical in the process of consumer success. For failure, consumers identified consumer characteristics as most critical and staff identified lack of social relationships. Processes underlying failure were not viewed as merely the opposite of processes underlying success. In addition, there was notable disagreement between staff and consumers on important processes. Findings overlap with critical ingredients identified in previous studies, including aspects of the ACT team, social involvement and employment. In contrast to prior studies, there was little emphasis on hospitalizations and greater emphasis on not abusing substances, obtaining wants and desires, and consumer characteristics.

  11. Dispute settlement process under GATT/WTO diplomatic or judicial ...

    African Journals Online (AJOL)

    This paper probes the mechanisms of the dispute resolution process under the World Trade Organisation (WTO) and the General Agreement on Tariff and Trade (GATT). It tries to analyse the evolution of the dispute process which was initially based on diplomatic procedures and gives an account of its evolution and ...

  12. Narrative and emotion process in psychotherapy: an empirical test of the Narrative-Emotion Process Coding System (NEPCS).

    Science.gov (United States)

    Boritz, Tali Z; Bryntwick, Emily; Angus, Lynne; Greenberg, Leslie S; Constantino, Michael J

    2014-01-01

    While the individual contributions of narrative and emotion processes to psychotherapy outcome have been the focus of recent interest in psychotherapy research literature, the empirical analysis of narrative and emotion integration has rarely been addressed. The Narrative-Emotion Processes Coding System (NEPCS) was developed to provide researchers with a systematic method for identifying specific narrative and emotion process markers, for application to therapy session videos. The present study examined the relationship between NEPCS-derived problem markers (same old storytelling, empty storytelling, unstoried emotion, abstract storytelling) and change markers (competing plotlines storytelling, inchoate storytelling, unexpected outcome storytelling, and discovery storytelling), and treatment outcome (recovered versus unchanged at therapy termination) and stage of therapy (early, middle, late) in brief emotion-focused (EFT), client-centred (CCT), and cognitive (CT) therapies for depression. Hierarchical linear modelling analyses demonstrated a significant Outcome effect for inchoate storytelling (p = .037) and discovery storytelling (p = .002), a Stage × Outcome effect for abstract storytelling (p = .05), and a Stage × Outcome × Treatment effect for competing plotlines storytelling (p = .001). There was also a significant Stage × Outcome effect for NEPCS problem markers (p = .007) and change markers (p = .03). The results provide preliminary support for the importance of assessing the contribution of narrative-emotion processes to efficacious treatment outcomes in EFT, CCT, and CT treatments of depression.

  13. Models of neural networks temporal aspects of coding and information processing in biological systems

    CERN Document Server

    Hemmen, J; Schulten, Klaus

    1994-01-01

    Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of freedom. This capacity of a neural network rests on the fact that the neuronal action potential is a short, say 1 ms, spike, localized in space and time. Spatial as well as temporal correlations of activity may represent different states of a network. In particular, temporal correlations of activity may express that neurons process the same "object" of, for example, a visual scene by spiking at the very same time. The traditional description of a neural network through a firing rate, the famous S-shaped curve, presupposes a wide time window of, say, at least 100 ms. It thus fails to exploit the capacity to "bind" sets of coherently firing neurons for the purpose of both scene segmentation and figure-ground segregatio...

  14. Plasma Separation Process: Betacell (BCELL) code: User's manual. [Bipolar barrier junction

    Energy Technology Data Exchange (ETDEWEB)

    Taherzadeh, M.

    1987-11-13

    The emergence of clearly defined applications for (small or large) amounts of long-life and reliable power sources has given the design and production of betavoltaic systems a new life. Moreover, because of the availability of the plasma separation program, (PSP) at TRW, it is now possible to separate the most desirable radioisotopes for betacell power generating devices. A computer code, named BCELL, has been developed to model the betavoltaic concept by utilizing the available up-to-date source/cell parameters. In this program, attempts have been made to determine the betacell energy device maximum efficiency, degradation due to the emitting source radiation and source/cell lifetime power reduction processes. Additionally, comparison is made between the Schottky and PN junction devices for betacell battery design purposes. Certain computer code runs have been made to determine the JV distribution function and the upper limit of the betacell generated power for specified energy sources. A Ni beta emitting radioisotope was used for the energy source and certain semiconductors were used for the converter subsystem of the betacell system. Some results for a Promethium source are also given here for comparison. 16 refs.

  15. 28 CFR 522.12 - Relationship between existing criminal sentences imposed under the U.S. or D.C. Code and new...

    Science.gov (United States)

    2010-07-01

    ... sentences imposed under the U.S. or D.C. Code and new civil contempt commitment orders. 522.12 Section 522..., AND TRANSFER ADMISSION TO INSTITUTION Civil Contempt of Court Commitments § 522.12 Relationship between existing criminal sentences imposed under the U.S. or D.C. Code and new civil contempt commitment...

  16. The sensitivity analysis for APR1400 nodalization under Large Break LOCA condition based on mars code

    Directory of Open Access Journals (Sweden)

    Jang Hyung-Wook

    2017-01-01

    Full Text Available The phenomena of loss of coolant accident have been investigated for long time and the result of experiment shows that the flow condition in the downcomer during the end-of-blowdown were highly multi-dimensional at full-scale. However, the downcomer nodalization of input deck for large break loss of coolant accident used in advanced power reactor 1400 analyses are made up with 1-D model and improperly designed to describe realistic coolant phenomena during loss of coolant accident analysis. In this paper, the authors modified the nodalization of MARS code LBLOCA input deck and performed LBLOCA analysis with new input deck. From original LBLOCA input deck file, the nodalization of downcomer and junction connections with 4 cold legs and direct vessel injection lines are modified for reflecting the realistic cross-flow effect and real downcomer structure. The analysis results show that the peak cladding temperature of new input deck decreases more rapidly than previous result and that the drop of peak cladding temperature was advanced by application of momentum flux term in cross-flow. Additionally, the authors developed a new input deck with multi-dimensional downcomer model and ran MARS code with multi-dimensional input deck as well. By using the modified input deck, the Emergency core cooling system by-pass flow phenomena is better characterized and found to be consistent with both experimental report and regulatory guide.

  17. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  18. Long Non-Coding RNAs in Hepatitis B Virus-Related Hepatocellular Carcinoma: Regulation, Functions, and Underlying Mechanisms

    Directory of Open Access Journals (Sweden)

    Lipeng Qiu

    2017-11-01

    Full Text Available Hepatocellular carcinoma (HCC is the fifth most common cancer and the third leading cause of cancer death in the world. Hepatitis B virus (HBV and its X gene-encoded protein (HBx play important roles in the progression of HCC. Although long non-coding RNAs (lncRNAs cannot encode proteins, growing evidence indicates that they play essential roles in HCC progression, and contribute to cell proliferation, invasion and metastasis, autophagy, and apoptosis by targeting a large number of pivotal protein-coding genes, miRNAs, and signaling pathways. In this review, we briefly outline recent findings of differentially expressed lncRNAs in HBV-related HCC, with particular focus on several key lncRNAs, and discuss their regulation by HBV/HBx, their functions, and their underlying molecular mechanisms in the progression of HCC.

  19. DIONISIO 2.0: new version of the code for simulating the behavior of a power fuel rod under irradiation

    International Nuclear Information System (INIS)

    Soba, A; Denis, A; Lemes, M; Gonzalez, M E

    2012-01-01

    During the latest ten years the Codes and Models Section of the Nuclear Fuel Cycle Department has been developing the DIONISIO code, which simulates most of the main phenomena that take place within a fuel rod during the normal operation of a nuclear reactor: temperature distribution, thermal expansion, elastic and plastic strain, creep, irradiation growth, pellet-cladding mechanical interaction, fission gas release, swelling and densification. Axial symmetry is assumed and cylindrical finite elements are used to discretized the domain. The code has a modular structure and contains more than 40 interconnected models. A group of subroutines, designed to extend the application range of the fuel performance code DIONISIO to high burn up, has recently been included in the code. The new calculation tools, which are tuned for UO 2 fuels in LWR conditions, predict the radial distribution of power density, burnup and concentration of diverse nuclides within the pellet. New models of porosity and fission gas release in the rim, as well as the influence of the microstructure of this zone on the thermal conductivity of the pellet, are presently under development. A considerable computational challenge was the inclusion of the option of simulating the whole bar, by dividing it in a number of axial segments, at the user's choice, and solving in each segment the complete problem. All the general rod parameters (pressure, fission gas release, volume, etc.) are evaluated at the end of every time step. This modification allows taking into account the axial variation of the linear power and, consequently, evaluating the dependence of all the significant rod parameters with that coordinate. DIONISIO was elected for participating in the FUMEX III project of codes intercomparison, organized by IAEA, from 2008 to 2011. The results of the simulations performed within this project were compared with more than 30 experiments that involve more than 150 irradiated rods. The high number

  20. Density-matrix simulation of small surface codes under current and projected experimental noise

    Science.gov (United States)

    O'Brien, T. E.; Tarasinski, B.; DiCarlo, L.

    2017-09-01

    We present a density-matrix simulation of the quantum memory and computing performance of the distance-3 logical qubit Surface-17, following a recently proposed quantum circuit and using experimental error parameters for transmon qubits in a planar circuit QED architecture. We use this simulation to optimize components of the QEC scheme (e.g., trading off stabilizer measurement infidelity for reduced cycle time) and to investigate the benefits of feedback harnessing the fundamental asymmetry of relaxation-dominated error in the constituent transmons. A lower-order approximate calculation extends these predictions to the distance-5 Surface-49. These results clearly indicate error rates below the fault-tolerance threshold of the surface code, and the potential for Surface-17 to perform beyond the break-even point of quantum memory. However, Surface-49 is required to surpass the break-even point of computation at state-of-the-art qubit relaxation times and readout speeds.

  1. Medico-legal assessment of malpractice under the Austrian penal code.

    Science.gov (United States)

    Bauer, G

    1986-01-01

    In recent years, Austria has seen some change in the approach to errors in medical practice. The privileged position of the medical practitioner within the meaning of the former penal code, in force till 1974, no longer exists; however, errors leading to insignificant damage to the patient's health may remain free from punishment. In any case, nowadays, the categories of the dogmatics of negligence are applied to the doctor's professional activity. The traditional concept of 'malpractice' as formerly applied has virtually been displaced from the medico-legal assessment of an error in medical practice. The patient-doctor relationship based on trust is increasingly being supplemented by legal norms. Accordingly, the doctor's liability appears increasingly as the doctor's typical professional risk. Yet, in Austria, the doctor's liability is still kept within limits. The situation, with some cases in point, is analysed and described.

  2. Validation of AMPX-KENO code for criticality analysis under various moderator density condition

    International Nuclear Information System (INIS)

    Ahn, Joon Gi; Hwang, Hae Ryang; Kim, Hyeong Heon; Lee, Seong Hee

    1992-01-01

    Nuclear criticality safety analysis shall be performed for the storage and handling facilities of the fissionable materials and the calculational method used to determine the effective multiplication factor also shall be validated by comparison with proper experimental data. The benchmark calculations were performed for the criticality analysis of new fuel storage facility using AMPX-KENO computer code system. The reference of the benchmark calculations are the critical experiments performed by the Nuclear Safety Department of the French Atomic Energy Commission to study the problems raised by the accidental sprinkling of a mist into a fuel storage. The bias and statistical uncertainties of the calculational method that will be applied in the criticality analysis of new fuel storage facility were also evaluated

  3. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  4. Classification and coding of commercial fishing injuries by work processes: an experience in the Danish fresh market fishing industry

    DEFF Research Database (Denmark)

    Jensen, Olaf Chresten; Stage, Søren; Noer, Preben

    2005-01-01

    BACKGROUND: Work-related injuries in commercial fishing are of concern internationally. To better identify the causes of injury, this study coded occupational injuries by working processes in commercial fishing for fresh market fish. METHODS: A classification system of the work processes...... to working with the gear and nets vary greatly in the different fishing methods. Coding of the injuries to the specific working processes allows for targeted prevention efforts....... was developed by participation in fishing vessel trips where observations and video recordings of the work operations on board were collected. Subsequently the system was pilot tested using the Danish Maritime Authority injury reports. RESULTS: The developed classification system contains 17 main categories...

  5. The levels of processing effect under nitrogen narcosis.

    Science.gov (United States)

    Kneller, Wendy; Hobbs, Malcolm

    2013-01-01

    Previous research has consistently demonstrated that inert gas (nitrogen) narcosis affects free recall but not recognition memory in the depth range of 30 to 50 meters of sea water (msw), possibly as a result of narcosis preventing processing when learned material is encoded. The aim of the current research was to test this hypothesis by applying a levels of processing approach to the measurement of free recall under narcosis. Experiment 1 investigated the effect of depth (0-2 msw vs. 37-39 msw) and level of processing (shallow vs. deep) on free recall memory performance in 67 divers. When age was included as a covariate, recall was significantly worse in deep water (i.e., under narcosis), compared to shallow water, and was significantly higher in the deep processing compared to shallow processing conditions in both depth conditions. Experiment 2 demonstrated that this effect was not simply due to the different underwater environments used for the depth conditions in Experiment 1. It was concluded memory performance can be altered by processing under narcosis and supports the contention that narcosis affects the encoding stage of memory as opposed to self-guided search (retrieval).

  6. The integrated design and archive of space-borne signal processing and compression coding

    Science.gov (United States)

    He, Qiang-min; Su, Hao-hang; Wu, Wen-bo

    2017-10-01

    With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.

  7. 9 CFR 355.25 - Canning with heat processing and hermetically sealed containers; closures; code marking; heat...

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Canning with heat processing and hermetically sealed containers; closures; code marking; heat processing; incubation. 355.25 Section 355.25... CERTIFICATION CERTIFIED PRODUCTS FOR DOGS, CATS, AND OTHER CARNIVORA; INSPECTION, CERTIFICATION, AND...

  8. Dress codes and appearance policies: challenges under federal legislation, part 2: title VII of the civil rights act and gender.

    Science.gov (United States)

    Mitchell, Michael S; Koen, Clifford M; Darden, Stephen M

    2014-01-01

    As more and more individuals express themselves with tattoos and body piercings and push the envelope on what is deemed appropriate in the workplace, employers have an increased need for creation and enforcement of reasonable dress codes and appearance policies. As with any employment policy or practice, an appearance policy must be implemented and enforced without regard to an individual's race, color, gender, national origin, religion, disability, age, or other protected status. A policy governing dress and appearance based on the business needs of an employer that is applied fairly and consistently and does not have a disproportionate effect on any protected class will generally be upheld if challenged in court. By examining some of the more common legal challenges to dress codes and how courts have resolved the disputes, health care managers can avoid many potential problems. This article, the second part of a 3-part examination of dress codes and appearance policies, focuses on the issue of gender under the Civil Rights Act of 1964. Pertinent court cases that provide guidance for employers are addressed.

  9. Analysis of error floor of LDPC codes under LP decoding over the BSC

    Energy Technology Data Exchange (ETDEWEB)

    Chertkov, Michael [Los Alamos National Laboratory; Chilappagari, Shashi [UNIV OF AZ; Vasic, Bane [UNIV OF AZ; Stepanov, Mikhail [UNIV OF AZ

    2009-01-01

    We consider linear programming (LP) decoding of a fixed low-density parity-check (LDPC) code over the binary symmetric channel (BSC). The LP decoder fails when it outputs a pseudo-codeword which is not a codeword. We propose an efficient algorithm termed the instanton search algorithm (ISA) which, given a random input, generates a set of flips called the BSC-instanton and prove that: (a) the LP decoder fails for any set of flips with support vector including an instanton; (b) for any input, the algorithm outputs an instanton in the number of steps upper-bounded by twice the number of flips in the input. We obtain the number of unique instantons of different sizes by running the ISA sufficient number of times. We then use the instanton statistics to predict the performance of the LP decoding over the BSC in the error floor region. We also propose an efficient semi-analytical method to predict the performance of LP decoding over a large range of transition probabilities of the BSC.

  10. Reach tracking reveals dissociable processes underlying cognitive control.

    Science.gov (United States)

    Erb, Christopher D; Moher, Jeff; Sobel, David M; Song, Joo-Hyun

    2016-07-01

    The current study uses reach tracking to investigate how cognitive control is implemented during online performance of the Stroop task (Experiment 1) and the Eriksen flanker task (Experiment 2). We demonstrate that two of the measures afforded by reach tracking, initiation time and reach curvature, capture distinct patterns of effects that have been linked to dissociable processes underlying cognitive control in electrophysiology and functional neuroimaging research. Our results suggest that initiation time reflects a response threshold adjustment process involving the inhibition of motor output, while reach curvature reflects the degree of co-activation between response alternatives registered by a monitoring process over the course of a trial. In addition to shedding new light on fundamental questions concerning how these processes contribute to the cognitive control of behavior, these results present a framework for future research to investigate how these processes function across different tasks, develop across the lifespan, and differ among individuals. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Explaining individual differences in cognitive processes underlying hindsight bias.

    Science.gov (United States)

    Coolin, Alisha; Erdfelder, Edgar; Bernstein, Daniel M; Thornton, Allen E; Thornton, Wendy Loken

    2015-04-01

    After learning an event's outcome, people's recollection of their former prediction of that event typically shifts toward the actual outcome. Erdfelder and Buchner (Journal of Experimental Psychology: Learning, Memory, and Cognition, 24, 387-414, 1998) developed a multinomial processing tree (MPT) model to identify the underlying processes contributing to this hindsight bias (HB) phenomenon. More recent applications of this model have revealed that, in comparison to younger adults, older adults are more susceptible to two underlying HB processes: recollection bias and reconstruction bias. However, the impact of cognitive functioning on these processes remains unclear. In this article, we extend the MPT model for HB by incorporating individual variation in cognitive functioning into the estimation of the model's core parameters in older and younger adults. In older adults, our findings revealed that (1) better episodic memory was associated with higher recollection ability in the absence of outcome knowledge, (2) better episodic memory and inhibitory control and higher working memory capacity were associated with higher recollection ability in the presence of outcome knowledge, and (3) better inhibitory control was associated with less reconstruction bias. Although the pattern of effects was similar in younger adults, the cognitive covariates did not significantly predict the underlying HB processes in this age group. In sum, we present a novel approach to modeling individual variability in MPT models. We applied this approach to the HB paradigm to identify the cognitive mechanisms contributing to the underlying HB processes. Our results show that working memory capacity and inhibitory control, respectively, drive individual differences in recollection bias and reconstruction bias, particularly in older adults.

  12. Analysis of UO{sub 2}-BeO fuel under transient using fuel performance code

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Abe, Alfredo Y.; Muniz, Rafael O.R.; Giovedi, Claudia, E-mail: dsgomes@ipen.br, E-mail: alfredo@ctmsp.mar.mil.br, E-mail: rafael.orm@gmail.com, E-mail: claudia.giovedi@ctmsp.mar.mil.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Universidade de São Paulo (USP), São Paulo, SP (Brazil). Departamento de Engenharia Naval e Oceânica

    2017-11-01

    Recent research has appointed the need to replace the classic fuel concept, used in light water reactors. Uranium dioxide has a weak point due to the low thermal conductivity, that produce high temperatures on the fuel. The ceramic composite fuel formed of uranium dioxide (UO{sub 2}), with the addition of beryllium oxide (BeO), presents high thermal conductivity compared with UO{sub 2}. The oxidation of zirconium generates hydrogen gas that can create a detonation condition. One of the preferred options are the ferritic alloys formed of iron-chromium and aluminum (FeCrAl), that should avoid the hydrogen release due to oxidation. In general, the FeCrAl alloys containing 10 - 20Cr, 3 - 5Al, and 0 - 0.12Y in weight percent. The FeCrAl alloys should exhibit a slow oxidation kinetics due to chemical composition. Resistance to oxidation in the presence of steam is improved as a function of the content of chromium and aluminum. In this way, the thermal and mechanical properties of the UO{sub 2}-BeO-10%vol, composite fuel were coupled with FeCrAl alloys and added to the fuel codes. In this work, we examine the fuel rod behavior of UO{sub 2}-10%vol-BeO/FeCrAl, including a simulated transient of reactivity. The fuels behavior shown reduced temperature with UO{sub 2}-BeO/Zr, UO{sub 2}-BeO/FeCrAl also were compared with UO{sub 2}/Zr system. The case reactivity initiated accident analyzed, reproducing the fuel rod called VA-1 using UO{sub 2}/Zr alloys and compared with UO{sub 2}-BeO/FeCrAl. (author)

  13. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  14. Multiple optical code-label processing using multi-wavelength frequency comb generator and multi-port optical spectrum synthesizer.

    Science.gov (United States)

    Moritsuka, Fumi; Wada, Naoya; Sakamoto, Takahide; Kawanishi, Tetsuya; Komai, Yuki; Anzai, Shimako; Izutsu, Masayuki; Kodate, Kashiko

    2007-06-11

    In optical packet switching (OPS) and optical code division multiple access (OCDMA) systems, label generation and processing are key technologies. Recently, several label processors have been proposed and demonstrated. However, in order to recognize N different labels, N separate devices are required. Here, we propose and experimentally demonstrate a large-scale, multiple optical code (OC)-label generation and processing technology based on multi-port, a fully tunable optical spectrum synthesizer (OSS) and a multi-wavelength electro-optic frequency comb generator. The OSS can generate 80 different OC-labels simultaneously and can perform 80-parallel matched filtering. We also demonstrated its application to OCDMA.

  15. Model, parameter and code of environmental dispersion of gaseous effluent under normal operation from nuclear power plant with 600 MWe

    International Nuclear Information System (INIS)

    Hu Erbang; Gao Zhanrong

    1998-06-01

    The model of environmental dispersion of gaseous effluence under normal operation from a nuclear power plant with 600 MWe is established to give a mathematical expression of annual mean atmospheric dispersion factor under mixing release condition based on quality assessment of radiological environment for 30 years of Chinese nuclear industry. In calculation, the impact from calm and other following factors have been taken into account: mixing layer, dry and wet deposition, radioactive decay and buildings. The doses caused from the following exposure pathways are also given by this model: external exposure from immersion cloud and ground deposition, internal exposure due to inhalation and ingestion. The code is named as ROULEA. It contains four modules, i.e. INPUT, ANRTRI, CHIQV and DOSE for calculating 4-dimension joint frequency, annual mean atmospheric dispersion factor and doses

  16. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  17. Ultrasonic signal processing for sizing under-clad flaws

    International Nuclear Information System (INIS)

    Shankar, R.; Paradiso, T.J.; Lane, S.S.; Quinn, J.R.

    1985-01-01

    Ultrasonic digital data were collected from underclad cracks in sample pressure vessel specimen blocks. These blocks were weld cladded under different processes to simulate actual conditions in US Pressure Water Reactors. Each crack was represented by a flaw-echo dynamic curve which is a plot of the transducer motion on the surface as a function of the ultrasonic response into the material. Crack depth sizing was performed by identifying in the dynamic curve the crack tip diffraction signals from the upper and lower tips. This paper describes the experimental procedure, digital signal processing methods used and algorithms developed for crack depth sizing

  18. Contractual Penalty and the Right to Payment for Delays Caused by Force Majeure in Czech Civil Law under the New Civil Code

    Directory of Open Access Journals (Sweden)

    Janku Martin

    2015-12-01

    Full Text Available In the context of the conclusion of contracts between entrepreneurs under the Czech Civil Code, it is a relatively common arrangement that the parties disclaim any and all liability for damage arising from non-compliance with contractual obligations, if they can prove that this failure was due to an obstacle independent of their will. This circumstance excluding liability for the damage is called force majeure by the theory. In many countries this circumstance is ruled upon directly by the legislation (höhere Gewalt, vis major. The Czech regulations represented by the new Civil Code of 2012 (CivC, however, contains only a framework provision that mentions discharging reasons. The paper deals with the – rather disputable – issue that the force majeure does not affect the obligation to pay a contractual penalty under the new rules of the CivC. It should be therefore reflected in the arrangements for contractual penalties inter partes. To this effect the paper analyses the concepts of contractual penalties and force majeure in civil law legislation. Afterwards it compares their mutual relationship and impact on the obligations of the Contracting Parties. Finally, it draws recommendations for practice from the perspective of the contracting process.

  19. Identification and Functional Analysis of Long Intergenic Non-coding RNAs Underlying Intramuscular Fat Content in Pigs

    Directory of Open Access Journals (Sweden)

    Cheng Zou

    2018-03-01

    Full Text Available Intramuscular fat (IMF content is an important trait that can affect pork quality. Previous studies have identified many genes that can regulate IMF. Long intergenic non-coding RNAs (lincRNAs are emerging as key regulators in various biological processes. However, lincRNAs related to IMF in pig are largely unknown, and the mechanisms by which they regulate IMF are yet to be elucidated. Here we reconstructed 105,687 transcripts and identified 1,032 lincRNAs in pig longissimus dorsi muscle (LDM of four stages with different IMF contents based on published RNA-seq. These lincRNAs show typical characteristics such as shorter length and lower expression compared with protein-coding genes. Combined with methylation data, we found that both the promoter and genebody methylation of lincRNAs can negatively regulate lincRNA expression. We found that lincRNAs exhibit high correlation with their protein-coding neighbors in expression. Co-expression network analysis resulted in eight stage-specific modules, gene ontology and pathway analysis of them suggested that some lincRNAs were involved in IMF-related processes, such as fatty acid metabolism and peroxisome proliferator-activated receptor signaling pathway. Furthermore, we identified hub lincRNAs and found six of them may play important roles in IMF development. This work detailed some lincRNAs which may affect of IMF development in pig, and facilitated future research on these lincRNAs and molecular assisted breeding for pig.

  20. Professional Practice and Innovation: The Coding Masterpiece: A Framework for the Formal Pathways and Processes of Health Classification.

    Science.gov (United States)

    Price, Emily; Robinson, Kerin

    2011-03-01

    This article empirically defines the formal pathways and processes that enable and frame hospital clinical classification in an activity-based funding environment. These structured actions include: learning and training; abstracting; clinical knowledge locating and confirming; coder-doctor communication; coder-coder communication; the complicated sub-set of code searching and decision-making processes that constitute practical clinical 'coding'; allocation to diagnosis-related groups; confirmation of financial reimbursement; auditing; and quality management practices to ensure the integrity of the multiple outputs and outcomes of clinical coding. An analogy of these complex, exacting, and knowledge-dense work practices is made with the 20th century avant-garde art movement of Cubism: the creation of Pablo Picasso's The three musicians is used as a metaphor for clinical/health classification work.

  1. Nationwide Risk-Based PCB Remediation Waste Disposal Approvals under Title 40 of the Code of Federal Regulations (CFR) Section 761.61(c)

    Science.gov (United States)

    This page contains information about Nationwide Risk-Based Polychlorinated Biphenyls (PCBs) Remediation Waste Disposal Approvals under Title 40 of the Code of Federal Regulations (CFR) Section 761.61(c)

  2. Citizen Action Can Help the Code Adoption Process for Radon-Resistant New Construction: Decatur, Alabama

    Science.gov (United States)

    Adopting a code requiring radon-resistant new construction (RRNC) in Decatur, Alabama, took months of effort by four people. Their actions demonstrate the influence that passionate residents can have on reversing a city council’s direction.

  3. Aerobic storage under dynamic conditions in activated sludge processes

    DEFF Research Database (Denmark)

    Majone, M.; Dircks, K.

    1999-01-01

    In activated sludge processes, several plant configurations (like plug-flow configuration of the aeration tanks, systems with selectors, contact-stabilization processes or SBR processes) impose a concentration gradient of the carbon sources to the biomass. As a consequence, the biomass grows under...... mechanisms can also contribute to substrate removal, depending on the microbial composition and the previous "history" of the biomass. In this paper the type and the extent of this dynamic response is discussed by review of experimental studies on pure cultures, mixed cultures and activated sludges...... and with main reference to its relevance on population dynamics in the activated sludge. Possible conceptual approaches to storage modelling are also presented, including both structured and unstructured modelling. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  4. CRT code and results of some studies for power, reactivity and temperatures for LMFBR fuel rods under operational transients

    International Nuclear Information System (INIS)

    Om Pal Singh; Ponpondi, S.; Parikh, M.V.

    1985-01-01

    The paper describes the details of the computer code, CRT, that has been developed to study the reactor power, reactivity and temperature transients in LMFBR fuel rods under operational transients. The code is based upon suitable modelling of reactor neutron kinetics, heat transfer phenomena and reactivity feedback effects coming from: axial/radial expansion of the fuel/clad/coolant; core boundary movement to axial and radial blankets; sodium entry/expulsion during the radial expansion of the core and apparent insertion/removal of control rods during core/reactor vessel expansion. Further, the results of some studies like, transient behaviour of reactor power and temperature distribution in fuel rods for reactor at low and high initial powers and for fresh and irradiated fuels; comparison of lumped and exact heat transfer models and evaluation of limiting reactivity addition rates for the lumped model to be adequate; influence of the gap conductance on the temperature distributions inside the fuel pellet and the clad and break up of reactivity coming from different feedback mechanisms and their dependence upon the heat transfer parameters are also presented. (author)

  5. A neurosemantic theory of concrete noun representation based on the underlying brain codes.

    Directory of Open Access Journals (Sweden)

    Marcel Adam Just

    Full Text Available This article describes the discovery of a set of biologically-driven semantic dimensions underlying the neural representation of concrete nouns, and then demonstrates how a resulting theory of noun representation can be used to identify simple thoughts through their fMRI patterns. We use factor analysis of fMRI brain imaging data to reveal the biological representation of individual concrete nouns like apple, in the absence of any pictorial stimuli. From this analysis emerge three main semantic factors underpinning the neural representation of nouns naming physical objects, which we label manipulation, shelter, and eating. Each factor is neurally represented in 3-4 different brain locations that correspond to a cortical network that co-activates in non-linguistic tasks, such as tool use pantomime for the manipulation factor. Several converging methods, such as the use of behavioral ratings of word meaning and text corpus characteristics, provide independent evidence of the centrality of these factors to the representations. The factors are then used with machine learning classifier techniques to show that the fMRI-measured brain representation of an individual concrete noun like apple can be identified with good accuracy from among 60 candidate words, using only the fMRI activity in the 16 locations associated with these factors. To further demonstrate the generativity of the proposed account, a theory-based model is developed to predict the brain activation patterns for words to which the algorithm has not been previously exposed. The methods, findings, and theory constitute a new approach of using brain activity for understanding how object concepts are represented in the mind.

  6. Prediction of BWR performance under the influence of Isolation Condenser-using RAMONA-4 code

    International Nuclear Information System (INIS)

    Khan, H.J.; Cheng, H.S.; Rohatgi, U.S.

    1992-01-01

    The purpose of the Boiling Water Reactor (BWR) Isolation Condenser (IC) is to passively control the reactor pressure by removing heat from the system. This type of control is expected to reduce the frequency of opening and closing of the Safety Relief Valves (SRV). A comparative analysis is done for a BWR operating with and without the influence of an IC under Main Steam Isolation Valve (MSIV) closure. A regular BWR, with forced flow and high thermal power, has been considered for analysis. In addition, the effect of ICs on the BWR performance is studied for natural convection flow at lower power and modified riser geometry. The IC is coupled to the steam dome for the steam inlet flow and the Reactor Pressure Vessel (RPV) near the feed water entrance for the condensate return flow. Transient calculations are performed using prescribed pressure set points for the SRVs and given time settings for MSIV closure. The effect of the IC on the forced flow is to reduce the rate of pressure rise and thereby decrease the cycling frequency ofthe SRVS. This is the primary objective of any operating IC in a BWR (e.g. Oyster Creek). The response of the reactor thermal and fission power, steam flow rate, collapsed liquid level, and core average void fraction are found to agree with the trend of pressure. The variations in the case of an active IC can be closely related to the creation of a time lag and changes in the cycling frequency of the SRVS. An analysis for natural convection flow in a BWR indicates that the effect of an IC on its transient performance is similar to that for the forced convection system. In this case, the MSIV closure, has resulted in a lower peak pressure due to the magnitude of reduced power. However, the effect of reduced cycling frequency of the SRV due to the IC, and the time lag between the events, are comparable to that for forced convection

  7. Probe code: a set of programs for processing and analysis of the left ventricular function - User's manual

    International Nuclear Information System (INIS)

    Piva, R.M.V.

    1987-01-01

    The User's Manual of the Probe Code is an addendum to the M.Sc. thesis entitled A Microcomputer System of Nuclear Probe to Check the Left Ventricular Function. The Probe Code is a software which was developed for processing and off-line analysis curves from the Left Ventricular Function, that were obtained in vivo. These curves are produced by means of an external scintigraph probe, which was collimated and put on the left ventricule, after a venous inoculation of Tc-99 m. (author)

  8. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  9. Studying the co-evolution of production and test code in open source and industrial developer test processes through repository mining

    NARCIS (Netherlands)

    Zaidman, A.; Van Rompaey, B.; Van Deursen, A.; Demeyer, S.

    2010-01-01

    Many software production processes advocate rigorous development testing alongside functional code writing, which implies that both test code and production code should co-evolve. To gain insight in the nature of this co-evolution, this paper proposes three views (realized by a tool called TeMo)

  10. Performance analysis of spectral-phase-encoded optical code-division multiple-access system regarding the incorrectly decoded signal as a nonstationary random process

    Science.gov (United States)

    Yan, Meng; Yao, Minyu; Zhang, Hongming

    2005-11-01

    The performance of a spectral-phase-encoded (SPE) optical code-division multiple-access (OCDMA) system is analyzed. Regarding the incorrectly decoded signal (IDS) as a nonstationary random process, we derive a novel probability distribution for it. The probability distribution of the IDS is considered a chi-squared distribution with degrees of freedom r=1, which is more reasonable and accurate than in previous work. The bit error rate (BER) of an SPE OCDMA system under multiple-access interference is evaluated. Numerical results show that the system can sustain very low BER even when there are multiple simultaneous users, and as the code length becomes longer or the initial pulse becomes shorter, the system performs better.

  11. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    Science.gov (United States)

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  12. Image Processing Code for Sharpening Photoelastic Fringe Patterns and Its Usage in Determination of Stress Intensity Factors in a Sample Contact Problem

    OpenAIRE

    Khaleghian, Seyedmeysam; Emami, Anahita; Soltani, Nasser

    2015-01-01

    This study presented a type of image processing code which is used for sharpening photoelastic fringe patterns of transparent materials in photoelastic experiences to determine the stress distribution. C-Sharp software was utilized for coding the algorithm of this image processing method. For evaluation of this code, the results of a photoelastic experience of a sample contact problem between a half-plane with an oblique edge crack and a tilted wedge using this image processing method was com...

  13. Efficient Option Pricing under Levy Processes, with CVA and FVA

    Directory of Open Access Journals (Sweden)

    Jimmy eLaw

    2015-07-01

    Full Text Available We generalize the Piterbarg (2010 model to include 1 bilateral default risk as in Burgard and Kjaer (2012, and 2 jumps in the dynamics of the underlying asset using general classes of L'evy processes of exponential type. We develop an efficient explicit-implicit scheme for European options and barrier options taking CVA-FVA into account. We highlight the importance of this work in the context of trading, pricing and management a derivative portfolio given the trajectory of regulations.

  14. The effects of perceptual load on semantic processing under inattention.

    Science.gov (United States)

    Koivisto, Mika; Revonsuo, Antti

    2009-10-01

    Inattentional blindness refers to a failure to consciously detect an irrelevant object that appears without any expectation when attention is engaged with another task. The perceptual load theory predicts that task-irrelevant stimuli will reach awareness only when the primary task is of low load, which allows processing resources to spill over to processing task-irrelevant stimuli as well. We studied whether perceptual load has an effect on inattentional blindness for a task-irrelevant stimulus whose meaning is or is not relevant to the attentional goals of the observer. In the critical trial, a word appeared without any expectation in the center of a display of attended pictures. The results showed that, under both high and low load, unexpected words belonging to the attended semantic category were detected more often than semantically unrelated words. These results imply that task-irrelevant stimuli, whose meanings are relevant to the observer's task, enter awareness irrespective of perceptual load.

  15. The signal processing architecture underlying subjective reports of sensory awareness.

    Science.gov (United States)

    Maniscalco, Brian; Lau, Hakwan

    2016-01-01

    What is the relationship between perceptual information processing and subjective perceptual experience? Empirical dissociations between stimulus identification performance and subjective reports of stimulus visibility are crucial for shedding light on this question. We replicated a finding that metacontrast masking can produce such a dissociation (Lau and Passingham, 2006), and report a novel finding that this paradigm can also dissociate stimulus identification performance from the efficacy with which visibility ratings predict task performance. We explored various hypotheses about the relationship between perceptual task performance and visibility rating by implementing them in computational models and using formal model comparison techniques to assess which ones best captured the unusual patterns in the data. The models fell into three broad categories: Single Channel models, which hold that task performance and visibility ratings are based on the same underlying source of information; Dual Channel models, which hold that there are two independent processing streams that differentially contribute to task performance and visibility rating; and Hierarchical models, which hold that a late processing stage generates visibility ratings by evaluating the quality of early perceptual processing. Taking into account the quality of data fitting and model complexity, we found that Hierarchical models perform best at capturing the observed behavioral dissociations. Because current theories of visual awareness map well onto these different model structures, a formal comparison between them is a powerful approach for arbitrating between the different theories.

  16. Bar code hotel: diverse interactions of semi-autonomous entities under the partial control of multiple operators

    Science.gov (United States)

    Hoberman, Perry

    1995-03-01

    In this paper I describe an interactive installation that was produced in 1994 as one of eight Art and Virtual Environments projects sponsored by the Banff Center for the Arts. The installation, Bar Code Hotel, makes use of a number of strategies to create a casual, social, multi-person interface. Among the goals was to investigate methods that would minimize any significant learning curve, allowing visitors to immediately interact with a virtual world in a meaningful way. By populating this virtual world with semi-independent entities that could be directed by participants even as these entities were interacting with each other, a rich and heterogeneous experience was produced in which a variety of relationships between human participants and virtual objects could be examined. The paper will describe some of the challenges of simultaneously processing multiple input sources affecting a virtual environment in which each object already has its own ongoing behavior.

  17. 77 FR 13098 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct

    Science.gov (United States)

    2012-03-05

    ...=gsmaprivacydesignguidelinesformobileapplicationdevelopmentv1.pdf ; Mobile Marketing Association, Global Code of Conduct, July 15, 2008, available at http... mobile device applications (``mobile apps''). Mobile apps are gaining in social and economic importance.\\10\\ However, as several commenters on the Privacy and Innovation Green Paper noted, mobile devices...

  18. Transcending Rationalism and Constructivism: Chinese Leaders’ Operational Codes, Socialization Processes, and Multilateralism after the Cold War

    DEFF Research Database (Denmark)

    He, Kai; Feng, Huiyun

    2015-01-01

    This paper challenges both rationalist and constructivist approaches in explaining China’s foreign policy behavior toward multilateral institutions after the Cold War. Borrowing insights from socialization theory and operational code analysis, this paper suggests a ‘superficial socialization’ arg...

  19. Independent component processes underlying emotions during natural music listening.

    Science.gov (United States)

    Rogenmoser, Lars; Zollinger, Nina; Elmer, Stefan; Jäncke, Lutz

    2016-09-01

    The aim of this study was to investigate the brain processes underlying emotions during natural music listening. To address this, we recorded high-density electroencephalography (EEG) from 22 subjects while presenting a set of individually matched whole musical excerpts varying in valence and arousal. Independent component analysis was applied to decompose the EEG data into functionally distinct brain processes. A k-means cluster analysis calculated on the basis of a combination of spatial (scalp topography and dipole location mapped onto the Montreal Neurological Institute brain template) and functional (spectra) characteristics revealed 10 clusters referring to brain areas typically involved in music and emotion processing, namely in the proximity of thalamic-limbic and orbitofrontal regions as well as at frontal, fronto-parietal, parietal, parieto-occipital, temporo-occipital and occipital areas. This analysis revealed that arousal was associated with a suppression of power in the alpha frequency range. On the other hand, valence was associated with an increase in theta frequency power in response to excerpts inducing happiness compared to sadness. These findings are partly compatible with the model proposed by Heller, arguing that the frontal lobe is involved in modulating valenced experiences (the left frontal hemisphere for positive emotions) whereas the right parieto-temporal region contributes to the emotional arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access......, in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded...

  1. Computer modelling of the WWER fuel elements under high burnup conditions by the computer codes PIN-W and RODQ2D

    International Nuclear Information System (INIS)

    Valach, M.; Zymak, J.; Svoboda, R.

    1997-01-01

    This paper presents the development status of the computer codes for the WWER fuel elements thermomechanical behavior modelling under high burnup conditions at the Nuclear Research Institute Rez. The accent is given on the analysis of the results from the parametric calculations, performed by the programmes PIN-W and RODQ2D, rather than on their detailed theoretical description. Several new optional correlations for the UO2 thermal conductivity with degradation effect caused by burnup were implemented into the both codes. Examples of performed calculations document differences between previous and new versions of both programmes. Some recommendations for further development of the codes are given in conclusion. (author). 6 refs, 9 figs

  2. Neural processes underlying the orienting of attention without awareness.

    Science.gov (United States)

    Giattino, Charles M; Alam, Zaynah M; Woldorff, Marty G

    2017-07-22

    Despite long being of interest to both philosophers and scientists, the relationship between attention and perceptual awareness is not well understood, especially to what extent they are even dissociable. Previous studies have shown that stimuli of which we are unaware can orient spatial attention and affect behavior. Yet, relatively little is understood about the neural processes underlying such unconscious orienting of attention, and how they compare to conscious orienting. To directly compare the cascade of attentional processes with and without awareness of the orienting stimulus, we employed a spatial-cueing paradigm and used object-substitution masking to manipulate subjects' awareness of the cues. We recorded EEG during the task, from which we extracted hallmark event-related-potential (ERP) indices of attention. Behaviorally, there was a 61 ms validity effect (invalidly minus validly cued target RTs) on cue-aware trials. On cue-unaware trials, subjects also had a robust validity effect of 20 ms, despite being unaware of the cue. An N2pc to the cue, a hallmark ERP index of the lateralized orienting of attention, was observed for cue-aware but not cue-unaware trials, despite the latter showing a clear behavioral validity effect. Finally, the P1 sensory-ERP response to the targets was larger when validly versus invalidly cued, even when subjects were unaware of the preceding cue, demonstrating enhanced sensory processing of targets following subliminal cues. These results suggest that subliminal stimuli can orient attention and lead to subsequent enhancements to both stimulus sensory processing and behavior, but through different neural mechanisms (such as via a subcortical pathway) than stimuli we perceive. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Vectorization of KENO IV code and an estimate of vector-parallel processing

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Higuchi, Kenji; Katakura, Jun-ichi; Kurita, Yutaka.

    1986-10-01

    The multi-group criticality safety code KENO IV has been vectorized and tested on FACOM VP-100 vector processor. At first the vectorized KENO IV on a scalar processor became slower than the original one by a factor of 1.4 because of the overhead introduced by the vectorization. Making modifications of algorithms and techniques for vectorization, the vectorized version has become faster than the original one by a factor of 1.4 and 3.0 on the vector processor for sample problems of complex and simple geometries, respectively. For further speedup of the code, some improvements on compiler and hardware, especially on addition of Monte Carlo pipelines to the vector processor, are discussed. Finally a pipelined parallel processor system is proposed and its performance is estimated. (author)

  4. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  5. Using grounded theory coding mechanisms to analyze case study and focus group data in the context of software process research

    OpenAIRE

    O'Connor, Rory

    2012-01-01

    The primary aim of this chapter is to outline a potentially powerful framework for the combination of research approaches utilizing the Grounded Theory coding mechanism for Case Study, and Focus Groups data analysis. A secondary aim of this chapter is to provide a roadmap for such a usage by way of an example research project. The context for this project is the need to study and evaluate the actual practice of software development processes in real world commercial settings of software compa...

  6. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  7. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  8. Cognitive Processes in Decisions Under Risk Are Not the Same As in Decisions Under Uncertainty

    Directory of Open Access Journals (Sweden)

    Kirsten G Volz

    2012-07-01

    Full Text Available We deal with risk versus uncertainty, a distinction that is of fundamental importance for cognitive neuroscience yet largely neglected. In a world of risk (small world, all alternatives, consequences, and probabilities are known. In uncertain (large worlds, some of this information is unknown or unknowable. Most of cognitive neuroscience studies exclusively study the neural correlates for decisions under risk (e.g., lotteries, with the tacit implication that understanding these would lead to an understanding of decision making in general. First, we show that normative strategies for decisions under risk do not generalize to uncertain worlds, where simple heuristics are often the more accurate strategies. Second, we argue that the cognitive processes for making decisions in a world of risk are not the same as those for dealing with uncertainty. Because situations with known risks are the exception rather than the rule in human evolution, it is unlikely that our brains are adapted to them. We therefore suggest a paradigm shift towards studying decision processes in uncertain worlds and provide first examples.

  9. Global Intersection of Long Non-Coding RNAs with Processed and Unprocessed Pseudogenes in the Human Genome

    Directory of Open Access Journals (Sweden)

    Michael John Milligan

    2016-03-01

    Full Text Available Pseudogenes are abundant in the human genome and had long been thought of purely as nonfunctional gene fossils. Recent observations point to a role for pseudogenes in regulating genes transcriptionally and post-transcriptionally in human cells. To computationally interrogate the network space of integrated pseudogene and long non-coding RNA regulation in the human transcriptome, we developed and implemented an algorithm to identify all long non-coding RNA (lncRNA transcripts that overlap the genomic spans, and specifically the exons, of any human pseudogenes in either sense or antisense orientation. As inputs to our algorithm, we imported three public repositories of pseudogenes: GENCODE v17 (processed and unprocessed, Ensembl 72; Retroposed Pseudogenes V5 (processed only and Yale Pseudo60 (processed and unprocessed, Ensembl 60; two public lncRNA catalogs: Broad Institute, GENCODE v17; NCBI annotated piRNAs; and NHGRI clinical variants. The data sets were retrieved from the UCSC Genome Database using the UCSC Table Browser. We identified 2277 loci containing exon-to-exon overlaps between pseudogenes, both processed and unprocessed, and long non-coding RNA genes. Of these loci we identified 1167 with Genbank EST and full-length cDNA support providing direct evidence of transcription on one or both strands with exon-to-exon overlaps. The analysis converged on 313 pseudogene-lncRNA exon-to-exon overlaps that were bidirectionally supported by both full-length cDNAs and ESTs. In the process of identifying transcribed pseudogenes, we generated a comprehensive, positionally non-redundant encyclopedia of human pseudogenes, drawing upon multiple, and formerly disparate public pseudogene repositories. Collectively, these observations suggest that pseudogenes are pervasively transcribed on both strands and are common drivers of gene regulation.

  10. Neural processes underlying cultural differences in cognitive persistence.

    Science.gov (United States)

    Telzer, Eva H; Qu, Yang; Lin, Lynda C

    2017-08-01

    Self-improvement motivation, which occurs when individuals seek to improve upon their competence by gaining new knowledge and improving upon their skills, is critical for cognitive, social, and educational adjustment. While many studies have delineated the neural mechanisms supporting extrinsic motivation induced by monetary rewards, less work has examined the neural processes that support intrinsically motivated behaviors, such as self-improvement motivation. Because cultural groups traditionally vary in terms of their self-improvement motivation, we examined cultural differences in the behavioral and neural processes underlying motivated behaviors during cognitive persistence in the absence of extrinsic rewards. In Study 1, 71 American (47 females, M=19.68 years) and 68 Chinese (38 females, M=19.37 years) students completed a behavioral cognitive control task that required cognitive persistence across time. In Study 2, 14 American and 15 Chinese students completed the same cognitive persistence task during an fMRI scan. Across both studies, American students showed significant declines in cognitive performance across time, whereas Chinese participants demonstrated effective cognitive persistence. These behavioral effects were explained by cultural differences in self-improvement motivation and paralleled by increasing activation and functional coupling between the inferior frontal gyrus (IFG) and ventral striatum (VS) across the task among Chinese participants, neural activation and coupling that remained low in American participants. These findings suggest a potential neural mechanism by which the VS and IFG work in concert to promote cognitive persistence in the absence of extrinsic rewards. Thus, frontostriatal circuitry may be a neurobiological signal representing intrinsic motivation for self-improvement that serves an adaptive function, increasing Chinese students' motivation to engage in cognitive persistence. Copyright © 2017 Elsevier Inc. All rights

  11. Study of counter current flow limitation model of MARS-KS and SPACE codes under Dukler's air/water flooding test conditions

    International Nuclear Information System (INIS)

    Lee, Won Woong; Kim, Min Gil; Lee, Jeong Ik; Bang, Young Seok

    2015-01-01

    In particular, CCFL(the counter current flow limitation) occurs in components such as hot leg, downcomer annulus and steam generator inlet plenum during LOCA which is possible to have flows in two opposite directions. Therefore, CCFL is one of the thermal-hydraulic models which has significant effect on the reactor safety analysis code performance. In this study, the CCFL model will be evaluated with MARS-KS based on two-phase two-field governing equations and SPACE code based on two-phase three-field governing equations. This study will be conducted by comparing MARS-KS code which is being used for evaluating the safety of a Korean Nuclear Power Plant and SPACE code which is currently under assessment for evaluating the safety of the designed nuclear power plant. In this study, comparison of the results of liquid upflow and liquid downflow rate for different gas flow rate from two code to the famous Dukler's CCFL experimental data are presented. This study will be helpful to understand the difference between system analysis codes with different governing equations, models and correlations, and further improving the accuracy of system analysis codes. In the nuclear reactor system, CCFL is an important phenomenon for evaluating the safety of nuclear reactors. This is because CCFL phenomenon can limit injection of ECCS water when CCFL occurs in components such as hot leg, downcomer annulus or steam generator inlet plenum during LOCA which is possible to flow in two opposite directions. Therefore, CCFL is one of the thermal-hydraulic models which has significant effect on the reactor safety analysis code performance. In this study, the CCFL model was evaluated with MARS-KS and SPACE codes for studying the difference between system analysis codes with different governing equations, models and correlations. This study was conducted by comparing MARS-KS and SPACE code results of liquid upflow and liquid downflow rate for different gas flow rate to the famous Dukler

  12. Creative Industries: Development Processes Under Contemporary Conditions of Globalization

    Directory of Open Access Journals (Sweden)

    Valerija Kontrimienė

    2017-06-01

    Full Text Available The article deals with the processes of developing creative industries under conditions of a growth in the worldwide economy and globalization, discloses the role of the sector of creative industries and shows its place in the system of the modern global economy. The paper presents a comparative analysis of theories and theoretical approaches intended for the sector of creative industries and its development as well as defines regularities and specificities characteristic of the development of creative industries. Particular attention is shifted on the growth and development of creative industries considering the current challenges of globalization and on the most important specificities of the developing sector in the context of the challenges of economic globalization. The paper examines the trends reflecting the place of the sector of creative industries in the economy of the modern world, including the tendencies indicating changes in the export of the products created in this sector. The article considers the issues of developing creative industries and reveals priorities of future research.

  13. Gaussian process regression for sensor networks under localization uncertainty

    Science.gov (United States)

    Jadaliha, M.; Xu, Yunfei; Choi, Jongeun; Johnson, N.S.; Li, Weiming

    2013-01-01

    In this paper, we formulate Gaussian process regression with observations under the localization uncertainty due to the resource-constrained sensor networks. In our formulation, effects of observations, measurement noise, localization uncertainty, and prior distributions are all correctly incorporated in the posterior predictive statistics. The analytically intractable posterior predictive statistics are proposed to be approximated by two techniques, viz., Monte Carlo sampling and Laplace's method. Such approximation techniques have been carefully tailored to our problems and their approximation error and complexity are analyzed. Simulation study demonstrates that the proposed approaches perform much better than approaches without considering the localization uncertainty properly. Finally, we have applied the proposed approaches on the experimentally collected real data from a dye concentration field over a section of a river and a temperature field of an outdoor swimming pool to provide proof of concept tests and evaluate the proposed schemes in real situations. In both simulation and experimental results, the proposed methods outperform the quick-and-dirty solutions often used in practice.

  14. Optimization and Control of Pressure Swing Adsorption Processes Under Uncertainty

    KAUST Repository

    Khajuria, Harish

    2012-03-21

    The real-time periodic performance of a pressure swing adsorption (PSA) system strongly depends on the choice of key decision variables and operational considerations such as processing steps and column pressure temporal profiles, making its design and operation a challenging task. This work presents a detailed optimization-based approach for simultaneously incorporating PSA design, operational, and control aspects under the effect of time variant and invariant disturbances. It is applied to a two-bed, six-step PSA system represented by a rigorous mathematical model, where the key optimization objective is to maximize the expected H2 recovery while achieving a closed loop product H2 purity of 99.99%, for separating 70% H2, 30% CH4 feed. The benefits over sequential design and control approach are shown in terms of closed-loop recovery improvement of more than 3%, while the incorporation of explicit/multiparametric model predictive controllers improves the closed loop performance. © 2012 American Institute of Chemical Engineers (AIChE).

  15. Anticipatory processes under academic stress: an ERP study.

    Science.gov (United States)

    Duan, Hongxia; Yuan, Yiran; Yang, Can; Zhang, Liang; Zhang, Kan; Wu, Jianhui

    2015-03-01

    It is well known that preparing for and taking high-stakes exams has a significant influence on the emotional and physiological wellbeing of exam-takers, but few studies have investigated the resulting cognitive changes. The current study examined the effect of examination-induced academic stress on anticipation in information processing. Anticipation was indexed using the contingent negative variation (CNV). Electroencephalograms (EEG) were collected from 42 participants using the classic S1-S2 paradigm. These participants were preparing for the Chinese National Postgraduate Entrance Exam (NPEE). EEGs were also collected from 21 age-matched, non-exam comparison participants. The levels of perceived stress and state anxiety were higher and both the initial CNV (iCNV) and the late CNV (lCNV) were more negative in the exam group than in the non-exam group. These results suggest that participants under academic stress experienced greater anticipation of upcoming events. More important, for the non-exam group, state anxiety was positively related to both the iCNV and lCNV amplitude, and this correlation existed when trait anxiety was controlled; however, there was no such relationship in the exam group. These results suggested that the cortical anticipatory activity in the high-stressed exam group reached the maximum ceiling, leaving little room for transient increases in state anxiety. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Blind signal processing algorithms under DC biased Gaussian noise

    Science.gov (United States)

    Kim, Namyong; Byun, Hyung-Gi; Lim, Jeong-Ok

    2013-05-01

    Distortions caused by the DC-biased laser input can be modeled as DC biased Gaussian noise and removing DC bias is important in the demodulation process of the electrical signal in most optical communications. In this paper, a new performance criterion and a related algorithm for unsupervised equalization are proposed for communication systems in the environment of channel distortions and DC biased Gaussian noise. The proposed criterion utilizes the Euclidean distance between the Dirac-delta function located at zero on the error axis and a probability density function of biased constant modulus errors, where constant modulus error is defined by the difference between the system out and a constant modulus calculated from the transmitted symbol points. From the results obtained from the simulation under channel models with fading and DC bias noise abruptly added to background Gaussian noise, the proposed algorithm converges rapidly even after the interruption of DC bias proving that the proposed criterion can be effectively applied to optical communication systems corrupted by channel distortions and DC bias noise.

  17. The safety relief valve handbook design and use of process safety valves to ASME and International codes and standards

    CERN Document Server

    Hellemans, Marc

    2009-01-01

    The Safety Valve Handbook is a professional reference for design, process, instrumentation, plant and maintenance engineers who work with fluid flow and transportation systems in the process industries, which covers the chemical, oil and gas, water, paper and pulp, food and bio products and energy sectors. It meets the need of engineers who have responsibilities for specifying, installing, inspecting or maintaining safety valves and flow control systems. It will also be an important reference for process safety and loss prevention engineers, environmental engineers, and plant and process designers who need to understand the operation of safety valves in a wider equipment or plant design context. . No other publication is dedicated to safety valves or to the extensive codes and standards that govern their installation and use. A single source means users save time in searching for specific information about safety valves. . The Safety Valve Handbook contains all of the vital technical and standards informat...

  18. The intercomparison of aerosol codes

    International Nuclear Information System (INIS)

    Dunbar, I.H.; Fermandjian, J.; Gauvain, J.

    1988-01-01

    The behavior of aerosols in a reactor containment vessel following a severe accident could be an important determinant of the accident source term to the environment. Various processes result in the deposition of the aerosol onto surfaces within the containment, from where they are much less likely to be released. Some of these processes are very sensitive to particle size, so it is important to model the aerosol growth processes: agglomeration and condensation. A number of computer codes have been written to model growth and deposition processes. They have been tested against each other in a series of code comparison exercises. These exercises have investigated sensitivities to physical and numerical assumptions and have also proved a useful means of quality control for the codes. Various exercises in which code predictions are compared with experimental results are now under way

  19. Validation of activity determination codes and nuclide vectors by using results from processing of retired components and operational waste

    International Nuclear Information System (INIS)

    Lundgren, Klas; Larsson, Arne

    2012-01-01

    Decommissioning studies for nuclear power reactors are performed in order to assess the decommissioning costs and the waste volumes as well as to provide data for the licensing and construction of the LILW repositories. An important part of this work is to estimate the amount of radioactivity in the different types of decommissioning waste. Studsvik ALARA Engineering has performed such assessments for LWRs and other nuclear facilities in Sweden. These assessments are to a large content depending on calculations, senior experience and sampling on the facilities. The precision in the calculations have been found to be relatively high close to the reactor core. Of natural reasons the precision will decline with the distance. Even if the activity values are lower the content of hard to measure nuclides can cause problems in the long term safety demonstration of LLW repositories. At the same time Studsvik is processing significant volumes of metallic and combustible waste from power stations in operation and in decommissioning phase as well as from other nuclear facilities such as research and waste treatment facilities. Combining the unique knowledge in assessment of radioactivity inventory and the large data bank the waste processing represents the activity determination codes can be validated and the waste processing analysis supported with additional data. The intention with this presentation is to highlight how the European nuclear industry jointly could use the waste processing data for validation of activity determination codes. (authors)

  20. Impaired Letter-String Processing in Developmental Dyslexia: What Visual-to-Phonology Code Mapping Disorder?

    Science.gov (United States)

    Valdois, Sylviane; Lassus-Sangosse, Delphine; Lobier, Muriel

    2012-01-01

    Poor parallel letter-string processing in developmental dyslexia was taken as evidence of poor visual attention (VA) span, that is, a limitation of visual attentional resources that affects multi-character processing. However, the use of letter stimuli in oral report tasks was challenged on its capacity to highlight a VA span disorder. In…

  1. Method of Coding Search Strings as Markov Processes Using a Higher Level Language.

    Science.gov (United States)

    Ghanti, Srinivas; Evans, John E.

    For much of the twentieth century, Markov theory and Markov processes have been widely accepted as valid ways to view statistical variables and parameters. In the complex realm of online searching, where researchers are always seeking the route to the best search strategies and the most powerful query terms and sequences, Markov process analysis…

  2. INLUX-DBR - A calculation code to calculate indoor natural illuminance inside buildings under various sky conditions

    International Nuclear Information System (INIS)

    Ferraro, V.; Igawa, N.; Marinelli, V.

    2010-01-01

    A calculation code, named INLUX-DBR, is presented, which is a modified version of INLUX code, able to predict the illuminance distribution on the inside surfaces of a room with six walls and a window, and on the work plane. At each desired instant the code solves the system of the illuminance equations of each surface element, characterized by the latter's reflection coefficient and its view factors toward the other elements. In the model implemented in the code, the sky-diffuse luminance distribution, the sun beam light and the light reflected from the ground toward the room are considered. The code was validated by comparing the calculated values of illuminance with the experimental values measured inside a scale model (1:5) of a building room, in various sky conditions of overcast, clear and intermediate days. The validation is performed using the sky luminance data measured by a sky scanner and the measured beam illuminance of the sun as input data. A comparative analysis of some of the well-known calculation models of sky luminance, namely Perez, Igawa and CIE models was also carried out, comparing the code predictions and the measured values of inside illuminance in the scale model.

  3. Development of a model and computer code to describe solar grade silicon production processes. Fifth quarterly report

    Energy Technology Data Exchange (ETDEWEB)

    Srivastava, R.; Gould, R.K.

    1979-02-01

    This program aims at developing mathematical models, and computer codes based on these models, which will allow prediction of the product distribution in chemical reactors in which gaseous silicon compounds are converted to condensed-phase silicon. The reactors to be modeled are flow reactors in which silane or one of the halogenated silanes is thermally decomposed or reacted with an alkali metal, H/sub 2/ or H atoms. Because the product of interest is particulate silicon, processes which must be modeled, in addition to mixing and reaction of gas-phase reactants, include the nucleation and growth of condensed Si via coagulation, condensation, and heterogeneous reaction. During this report period computer codes were developed and used to calculate: (1) coefficients for Si vapor and Si particles describing transport due to concentration and temperature gradients (i.e., Fick and Soret diffusion, respectively), and (2) estimates of thermochemical properties of Si n-mers. The former are needed to allow the mass flux of Si to reactor walls to be calculated. Because of the extremely large temperature gradients that exist in some of the reactors to be used in producing Si (particularly the Westinghouse reactor), it was found that thermal (Soret) diffusion can be the dominant transport mechanism for certain sizes of Si particles. The thermochemical estimates are required to allow computation of the formation rate of Si droplets. With the completion of these calculations the information and coding of the particle routines in the modified LAPP code is at the point where debugging can be done and that is now in progress.

  4. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  5. Overcoming Methodological Obstacles in Business Process Simulation under Deep Uncertainty

    NARCIS (Netherlands)

    Markensteijn, T.L.

    2013-01-01

    Organizations are in ever changing environments which results in the need for constant adaptation of business processes and structures. Discrete Event Simulation (DES) is a commonly used application of Business Process Simulation to support decision makers in complex processes. However, in case deep

  6. Disruption of Relational Processing Underlies Poor Memory for Order

    Science.gov (United States)

    Jonker, Tanya R.; MacLeod, Colin M.

    2015-01-01

    McDaniel and Bugg (2008) proposed that relatively uncommon stimuli and encoding tasks encourage elaborative encoding of individual items (item-specific processing), whereas relatively typical or common encoding tasks encourage encoding of associations among list items (relational processing). It is this relational processing that is thought to…

  7. Starch hydrolysis under low water conditions: a conceptual process design

    NARCIS (Netherlands)

    Veen, van der M.E.; Veelaert, S.; Goot, van der A.J.; Boom, R.M.

    2006-01-01

    A process concept is presented for the hydrolysis of starch to glucose in highly concentrated systems. Depending on the moisture content, the process consists of two or three stages. The two-stage process comprises combined thermal and enzymatic liquefaction, followed by enzymatic saccharification.

  8. Improving the Emergency Department's Processes of Coding and Billing at Brooke Army Medical Center

    National Research Council Canada - National Science Library

    Lehning, Peter

    2003-01-01

    .... Beginning in October 2002, outpatient itemized billing was mandated for use in the AMEDD. This system shifted the process of billing for outpatient services from an allinclusive rate to one based on the actual care provided...

  9. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  10. Measuring the implementation of codes of conduct. An assessment method based on a process approach of the responsible organisation

    NARCIS (Netherlands)

    Nijhof, A.H.J.; Cludts, Stephan; Fisscher, O.A.M.; Laan, Albertus

    2003-01-01

    More and more organisations formulate a code of conduct in order to stimulate responsible behaviour among their members. Much time and energy is usually spent fixing the content of the code but many organisations get stuck in the challenge of implementing and maintaining the code. The code then

  11. Successive Transfers Relating to Movable Tangible Assets and Acquisition of Property under Article 937, Paragraph (1 of the Civil Code

    Directory of Open Access Journals (Sweden)

    Mara Ioan

    2015-12-01

    Full Text Available Apparently article 1275, paragraph (1 of the Civil Code covers all situations that may arise in practice, without making a distinction for the constituent or transferring contracts if they are of the same or of different nature. However, we appreciate that article 1275 of the Civil Code does not apply in all situations of successive transfers relating to movable tangible property granted by the same legal subject. Corroborating this text with the norms in article 937 paragraph (1 of the Civil Code and article1273 paragraph (1 of the Civil Code it leads to the solution according to which article 1275 of the Civil Code regards only the cases where the transfer of successive property are of the same nature, the onerous primary act has not resulted in immediate transmission of real previous right of the document with the free subsidiary title and when the primal act is free, and the alternative is onerous. It is excluded, thus from the application of the rule in question when the primary onerous act had as effect the immediate transmission of the real right and then, but without having occurred the delivery of the asset by the acquirer, it was concluded a document with a free title, subsidiary.

  12. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  13. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  14. The Process of Interactional Sensitivity Coding in Health Care: Conceptually and Operationally Defining Patient-Centered Communication.

    Science.gov (United States)

    Sabee, Christina M; Koenig, Christopher J; Wingard, Leah; Foster, Jamie; Chivers, Nick; Olsher, David; Vandergriff, Ilona

    2015-01-01

    This study aimed to develop a process for measuring sensitivity in provider-patient interactions to better understand patient-centered communication. The authors developed the Process of Interactional Sensitivity Coding in Healthcare (PISCH) by incorporating a multimethod investigation into conversations between physicians and their patients with type 2 diabetes. The PISCH was then applied and assessed for its reliability across the unitization of interactions, the activities that were reflected, and the characteristics of patient-centered interactional sensitivity that were observed within each unit. In most cases, the PISCH resulted in reliable analysis of the interactions, but a few key areas (shared decision making, enabling self-management, and responding to emotion) were not reliably assessed. Implications of the test of this coding scheme include the expansion of the theoretical notion of interactional sensitivity to the health care context, rigorous implementation of a multimethod measurement development that relied on qualitative and quantitative assessments, and important future questions about the role of communication concepts in future interpersonal research.

  15. The operation of criminal process under Constitutio Criminalis Theresiana

    Directory of Open Access Journals (Sweden)

    Feješ Ištvan

    2013-01-01

    Full Text Available The paper is divided into four larger parts. The first part is the introduction where the author briefly describes the history of born of inquisition procedure. The second part is devoted to the characteristics and the structures of the procedure. The author here shows Theresiana has all the features of a classical inquisitorial process. The process is generally divided into investigation and trial. In addition, there was a procedure for solving a so called 'racurs', that was not a legal remedy in the strict sense of the word, but an appeal for judgement reversal and amnesty. As the investigation is the central part of the process the paper devotes the most attention to this phase of the process. It analyzes the division, commencement, operation and ending of the investigation. The paper talks separately about the process of bringing the judgement, and the process based on 'racurs'. The fourth part is the conclusion where the author summarizes the characteristics of the process and the position of the accused. The paper concludes that despite important changes in the process, the position of the accused entirely reflected the spirit of feudal law and the accused remained a disempowered object of the process.

  16. Flux behaviour under different operational conditions in osmosis process

    DEFF Research Database (Denmark)

    Korenak, Jasmina; Zarebska, Agata; Buksek, Hermina

    the active membrane layer is facing draw solution. Osmosis process can be affected by several factors, such as operating conditions (temperature and cross flow velocity), feed and draw solution properties, and membrane characteristics. These factors can significantly contribute to the efficiency...... of the process itself. In order to implement the osmosis process on an industrial scale, process economy need to be taken into consideration, as well as the desired final product quality. Membrane performance can be evaluated based on the water permeability and the selectivity of the membrane. The permeability...

  17. An overview of new video coding tools under consideration for VP10: the successor to VP9

    Science.gov (United States)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  18. Evidence for embodied predictive coding: the anterior insula coordinates cortical processing of tactile deviancy

    DEFF Research Database (Denmark)

    Allen, Micah; Fardo, Francesca; Dietz, Martin

    2015-01-01

    this possibility in the somatosensory domain, we measured brain activity using functional magnetic resonance imaging while healthy participants discriminated tactile stimuli in a roving oddball design. Dynamic Causal Modelling revealed that unexpected stimuli increased the strength of forward connections...... processing of tactile changes to support body awareness....

  19. 77 FR 17460 - Multistakeholder Process To Develop Consumer Data Privacy Codes of Conduct

    Science.gov (United States)

    2012-03-26

    ... and Promoting Innovation in the Global Digital Economy (``the Privacy and Innovation Blueprint'').\\1... convene open, transparent, consensus-based processes in which stakeholders develop legally enforceable... Protecting Privacy and Promoting Innovation in the Global Digital Economy, Feb. 2012, available at http://www...

  20. Media audit reveals inappropriate promotion of products under the scope of the International Code of Marketing of Breast-milk Substitutes in South-East Asia.

    Science.gov (United States)

    Vinje, Kristine Hansen; Phan, Linh Thi Hong; Nguyen, Tuan Thanh; Henjum, Sigrun; Ribe, Lovise Omoijuanfo; Mathisen, Roger

    2017-06-01

    To review regulations and to perform a media audit of promotion of products under the scope of the International Code of Marketing of Breast-milk Substitutes ('the Code') in South-East Asia. We reviewed national regulations relating to the Code and 800 clips of editorial content, 387 advertisements and 217 Facebook posts from January 2015 to January 2016. We explored the ecological association between regulations and market size, and between the number of advertisements and market size and growth of milk formula. Cambodia, Indonesia, Myanmar, Thailand and Vietnam. Regulations on the child's age for inappropriate marketing of products are all below the Code's updated recommendation of 36 months (i.e. 12 months in Thailand and Indonesia; 24 months in the other three countries) and are voluntary in Thailand. Although the advertisements complied with the national regulations on the age limit, they had content (e.g. stages of milk formula; messages about the benefit; pictures of a child) that confused audiences. Market size and growth of milk formula were positively associated with the number of newborns and the number of advertisements, and were not affected by the current level of implementation of breast-milk substitute laws and regulations. The present media audit reveals inappropriate promotion and insufficient national regulation of products under the scope of the Code in South-East Asia. Strengthened implementation of regulations aligned with the Code's updated recommendation should be part of comprehensive strategies to minimize the harmful effects of advertisements of breast-milk substitutes on maternal and child nutrition and health.

  1. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    International Nuclear Information System (INIS)

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000 degree F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion (''bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled

  2. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  3. Electrocatalytic reduction of carbon dioxide under plasma DBD process

    International Nuclear Information System (INIS)

    Amouroux, Jacques; Cavadias, Simeon

    2017-01-01

    Carbon dioxide can be converted, by reaction with hydrogen, into fine chemicals and liquid fuels such as methanol and DME. Methane production by the Sabatier reaction opens the way of carbon recycling for a circular economy of carbon resources. The catalytic process of methanation of carbon dioxide produces two molecules of water as a by-product. A current limitation in the CO 2 methanation is the ageing of catalysts, mainly due to water adsorption during the process. To avoid this adsorption, the process is operated at high temperature (300 °C–400 °C), leading to carbon deposition on the catalyst and its deactivation. To overcome this problem, a methanation plasma-catalytic process has been developed, which achieves high CO 2 conversion rate (80%), and a selectivity close to 100%, working from room temperature to 150 °C, instead of 300 °C–400 °C for the thermal catalytic process. The main characteristics of this process are high-voltage pulses of few nanoseconds duration, activating the adsorption of CO 2 in bent configuration and the polarization of the catalyst. The key step in this process is the desorption of water from the polarized catalyst. The high CO 2 conversion at low temperature could be explained by the creation of a plasma inside the nanopores of the catalyst. (paper)

  4. Electrocatalytic reduction of carbon dioxide under plasma DBD process

    Science.gov (United States)

    Amouroux, Jacques; Cavadias, Simeon

    2017-11-01

    Carbon dioxide can be converted, by reaction with hydrogen, into fine chemicals and liquid fuels such as methanol and DME. Methane production by the Sabatier reaction opens the way of carbon recycling for a circular economy of carbon resources. The catalytic process of methanation of carbon dioxide produces two molecules of water as a by-product. A current limitation in the CO2 methanation is the ageing of catalysts, mainly due to water adsorption during the process. To avoid this adsorption, the process is operated at high temperature (300 °C-400 °C), leading to carbon deposition on the catalyst and its deactivation. To overcome this problem, a methanation plasma-catalytic process has been developed, which achieves high CO2 conversion rate (80%), and a selectivity close to 100%, working from room temperature to 150 °C, instead of 300 °C-400 °C for the thermal catalytic process. The main characteristics of this process are high-voltage pulses of few nanoseconds duration, activating the adsorption of CO2 in bent configuration and the polarization of the catalyst. The key step in this process is the desorption of water from the polarized catalyst. The high CO2 conversion at low temperature could be explained by the creation of a plasma inside the nanopores of the catalyst.

  5. Acoustic wave focusing in complex media using Nonlinear Time Reversal coded signal processing

    Czech Academy of Sciences Publication Activity Database

    Dos Santos, S.; Dvořáková, Zuzana; Lints, M.; Kůs, V.; Salupere, A.; Převorovský, Zdeněk

    2014-01-01

    Roč. 19, č. 12 (2014) ISSN 1435-4934. [European Conference on Non-Destructive Testing (ECNDT 2014) /11./. Praha, 06.10.2014-10.10.2014] Institutional support: RVO:61388998 Keywords : ultrasonic testing (UT) * signal processing * TR- NEWS * nonlinear time reversal * NDT * nonlinear acoustics Subject RIV: BI - Acoustics http://www.ndt.net/events/ECNDT2014/ app /content/Slides/590_DosSantos_Rev1.pdf

  6. The GEM code. A simulation program for the evaporation and the fission process of an excited nucleus

    Energy Technology Data Exchange (ETDEWEB)

    Furihata, Shiori [Mitsubishi Research Institute Inc., Tokyo (Japan); Niita, Koji [Research Organization for Information Science and Technology, Tokai, Ibaraki (Japan); Meigo, Shin-ichiro; Ikeda, Yujiro; Maekawa, Fujio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-03-01

    The GEM code is a simulation program which describes the de-excitation process of an excited nucleus, which is based on the Generalized Evaporation Model and the Atchison fission model. It has been shown that the combination of the Bertini intranuclear cascade model and GEM accurately predicts the cross sections of light fragments, such as Be produced from the proton-induced reactions. It has also been shown that the use of the reevaluated parameters in the Atchison model improves predictions of cross sections of fission fragments produced from the proton-induced reaction on Au. In this report, we present details and the usage of the GEM code. Furthermore, the results of benchmark calculations are shown by using the combination of the Bertini intranuclear cascade model and the GEM code (INC/GEM). Neutron spectra and isotope production cross sections from the reactions on various targets irradiated by protons are calculated with INC/GEM. Those results are compared with experimental data as well as the calculation results with LAHET. INC/GEM reproduces the experiments of double differential neutron emissions from the reaction on Al and Pb. The isotopic distributions for He, Li, and Be produced from the reaction on Ag are in good agreement with experimental data within 50%, although INC/GEM underestimates those of heavier nuclei than O. It is also shown that the predictions with INC/GEM for isotope production of light fragments, such as Li and Be, are better than those calculation with LAHET, particularly for heavy target. INC/GEM also gives better estimates of the cross sections of fission products than LAHET. (author)

  7. Underlying Skills of Oral and Silent Reading Fluency in Chinese: Perspective of Visual Rapid Processing.

    Science.gov (United States)

    Zhao, Jing; Kwok, Rosa K W; Liu, Menglian; Liu, Hanlong; Huang, Chen

    2016-01-01

    findings suggest that the underlying mechanism between oral and silent reading fluency is different at the beginning of the basic visual coding. The current results also might reveal a potential modulation of the language characteristics of Chinese on the relationship between visual rapid processing and reading fluency.

  8. Method for improving the gamma-transition cascade spectra amplitude resolution during coincidence code computerized processing

    International Nuclear Information System (INIS)

    Sukhovoj, A.M.; Khitrov, V.A.

    1984-01-01

    A method of unfolding the differential γ-cascade spectra during radiation capture of slow neutrons based on the computeri-- zed processing of the results of measurements performed, by means of a spectrometer with two Ge(Li) detectors is suggested. The efficiency of the method is illustrated using as an example the spectrum of 35 Cl(n, γ) reaction corresponding to the 8580 keV peak. It is shown that the above approach permits to improve the resolution by 1.2-2.6 times without decrease in registration efficiency within the framework of the method of coincidence pulse amplitude summation

  9. Connectivity Reveals Sources of Predictive Coding Signals in Early Visual Cortex During Processing of Visual Optic Flow.

    Science.gov (United States)

    Schindler, Andreas; Bartels, Andreas

    2017-05-01

    Superimposed on the visual feed-forward pathway, feedback connections convey higher level information to cortical areas lower in the hierarchy. A prominent framework for these connections is the theory of predictive coding where high-level areas send stimulus interpretations to lower level areas that compare them with sensory input. Along these lines, a growing body of neuroimaging studies shows that predictable stimuli lead to reduced blood oxygen level-dependent (BOLD) responses compared with matched nonpredictable counterparts, especially in early visual cortex (EVC) including areas V1-V3. The sources of these modulatory feedback signals are largely unknown. Here, we re-examined the robust finding of relative BOLD suppression in EVC evident during processing of coherent compared with random motion. Using functional connectivity analysis, we show an optic flow-dependent increase of functional connectivity between BOLD suppressed EVC and a network of visual motion areas including MST, V3A, V6, the cingulate sulcus visual area (CSv), and precuneus (Pc). Connectivity decreased between EVC and 2 areas known to encode heading direction: entorhinal cortex (EC) and retrosplenial cortex (RSC). Our results provide first evidence that BOLD suppression in EVC for predictable stimuli is indeed mediated by specific high-level areas, in accord with the theory of predictive coding. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Failure Processes in Embedded Monolayer Graphene under Axial Compression

    Science.gov (United States)

    Androulidakis, Charalampos; Koukaras, Emmanuel N.; Frank, Otakar; Tsoukleri, Georgia; Sfyris, Dimitris; Parthenios, John; Pugno, Nicola; Papagelis, Konstantinos; Novoselov, Kostya S.; Galiotis, Costas

    2014-01-01

    Exfoliated monolayer graphene flakes were embedded in a polymer matrix and loaded under axial compression. By monitoring the shifts of the 2D Raman phonons of rectangular flakes of various sizes under load, the critical strain to failure was determined. Prior to loading care was taken for the examined area of the flake to be free of residual stresses. The critical strain values for first failure were found to be independent of flake size at a mean value of –0.60% corresponding to a yield stress up to -6 GPa. By combining Euler mechanics with a Winkler approach, we show that unlike buckling in air, the presence of the polymer constraint results in graphene buckling at a fixed value of strain with an estimated wrinkle wavelength of the order of 1–2 nm. These results were compared with DFT computations performed on analogue coronene/PMMA oligomers and a reasonable agreement was obtained. PMID:24920340

  11. Minimization of water consumption under uncertainty for PC process

    Energy Technology Data Exchange (ETDEWEB)

    Salazar, J.; Diwekar, U.; Zitney, S.

    2009-01-01

    Integrated gasification combined cycle (IGCC) technology is becoming increasingly important for the development of advanced power generation systems. As an emerging technology different process configurations have been heuristically proposed for IGCC processes. One of these schemes combines water-gas shift reaction and chemical-looping combustion for the CO2 removal prior the fuel gas is fed to the gas turbine reducing its size (improving economic performance) and producing sequestration-ready CO2 (improving its cleanness potential). However, these schemes have not been energetically integrated and process synthesis techniques can be used to obtain optimal flowsheets and designs. This work studies the heat exchange network synthesis (HENS) for the water-gas shift reaction train employing a set of alternative designs provided by Aspen energy analyzer (AEA) and combined in a process superstructure that was simulated in Aspen Plus (AP). For the alternative designs, large differences in the performance parameters (for instance, the utility requirements) predictions from AEA and AP were observed, suggesting the necessity of solving the HENS problem within the AP simulation environment and avoiding the AEA simplifications. A CAPE-OPEN compliant capability which makes use of a MINLP algorithm for sequential modular simulators was employed to obtain a heat exchange network that provided a cost of energy that was 27% lower than the base case.

  12. Sustainable Process Design under uncertainty analysis: targeting environmental indicators

    DEFF Research Database (Denmark)

    L. Gargalo, Carina; Gani, Rafiqul

    2015-01-01

    This study focuses on uncertainty analysis of environmental indicators used to support sustainable process design efforts. To this end, the Life Cycle Assessment methodology is extended with a comprehensive uncertainty analysis to propagate the uncertainties in input LCA data to the environmental...

  13. Endocrine processes underlying victory and defeat in the male rat

    NARCIS (Netherlands)

    Schuurman, Teunis

    1981-01-01

    The central questions of the present study were:1. does base line hormonal state determine agonistic behavior in male-male encounters? 2. does agonistic behavior affect hormonal state? Such an interrelationship between agonistic behavior and hormonal processes might serve as a regulatory system for

  14. Proposal to consistently apply the International Code of Nomenclature of Prokaryotes (ICNP) to names of the oxygenic photosynthetic bacteria (cyanobacteria), including those validly published under the International Code of Botanical Nomenclature (ICBN)/International Code of Nomenclature for algae, fungi and plants (ICN), and proposal to change Principle 2 of the ICNP.

    Science.gov (United States)

    Pinevich, Alexander V

    2015-03-01

    This taxonomic note was motivated by the recent proposal [Oren & Garrity (2014) Int J Syst Evol Microbiol 64, 309-310] to exclude the oxygenic photosynthetic bacteria (cyanobacteria) from the wording of General Consideration 5 of the International Code of Nomenclature of Prokaryotes (ICNP), which entails unilateral coverage of these prokaryotes by the International Code of Nomenclature for algae, fungi, and plants (ICN; formerly the International Code of Botanical Nomenclature, ICBN). On the basis of key viewpoints, approaches and rules in the systematics, taxonomy and nomenclature of prokaryotes it is reciprocally proposed to apply the ICNP to names of cyanobacteria including those validly published under the ICBN/ICN. For this purpose, a change to Principle 2 of the ICNP is proposed to enable validation of cyanobacterial names published under the ICBN/ICN rules. © 2015 IUMS.

  15. Status and Perspective of the Pre/Post Processing for Thermal-Hydraulic Code

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Park, Chan Eok; Kim, Eun Ki

    2009-01-01

    The governing equations for the thermal-hydraulic solver can be obtained through several steps of modeling and approximations from the basic material transport principles. The volume averaging process gives rise to the concept of the volume porosity. Considering the structural complexity of the reactor internal design, one can imagine how much efforts should be put to get the proper porosity data. Personal experience tells that over one man-year of efforts are not enough to develop a reasonable input preparation report for a reactor because of the notorious error-prone tedious calculations. To overcome this problem, any one can imagine that the utilization of a CAD system will be very much helpful. But, some efforts have to be exercised to evaluate the capability of the present day CAD system for this type of application. Even if the evaluation is affirmative, some efforts have to be devoted to develop the necessary procedure for the porosity calculation. In this paper, the CAD system, Pro/Engineer is utilized for this purpose. Detailed review tells that, sole Cad system is not enough but a mesh generator is also necessary. A post-data processor may be combined to further enhance the capability

  16. Long Non-Coding RNAs in Hepatitis B Virus-Related Hepatocellular Carcinoma: Regulation, Functions, and Underlying Mechanisms

    OpenAIRE

    Qiu, Lipeng; Wang, Tao; Xu, Xiuquan; Wu, Yihang; Tang, Qi; Chen, Keping

    2017-01-01

    Hepatocellular carcinoma (HCC) is the fifth most common cancer and the third leading cause of cancer death in the world. Hepatitis B virus (HBV) and its X gene-encoded protein (HBx) play important roles in the progression of HCC. Although long non-coding RNAs (lncRNAs) cannot encode proteins, growing evidence indicates that they play essential roles in HCC progression, and contribute to cell proliferation, invasion and metastasis, autophagy, and apoptosis by targeting a large number of pivota...

  17. Evaluation of compliance with the Spanish Code of self-regulation of food and drinks advertising directed at children under the age of 12 years in Spain, 2012.

    Science.gov (United States)

    León-Flández, K; Rico-Gómez, A; Moya-Geromin, M Á; Romero-Fernández, M; Bosqued-Estefania, M J; Damián, J; López-Jurado, L; Royo-Bordonada, M Á

    2017-09-01

    To evaluate compliance levels with the Spanish Code of self-regulation of food and drinks advertising directed at children under the age of 12 years (Publicidad, Actividad, Obesidad, Salud [PAOS] Code) in 2012; and compare these against the figures for 2008. Cross-sectional study. Television advertisements of food and drinks (AFD) were recorded over 7 days in 2012 (8am-midnight) of five Spanish channels popular to children. AFD were classified as core (nutrient-rich/low-calorie products), non-core (nutrient-poor/rich-calorie products) or miscellaneous. Compliance with each standard of the PAOS Code was evaluated. AFD were deemed to be fully compliant when it met all the standards. Two thousand five hundred and eighty-two AFDs came within the purview of the PAOS Code. Some of the standards that registered the highest levels of non-compliance were those regulating the suitability of the information presented (79.4%) and those prohibiting the use of characters popular with children (25%). Overall non-compliance with the Code was greater in 2012 than in 2008 (88.3% vs 49.3%). Non-compliance was highest for advertisements screened on children's/youth channels (92.3% vs. 81.5%; P < 0.001) and for those aired outside the enhanced protection time slot (89.3% vs. 86%; P = 0.015). Non-compliance with the PAOS Code is higher than for 2008. Given the lack of effectiveness of self-regulation, a statutory system should be adopted to ban AFD directed at minors, or at least restrict it to healthy products. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  18. SHARC (Strategic High-Altitude Radiance Code). A computer model for calculating atmospheric radiation under non-equilibrium conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, R.; Ratkowski, A.J.; Sundberg, R.L.; Duff, J.W.; Bernstein, L.S.

    1989-02-01

    The Strategic High-Altitude Radiance Code (SHARC ) is a new computer code that calculates atmospheric radiation for paths from 60 to 300 km altitude in the 2-40 micro spectral region. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium) molecular emissions which are the dominant sources at these altitudes. The initial version of SHARC, which is described in this paper, includes the five strongest IR radiators, CO/sub 2/, NO, O/sub 3/, H/sub 2/O and CO. Calculation of excited state populations is accomplished by interfacing a Monte Carlo model for radiative excitation and energy transfer with a highly flexible chemical kinetics module derived from the Sandia CHEMKIN Code. An equivalent-width, line-by-line approach for the radiation transport gives a spectral resolution of about 0.50/cm. The radiative-transport calculation includes the effects of combined Doppler-Lorentz (Voigt) line shapes. Particular emphasis was placed on modular construction and supporting data files so that models and model parameters can be modified or upgraded as additional data become available. The initial version of SHARC is now ready for distribution.

  19. Category Specific Spatial Dissociations of Parallel Processes Underlying Visual Naming

    OpenAIRE

    Conner, Christopher R.; Chen, Gang; Pieters, Thomas A.; Tandon, Nitin

    2013-01-01

    The constituent elements and dynamics of the networks responsible for word production are a central issue to understanding human language. Of particular interest is their dependency on lexical category, particularly the possible segregation of nouns and verbs into separate processing streams. We applied a novel mixed-effects, multilevel analysis to electrocorticographic data collected from 19 patients (1942 electrodes) to examine the activity of broadly disseminated cortical networks during t...

  20. Neural Correlates of Feedback Processing in Decision Making under Risk

    Directory of Open Access Journals (Sweden)

    Beate eSchuermann

    2012-07-01

    Full Text Available Introduction. Event-related brain potentials (ERP provide important information about the sensitivity of the brain to process varying risks. The aim of the present study was to determine how different risk levels are reflected in decision-related ERPs, namely the feedback-related negativity (FRN and the P300. Material and Methods. 20 participants conducted a probabilistic two-choice gambling task while an electroencephalogram was recorded. Choices were provided between a low-risk option yielding low rewards and low losses and a high-risk option yielding high rewards and high losses. While options differed in expected risks, they were equal in expected values and in feedback probabilities. Results. At the behavioral level, participants were generally risk-averse but modulated their risk-taking behavior according to reward history. An early positivity (P200 was enhanced on negative feedbacks in high-risk compared to low-risk options. With regard to the FRN, there were significant amplitude differences between positive and negative feedbacks in high-risk options, but not in low-risk options. While the FRN on negative feedbacks did not vary with decision riskiness, reduced amplitudes were found for positive feedbacks in high-risk relative to low-risk choices. P300 amplitudes were larger in high-risk decisions, and in an additive way, after negative compared to positive feedback. Discussion. The present study revealed significant influences of risk and valence processing on ERPs. FRN findings suggest that the reward prediction error signal is increased after high-risk decisions. The increased P200 on negative feedback in risky decisions suggests that large negative prediction errors are processed as early as in the P200 time range. The later P300 amplitude is sensitive to feedback valence as well as to the risk of a decision. Thus, the P300 carries additional information for reward processing, mainly the enhanced motivational significance of risky

  1. Dress codes and appearance policies: challenges under federal legislation, part 3: Title VII, the Americans with Disabilities Act, and the National Labor Relations Act.

    Science.gov (United States)

    Mitchell, Michael S; Koen, Clifford M; Darden, Stephen M

    2014-01-01

    As more and more individuals express themselves with tattoos and body piercings and push the envelope on what is deemed appropriate in the workplace, employers have an increased need for creation and enforcement of reasonable dress codes and appearance policies. As with any employment policy or practice, an appearance policy must be implemented and enforced without regard to an individual's race, color, sex, national origin, religion, disability, age, or any other protected status. A policy governing dress and appearance based on the business needs of an employer that is applied fairly and consistently and does not have a disproportionate effect on any protected class will generally be upheld if challenged in court. By examining some of the more common legal challenges to dress codes and how courts have resolved the disputes, health care managers can avoid many potential problems. This article, the third part of a 3-part examination of dress codes and appearance policies, focuses on the issues of race and national origin under the Civil Rights Act, disability under the Americans With Disabilities Act, and employees' rights to engage in concerted activities under the National Labor Relations Act. Pertinent court cases that provide guidance for employers are addressed.

  2. Category specific spatial dissociations of parallel processes underlying visual naming.

    Science.gov (United States)

    Conner, Christopher R; Chen, Gang; Pieters, Thomas A; Tandon, Nitin

    2014-10-01

    The constituent elements and dynamics of the networks responsible for word production are a central issue to understanding human language. Of particular interest is their dependency on lexical category, particularly the possible segregation of nouns and verbs into separate processing streams. We applied a novel mixed-effects, multilevel analysis to electrocorticographic data collected from 19 patients (1942 electrodes) to examine the activity of broadly disseminated cortical networks during the retrieval of distinct lexical categories. This approach was designed to overcome the issues of sparse sampling and individual variability inherent to invasive electrophysiology. Both noun and verb generation evoked overlapping, yet distinct nonhierarchical processes favoring ventral and dorsal visual streams, respectively. Notable differences in activity patterns were noted in Broca's area and superior lateral temporo-occipital regions (verb > noun) and in parahippocampal and fusiform cortices (noun > verb). Comparisons with functional magnetic resonance imaging (fMRI) results yielded a strong correlation of blood oxygen level-dependent signal and gamma power and an independent estimate of group size needed for fMRI studies of cognition. Our findings imply parallel, lexical category-specific processes and reconcile discrepancies between lesional and functional imaging studies. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. A model for optimization of process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael

    2011-01-01

    The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

  4. Efficient Emulation of Radiative Transfer Codes Using Gaussian Processes and Application to Land Surface Parameter Inferences

    Directory of Open Access Journals (Sweden)

    José Luis Gómez-Dans

    2016-02-01

    Full Text Available There is an increasing need to consistently combine observations from different sensors to monitor the state of the land surface. In order to achieve this, robust methods based on the inversion of radiative transfer (RT models can be used to interpret the satellite observations. This typically results in an inverse problem, but a major drawback of these methods is the computational complexity. We introduce the concept of Gaussian Process (GP emulators: surrogate functions that accurately approximate RT models using a small set of input (e.g., leaf area index, leaf chlorophyll, etc. and output (e.g., top-of-canopy reflectances or at sensor radiances pairs. The emulators quantify the uncertainty of their approximation, and provide a fast and easy route to estimating the Jacobian of the original model, enabling the use of e.g., efficient gradient descent methods. We demonstrate the emulation of widely used RT models (PROSAIL and SEMIDISCRETE and the coupling of vegetation and atmospheric (6S RT models targetting particular sensor bands. A comparison with the full original model outputs shows that the emulators are a viable option to replace the original model, with negligible bias and discrepancies which are much smaller than the typical uncertainty in the observations. We also extend the theory of GP to cope with models with multivariate outputs (e.g., over the full solar reflective domain, and apply this to the emulation of PROSAIL, coupled 6S and PROSAIL and to the emulation of individual spectral components of 6S. In all cases, emulators successfully predict the full model output as well as accurately predict the gradient of the model calculated by finite differences, and produce speed ups between 10,000 and 50,000 times that of the original model. Finally, we use emulators to invert leaf area index ( L A I , leaf chlorophyll content ( C a b and equivalent leaf water thickness ( C w from a time series of observations from Sentinel-2/MSI

  5. Dissociable neural processes underlying risky decisions for self versus other

    Directory of Open Access Journals (Sweden)

    Daehyun eJung

    2013-03-01

    Full Text Available Previous neuroimaging studies on decision making have mainly focused on decisions on behalf of oneself. Considering that people often make decisions on behalf of others, it is intriguing that there is little neurobiological evidence on how decisions for others differ from those for self. Thus, the present study focused on the direct comparison between risky decisions for self and those for other using functional magnetic resonance imaging (fMRI. Participants (N = 23 were asked to perform a gambling task for themselves (decision-for-self condition or for another person (decision-for-other condition while in the scanner. Their task was to choose between a low-risk option (i.e., win or lose 10 points and a high-risk option (i.e., win or lose 90 points. The winning probabilities of each option varied from 17% to 83%. Compared to choices for others, choices for self were more risk-averse at lower winning probability and more risk-seeking at higher winning probability, perhaps due to stronger affective process during risky decision for self compared to other. The brain activation pattern changed according to the target of the decision, such that reward-related regions were more active in the decision-for-self condition than in the decision-for-other condition, whereas brain regions related to the theory of mind (ToM showed greater activation in the decision-for-other condition than in the decision-for-self condition. A parametric modulation analysis reflecting each individual’s decision model revealed that activation of the amygdala and the dorsomedial prefrontal cortex (DMPFC were associated with value computation for self and for other, respectively, during a risky financial decision. The present study suggests that decisions for self and other may recruit fundamentally distinctive neural processes, which can be mainly characterized by dominant affective/impulsive and cognitive/regulatory processes, respectively.

  6. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  7. Introductory Biology Textbooks Under-Represent Scientific Process

    Directory of Open Access Journals (Sweden)

    Dara B. Duncan

    2011-08-01

    Full Text Available Attrition of undergraduates from Biology majors is a long-standing problem. Introductory courses that fail to engage students or spark their curiosity by emphasizing the open-ended and creative nature of biological investigation and discovery could contribute to student detachment from the field. Our hypothesis was that introductory biology books devote relatively few figures to illustration of the design and interpretation of experiments or field studies, thereby de-emphasizing the scientific process.To investigate this possibility, we examined figures in six Introductory Biology textbooks published in 2008. On average, multistep scientific investigations were presented in fewer than 5% of the hundreds of figures in each book. Devoting such a small percentage of figures to the processes by which discoveries are made discourages an emphasis on scientific thinking. We suggest that by increasing significantly the illustration of scientific investigations, textbooks could support undergraduates’ early interest in biology, stimulate the development of design and analytical skills, and inspire some students to participate in investigations of their own.

  8. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  9. Implementing a bar-code assisted medication administration system: effects on the dispensing process and user perceptions.

    Science.gov (United States)

    Samaranayake, N R; Cheung, S T D; Cheng, K; Lai, K; Chui, W C M; Cheung, B M Y

    2014-06-01

    We assessed the effects of a bar-code assisted medication administration system used without the support of computerised prescribing (stand-alone BCMA), on the dispensing process and its users. The stand-alone BCMA system was implemented in one ward of a teaching hospital. The number of dispensing steps, dispensing time and potential dispensing errors (PDEs) were directly observed one month before and eight months after the intervention. Attitudes of pharmacy and nursing staff were assessed using a questionnaire (Likert scale) and interviews. Among 1291 and 471 drug items observed before and after the introduction of the technology respectively, the number of dispensing steps increased from five to eight and time (standard deviation) to dispense one drug item by one staff personnel increased from 0.8 (0.09) to 1.5 (0.12) min. Among 2828 and 471 drug items observed before and after the intervention respectively, the number of PDEs increased significantly (Ptechnology decreased significantly (P=0.003 and P=0.004 respectively) among users who participated in the before (N=16) and after (N=16) questionnaires surveys. Among the interviewees, pharmacy staff felt that the system offered less benefit to the dispensing process (9/16). Nursing staff perceived the system as useful in improving the accuracy of drug administration (7/10). Implementing a stand-alone BCMA system may slow down and complicate the dispensing process. Nursing staff believe the stand-alone BCMA system could improve the drug administration process but pharmacy staff believes the technology would be more helpful if supported by computerised prescribing. However, periodical assessments are needed to identify weaknesses in the process after implementation, and all users should be educated on the benefits of using this technology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. BIOCHEMICAL PROCESSES IN CHERNOZEM SOIL UNDER DIFFERENT FERTILIZATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Ecaterina Emnova

    2012-06-01

    Full Text Available The paper deals with the evaluation of the intensity of certain soil biochemical processes (e.g. soil organic C mineralization at Organic and mixed Mineral+Organic fertilization of typical chernozem in crop rotation dynamics (for 6 years by use of eco-physiological indicators of biological soil quality: microbial biomass carbon, basal soil respiration, as well as, microbial and metabolic quotients. Soil sampling was performed from a long-term field crop experiment, which has been established in 1971 at the Balti steppe (Northern Moldova. The crop types had a more considerable impact on the soil microbial biomass accumulation and community biochemical activity compared to long-term Organic or mixed Mineral + Organic fertilizers amendments. The Org fertilization system doesn’t make it possible to avoid the loss of organic C in arable typical chernozem. The organic fertilizer (cattle manure is able to mitigate the negative consequences of long-term mineral fertilization.

  11. Modelling soil carbon fate under erosion process in vineyard

    Science.gov (United States)

    Novara, Agata; Scalenghe, Riccardo; Minacapilli, Mario; Maltese, Antonino; Capodici, Fulvio; Borgogno Mondino, Enrico; Gristina, Luciano

    2017-04-01

    Soil erosion processes in vineyards beyond water runoff and sediment transport have a strong effect on soil organic carbon loss (SOC) and redistribution along the slope. The variation of SOC across the landscape determines a difference in soil fertility and vine productivity. The aim of this research was to study erosion of a Mediterranean vineyard, develop an approach to estimate the SOC loss, correlate the vines vigor with sediment and carbon erosion. The study was carried out in a Sicilian (Italy) vineyard, planted in 2011. Along the slope, six pedons were studied by digging 6 pits up to 60cm depth. Soil was sampled in each pedon every 10cm and SOC was analyzed. Soil erosion, detachment and deposition areas were measured by pole height method. The vigor of vegetation was expressed in term of NDVI (Normalized difference Vegetation Index) derived from a satellite image (RapidEye) acquired at berry pre-veraison stage (July) and characterized by 5 spectral bands in the shortwave region, including a band in the red wavelength (R, 630-685 nm) and in the near infrared (NIR, 760-850 nm) . Results showed that soil erosion, sediments redistribution and SOC across the hill was strongly affected by topographic features, slope and curvature. The erosion rate was 46Mg ha-1 y-1 during the first 6 years since planting. The SOC redistribution was strongly correlated with the detachment or deposition area as highlighted by pole height measurements. The approach developed to estimate the SOC loss showed that during the whole study period the off-farm SOC amounts to 1.6Mg C ha-1. As highlighted by NDVI results, the plant vigor is strong correlated with SOC content and therefore, developing an accurate NDVI approach could be useful to detect the vineyard areas characterized by low fertility due to erosion process.

  12. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    Science.gov (United States)

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  13. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    Directory of Open Access Journals (Sweden)

    Stefan Bosse

    2015-02-01

    Full Text Available Multi-agent systems (MAS can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  14. Foveal Processing Under Concurrent Peripheral Load in Profoundly Deaf Adults

    Science.gov (United States)

    2016-01-01

    Development of the visual system typically proceeds in concert with the development of audition. One result is that the visual system of profoundly deaf individuals differs from that of those with typical auditory systems. While past research has suggested deaf people have enhanced attention in the visual periphery, it is still unclear whether or not this enhancement entails deficits in central vision. Profoundly deaf and typically hearing adults were administered a variant of the useful field of view task that independently assessed performance on concurrent central and peripheral tasks. Identification of a foveated target was impaired by a concurrent selective peripheral attention task, more so in profoundly deaf adults than in the typically hearing. Previous findings of enhanced performance on the peripheral task were not replicated. These data are discussed in terms of flexible allocation of spatial attention targeted towards perceived task demands, and support a modified “division of labor” hypothesis whereby attentional resources co-opted to process peripheral space result in reduced resources in the central visual field. PMID:26657078

  15. A regional process under the international initiative for IFM

    Directory of Open Access Journals (Sweden)

    Murase Masahiko

    2016-01-01

    Full Text Available Climate change is likely to result in increases in the frequency or intensity of extreme weather events including floods. The International Flood Initiative (IFI, initiated in January 2005 by UNESCO and WMO and voluntary partner organizations has promoted an integrated flood management (IFM to take advantage of floods and use of floodplains while reducing the social, environmental and economic risks. Its secretariat is located in ICHARM. The initiative objective is to support national platforms to practice evidence-based disaster risk reduction through mobilizing scientific and research networks. After its initial decade, the initiative is providing a stepping-stone for the implementation of Sendai Framework by revitalizing its activities aimed at building on the sucess of the past, while addressing existing gaps in integrated flood managemnet strategies comprising of optimal structural and nonstructural measures thereby mainstreaming disaster risk reduction and targeting sustainable development. In this context, a new mechanism try to facilitate monitoring, assessment and capacity building in the Asia Pacific region. The primary outcomes of the mechanism are demand-driven networking and related documentations of best practices for 1 hazard assessment, 2 exposure assessment, 3 vulnerability assessment and coping capacity to identify the gaps, and 4 follow-ups and monitoring of the IFM process.

  16. Neural processing of reward magnitude under varying attentional demands.

    Science.gov (United States)

    Stoppel, Christian Michael; Boehler, Carsten Nicolas; Strumpf, Hendrik; Heinze, Hans-Jochen; Hopf, Jens-Max; Schoenfeld, Mircea Ariel

    2011-04-06

    Central to the organization of behavior is the ability to represent the magnitude of a prospective reward and the costs related to obtaining it. Therein, reward-related neural activations are discounted in dependence of the effort required to resolve a given task. Varying attentional demands of the task might however affect reward-related neural activations. Here we employed fMRI to investigate the neural representation of expected values during a monetary incentive delay task with varying attentional demands. Following a cue, indicating at the same time the difficulty (hard/easy) and the reward magnitude (high/low) of the upcoming trial, subjects performed an attention task and subsequently received feedback about their monetary reward. Consistent with previous results, activity in anterior-cingulate, insular/orbitofrontal and mesolimbic regions co-varied with the anticipated reward-magnitude, but also with the attentional requirements of the task. These activations occurred contingent on action-execution and resembled the response time pattern of the subjects. In contrast, cue-related activations, signaling the forthcoming task-requirements, were only observed within attentional control structures. These results suggest that anticipated reward-magnitude and task-related attentional demands are concurrently processed in partially overlapping neural networks of anterior-cingulate, insular/orbitofrontal, and mesolimbic regions. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Dress codes and appearance policies: challenges under federal legislation, part 1: title VII of the civil rights act and religion.

    Science.gov (United States)

    Mitchell, Michael S; Koen, Clifford M; Moore, Thomas W

    2013-01-01

    As more and more individuals choose to express themselves and their religious beliefs with headwear, jewelry, dress, tattoos, and body piercings and push the envelope on what is deemed appropriate in the workplace, employers have an increased need for creation and enforcement of reasonable dress codes and appearance policies. As with any employment policy or practice, an appearance policy must be implemented and enforced without regard to an individual's race, color, sex, national origin, religion, disability, age, or any other protected status. A policy governing dress and appearance based on the business needs of an employer that is applied fairly and consistently and does not have a disproportionate effect on any protected class will generally be upheld if challenged in court. By examining some of the more common legal challenges to dress codes and how courts have resolved the disputes, health care managers can avoid many potential problems. This article addresses the issue of religious discrimination focusing on dress and appearance and some of the court cases that provide guidance for employers.

  18. Computer Simulation of Cure Process of an Axisymmetric Rubber Article Reinforced by Metal Plates Using Extended ABAQUS Code

    Directory of Open Access Journals (Sweden)

    M.H.R. Ghoreishy

    2013-01-01

    Full Text Available Afinite element model is developed for simulation of the curing process of a thick axisymmetric rubber article reinforced by metal plates during the molding and cooling stages. The model consists of the heat transfer equation and a newly developed kinetics model for the determination of the state of cure in the rubber. The latter is based on the modification of the well-known Kamal-Sourour model. The thermal contact of the rubber with metallic surfaces (inserts and molds and the variation of the thermal properties (conductivity and specific heat with temperature and state-of-cure are taken into consideration. The ABAQUS code is used in conjunction with an in-house developed user subroutine to solve the governing equations. Having compared temperature profile and variation of the state-of-cure with experimentally measured data, the accuracy and applicability of the model is confirmed. It is also shown that this model can be successfully used for the optimization of curing process which gives rise to reduction of the molding time.

  19. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  20. Using the Orbit Tracking Code Z3CYCLONE to Predict the Beam Produced by a Cold Cathode PIG Ion Source for Cyclotrons under DC Extraction

    CERN Document Server

    Forringer, Edward

    2005-01-01

    Experimental measurements of the emittance and luminosity of beams produced by a cold-cathode Phillips Ionization Guage (PIG) ion source for cyclotrons under dc extraction are reviewed. (The source being studied is of the same style as ones that will be used in a series of 250 MeV proton cyclotrons being constructed for cancer therapy by ACCEL Inst, Gmbh, of Bergisch Gladbach, Germany.) The concepts of 'plasma boundary' and 'plasma temperature' are presented as a useful set of parameters for describing the initial conditions used in computational orbit tracking. Experimental results for r-pr and z-pz emittance are compared to predictions from the MSU orbit tracking code Z3CYCLONE with results indicating that the code is able to predict the beam produced by these ion sources with adequate accuracy such that construction of actual cyclotrons can proceed with reasonably prudent confidence that the cyclotron will perform as predicted.

  1. Designing the User Interface COBRET under Windows to Carry out Pre- and Post-Processing for the Programs COBRA-RERTR and PARET

    International Nuclear Information System (INIS)

    Ghazi, N.; Monther, A.; Hainoun, A.

    2004-01-01

    In the frame work of testing, evaluation and application of computer codes in the design and safety analysis of research reactors, the dynamic code PARET and the thermal hydraulic code COBRA-RERTR have been adopted. In order to run the codes under windows and to support the user by pre- and post processing, the user interface program COBRET has been developed in the programming language Visual Basic 6 and the data used by it are organized and stored in a relational database in MS Access, an integral part of the software package, MS Office. The interface works in the environment of the Windows operating system and utilizes its graphics as well as other possibilities. It consists of Pre and Post processor. The pre processor deals with the interactive preparation of the input files for PARET and COBRA codes. It supports the user with an automatic check in routine for detecting logical input errors in addition to many direct helps during the multi mode input process. This process includes an automatic branching according to the selected control parameters that depends on the simulation modes of the considered physical problem. The post processor supports the user with graphical tool to present the time and axial distribution of the system variables that consist of many neutronics and thermal hydraulic parameters of the reactor system like neutron flux, reactivity, temperatures, flow rate, pressure and void distribution. (authors)

  2. The Obligation And Warranty To State Reasons Of Judicial Decisions Under The Paradigm Of The New Code Of Civil Procedure: A Right Of Democratic State Of Consolidation

    Directory of Open Access Journals (Sweden)

    Quezia Dornellas Fialho

    2017-02-01

    Full Text Available The constitutional state requires den to fundamental rights within the process. The duty, while the guarantee, the evidence of judicial decisions should, together with other procedural principles, lead to a new way of thinking about the process, not aiming at the speed at any cost, but the safe conduct of the fundamental rights of parties during the procedural motion. Thus, with respect to this duty-assurance, the new Code of Civil Procedure innovated by establishing requirements for the goals adequate reasoning of judgments.

  3. Behaviour of radionuclides in sedimentation processes under varying redox conditions

    Energy Technology Data Exchange (ETDEWEB)

    Ilus, E.; Ikaeheimonen, T.K.; Mattila, J.; Klemola, S. [STUK Radiation and Nuclear Safety Authority (Finland)

    2001-04-01

    Determination of sedimentation rates plays an important role in material balance and model calculations of seas and other bodies of water. The Baltic Sea offers an exceptionally good opportunity to study processes in sediments and sedimentation rates with radioecological methods, because the concentration peaks of {sup 137}Cs and {sup 239,240}Pu are easily detectable in its sediments. In 1995-1996 sediment profiles were taken at 51 sampling stations situated in the Baltic Proper, Bothnian Bay, Bothnian Sea and Gulf of Finland. The aim was to estimate sedimentation rates in different parts of the Baltic Sea by using alternative methods and to consider reasons for eventual differences in results. The {sup 210}Pb, {sup 137}Cs, {sup 239,240}Pu and th sediment trap methods were used in estimations. The results show that the accumulation rates of dry matter may vary between 0.006 and 0.90 g cm{sup -2}y{sup -1} at different sampling stations of the Baltic Sea and the sedimentation rates between 0.2 and 29 mm y{sup -1} depending on the sedimentation itself and the method used in calculation. This is a considerable range in results, considering that all of the sampling stations were located in areas of soft sediment bottoms. In general, the sedimentation rates were highest at the Bothnian Sea sampling stations. In the Gulf of Finland the sedimentation rates were highest in the eastern part, while in the Bothnian Bay and in the Baltic Proper the rates were in general lower than in the 2 areas first mentioned. The differences among the results obtained with various methods varied unsystematically; thus it was not possible to predict that anyone of the methods would always give higher results than any of the others or vice versa. The results show that in the Baltic Sea the use of more than 1 parallel methods in estimation of sedimentation rate is highly recommended. None of the methods is necessarily suitable for routine use in the Baltic Sea. In those cases where the {sup 137

  4. Catalogue of nuclear fusion codes - 1976

    International Nuclear Information System (INIS)

    1976-10-01

    A catalogue is presented of the computer codes in nuclear fusion research developed by JAERI, Division of Thermonuclear Fusion Research and Division of Large Tokamak Development in particular. It contains a total of about 100 codes under the categories: Atomic Process, Data Handling, Experimental Data Processing, Engineering, Input and Output, Special Languages and Their Application, Mathematical Programming, Miscellaneous, Numerical Analysis, Nuclear Physics, Plasma Physics and Fusion Research, Plasma Simulation and Numerical Technique, Reactor Design, Solid State Physics, Statistics, and System Program. (auth.)

  5. Proposals to clarify and enhance the naming of fungi under the International Code of Nomenclature for algae, fungi, and plants.

    Science.gov (United States)

    Hawksworth, David L

    2015-06-01

    Twenty-three proposals to modify the International Code of Nomenclature for algae, fungi, and plants adopted in 2011 with respect to the provisions for fungi are made, in accordance with the wishes of mycologists expressed at the 10(th) International Mycological Congress in Bangkok in 2014, and with the support of the International Commission on the Taxonomy of Fungi (ICTF), the votes of which are presented here. The proposals relate to: conditions for epitypification, registration of later typifications, protected lists of names, removal of exemptions for lichen-forming fungi, provision of a diagnosis when describing a new taxon, citation of sanctioned names, avoiding homonyms in other kingdoms, ending preference for sexually typified names, and treatment of conspecific names with the same epithet. These proposals are also being published in Taxon, will be considered by the Nomenclature Committee for Fungi and General Committee on Nomenclature, and voted on at the 19(th) International Botanical Congress in Shenzhen, China, in 2017.

  6. Development of a dynamical model of a nuclear processes simulator for analysis and training in classroom based in the RELAP/SCDAP codes

    International Nuclear Information System (INIS)

    Salazar C, J.H.; Ramos P, J.C.; Salazar S, E.; Chavez M, C.

    2003-01-01

    The present work illustrates the application of the concept of a simulator for analysis, design, instruction and training in a classroom environment associated to a nuclear power station. Emphasis is made on the methodology used to incorporate the best estimate codes RELAP/SCDAP to a prototype under development at the Nuclear Reactor Engineering Analysis Laboratory (NREAL). This methodology is based on a modular structure where multiple processes can be executed in an independent way and where the generated information is stored in shared memory segments and distributed by means of communication routines developed in the C programming language. The utility of the system is demonstrated using highly interactive graphics (mimic diagrams, pictorials and tendency graphs) for the simultaneous dynamic visualization of the most significant variables of a typical transient event (feed water controller failure in a BWR). A fundamental part of the system is its advanced graphic interface. This interface, of the type of direct manipulation, reproduces instruments and controls whose functionality is similar to those found in the current replica simulator for the Laguna Verde Nuclear Power Station. Finally the evaluation process is described. The general behavior of the main variables for the selected transitory event is interpreted, corroborating that they follow the same tendency that those reported for a BWR. The obtained results allow to conclude that the developed system works satisfactorily and that the use of al 1 x 1 real time visualization tools offers important advantages regarding other traditional methods of analysis. (Author)

  7. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  8. MO-G-BRE-05: Clinical Process Improvement and Billing in Radiation Oncology: A Case Study of Applying FMEA for CPT Code 77336 (continuing Medical Physics Consultation)

    International Nuclear Information System (INIS)

    Spirydovich, S; Huq, M

    2014-01-01

    Purpose: The improvement of quality in healthcare can be assessed by Failure Mode and Effects Analysis (FMEA). In radiation oncology, FMEA, as applied to the billing CPT code 77336, can improve both charge capture and, most importantly, quality of the performed services. Methods: We created an FMEA table for the process performed under CPT code 77336. For a given process step, each member of the assembled team (physicist, dosimetrist, and therapist) independently assigned numerical values for: probability of occurrence (O, 1–10), severity (S, 1–10), and probability of detection (D, 1–10) for every failure mode cause and effect combination. The risk priority number, RPN, was then calculated as a product of O, S and D from which an average RPN was calculated for each combination mentioned above. A fault tree diagram, with each process sorted into 6 categories, was created with linked RPN. For processes with high RPN recommended actions were assigned. 2 separate R and V systems (Lantis and EMR-based ARIA) were considered. Results: We identified 9 potential failure modes and corresponding 19 potential causes of these failure modes all resulting in unjustified 77336 charge and compromised quality of care. In Lantis, the range of RPN was 24.5–110.8, and of S values – 2–10. The highest ranking RPN of 110.8 came from the failure mode described as “end-of-treatment check not done before the completion of treatment”, and the highest S value of 10 (RPN=105) from “overrides not checked”. For the same failure modes, within ARIA electronic environment with its additional controls, RPN values were significantly lower (44.3 for end-of-treatment missing check and 20.0 for overrides not checked). Conclusion: Our work has shown that when charge capture was missed that also resulted in some services not being performed. Absence of such necessary services may result in sub-optimal quality of care rendered to patients

  9. Real-time photoacoustic and ultrasound dual-modality imaging system facilitated with graphics processing unit and code parallel optimization.

    Science.gov (United States)

    Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun

    2013-08-01

    Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.

  10. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    the reading of data from memory the receiving process. Protecting data in computer memories was one of the earliest applications of Hamming codes. We now describe the clever scheme invented by Hamming in 1948. To keep things simple, we describe the binary length 7 Hamming code. Encoding in the Hamming Code.

  11. The Narrative-Emotion Process Coding System 2.0: A multi-methodological approach to identifying and assessing narrative-emotion process markers in psychotherapy.

    Science.gov (United States)

    Angus, Lynne E; Boritz, Tali; Bryntwick, Emily; Carpenter, Naomi; Macaulay, Christianne; Khattra, Jasmine

    2017-05-01

    Recent studies suggest that it is not simply the expression of emotion or emotional arousal in session that is important, but rather it is the reflective processing of emergent, adaptive emotions, arising in the context of personal storytelling and/or Emotion-Focused Therapy (EFT) interventions, that is associated with change. To enhance narrative-emotion integration specifically in EFT, Angus and Greenberg originally identified a set of eight clinically derived narrative-emotion integration markers were originally identified for the implementation of process-guiding therapeutic responses. Further evaluation and testing by the Angus Narrative-Emotion Marker Lab resulted in the identification of 10 empirically validated Narrative-Emotion Process (N-EP) markers that are included in the Narrative-Emotion Process Coding System Version 2.0 (NEPCS 2.0). Based on empirical research findings, individual markers are clustered into Problem (e.g., stuckness in repetitive story patterns, over-controlled or dysregulated emotion, lack of reflectivity), Transition (e.g., reflective, access to adaptive emotions and new emotional plotlines, heightened narrative and emotion integration), and Change (e.g., new story outcomes and self-narrative discovery, and co-construction and re-conceptualization) subgroups. To date, research using the NEPCS 2.0 has investigated the proportion and pattern of narrative-emotion markers in Emotion-Focused, Client-Centered, and Cognitive Therapy for Major Depression, Motivational Interviewing plus Cognitive Behavioral Therapy for Generalized Anxiety Disorder, and EFT for Complex Trauma. Results have consistently identified significantly higher proportions of N-EP Transition and Change markers, and productive shifts, in mid- and late phase sessions, for clients who achieved recovery by treatment termination. Recovery is consistently associated with client storytelling that is emotionally engaged, reflective, and evidencing new story outcomes and self

  12. EXTRA·M: a computing code system for analysis of the Purex process with mixer settlers for reprocessing

    International Nuclear Information System (INIS)

    Tachimori, Shoichi

    1994-03-01

    A computer code system EXTRA·M, for simulation of transient behavior of the solutes in a multistage countercurrent extraction process, was developed aiming to predict the distribution and chemical behaviors of actinide elements, i.e., U, Pu, Np, and of technetium in the Purex process of fuel reprocessing. The mathematical model is applicable to a complete mixing stagewise contactor such as mixer settler and to the Purex, with tri-n-butylphosphate (TBP) and nitric acid system. The main characteristics of the EXTRA·M are as follows; i) Calculation of distribution ratios of the solutes is based on numerical equations of which parameter values are to be determined by a best fit method with a number of experimental data. ii) Total of 18 solutes; U(IV), U(VI), Pu(III), Pu(IV), Pu(V), Pu(VI), Np(IV), Np(V), Np(VI), Tc(IV), Tc(V), Tc(VI), Tc(VII), Zr(IV), HNO 3 , hydrazine, hydroxylamine nitrate and nitrous acid, are treated and rate equations of total 40 chemical reactions involving these solutes are incorporated. iii) Instantaneous change of flow conditions, i.e., concentration of the solutes and flow rate of the feeding solutions, is contrived by computation. iv) Reflux or bypass mode calculation, in which an aqueous raffinate stream is transferred to the preceding bank or stage, is possible. The present report explains the concept, assumptions and characteristics of the model, the material balance equations including distribution and reaction rate equations and their solution method, and the usefulness of the model by showing some examples of the verification results. A description and source program of EXTRA·M1, as an example, are listed in the annex. (J.P.N.) 63 refs

  13. Implementation and use of Gaussian process meta model for sensitivity analysis of numerical models: application to a hydrogeological transport computer code

    International Nuclear Information System (INIS)

    Marrel, A.

    2008-01-01

    In the studies of environmental transfer and risk assessment, numerical models are used to simulate, understand and predict the transfer of pollutant. These computer codes can depend on a high number of uncertain input parameters (geophysical variables, chemical parameters, etc.) and can be often too computer time expensive. To conduct uncertainty propagation studies and to measure the importance of each input on the response variability, the computer code has to be approximated by a meta model which is build on an acceptable number of simulations of the code and requires a negligible calculation time. We focused our research work on the use of Gaussian process meta model to make the sensitivity analysis of the code. We proposed a methodology with estimation and input selection procedures in order to build the meta model in the case of a high number of inputs and with few simulations available. Then, we compared two approaches to compute the sensitivity indices with the meta model and proposed an algorithm to build prediction intervals for these indices. Afterwards, we were interested in the choice of the code simulations. We studied the influence of different sampling strategies on the predictiveness of the Gaussian process meta model. Finally, we extended our statistical tools to a functional output of a computer code. We combined a decomposition on a wavelet basis with the Gaussian process modelling before computing the functional sensitivity indices. All the tools and statistical methodologies that we developed were applied to the real case of a complex hydrogeological computer code, simulating radionuclide transport in groundwater. (author) [fr

  14. 76 FR 37034 - Certain Employee Remuneration in Excess of $1,000,000 Under Internal Revenue Code Section 162(m)

    Science.gov (United States)

    2011-06-24

    ... exercise of a stock option or stock appreciation right, or the substantial vesting of restricted property... options, stock appreciation rights, and restricted property be extended even further to cover other stock... to stock options, stock appreciation rights, and restricted property is covered under Sec. 1.162-27(f...

  15. System for measuring the effect of fouling and corrosion on heat transfer under simulated OTEC conditions. [HTAU and LABTTF codes

    Energy Technology Data Exchange (ETDEWEB)

    Fetkovich, J.G.

    1976-12-01

    A complete system designed to measure, with high precision, changes in heat transfer rates due to fouling and corrosion of simulated heat exchanger tubes, at sea and under OTEC conditions is described. All aspects of the system are described in detail, including theory, mechanical design, electronics design, assembly procedures, test and calibration, operating procedures, laboratory results, field results, and data analysis programs.

  16. Mashing of Rice with Barley Malt Under Nonconventional Process Conditions for Use in Food Processes

    DEFF Research Database (Denmark)

    Moe, T.; Adler-Nissen, Jens

    1994-01-01

    Non-conventional mashing conditions are relevant in the development of a lactic acid-fermented soymilk beverage where mashed rice is the source of carbohydrates for the fermentation and sweetness of the beverage. Advantages in the process layout could be achieved by mashing at higher pH and lower...... conditions when a mashing step is integrated in other food processes....

  17. Development of severe accident analysis code - A study on the molten core-concrete interaction under severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chang Hyun; Lee, Byung Chul; Huh, Chang Wook; Kim, Doh Young; Kim, Ju Yeul [Seoul National University, Seoul (Korea, Republic of)

    1996-07-01

    The purpose of this study is to understand the phenomena of the molten core/concrete interaction during the hypothetical severe accident, and to develop the model for heat transfer and physical phenomena in MCCIs. The contents of this study are analysis of mechanism in MCCIs and assessment of heat transfer models, evaluation of model in CORCON code and verification in CORCON using SWISS and SURC Experiments, and 1000 MWe PWR reactor cavity coolability, and establishment a model for prediction of the crust formation and temperature of melt-pool. The properties and flow condition of melt pool covering with the conditions of severe accident are used to evaluate the heat transfer coefficients in each reviewed model. Also, the scope and limitation of each model for application is assessed. A phenomenological analysis is performed with MELCOR 1.8.2 and MELCOR 1.8.3 And its results is compared with corresponding experimental reports of SWISS and SURC experiments. And the calculation is performed to assess the 1000 MWe PWR reactor cavity coolability. To improve the heat transfer model between melt-pool and overlying coolant and analyze the phase change of melt-pool, 2 dimensional governing equations are established using the enthalpy method and computational program is accomplished in this study. The benchmarking calculation is performed and its results are compared to the experiment which has not considered effects of the coolant boiling and the gas injection. Ultimately, the model shall be developed for considering the gas injection effect and coolant boiling effect. 66 refs., 10 tabs., 29 refs. (author)

  18. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  19. JASMINE-pro: A computer code for the analysis of propagation process in steam explosions. User's manual

    International Nuclear Information System (INIS)

    Yang, Yanhua; Nilsuwankosit, Sunchai; Moriyama, Kiyofumi; Maruyama, Yu; Nakamura, Hideo; Hashimoto, Kazuichiro

    2000-12-01

    A steam explosion is a phenomenon where a high temperature liquid gives its internal energy very rapidly to another low temperature volatile liquid, causing very strong pressure build up due to rapid vaporization of the latter. In the field of light water reactor safety research, steam explosions caused by the contact of molten core and coolant has been recognized as a potential threat which could cause failure of the pressure vessel or the containment vessel during a severe accident. A numerical simulation code JASMINE was developed at Japan Atomic Energy Research Institute (JAERI) to evaluate the impact of steam explosions on the integrity of reactor boundaries. JASMINE code consists of two parts, JASMINE-pre and -pro, which handle the premixing and propagation phases in steam explosions, respectively. JASMINE-pro code simulates the thermo-hydrodynamics in the propagation phase of a steam explosion on the basis of the multi-fluid model for multiphase flow. This report, 'User's Manual', gives the usage of JASMINE-pro code as well as the information on the code structures which should be useful for users to understand how the code works. (author)

  20. Disposal Notifications and Quarterly Membership Updates for the Utility Solid Waste Group Members’ Risk-Based Approvals to Dispose of PCB Remediation Waste Under Title 40 of the Code of Federal Regulations Section 761.61(c)

    Science.gov (United States)

    Disposal Notifications and Quarterly Membership Updates for the Utility Solid Waste Group Members’ Risk-Based Approvals to Dispose of Polychlorinated Biphenyl (PCB) Remediation Waste Under Title 40 of the Code of Federal Regulations Section 761.61(c)

  1. Calculation of the real states of Ignalina NPP Unit 1 and Unit 2 RBMK-1500 reactors in the verification process of QUABOX/CUBBOX code

    International Nuclear Information System (INIS)

    Bubelis, E.; Pabarcius, R.; Demcenko, M.

    2001-01-01

    Calculations of the main neutron-physical characteristics of RBMK-1500 reactors of Ignalina NPP Unit 1 and Unit 2 were performed, taking real reactor core states as the basis for these calculations. Comparison of the calculation results, obtained using QUABOX/CUBBOX code, with experimental data and the calculation results, obtained using STEPAN code, showed that all the main neutron-physical characteristics of the reactors of Unit 1 and Unit 2 of Ignalina NPP are in the safe deviation range of die analyzed parameters, and that reactors of Ignalina NPP, during the process of the reactor core composition change, are operated in a safe and stable manner. (author)

  2. FIFI 3: A digital computer code for the solution of sets of first order differential equations and the analysis of process plant dynamics

    International Nuclear Information System (INIS)

    Sumner, H.M.

    1965-11-01

    FIFI 3 is a FORTRAN Code embodying a technique for the analysis of process plant dynamics. As such, it is essentially a tool for the integration of sets of first order ordinary differential equations, either linear or non-linear; special provision is made for the inclusion of time-delayed variables in the mathematical model of the plant. The method of integration is new and is centred on a stable multistep predictor-corrector algorithm devised by the late Mr. F.G. Chapman, of the UKAEA, Winfrith. The theory on which the Code is based and detailed rules for using it are described in Parts I and II respectively. (author)

  3. Development of TPNCIRC code for Evaluation of Two-Phase Natural Circulation Flow Performance under External Reactor Vessel Cooling Conditions

    International Nuclear Information System (INIS)

    Choi, A-Reum; Song, Hyuk-Jin; Park, Jong-Woon

    2015-01-01

    During a severe accident, corium is relocated to the lower head of the nuclear reactor pressure vessel (RPV). Design concept of retaining the corium inside a nuclear reactor pressure vessel (RPV) through external cooling under hypothetical core melting accidents is called external reactor vessel cooling (ERVC). In this respect, validated two-phase natural circulation flow (TPNC) model is necessary to determine the adequacy of the ERVC design and operating conditions such as inlet area, form losses, gap distance, riser length and coolant conditions. The most important model generally characterizing the TPNC are void fraction and two-phase friction factors. Typical experimental and analytical studies to be referred to on two-phase circulation flow characteristics are those by Reyes, Gartia et al. based on Vijayan et al., Nayak et al. and Dubey et al. In the present paper, two-phase natural circulation (TPNC) flow characteristics under external reactor vessel cooling (ERVC) conditions are studied using two existing TPNC flow models of Reyes and Gartia et al. incorporating more improved void fraction and two-phase friction models. These models and correlations are integrated into a computer program, TPNCIRC, which can handle candidate ERVC design parameters, such as inlet, riser and downcomer flow lengths and areas, gap size between reactor vessel and surrounding insulations, minor loss factors and operating parameters of decay power, pressure and subcooling. Accuracy of the TPNCIRC program is investigated with respect to the flow rate and void fractions for existing measured data from a general experiment and ULPU specifically designed for the AP1000 in-vessel retention. Also, the effect of some important design parameters are examined for the experimental and plant conditions. Using the flow models and correlations are integrated into a computer program, TPNCIRC, a number of correlations have been examined. This seems coming from the differences of void fractions

  4. There Is a "U" in Clutter: Evidence for Robust Sparse Codes Underlying Clutter Tolerance in Human Vision.

    Science.gov (United States)

    Cox, Patrick H; Riesenhuber, Maximilian

    2015-10-21

    The ability to recognize objects in clutter is crucial for human vision, yet the underlying neural computations remain poorly understood. Previous single-unit electrophysiology recordings in inferotemporal cortex in monkeys and fMRI studies of object-selective cortex in humans have shown that the responses to pairs of objects can sometimes be well described as a weighted average of the responses to the constituent objects. Yet, from a computational standpoint, it is not clear how the challenge of object recognition in clutter can be solved if downstream areas must disentangle the identity of an unknown number of individual objects from the confounded average neuronal responses. An alternative idea is that recognition is based on a subpopulation of neurons that are robust to clutter, i.e., that do not show response averaging, but rather robust object-selective responses in the presence of clutter. Here we show that simulations using the HMAX model of object recognition in cortex can fit the aforementioned single-unit and fMRI data, showing that the averaging-like responses can be understood as the result of responses of object-selective neurons to suboptimal stimuli. Moreover, the model shows how object recognition can be achieved by a sparse readout of neurons whose selectivity is robust to clutter. Finally, the model provides a novel prediction about human object recognition performance, namely, that target recognition ability should show a U-shaped dependency on the similarity of simultaneously presented clutter objects. This prediction is confirmed experimentally, supporting a simple, unifying model of how the brain performs object recognition in clutter. The neural mechanisms underlying object recognition in cluttered scenes (i.e., containing more than one object) remain poorly understood. Studies have suggested that neural responses to multiple objects correspond to an average of the responses to the constituent objects. Yet, it is unclear how the identities

  5. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  6. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  7. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence...... decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s...

  8. Thermal-hydraulic analysis under partial loss of flow accident hypothesis of a plate-type fuel surrounded by two water channels using RELAP5 code

    Directory of Open Access Journals (Sweden)

    Itamar Iliuk

    2016-01-01

    Full Text Available Thermal-hydraulic analysis of plate-type fuel has great importance to the establishment of safety criteria, also to the licensing of the future nuclear reactor with the objective of propelling the Brazilian nuclear submarine. In this work, an analysis of a single plate-type fuel surrounding by two water channels was performed using the RELAP5 thermal-hydraulic code. To realize the simulations, a plate-type fuel with the meat of uranium dioxide sandwiched between two Zircaloy-4 plates was proposed. A partial loss of flow accident was simulated to show the behavior of the model under this type of accident. The results show that the critical heat flux was detected in the central region along the axial direction of the plate when the right water channel was blocked.

  9. Exploring the Use of Design of Experiments in Industrial Processes Operating Under Closed-Loop Control

    DEFF Research Database (Denmark)

    Capaci, Francesca; Kulahci, Murat; Vanhatalo, Erik

    2017-01-01

    Industrial manufacturing processes often operate under closed-loop control, where automation aims to keep important process variables at their set-points. In process industries such as pulp, paper, chemical and steel plants, it is often hard to find production processes operating in open loop....... Instead, closed-loop control systems will actively attempt to minimize the impact of process disturbances. However, we argue that an implicit assumption in most experimental investigations is that the studied system is open loop, allowing the experimental factors to freely affect the important system...... responses. This scenario is typically not found in process industries. The purpose of this article is therefore to explore issues of experimental design and analysis in processes operating under closed-loop control and to illustrate how Design of Experiments can help in improving and optimizing...

  10. Mashing of Rice with Barley Malt Under Nonconventional Process Conditions for Use in Food Processes

    DEFF Research Database (Denmark)

    Moe, T.; Adler-Nissen, Jens

    1994-01-01

    Non-conventional mashing conditions are relevant in the development of a lactic acid-fermented soymilk beverage where mashed rice is the source of carbohydrates for the fermentation and sweetness of the beverage. Advantages in the process layout could be achieved by mashing at higher pH and lower...... at 50 degrees C and 62 degrees C was investigated. Regression equations have been established for predicting yields of soluble protein, low molecular weight sugars and total fermentability as functions of pH and malt concentration. The results showed that the maltose yield was constant while glucose......, maltotriose and total fermentable sugar yields decreased slightly with increasing pH and decreasing malt concentration. Prolonged mash holding times at 50 degrees C and 62 degrees C gave minor increases in protein yields only. It is concluded that it is quite acceptable to use nonconventional mashing...

  11. Post-Processing of Dynamic Gadolinium-Enhanced Magnetic Resonance Imaging Exams of the Liver: Explanation and Potential Clinical Applications for Color-Coded Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Wang, L.; Bos, I.C. Van den; Hussain, S.M.; Pattynama, P.M.; Vogel, M.W.; Kr estin, G.P.

    2008-01-01

    The purpose of this article is to explain and illustrate the current status and potential applications of automated and color-coded post-processing techniques for the analysis of dynamic multiphasic gadolinium-enhanced magnetic resonance imaging (MRI) of the liver. Post-processing of these images on dedicated workstations allows the generation of time-intensity curves (TIC) as well as color-coded images, which provides useful information on (neo)-angiogenesis within a liver lesion, if necessary combined with information on enhancement patterns of the surrounding liver parenchyma. Analysis of TIC and color-coded images, which are based on pharmacokinetic modeling, provides an easy-to-interpret schematic presentation of tumor behavior, providing additional characteristics for adequate differential diagnosis. Inclusion of TIC and color-coded images as part of the routine abdominal MRI workup protocol may help to further improve the specificity of MRI findings, but needs to be validated in clinical decision-making situations. In addition, these tools may facilitate the diagnostic workup of disease for detection, characterization, staging, and monitoring of antitumor therapy, and hold incremental value to the widely used tumor response criteria

  12. Current status of high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Sasa, Toshinobu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Current status of design code of accelerator (NMTC/JAERI code), outline of physical model and evaluation of accuracy of code were reported. To evaluate the nuclear performance of accelerator and strong spallation neutron origin, the nuclear reaction between high energy proton and target nuclide and behaviors of various produced particles are necessary. The nuclear design of spallation neutron system used a calculation code system connected the high energy nucleon{center_dot}meson transport code and the neutron{center_dot}photon transport code. NMTC/JAERI is described by the particle evaporation process under consideration of competition reaction of intranuclear cascade and fission process. Particle transport calculation was carried out for proton, neutron, {pi}- and {mu}-meson. To verify and improve accuracy of high energy nucleon-meson transport code, data of spallation and spallation neutron fragment by the integral experiment were collected. (S.Y.)

  13. A systematic framework for enterprise-wide optimization: Synthesis and design of processing network under uncertainty

    DEFF Research Database (Denmark)

    Quaglia, Alberto; Sarup, Bent; Sin, Gürkan

    2013-01-01

    In this paper, a systematic framework for synthesis and design of processing networks under uncertaintyis presented. Through the framework, an enterprise-wide optimization problem is formulated and solvedunder uncertain conditions, to identify the network (composed of raw materials, process techn...

  14. From production-oriented farming towards multifunctional entrepreneurship : exploring the underlying learning process

    NARCIS (Netherlands)

    Seuneke, P.L.M.

    2014-01-01

      This thesis unravels the learning process underlying the switch from conventional production-oriented farming towards ‘multifunctional entrepreneurship’. In other words: the process by which former production-oriented farmers (men, women and their families) re-invent themselves

  15. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  16. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    Science.gov (United States)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  17. A mechanistic model of critical heat flux under subcooled flow boiling conditions for application to one- and three-dimensional computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Le Corre, Jean-Marie, E-mail: lecorrjm@westinghouse.co [Westinghouse Electric Sweden AB, 72163 Vaesteras (Sweden); Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213 (United States); Yao, Shi-Chune [Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213 (United States); Amon, Cristina H. [Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213 (United States); Faculty of Applied Science and Engineering, University of Toronto, Toronto, ON M5S 1A4 (Canada)

    2010-02-15

    Based on a review of visual observations at or near critical heat flux (CHF) under subcooled flow boiling conditions and consideration of CHF triggering mechanisms, presented in a companion paper [Le Corre, J.M., Yao, S.C., Amon, C.H., 2010. Two-phase flow regimes and mechanisms of critical heat flux under subcooled flow boiling conditions. Nucl. Eng. Des.], a model using a two-dimensional transient thermal analysis of the heater undergoing nucleation was developed to mechanistically predict CHF in the case of a bubbly flow regime. The model simulates the spatial and temporal heater temperature variations during nucleation at the wall, accounting for the stochastic nature of the boiling phenomena. It is postulated that a high local wall superheat occurring underneath a nucleating bubble at the time of bubble departure can prevent wall rewetting at CHF (Leidenfrost effect). The model has also the potential to evaluate the post-DNB heater temperature up to the point of heater melting. Validation of the proposed model was performed using detailed measured wall boiling parameters near CHF, thereby bypassing most needed constitutive relations. It was found that under limiting nucleation conditions; a peak wall temperature at the time of bubble departure can be reached at CHF preventing wall cooling by quenching. The simulations show that the resulting dry patch can survive the surrounding quenching events, preventing further nucleation and leading to a fast heater temperature increase. The model was applied at CHF conditions in simple geometry coupled with one-dimensional and three-dimensional (CFD) codes. It was found that, within the range where CHF occurs under bubbly flow conditions (as defined in Le Corre et al., 2010), the local wall superheat underneath nucleating bubbles is predicted to reach the Leidenfrost temperature. However, a better knowledge of statistical variations in wall boiling parameters would be necessary to correctly capture the CHF trends with

  18. SYSMOD: user-interface for data processing, calculation codes and analysis of PWR lattices; SYSMOD: una interfase-usuario para el procesamiento, calculo y analysis de redes PWR

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Alejandro; Milian, Daniel [Instituto Superior de Ciencias y Tecnologias Nucleares (ISCTN), La Habana (Cuba). E-mail: agg@ctn.isctn.edu.cu

    2000-07-01

    The task of the physical calculation of the reactor demand of the management of a great volume of information and inclose the stages for processing of data, calculations and analysis of their results. These stages are highly sensible to human mistakes, that's why is imprescindible that them undergo automatization, doing tracked all the process against mistake or unexpected result. The user-interface SYSMOD was developed over the platform IDE Delphi 3.0, visual language driven to events. It to consist in of the principal menu, which inclose between its options the preparation of the input data (File and Edit) to the pre-processors for the calculation codes of reactors. The output information may be showed in graphic and/or alphanumeric format (Data-Process). SYSMOD endures two applications for the management of the data base for the data during the preparation of the input for the pre-processors of the spectral calculation, so as for the organization, conservation and presentation for the obtained results. The carried out of the lattices and global codes, takes place from this application, over the platform MS-DOS (Run). SYSMOD regards the possibility for the debugging of the codes (Debugging), so as the benchmarks qualified to so effect (Benchmark). SYSMOD has been applied for the analysis of te WWER-440 of the first unity of Juragua Nuclear Power Plant. (author)

  19. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  20. Melting and evaporation analysis of the first wall in a water-cooled breeding blanket module under vertical displacement event by using the MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo [Department of Nuclear Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826 (Korea, Republic of); Cho, Hyoung-Kyu, E-mail: chohk@snu.ac.kr [Department of Nuclear Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826 (Korea, Republic of); Park, Goon-Cherl [Department of Nuclear Engineering, Seoul National University, 1 Gwanak-ro, Gwanak-gu, Seoul 08826 (Korea, Republic of); Im, Kihak [National Fusion Research Institute, 169-148 Gwahak-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of)

    2017-05-15

    Highlights: • Material phase change of first wall was simulated for vertical displacement event. • An in-house first wall module was developed to simulate melting and evaporation. • Effective heat capacity method and evaporation model were proposed. • MARS code was proposed to predict two-phase phenomena in coolant channel. • Phase change simulation was performed by coupling MARS and in-house module. - Abstract: Plasma facing components of tokamak reactors such as ITER or the Korean fusion demonstration reactor (K-DEMO) can be subjected to damage by plasma instabilities. Plasma disruptions like vertical displacement event (VDE) with high heat flux, can cause melting and vaporization of plasma facing materials and burnout of coolant channels. In this study, to simulate melting and vaporization of the first wall in a water-cooled breeding blanket under VDE, one-dimensional heat equations were solved numerically by using an in-house first wall module, including phase change models, effective heat capacity method, and evaporation model. For thermal-hydraulics, the in-house first wall analysis module was coupled with the nuclear reactor safety analysis code, MARS, to take advantage of its prediction capability for two-phase flow and critical heat flux (CHF) occurrence. The first wall was proposed for simulation according to the conceptual design of the K-DEMO, and the heat flux of plasma disruption with a value of 600 MW/m{sup 2} for 0.1 s was applied. The phase change simulation results were analyzed in terms of the melting and evaporation thicknesses and the occurrence of CHF. The thermal integrity of the blanket first wall is discussed to confirm whether the structural material melts for the given conditions.

  1. Quality assessment of baby food made of different pre-processed organic raw materials under industrial processing conditions.

    Science.gov (United States)

    Seidel, Kathrin; Kahl, Johannes; Paoletti, Flavio; Birlouez, Ines; Busscher, Nicolaas; Kretzschmar, Ursula; Särkkä-Tirkkonen, Marjo; Seljåsen, Randi; Sinesio, Fiorella; Torp, Torfinn; Baiamonte, Irene

    2015-02-01

    The market for processed food is rapidly growing. The industry needs methods for "processing with care" leading to high quality products in order to meet consumers' expectations. Processing influences the quality of the finished product through various factors. In carrot baby food, these are the raw material, the pre-processing and storage treatments as well as the processing conditions. In this study, a quality assessment was performed on baby food made from different pre-processed raw materials. The experiments were carried out under industrial conditions using fresh, frozen and stored organic carrots as raw material. Statistically significant differences were found for sensory attributes among the three autoclaved puree samples (e.g. overall odour F = 90.72, p processed from frozen carrots show increased moisture content and decrease of several chemical constituents. Biocrystallization identified changes between replications of the cooking. Pre-treatment of raw material has a significant influence on the final quality of the baby food.

  2. Software supervisor: extension to the on-line codes utilization in order to help the process control

    International Nuclear Information System (INIS)

    Thomas, J.B.; Dumas, M.; Evrard, J.M.

    1988-01-01

    Calculation is a complex problem, which is usually solved by human experts. The complexity and the potentiality of the software increases. The introduction of the calculations in real time systems needs additional specifications. These aims cab be achieved by means of the control of the knowledge based systems, as well as by the introduction of the software techniques in the existing computer world. The following examples are given: the automatic generation of the calculation methods (in control language) in the modular code systems; the calculations monitoring by the expert systems, in order to help the on-line operations [fr

  3. AXAIR: A Computer Code for SAR Assessment of Plume-Exposure Doses from Potential Process-Accident Releases to Atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Pillinger, W.L.

    2001-05-17

    This report describes the AXAIR computer code which is available to terminal users for evaluating the doses to man from exposure to the atmospheric plume from postulated stack or building-vent releases at the Savannah River Plant. The emphasis herein is on documentation of the methodology only. The total-body doses evaluated are those that would be exceeded only 0.5 percent of the time based on worst-sector, worst-case meteorological probability analysis. The associated doses to other body organs are given in the dose breakdowns by radionuclide, body organ and pathway.

  4. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when...

  5. Clinical reasoning process underlying choice of teaching strategies: a framework to improve occupational therapists' transfer skill interventions.

    Science.gov (United States)

    Carrier, Annie; Levasseur, Mélanie; Bédard, Denis; Desrosiers, Johanne

    2012-10-01

    Clinical reasoning, a critical skill influenced by education and practice context, determines how occupational therapists teach transfer skills. Teaching strategies affect intervention efficacy. Although knowledge about the way teaching strategies are chosen could help improve interventions, few studies have considered this aspect. Therefore, the aim of this study was to explore the clinical reasoning process of occupational therapists underlying the choice of strategies to teach older adults transfer skills. A grounded theory study was carried out with eleven community occupational therapists recruited in six Health and Social Services Centres in Québec, Canada. Data were collected through observations of teaching situations (n = 31), in-depth semi-structured interviews (n = 12) and memos, and were analysed using constant comparative methods. Memos were also used to raise codes to conceptual categories, leading to an integrative framework. Rigour was assured by following scientific criteria for qualitative studies. The integrative framework includes the clinical reasoning process, consisting of eight stages, and its factors of influence. These factors are internal (experiences and elements of personal context) and external (type of transfer, clients' and their environment's characteristics and practice context). The clinical reasoning process underlying the choice of strategies to teach transfer skills was conceptualised into an integrative framework. Such a framework supports clinicians' reflective practice, highlights the importance of theory and practice of pedagogy in occupational therapists' education, and encourages consideration and better documentation of the possible influence of practice context on teaching interventions. As such, this integrative framework could improve occupational therapists' transfer skill interventions with older adults. © 2012 The Authors Australian Occupational Therapy Journal © 2012 Occupational Therapy Australia.

  6. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...... are optimal or best known for their parameters. In chapter five we study some graph codes with Reed–Solomon component codes. The underlying graph is well known and widely used for its good characteristics. This helps us to compute the dimension of the graph codes. We also introduce a combinatorial concept...... related to the iterative encoding of graph codes with MDS component code. The last chapter deals with affine Grassmann codes and Grassmann codes. We begin with some previously known codes and prove that they are also Tanner codes of the incidence graph of the point–line partial geometry...

  7. Bar Code Labels

    Science.gov (United States)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  8. Improvement of the MSG code for the MONJU evaporators. Additional function of reverse flow calculation on water/steam model and animation for post processing

    International Nuclear Information System (INIS)

    Toda, Shin-ichi; Yoshikawa, Shinji; Oketani, Kazuhiro

    2003-05-01

    The improved version of the MSG code (Multi-dimensional Thermal-hydraulic Analysis Code for Steam Generators) has been released. It has been carried out to improve based on the original version in order to calculate reverse flow on water/steam side, and to animate the post-processing data. To calculate reverse flow locally, modification to set pressure at each divided node point of water/steam region in the helical-coil heat transfer tubes has been carried out. And the matrix solver has been also improved to treat a problem within practical calculation time against increasing the pressure points. In this case pressure and enthalpy have to be calculated simultaneously, however, it was found out that using the block-Jacobean method make a diagonal-dominant matrix, and solve the matrix efficiently with a relaxation method. As the result of calculations of a steady-state condition and a transient of SG blow down with manual trip operation, the improvement on calculation function of the MSG code was confirmed. And an animation function of temperature contour in the sodium shell side as a post processing has been added. Since the animation is very effective to understand thermal-hydraulic behavior on the sodium shell side of the SG, especially in case of transient condition, the analysis and evaluation of the calculation results will be enabled to be more quickly and effectively. (author)

  9. Use of a Single CPT Code for Risk Adjustment in American College of Surgeons NSQIP Database: Is There Potential Bias with Practice-Pattern Differences in Multiple Procedures under the Same Anesthetic?

    Science.gov (United States)

    Cohen, Mark E; Liu, Yaoming; Liu, Jason B; Ko, Clifford Y; Hall, Bruce L

    2018-03-01

    American College of Surgeons NSQIP risk-adjustment models rely on the designated "principal" CPT code to account for procedure-related risk. However, if hospitals differ in their propensity to undertake multiple major operations under the same anesthetic, then risk adjustment using only a single code could bias hospital quality estimates. This study investigated this possibility for bias. We examined hospital odds ratios (ORs) when either the principal CPT code was used for risk adjustment (along with other standard NSQIP predictor variables) or when this code was used in addition to the remaining reported CPT code with the highest associated risk. We examined models for general surgery mortality and morbidity and morbidity in datasets that included mastectomy and/or breast reconstruction, or hysterectomy and/or gynecologic reconstruction as areas known to likely involve more than 1 procedure. Hospital ORs based on 1 vs 2 CPT codes for risk adjustment were essentially the same for mortality and morbidity general surgery models and for the mastectomy and/or breast reconstruction model. For hysterectomy and/or gynecologic reconstruction, the 1 CPT code model tended to slightly overestimate ORs compared with the 2 CPT codes model, when the hospital's OR and the proportion of combined operations were large. Conditions under which practice-pattern-associated modeling bias might exist appear to be uncommon and to have a small impact on quality assessments for the areas examined. The evidence suggests that, within the American College of Surgeons NSQIP modeling paradigm, the principal CPT code adequately risk adjusts for operative procedure in performance assessments. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Identification of important phenomena under sodium fire accidents based on PIRT process with factor analysis in sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Aoyagi, Mitsuhiro; Uchibori, Akihiro; Kikuchi, Shin; Takata, Takashi; Ohno, Shuji; Ohshima, Hiroyuki

    2016-01-01

    The PIRT (Phenomena Identification and Ranking Table) process is an effective method to identify key phenomena involved in safety issues in nuclear power plants. The present PIRT process is aimed to validate sodium fire analysis codes. Because a sodium fire accident in sodium-cooled fast reactor (SFR) involves complex phenomena, various figures of merit (FOMs) could exist in this PIRT process. In addition, importance evaluation of phenomena for each FOM should be implemented in an objective manner under the PIRT process. This paper describes the methodology for specification of FOMs, identification of associated phenomena and importance evaluation of each associated phenomenon in order to complete a ranking table of important phenomena involved in a sodium fire accident in an SFR. The FOMs were specified through factor analysis in this PIRT process. Physical parameters to be quantified by a sodium fire analysis code were identified by considering concerns resulting from sodium fire in the factor analysis. Associated phenomena were identified through the element- and sequence-based phenomena analyses as is often conducted in PIRT processes. Importance of each associated phenomenon was evaluated by considering the sequence-based analysis of associated phenomena correlated with the FOMs. Then, we complete the ranking table through the factor and phenomenon analyses. (author)

  11. Methodology for Optimization of Process Integration Schemes in a Biorefinery under Uncertainty

    Directory of Open Access Journals (Sweden)

    Meilyn González-Cortés

    2017-01-01

    Full Text Available The uncertainty has a great impact in the investment decisions, operability of the plants and in the feasibility of integration opportunitiesin the chemical processes. This paper, presents the steps to consider the optimization of process investment in the processes integration under conditions of uncertainty. It is shown the potentialities of the biomass cane of sugar for the integration with several plants in a biorefinery scheme for the obtaining chemical products, thermal and electric energy.  Among the factories with potentialities for this integration are the pulp and paper and sugar factories and other derivative processes. Theses factorieshave common resources and also havea variety of products that can be exchange between them so certain products generated in a one of them can be raw matter in another plant. The methodology developed guide to obtaining of feasible investment projects under uncertainty. As objective function was considered the maximization of net profitablevalue in different scenariosthat are generated from the integration scheme.

  12. The Investigations of Friction under Die Surface Vibration in Cold Forging Process

    DEFF Research Database (Denmark)

    Jinming, Sha

    of the application of ultrasonic vibration on drawing, rolling and other metal forming process show that the load and friction coefficient would be decreased with the presence of ultrasonic vibration. Investigations on forging processes and under low frequency, especially the quantitative analysis of friction......The objective of this thesis is to fundamentally study the influence of die surface vibration on friction under low frequency in metal forging processes. The research includes vibrating tool system design for metal forming, theoretical and experimental investigations, and finite element simulations...... on die surface vibration in forging process. After a general introduction to friction mechanisms and friction test techniques in metal forming, the application of ultrasonic vibration in metal forming, the influence of sliding velocity on friction is described. Some earlier investigations...

  13. Plane strain bending under tension as an ideal flow process in pressure – dependent plasticity

    Directory of Open Access Journals (Sweden)

    Alexandrov Sergei

    2017-01-01

    Full Text Available Ideal plastic flows are those for which all material elements follow minimum work paths. Ideal flow solutions are widely used as the basis for inverse methods for the preliminary design of metalworking processes. The present paper provides the first ideal flow solution in pressure-dependent plasticity. In particular, the process of bending under tension is considered and it is shown that there are relations between the bending moment and tensile force that result in ideal flow paths.

  14. A Dependent Insurance Risk Model with Surrender and Investment under the Thinning Process

    Directory of Open Access Journals (Sweden)

    Wenguang Yu

    2015-01-01

    Full Text Available A dependent insurance risk model with surrender and investment under the thinning process is discussed, where the arrival of the policies follows a compound Poisson-Geometric process, and the occurrences of the claim and surrender happen as the p-thinning process and the q-thinning process of the arrival process, respectively. By the martingale theory, the properties of the surplus process, adjustment coefficient equation, the upper bound of ruin probability, and explicit expression of ruin probability are obtained. Moreover, we also get the Laplace transformation, the expectation, and the variance of the time when the surplus reaches a given level for the first time. Finally, various trends of the upper bound of ruin probability and the expectation and the variance of the time when the surplus reaches a given level for the first time are simulated analytically along with changing the investment size, investment interest rates, claim rate, and surrender rate.

  15. Computer codes in particle transport physics

    International Nuclear Information System (INIS)

    Pesic, M.

    2004-01-01

    is given. Importance of validation and verification of data and computer codes is underlined briefly. Examples of applications of the MCNPX, FLUKA and SHIELD codes to simulation of some of processes in nature, from reactor physics, ion medical therapy, cross section calculations, design of accelerator driven sub-critical systems to astrophysics and shielding of spaceships, are shown. More reliable and more frequent cross sections data in intermediate and high- energy range for particles transport and interactions with mater are expected in near future, as a result of new experimental investigations that are under way with the aim to validate theoretical models applied currently in the codes. These new data libraries are expected to be much larger and more comprehensive than existing ones requiring more computer memory and faster CPUs. Updated versions of the codes to be developed in future, beside sequential computation versions, will also include the MPI or PVM options to allow faster ru: ming of the code at acceptable cost for an end-user. A new option to be implemented in the codes is expected too - an end-user written application for particular problem could be added relatively simple to the general source code script. Initial works on full implementation of graphic user interface for preparing input and analysing output of codes and ability to interrupt and/or continue code running should be upgraded to user-friendly level. (author)

  16. HCPCS Coding: An Integral Part of Your Reimbursement Strategy.

    Science.gov (United States)

    Nusgart, Marcia

    2013-12-01

    The first step to a successful reimbursement strategy is to ensure that your wound care product has the most appropriate Healthcare Common Procedure Coding System (HCPCS) code (or billing) for your product. The correct HCPCS code plays an essential role in patient access to new and existing technologies. When devising a strategy to obtain a HCPCS code for its product, companies must consider a number of factors as follows: (1) Has the product gone through the Food and Drug Administration (FDA) regulatory process or does it need to do so? Will the FDA code designation impact which HCPCS code will be assigned to your product? (2) In what "site of service" do you intend to market your product? Where will your customers use the product? Which coding system (CPT ® or HCPCS) applies to your product? (3) Does a HCPCS code for a similar product already exist? Does your product fit under the existing HCPCS code? (4) Does your product need a new HCPCS code? What is the linkage, if any, between coding, payment, and coverage for the product? Researchers and companies need to start early and place the same emphasis on a reimbursement strategy as it does on a regulatory strategy. Your reimbursement strategy staff should be involved early in the process, preferably during product research and development and clinical trial discussions.

  17. The Orexin Component of Fasting Triggers Memory Processes Underlying Conditioned Food Selection in the Rat

    Science.gov (United States)

    Ferry, Barbara; Duchamp-Viret, Patricia

    2014-01-01

    To test the selectivity of the orexin A (OXA) system in olfactory sensitivity, the present study compared the effects of fasting and of central infusion of OXA on the memory processes underlying odor-malaise association during the conditioned odor aversion (COA) paradigm. Animals implanted with a cannula in the left ventricle received ICV infusion…

  18. Identifying the Neural Correlates Underlying Social Pain: Implications for Developmental Processes

    Science.gov (United States)

    Eisenberger, Naomi I.

    2006-01-01

    Although the need for social connection is critical for early social development as well as for psychological well-being throughout the lifespan, relatively little is known about the neural processes involved in maintaining social connections. The following review summarizes what is known regarding the neural correlates underlying feeling of…

  19. Modeling and Compensatory Processes Underlying Involvement in Child Care among Kibbutz-Reared Fathers

    Science.gov (United States)

    Gaunt, Ruth; Bassi, Liat

    2012-01-01

    This study examined modeling and compensatory processes underlying the effects of an early paternal model on father involvement in child care. Drawing on social learning theory, it was hypothesized that father-son relationships would moderate the association between a father's involvement and his own father's involvement. A sample of 136 kibbutz…

  20. High paraffin Kumkol petroleum processing under fuel and lubricant petroleum scheme

    International Nuclear Information System (INIS)

    Nadirov, N.K.; Konaev, Eh.N.

    1997-01-01

    Technological opportunity of high paraffin Kumkol petroleum processing under the fuel and lubricant scheme with production of lubricant materials in short supply, combustible materials and technical paraffin is shown. Mini petroleum block putting into operation on Kumkol deposit is reasonable economically and raises profitableness of hydrocarbon raw material production. (author)

  1. 78 FR 70088 - Agency Proposed Business Process Vision Under the Rehabilitation Act of 1973

    Science.gov (United States)

    2013-11-22

    ... individuals who: Are blind or visually impaired; are deaf or hard of hearing; have cognitive or learning... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2013-0042] Agency Proposed Business Process Vision Under the Rehabilitation Act of 1973 AGENCY: Social Security Administration (SSA). ACTION: Notice of...

  2. Preparedness Formation of the Future Vocational Education Teachers to Occupational Adaptation under Conditions of Globalization Processes

    Science.gov (United States)

    Sushentseva, Liliya

    2014-01-01

    The problem of the preparedness formation of future teachers of vocational training to the professional adaptation under conditions of globalization processes in society is considered. The analysis of scientific and educational literature devoted to the study of occupational adaptation and preparedness formation of specialists to it is carried…

  3. Cognitive Processes Underlying Women's Risk Judgments: Associations with Sexual Victimization History and Rape Myth Acceptance

    Science.gov (United States)

    Yeater, Elizabeth A.; Treat, Teresa A.; Viken, Richard J.; McFall, Richard M.

    2010-01-01

    Objective: This study evaluated the effects of sexual victimization history, rape myth acceptance, implicit attention, and recent learning on the cognitive processes underlying undergraduate women's explicit risk judgments. Method: Participants were 194 undergraduate women between 18 and 24 years of age. The sample was ethnically diverse and…

  4. Material processing of convection-driven flow field and temperature distribution under oblique gravity

    Science.gov (United States)

    Hung, R. J.

    1995-01-01

    A set of mathematical formulation is adopted to study vapor deposition from source materials driven by heat transfer process under normal and oblique directions of gravitational acceleration with extremely low pressure environment of 10(exp -2) mm Hg. A series of time animation of the initiation and development of flow and temperature profiles during the course of vapor deposition has been obtained through the numerical computation. Computations show that the process of vapor deposition has been accomplished by the transfer of vapor through a fairly complicated flow pattern of recirculation under normal direction gravitational acceleration. It is obvious that there is no way to produce a homogeneous thin crystalline films with fine grains under such a complicated flow pattern of recirculation with a non-uniform temperature distribution under normal direction gravitational acceleration. There is no vapor deposition due to a stably stratified medium without convection for reverse normal direction gravitational acceleration. Vapor deposition under oblique direction gravitational acceleration introduces a reduced gravitational acceleration in vertical direction which is favorable to produce a homogeneous thin crystalline films. However, oblique direction gravitational acceleration also induces an unfavorable gravitational acceleration along horizontal direction which is responsible to initiate a complicated flow pattern of recirculation. In other words, it is necessary to carry out vapor deposition under a reduced gravity in the future space shuttle experiments with extremely low pressure environment to process vapor deposition with a homogeneous crystalline films with fine grains. Fluid mechanics simulation can be used as a tool to suggest most optimistic way of experiment with best setup to achieve the goal of processing best nonlinear optical materials.

  5. Ultrasonic signal processing and B-SCAN imaging for nondestructive testing. Application to under - cladding - cracks

    International Nuclear Information System (INIS)

    Theron, G.

    1988-02-01

    Crack propagation under the stainless steel cladding of nuclear reactor vessels is monitored by ultrasonic testing. This work study signal processing to improve detection and sizing of defects. Two possibilities are examined: processing of each individual signal and simultaneous processing of all the signals giving a B-SCAN image. The bibliographic study of time-frequency methods shows that they are not suitable for pulses. Then decomposition in instantaneous frequency and envelope is used. Effect of interference of 2 close echoes on instantaneous frequency is studies. The deconvolution of B-SCAN images is obtained by the transducer field. A point-by-point deconvolution method, less noise sensitive, is developed. B-SCAN images are processed in 2 phases: interface signal processing and deconvolution. These calculations improve image accuracy and dynamics. Water-stell interface and ferritic-austenitic interface are separated. Echoes of crack top are visualized and crack-hole differentiation is improved [fr

  6. Working through the pain: working memory capacity and differences in processing and storage under pain.

    Science.gov (United States)

    Sanchez, Christopher A

    2011-02-01

    It has been suggested that pain perception and attention are closely linked at both a neural and a behavioural level. If pain and attention are so linked, it is reasonable to speculate that those who vary in working memory capacity (WMC) should be affected by pain differently. This study compares the performance of individuals who differ in WMC as they perform processing and memory span tasks while under mild pain and not. While processing performance under mild pain does not interact with WMC, the ability to store information for later recall does. This suggests that pain operates much like an additional processing burden, and that the ability to overcome this physical sensation is related to differences in WMC. © 2011 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business

  7. An accurate European option pricing model under Fractional Stable Process based on Feynman Path Integral

    Science.gov (United States)

    Ma, Chao; Ma, Qinghua; Yao, Haixiang; Hou, Tiancheng

    2018-03-01

    In this paper, we propose to use the Fractional Stable Process (FSP) for option pricing. The FSP is one of the few candidates to directly model a number of desired empirical properties of asset price risk neutral dynamics. However, pricing the vanilla European option under FSP is difficult and problematic. In the paper, built upon the developed Feynman Path Integral inspired techniques, we present a novel computational model for option pricing, i.e. the Fractional Stable Process Path Integral (FSPPI) model under a general fractional stable distribution that tackles this problem. Numerical and empirical experiments show that the proposed pricing model provides a correction of the Black-Scholes pricing error - overpricing long term options, underpricing short term options; overpricing out-of-the-money options, underpricing in-the-money options without any additional structures such as stochastic volatility and a jump process.

  8. Network Coding

    Indian Academy of Sciences (India)

    message symbols downstream, network coding achieves vast performance gains by permitting intermediate nodes to carry out algebraic oper- ations on the incoming data. In this article we present a tutorial introduction to network coding as well as an application to the e±cient operation of distributed data-storage networks.

  9. Executive Functioning and Processing Speed in Age-Related Differences in Memory: Contribution of a Coding Task

    Science.gov (United States)

    Baudouin, Alexia; Clarys, David; Vanneste, Sandrine; Isingrini, Michel

    2009-01-01

    The aim of the present study was to examine executive dysfunctioning and decreased processing speed as potential mediators of age-related differences in episodic memory. We compared the performances of young and elderly adults in a free-recall task. Participants were also given tests to measure executive functions and perceptual processing speed…

  10. Filter clogging in coarse pore filtration activated sludge process under high MLSS concentration.

    Science.gov (United States)

    Moghaddam, M R Alavi; Guan, Y; Satoh, H; Mino, T

    2006-01-01

    Coarse pore filtration activated sludge process is a type of hybrid process in which the secondary settling tank of the conventional activated sludge process is replaced by non- woven and coarse pore filter modules. The filter has pores, which are irregular in shape, and much bigger than micro-filtration membrane pores in size. The objective of the study is to find out the effect of the microbial community structure on filter clogging in the coarse pore filtration activated sludge process under high MLSS concentration in aerobic and anoxic/aerobic (A/A) conditions. Filter clogging started from day 65 and 70 in the A/A and aerobic process, respectively, but it was more severe in the A/A process compared to that in the aerobic process. EPS contents of sludge did not change significantly during the operation in both processes, and did not have a crucial effect on the observed filter clogging. There was no strong evidence for direct effect of the type and number of metazoa on filter clogging. The main difference between aerobic sludge and A/A sludge during the filter clogging period was the relative abundance of filamentous bacteria. According to the obtained results, it can be concluded that a higher presence of filamentous bacteria could reduce the severity of filter clogging in a coarse pore filtration activated sludge process.

  11. BPLOM: BPM Level-Oriented Methodology for Incremental Business Process Modeling and Code Generation on Mobile Platforms

    Directory of Open Access Journals (Sweden)

    Jaime Solis Martines

    2013-06-01

    Full Text Available The requirements engineering phase is the departure point for the development process of any kind of computer application, it determines the functionality needed in the working scenario of the program. Although this is a crucial point in application development, as incorrect requirement definition leads to costly error appearance in later stages of the development process, application domain experts’ implication remains minor. In order to correct this scenario, business process modeling notations were introduced to favor business expert implication in this phase, but notation complexity prevents this participation to reach its ideal state. Hence, we promote the definition of a level oriented business process methodology, which encourages the adaptation of the modeling notation to the modeling and technical knowledge shown by the expert. This approach reduces the complexity found by domain experts and enables them to model their processes completely with a level of technical detail directly proportional to their knowledge.

  12. Improvement of the spallation-reaction simulation code by considering both the high-momentum intranuclear nucleons and the preequilibrium process

    International Nuclear Information System (INIS)

    Ishibashi, K.; Miura, Y.; Sakae, T.

    1990-01-01

    In the present study, intranuclear nucleons with a high momentum are introduced into intranuclear cascade calculation, and the preequilibrium effects are considered at the end of the cascade process. The improvements made in the HETC (High Energy Transport Code) are outlined, focusing on intranuclear nucleons with a high momentum, and termination of the intranuclear cascade process. Discussion is made of the cutoff energy, and Monte Carlo calculations based on an excitation model are presented and analyzed. The experimental high energy neutrons in the backward direction are successfully reproduced. The preequilibrium effect is considered in a local manner, and this is introduced as a simple probability density function for terminating the intranuclear cascade process. The resultant neutron spectra reproduce the shoulders of the experimental data in the region of 20 to 50 MeV. The exciton model is coded with a Monte Carlo algorithm. The results of the exciton model calculation is not so appreciable except for intermediate energy neutrons in the backward direction. (N.K.)

  13. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  14. Mapping Common Aphasia Assessments to Underlying Cognitive Processes and Their Neural Substrates.

    Science.gov (United States)

    Lacey, Elizabeth H; Skipper-Kallal, Laura M; Xing, Shihui; Fama, Mackenzie E; Turkeltaub, Peter E

    2017-05-01

    Understanding the relationships between clinical tests, the processes they measure, and the brain networks underlying them, is critical in order for clinicians to move beyond aphasia syndrome classification toward specification of individual language process impairments. To understand the cognitive, language, and neuroanatomical factors underlying scores of commonly used aphasia tests. Twenty-five behavioral tests were administered to a group of 38 chronic left hemisphere stroke survivors and a high-resolution magnetic resonance image was obtained. Test scores were entered into a principal components analysis to extract the latent variables (factors) measured by the tests. Multivariate lesion-symptom mapping was used to localize lesions associated with the factor scores. The principal components analysis yielded 4 dissociable factors, which we labeled Word Finding/Fluency, Comprehension, Phonology/Working Memory Capacity, and Executive Function. While many tests loaded onto the factors in predictable ways, some relied heavily on factors not commonly associated with the tests. Lesion symptom mapping demonstrated discrete brain structures associated with each factor, including frontal, temporal, and parietal areas extending beyond the classical language network. Specific functions mapped onto brain anatomy largely in correspondence with modern neural models of language processing. An extensive clinical aphasia assessment identifies 4 independent language functions, relying on discrete parts of the left middle cerebral artery territory. A better understanding of the processes underlying cognitive tests and the link between lesion and behavior may lead to improved aphasia diagnosis, and may yield treatments better targeted to an individual's specific pattern of deficits and preserved abilities.

  15. Emergency Department Processes for the Evaluation and Management of Persons Under Investigation for Ebola Virus Disease.

    Science.gov (United States)

    Wadman, Michael C; Schwedhelm, Shelly S; Watson, Suzanne; Swanhorst, John; Gibbs, Shawn G; Lowe, John J; Iwen, Peter C; Hayes, A Kim; Needham, Susie; Johnson, Daniel W; Kalin, Daniel J; Zeger, Wesley G; Muelleman, Robert L

    2015-09-01

    Due to the recent Ebola virus outbreak in West Africa, patients with epidemiologic risk for Ebola virus disease and symptoms consistent with Ebola virus disease are presenting to emergency departments (EDs) and clinics in the United States. These individuals, identified as a person under investigation for Ebola virus disease, are initially screened using a molecular assay for Ebola virus. If this initial test is negative and the person under investigation has been symptomatic for Ebola virus disease or some other etiology, may require further investigation to direct appropriate therapy. ED administrators, physicians, and nurses proposed processes to provide care that is consistent with other ED patients. Biocontainment unit administrators, industrial hygienists, laboratory directors, physicians, and other medical personnel examined the ED processes and offered biocontainment unit personal protective equipment and process strategies designed to ensure safety for providers and patients. ED processes for the safe and timely evaluation and management of the person under investigation for Ebola virus disease are presented with the ultimate goals of protecting providers and ensuring a consistent level of care while confirmatory testing is pending. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  16. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  17. Face processing pattern under top-down perception: a functional MRI study

    Science.gov (United States)

    Li, Jun; Liang, Jimin; Tian, Jie; Liu, Jiangang; Zhao, Jizheng; Zhang, Hui; Shi, Guangming

    2009-02-01

    Although top-down perceptual process plays an important role in face processing, its neural substrate is still puzzling because the top-down stream is extracted difficultly from the activation pattern associated with contamination caused by bottom-up face perception input. In the present study, a novel paradigm of instructing participants to detect faces from pure noise images is employed, which could efficiently eliminate the interference of bottom-up face perception in topdown face processing. Analyzing the map of functional connectivity with right FFA analyzed by conventional Pearson's correlation, a possible face processing pattern induced by top-down perception can be obtained. Apart from the brain areas of bilateral fusiform gyrus (FG), left inferior occipital gyrus (IOG) and left superior temporal sulcus (STS), which are consistent with a core system in the distributed cortical network for face perception, activation induced by top-down face processing is also found in these regions that include the anterior cingulate gyrus (ACC), right oribitofrontal cortex (OFC), left precuneus, right parahippocampal cortex, left dorsolateral prefrontal cortex (DLPFC), right frontal pole, bilateral premotor cortex, left inferior parietal cortex and bilateral thalamus. The results indicate that making-decision, attention, episodic memory retrieving and contextual associative processing network cooperate with general face processing regions to process face information under top-down perception.

  18. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    International Nuclear Information System (INIS)

    Fujimoto, Kazufumi; Nagai, Hideo; Runggaldier, Wolfgang J.

    2013-01-01

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand it considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).

  19. Chaotic home environment is associated with reduced infant processing speed under high task demands.

    Science.gov (United States)

    Tomalski, Przemysław; Marczuk, Karolina; Pisula, Ewa; Malinowska, Anna; Kawa, Rafał; Niedźwiecka, Alicja

    2017-08-01

    Early adversity has profound long-term consequences for child development across domains. The effects of early adversity on structural and functional brain development were shown for infants under 12 months of life. However, the causal mechanisms of these effects remain relatively unexplored. Using a visual habituation task we investigated whether chaotic home environment may affect processing speed in 5.5 month-old infants (n=71). We found detrimental effects of chaos on processing speed for complex but not for simple visual stimuli. No effects of socio-economic status on infant processing speed were found although the sample was predominantly middle class. Our results indicate that chaotic early environment may adversely affect processing speed in early infancy, but only when greater cognitive resources need to be deployed. The study highlights an attractive avenue for research on the mechanisms linking home environment with the development of attention control. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Enhanced performance of denitrifying sulfide removal process under micro-aerobic condition

    International Nuclear Information System (INIS)

    Chen Chuan; Ren Nanqi; Wang Aijie; Liu Lihong; Lee, Duu-Jong

    2010-01-01

    The denitrifying sulfide removal (DSR) process with bio-granules comprising both heterotrophic and autotrophic denitrifiers can simultaneously convert nitrate, sulfide and acetate into di-nitrogen gas, elementary sulfur and carbon dioxide, respectively, at high loading rates. This study determines the reaction rate of sulfide oxidized into sulfur, as well as the reduction of nitrate to nitrite, would be enhanced under a micro-aerobic condition. The presence of limited oxygen mitigated the inhibition effects of sulfide on denitrifier activities, and enhanced the performance of DSR granules. The advantages and disadvantages of applying the micro-aerobic condition to the DSR process are discussed.

  1. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    Science.gov (United States)

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  2. The orexin component of fasting triggers memory processes underlying conditioned food selection in the rat.

    Science.gov (United States)

    Ferry, Barbara; Duchamp-Viret, Patricia

    2014-03-14

    To test the selectivity of the orexin A (OXA) system in olfactory sensitivity, the present study compared the effects of fasting and of central infusion of OXA on the memory processes underlying odor-malaise association during the conditioned odor aversion (COA) paradigm. Animals implanted with a cannula in the left ventricle received ICV infusion of OXA or artificial cerebrospinal fluid (ACSF) 1 h before COA acquisition. An additional group of intact rats were food-deprived for 24 h before acquisition. Results showed that the increased olfactory sensitivity induced by fasting and by OXA infusion was accompanied by enhanced COA performance. The present results suggest that fasting-induced central OXA release influenced COA learning by increasing not only olfactory sensitivity, but also the memory processes underlying the odor-malaise association.

  3. Evaluation of Specific Executive Functioning Skills and the Processes Underlying Executive Control in Schizophrenia

    OpenAIRE

    Savla, Gauri N.; Twamley, Elizabeth W.; Thompson, Wesley K.; Delis, Dean C.; Jeste, Dilip V.; Palmer, Barton W.

    2010-01-01

    Schizophrenia is associated with executive dysfunction. Yet, the degree to which executive functions are impaired differentially, or above and beyond underlying basic cognitive processes is less clear. Participants included 145 matched pairs of individuals with schizophrenia (SCs) and normal comparison subjects (NCs). Executive functions were assessed with 10 tasks of the Delis-Kaplan Executive Function System (D-KEFS), in terms of “achievement scores” reflecting overall performance on the ta...

  4. Application of Homotopy Analysis Method to Option Pricing Under Lévy Processes

    OpenAIRE

    Sakuma, Takayuki; Yamada, Yuji

    2014-01-01

    Option pricing under the Lévy process has been considered an important research direction in the field of financial engineering, where a closed-form expression for the standard European option is available due to the existence of analytically tractable characteristic function according to the Lévy–Khinchin representation. However, this approach cannot be applied to exotic derivatives (such as barrier options) directly, although a large volume of exotic derivatives are actively traded in the c...

  5. Availability assessment of oil and gas processing plants operating under dynamic Arctic weather conditions

    OpenAIRE

    Naseri, Masoud; Baraldi, Piero; Compare, Michele; Zio, Enrico

    2016-01-01

    We consider the assessment of the availability of oil and gas processing facilities operating under Arctic conditions. The novelty of the work lies in modelling the time-dependent effects of environmental conditions on the components failure and repair rates. This is done by introducing weather-dependent multiplicative factors, which can be estimated by expert judgements given the scarce data available from Arctic offshore operations. System availability is assessed considering the equivalent...

  6. Charging process of polyurethane based composites under electronic irradiation: Effects of cellulose fiber content

    Science.gov (United States)

    Hadjadj, Aomar; Jbara, Omar; Tara, Ahmed; Gilliot, Mickael; Dellis, Jean-Luc

    2013-09-01

    The study deals with the charging effect of polyurethanes-based composites reinforced with cellulose fibers, under electronic beam irradiation in a scanning electron microscope. The results indicate that the leakage current and the trapped charge as well as the kinetics of charging process significantly change beyond a critical concentration of 10% cellulose fibers. These features are correlated with the cellulose concentration-dependence of the electrical properties, specifically resistivity and capacitance, of the composite.

  7. Understanding Nutrient Processing Under Similar Hydrologic Conditions Along a River Continuum

    Science.gov (United States)

    Garayburu-Caruso, V. A.; Mortensen, J.; Van Horn, D. J.; Gonzalez-Pinzon, R.

    2015-12-01

    Eutrophication is one of the main causes of water impairment across the US. The fate of nutrients in streams is typically described by the dynamic coupling of physical processes and biochemical processes. However, isolating each of these processes and determining its contribution to the whole system is challenging due to the complexity of the physical, chemical and biological domains. We conducted column experiments seeking to understand nutrient processing in shallow sediment-water interactions along representative sites of the Jemez River-Rio Grande continuum (eight stream orders), in New Mexico (USA). For each stream order, we used a set of 6 columns packed with 3 different sediments, i.e., Silica Cone Density Sand ASTM D 1556 (0.075-2.00 mm), gravel (> 2mm) and native sediments from each site. We incubated the sediments for three months and performed tracer experiments in the laboratory under identical flow conditions, seeking to normalize the physical processes along the river continuum. We added a short-term pulse injection of NO3, resazurin and NaCl to each column and determined metabolism and NO3 processing using the Tracer Additions for Spiraling Curve Characterization method (TASCC). Our methods allowed us to study how changes in bacterial communities and sediment composition along the river continuum define nutrient processing.

  8. Effects of microbial processes on gas generation under expected WIPP repository conditions: Annual report through 1992

    International Nuclear Information System (INIS)

    Francis, A.J.; Gillow, J.B.

    1993-09-01

    Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term ( 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and underground workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation

  9. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017....... Coding Class projektet er et pilotprojekt, hvor en række skoler i København og Vejle kommuner har igangsat undervisningsaktiviteter med fokus på kodning og programmering i skolen. Evalueringen og dokumentationen af projektet omfatter kvalitative nedslag i udvalgte undervisningsinterventioner i efteråret...

  10. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  11. Image processing strategies based on saliency segmentation for object recognition under simulated prosthetic vision.

    Science.gov (United States)

    Li, Heng; Su, Xiaofan; Wang, Jing; Kan, Han; Han, Tingting; Zeng, Yajie; Chai, Xinyu

    2018-01-01

    Current retinal prostheses can only generate low-resolution visual percepts constituted of limited phosphenes which are elicited by an electrode array and with uncontrollable color and restricted grayscale. Under this visual perception, prosthetic recipients can just complete some simple visual tasks, but more complex tasks like face identification/object recognition are extremely difficult. Therefore, it is necessary to investigate and apply image processing strategies for optimizing the visual perception of the recipients. This study focuses on recognition of the object of interest employing simulated prosthetic vision. We used a saliency segmentation method based on a biologically plausible graph-based visual saliency model and a grabCut-based self-adaptive-iterative optimization framework to automatically extract foreground objects. Based on this, two image processing strategies, Addition of Separate Pixelization and Background Pixel Shrink, were further utilized to enhance the extracted foreground objects. i) The results showed by verification of psychophysical experiments that under simulated prosthetic vision, both strategies had marked advantages over Direct Pixelization in terms of recognition accuracy and efficiency. ii) We also found that recognition performance under two strategies was tied to the segmentation results and was affected positively by the paired-interrelated objects in the scene. The use of the saliency segmentation method and image processing strategies can automatically extract and enhance foreground objects, and significantly improve object recognition performance towards recipients implanted a high-density implant. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Network Coding Over The 232

    DEFF Research Database (Denmark)

    Pedersen, Morten Videbæk; Heide, Janus; Vingelmann, Peter

    2013-01-01

    Creating efficient finite field implementations has been an active research topic for several decades. Many appli- cations in areas such as cryptography, signal processing, erasure coding and now also network coding depend on this research to deliver satisfactory performance. In this paper we...... will be useful in many network coding applications where large field sizes are required....

  13. 78 FR 9678 - Multi-stakeholder Process To Develop a Voluntary Code of Conduct for Smart Grid Data Privacy

    Science.gov (United States)

    2013-02-11

    .... ACTION: Notice of Open Meeting. SUMMARY: The U.S. Department of Energy, Office of Electricity Delivery... Privacy and Promoting Innovation in the Global Digital Economy \\2\\ (Privacy Blueprint). The Privacy... discussions concerning the development of a VCC and will engage stakeholders in an open, transparent process...

  14. Image Processing Strategies Based on a Visual Saliency Model for Object Recognition Under Simulated Prosthetic Vision.

    Science.gov (United States)

    Wang, Jing; Li, Heng; Fu, Weizhen; Chen, Yao; Li, Liming; Lyu, Qing; Han, Tingting; Chai, Xinyu

    2016-01-01

    Retinal prostheses have the potential to restore partial vision. Object recognition in scenes of daily life is one of the essential tasks for implant wearers. Still limited by the low-resolution visual percepts provided by retinal prostheses, it is important to investigate and apply image processing methods to convey more useful visual information to the wearers. We proposed two image processing strategies based on Itti's visual saliency map, region of interest (ROI) extraction, and image segmentation. Itti's saliency model generated a saliency map from the original image, in which salient regions were grouped into ROI by the fuzzy c-means clustering. Then Grabcut generated a proto-object from the ROI labeled image which was recombined with background and enhanced in two ways--8-4 separated pixelization (8-4 SP) and background edge extraction (BEE). Results showed that both 8-4 SP and BEE had significantly higher recognition accuracy in comparison with direct pixelization (DP). Each saliency-based image processing strategy was subject to the performance of image segmentation. Under good and perfect segmentation conditions, BEE and 8-4 SP obtained noticeably higher recognition accuracy than DP, and under bad segmentation condition, only BEE boosted the performance. The application of saliency-based image processing strategies was verified to be beneficial to object recognition in daily scenes under simulated prosthetic vision. They are hoped to help the development of the image processing module for future retinal prostheses, and thus provide more benefit for the patients. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  15. Research on Collapse Process of Cable-Stayed Bridges under Strong Seismic Excitations

    Directory of Open Access Journals (Sweden)

    Xuewei Wang

    2017-01-01

    Full Text Available In order to present the collapse process and failure mechanism of long-span cable-stayed bridges under strong seismic excitations, a rail-cum-road steel truss cable-stayed bridge was selected as engineering background, the collapse failure numerical model of the cable-stayed bridge was established based on the explicit dynamic finite element method (FEM, and the whole collapse process of the cable-stayed bridge was analyzed and studied with three different seismic waves acted in the horizontal longitudinal direction, respectively. It can be found from the numerical simulation analysis that the whole collapse failure process and failure modes of the cable-stayed bridge under three different seismic waves are similar. Furthermore, the piers and the main pylons are critical components contributing to the collapse of the cable-stayed bridge structure. However, the cables and the main girder are damaged owing to the failure of piers and main pylons during the whole structure collapse process, so the failure of cable and main girder components is not the main reason for the collapse of cable-stayed bridge. The analysis results can provide theoretical basis for collapse resistance design and the determination of critical damage components of long-span highway and railway cable-stayed bridges in the research of seismic vulnerability analysis.

  16. Investigation of hydrogen isotopes interaction processes with lithium under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Zaurbekova, Zhanna, E-mail: zaurbekova@nnc.kz [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Skakov, Mazhyn; Ponkratov, Yuriy; Kulsartov, Timur; Gordienko, Yuriy; Tazhibayeva, Irina; Baklanov, Viktor; Barsukov, Nikolay [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Chikhray, Yevgen [Institute of Experimental and Theoretical Physics of Kazakh National University, Almaty (Kazakhstan)

    2016-11-01

    Highlights: • The experiments on study of helium and tritium generation and release processes under neutron irradiation from lithium saturated with deuterium are described in paper. ​ • The values of relative tritium and helium yield from lithium sample at different levels of neutron irradiation is calculated. • It was concluded that the main affecting process on tritium release from lithium is its interaction with lithium atoms with formation of lithium tritide. - Abstract: The paper describes the experiments on study of helium and tritium generation and release processes from lithium saturated with deuterium under neutron irradiation (in temperature range from 473 to 773 K). The diagrams of two reactor experiments show the time dependences of helium, DT, T{sub 2}, and tritium water partial pressures changes in experimental chamber with investigated lithium sample. According to experimental results, the values of relative tritium and helium yield from lithium sample at different levels of neutron irradiation were calculated. The time dependences of relative tritium and helium yield from lithium sample were plotted. It was concluded that the main affecting process on tritium release from lithium is its interaction with lithium atoms with formation of lithium tritide.

  17. An optimization methodology for identifying robust process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith; Patriksson, Michael

    2009-01-01

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures

  18. An optimization methodology for identifying robust process integration investments under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden); Patriksson, Michael [Department of Mathematical Sciences, Chalmers University of Technology and Department of Mathematical Sciences, University of Gothenburg, SE-412 96 Goeteborg (Sweden)

    2009-02-15

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  19. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  20. Analytical functions used for description of the plastic deformation process in Zirconium alloys WWER type fuel rod cladding under designed accident conditions

    International Nuclear Information System (INIS)

    Fedotov, A.

    2003-01-01

    The aim of this work was to improve the RAPTA-5 code as applied to the analysis of the thermomechanical behavior of the fuel rod cladding under designed accident conditions. The irreversible process thermodynamics methods were proposed to be used for the description of the plastic deformation process in zirconium alloys under accident conditions. Functions, which describe yielding stress dependence on plastic strain, strain rate and temperature may be successfully used in calculations. On the basis of the experiments made and the existent experimental data the dependence of yielding stress on plastic strain, strain rate, temperature and heating rate for E110 alloy was determined. In future the following research work shall be made: research of dynamic strain ageing in E635 alloy under different strain rates; research of strain rate influence on plastic strain in E635 alloy under test temperature higher than 873 K; research of deformation strengthening of E635 alloy under high temperatures; research of heating rate influence n phase transformation in E110 and E635 alloys

  1. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    to the sMINLP problem determines the processing technologies, material flows, and product portfolio that are optimal with respect to all the sampled scenarios. The developed framework is implemented and tested on a specific case study. The optimal processing pathways selected with and without......We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...

  2. Phase formation polycrystalline vanadium oxide via thermal annealing process under controlled nitrogen pressure

    Science.gov (United States)

    Jessadaluk, S.; Khemasiri, N.; Rahong, S.; Rangkasikorn, A.; Kayunkid, N.; Wirunchit, S.; Horprathum, M.; Chananonnawathron, C.; Klamchuen, A.; Nukeaw, J.

    2017-09-01

    This article provides an approach to improve and control crystal phases of the sputtering vanadium oxide (VxOy) thin films by post-thermal annealing process. Usually, as-deposited VxOy thin films at room temperature are amorphous phase: post-thermal annealing processes (400 °C, 2 hrs) under the various nitrogen (N2) pressures are applied to improve and control the crystal phase of VxOy thin films. The crystallinity of VxOy thin films changes from amorphous to α-V2O5 phase or V9O17 polycrystalline, which depend on the pressure of N2 carrier during annealing process. Moreover, the electrical resistivity of the VxOy thin films decrease from 105 Ω cm (amorphous) to 6×10-1 Ω cm (V9O17). Base on the results, our study show a simply method to improve and control phase formation of VxOy thin films.

  3. 22 CFR 41.57 - International cultural exchange visitors and visitors under the Irish Peace Process Cultural and...

    Science.gov (United States)

    2010-04-01

    ... visitors under the Irish Peace Process Cultural and Training Program Act (IPPCTPA). 41.57 Section 41.57... visitors and visitors under the Irish Peace Process Cultural and Training Program Act (IPPCTPA). (a... operation of the Irish Peace Process Cultural and Training Program (IPPCTP) which establishes at a minimum...

  4. From "cracking the orthographic code" to "playing with language": toward a usage-based foundation of the reading process.

    Science.gov (United States)

    Wallot, Sebastian

    2014-01-01

    The empirical study of reading dates back more than 125 years. But despite this long tradition, the scientific understanding of reading has made rather heterogeneous progress: many factors that influence the process of text reading have been uncovered, but theoretical explanations remain fragmented; no general theory pulls together the diverse findings. A handful of scholars have noted that properties thought to be at the core of the reading process do not actually generalize across different languages or from situations single-word reading to connected text reading. Such observations cast doubt on many of the traditional conceptions about reading. In this article, I suggest that the observed heterogeneity in the research is due to misguided conceptions about the reading process. Particularly problematic are the unrefined notions about meaning which undergird many reading theories: most psychological theories of reading implicitly assume a kind of elemental token semantics, where words serve as stable units of meaning in a text. This conception of meaning creates major conceptual problems. As an alternative, I argue that reading shoud be rather understood as a form of language use, which circumvents many of the conceptual problems and connects reading to a wider range of linguistic communication. Finally, drawing from Wittgenstein, the concept of "language games" is outlined as an approach to language use that can be operationalized scientifically to provide a new foundation for reading research.

  5. Chemo-Mechano Coupling Processes Inducing Evolution of Rock Permeability under Hydrothermal and Stressed Conditions (Invited)

    Science.gov (United States)

    Yasuhara, H.; Takahashi, M.; Kishida, K.; Nakashima, S.

    2013-12-01

    Coupled thermo-hydro-mechano-chemo (THMC) processes prevailing within fractured rocks are of significant importance in case of a long-term geo-sequestration of anthropogenic wastes of high level radioactive materials and carbon dioxide, and an effective recovery of energy from petroleum and geothermal reservoirs typically located in deep underground. The THMC processes should change the mechanical, hydraulic, and transport properties of the host rocks. Under even moderate pressure and temperature conditions, geochemical processes such as mineral dissolution should be active and may induce the change of those properties. Therefore, the effects should be examined in detail. In this work, a suite of long-term permeability experiments using granite, sandstone, and mudstone with or without a single fracture has been conducted under moderate confining pressures ranging 3 - 15 MPa and temperatures of 20 and 90 °C, and monitors the evolution in rock permeability and effluent chemistry throughout the experimental periods. Under net reduction or augmentation of pore/fracture volumes, the net permeability should alternatively increase or decrease with time, depending on the prevailing mechanical and geochemical processes. In granite samples, At 20 °C the observed fracture permeabilities monotonically reduce and reach quasi-steady state in two weeks, but after the temperature is increased to 90 °C those resume decreasing throughout the rest of experiments - the ultimate reductions are roughly two orders of magnitude within 40 days. In mudstone samples, similar results to those in granite samples are obtained (i.e., monotonic reduction and subsequent quasi-steady state). In contrast, in sandstone samples, a monotonic augmentation in permeability has been observed throughout the experiments. A chemo-mechanical model that accounts for temperature-dependent mineral dissolutions at contacting areas and free walls of pore spaces is applied to replicating the experimental

  6. Steam condensation modelling in aerosol codes

    International Nuclear Information System (INIS)

    Dunbar, I.H.

    1986-01-01

    The principal subject of this study is the modelling of the condensation of steam into and evaporation of water from aerosol particles. These processes introduce a new type of term into the equation for the development of the aerosol particle size distribution. This new term faces the code developer with three major problems: the physical modelling of the condensation/evaporation process, the discretisation of the new term and the separate accounting for the masses of the water and of the other components. This study has considered four codes which model the condensation of steam into and its evaporation from aerosol particles: AEROSYM-M (UK), AEROSOLS/B1 (France), NAUA (Federal Republic of Germany) and CONTAIN (USA). The modelling in the codes has been addressed under three headings. These are the physical modelling of condensation, the mathematics of the discretisation of the equations, and the methods for modelling the separate behaviour of different chemical components of the aerosol. The codes are least advanced in area of solute effect modelling. At present only AEROSOLS/B1 includes the effect. The effect is greater for more concentrated solutions. Codes without the effect will be more in error (underestimating the total airborne mass) the less condensation they predict. Data are needed on the water vapour pressure above concentrated solutions of the substances of interest (especially CsOH and CsI) if the extent to which aerosols retain water under superheated conditions is to be modelled. 15 refs

  7. Decolourisation of dyes under electro-Fenton process using Fe alginate gel beads

    International Nuclear Information System (INIS)

    Rosales, E.; Iglesias, O.; Pazos, M.; Sanromán, M.A.

    2012-01-01

    Highlights: ► Catalytic activity of Fe alginate gel beads for the remediation of wastewater was tested. ► New electro-Fenton process for the remediation of polluted wastewater. ► Continuous dye treatment without operational problem with high removal. - Abstract: This study focuses on the application of electro-Fenton technique by use of catalytic activity of Fe alginate gel beads for the remediation of wastewater contaminated with synthetic dyes. The Fe alginate gel beads were evaluated for decolourisation of two typical dyes, Lissamine Green B and Azure B under electro-Fenton process. After characterization of Fe alginate gel beads, the pH effect on the process with Fe alginate beads and a comparative study of the electro-Fenton process with free Fe and Fe alginate bead was done. The results showed that the use of Fe alginate beads increases the efficiency of the process; moreover the developed particles show a physical integrity in a wide range of pH (2–8). Around 98–100% of dye decolourisation was obtained for both dyes by electro-Fenton process in successive batches. Therefore, the process was performed with Fe alginate beads in a bubble continuous reactor. High color removal (87–98%) was attained for both dyes operating at a residence time of 30 min, without operational problems and maintaining particle shapes throughout the oxidation process. Consequently, the stable performance of Fe alginate beads opens promising perspectives for fast and economical treatment of wastewater polluted by dyes or similar organic contaminants.

  8. Neural networks underlying language and social cognition during self-other processing in Autism spectrum disorders.

    Science.gov (United States)

    Kana, Rajesh K; Sartin, Emma B; Stevens, Carl; Deshpande, Hrishikesh D; Klein, Christopher; Klinger, Mark R; Klinger, Laura Grofer

    2017-07-28

    The social communication impairments defining autism spectrum disorders (ASD) may be built upon core deficits in perspective-taking, language processing, and self-other representation. Self-referential processing entails the ability to incorporate self-awareness, self-judgment, and self-memory in information processing. Very few studies have examined the neural bases of integrating self-other representation and semantic processing in individuals with ASD. The main objective of this functional MRI study is to examine the role of language and social brain networks in self-other processing in young adults with ASD. Nineteen high-functioning male adults with ASD and 19 age-sex-and-IQ-matched typically developing (TD) control participants made "yes" or "no" judgments of whether an adjective, presented visually, described them (self) or their favorite teacher (other). Both ASD and TD participants showed significantly increased activity in the medial prefrontal cortex (MPFC) during self and other processing relative to letter search. Analyses of group differences revealed significantly reduced activity in left inferior frontal gyrus (LIFG), and left inferior parietal lobule (LIPL) in ASD participants, relative to TD controls. ASD participants also showed significantly weaker functional connectivity of the anterior cingulate cortex (ACC) with several brain areas while processing self-related words. The LIFG and IPL are important regions functionally at the intersection of language and social roles; reduced recruitment of these regions in ASD participants may suggest poor level of semantic and social processing. In addition, poor connectivity of the ACC may suggest the difficulty in meeting the linguistic and social demands of this task in ASD. Overall, this study provides new evidence of the altered recruitment of the neural networks underlying language and social cognition in ASD. Published by Elsevier Ltd.

  9. Decolourisation of dyes under electro-Fenton process using Fe alginate gel beads

    Energy Technology Data Exchange (ETDEWEB)

    Rosales, E.; Iglesias, O.; Pazos, M. [Department of Chemical Engineering, University of Vigo, Isaac Newton Building, Campus As Lagoas, Marcosende 36310, Vigo (Spain); Sanroman, M.A., E-mail: sanroman@uvigo.es [Department of Chemical Engineering, University of Vigo, Isaac Newton Building, Campus As Lagoas, Marcosende 36310, Vigo (Spain)

    2012-04-30

    Highlights: Black-Right-Pointing-Pointer Catalytic activity of Fe alginate gel beads for the remediation of wastewater was tested. Black-Right-Pointing-Pointer New electro-Fenton process for the remediation of polluted wastewater. Black-Right-Pointing-Pointer Continuous dye treatment without operational problem with high removal. - Abstract: This study focuses on the application of electro-Fenton technique by use of catalytic activity of Fe alginate gel beads for the remediation of wastewater contaminated with synthetic dyes. The Fe alginate gel beads were evaluated for decolourisation of two typical dyes, Lissamine Green B and Azure B under electro-Fenton process. After characterization of Fe alginate gel beads, the pH effect on the process with Fe alginate beads and a comparative study of the electro-Fenton process with free Fe and Fe alginate bead was done. The results showed that the use of Fe alginate beads increases the efficiency of the process; moreover the developed particles show a physical integrity in a wide range of pH (2-8). Around 98-100% of dye decolourisation was obtained for both dyes by electro-Fenton process in successive batches. Therefore, the process was performed with Fe alginate beads in a bubble continuous reactor. High color removal (87-98%) was attained for both dyes operating at a residence time of 30 min, without operational problems and maintaining particle shapes throughout the oxidation process. Consequently, the stable performance of Fe alginate beads opens promising perspectives for fast and economical treatment of wastewater polluted by dyes or similar organic contaminants.

  10. Can we always sweep the details of RNA-processing under the carpet?

    International Nuclear Information System (INIS)

    Klironomos, Filippos D; Berg, Johannes; De Meaux, Juliette

    2013-01-01

    RNA molecules follow a succession of enzyme-mediated processing steps from transcription to maturation. The participating enzymes, for example the spliceosome for mRNAs and Drosha and Dicer for microRNAs, are also produced in the cell and their copy-numbers fluctuate over time. Enzyme copy-number changes affect the processing rate of the substrate molecules; high enzyme numbers increase the processing rate, while low enzyme numbers decrease it. We study different RNA-processing cascades where enzyme copy-numbers are either fixed or fluctuate. We find that for the fixed enzyme copy-numbers, the substrates at steady-state are Poisson-distributed, and the whole RNA cascade dynamics can be understood as a single birth–death process of the mature RNA product. In this case, solely fluctuations in the timing of RNA processing lead to variation in the number of RNA molecules. However, we show analytically and numerically that when enzyme copy-numbers fluctuate, the strength of RNA fluctuations increases linearly with the RNA transcription rate. This linear effect becomes stronger as the speed of enzyme dynamics decreases relative to the speed of RNA dynamics. Interestingly, we find that under certain conditions, the RNA cascade can reduce the strength of fluctuations in the expression level of the mature RNA product. Finally, by investigating the effects of processing polymorphisms, we show that it is possible for the effects of transcriptional polymorphisms to be enhanced, reduced or even reversed. Our results provide a framework to understand the dynamics of RNA processing. (paper)

  11. Codes of conduct: An extra suave instrument of EU governance?

    DEFF Research Database (Denmark)

    Borras, Susana

    In recent years the European Union has been introducing new governance instruments. This paper examines the use of one such new types of governance instruments, namely, codes of conduct. The paper addresses the following two research questions, namely, under what conditions are codes of conduct...... able to coordinate actors successfully (effectiveness)? and secondly, under what conditions are codes of conduct able to generate democratically legitimate political processes? The paper examines carefully a recent case study, the “Code of Conduct for the Recruitment of Researchers” (CCRR). The code...... of hypothesis are formulated on the basis of theoretically-inspired assumptions. Quantitative and qualitative data shall be provided in the near future, however preliminary information regarding the implementation progress of the CCRR shows that there are quite diversified responses at national level. The extra...

  12. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  13. Evaluation of Independent Audit and Corporate Go vernance Practices in Turkey Under The Turkish Commercıal Code No. 6102: A Qualitative Research

    Directory of Open Access Journals (Sweden)

    Yasin Karadeniz

    2015-12-01

    Full Text Available The purpose of this study is as follows: To explain the new dimension that the corporate governance practices, which have had troubles for years in Turkey, have acquired with the Turkish Commercial Code and, while explaining such relations, to reveal the importance of independent auditing, which could not become fully functional and has gone through many problems again in the practices of our country, and also the importance of present situation and the situation in future with the help of Turkish Commercial Code and corporate governance relations.Interviews as a way of qualitative research has been done face to face with at least one chief auditor (mostly CPAs working in any of the independent auditing firms in İzmir and Çanakkale cities.Following interviews with auditors it has been revealed that the Turkish Commercial Code, corporate governance in Turkey would contribute positively to development of independent auditing.

  14. Evaluation of specific executive functioning skills and the processes underlying executive control in schizophrenia.

    Science.gov (United States)

    Savla, Gauri N; Twamley, Elizabeth W; Thompson, Wesley K; Delis, Dean C; Jeste, Dilip V; Palmer, Barton W

    2011-01-01

    Schizophrenia is associated with executive dysfunction. Yet, the degree to which executive functions are impaired differentially, or above and beyond underlying basic cognitive processes is less clear. Participants included 145 matched pairs of individuals with schizophrenia (SCs) and normal comparison subjects (NCs). Executive functions were assessed with 10 tasks of the Delis-Kaplan Executive Function System (D-KEFS), in terms of "achievement scores" reflecting overall performance on the task. Five of these tasks (all measuring executive control) were further examined in terms of their basic component (e.g., processing speed) scores and contrast scores (reflecting residual higher order skills adjusted for basic component skills). Group differences were examined via multivariate analysis of variance. SCs had worse performance than NCs on all achievement scores, but the greatest SC-NC difference was that for the Trails Switching task. SCs also had worse performance than NCs on all basic component skills. Of the executive control tasks, only Trails Switching continued to be impaired after accounting for impairments in underlying basic component skills. Much of the impairment in executive functions in schizophrenia may reflect the underlying component skills rather than higher-order functions. However, the results from one task suggest that there might be additional impairment in some aspects of executive control.

  15. Robustness of trait distribution metrics for community assembly studies under the uncertainties of assembly processes.

    Science.gov (United States)

    Aiba, Masahiro; Katabuchi, Masatoshi; Takafumi, Hino; Matsuzaki, Shin-Ichiro S; Sasaki, Takehiro; Hiura, Tsutom

    2013-12-01

    Numerous studies have revealed the existence of nonrandom trait distribution patterns as a sign of environmental filtering and/or biotic interactions in a community assembly process. A number of metrics with various algorithms have been used to detect these patterns without any clear guidelines. Although some studies have compared their statistical powers, the differences in performance among the metrics under the conditions close to actual studies are not clear. Therefore, the performances of five metrics of convergence and 16 metrics of divergence under alternative conditions were comparatively analyzed using a suite of simulated communities. We focused particularly on the robustness of the performances to conditions that are often uncertain and uncontrollable in actual studies; e.g., atypical trait distribution patterns stemming from the operation of multiple assembly mechanisms, a scaling of trait-function relationships, and a sufficiency of analyzed traits. Most tested metrics, for either convergence or divergence, had sufficient statistical power to distinguish nonrandom trait distribution patterns without uncertainty. However, the performances of the metrics were considerably influenced by both atypical trait distribution patterns and other uncertainties. Influences from these uncertainties varied among the metrics of different algorithms and their performances were often complementary. Therefore, under the uncertainties of an assembly process, the selection of appropriate metrics and the combined use of complementary metrics are critically important to reliably distinguish nonrandom patterns in a trait distribution. We provide a tentative list of recommended metrics for future studies.

  16. Achievements in testing of the MGA and FRAM isotopic software codes under the DOE/NNSA-IRSN cooperation of gamma-ray isotopic measurement systems

    International Nuclear Information System (INIS)

    Vo, Duc; Wang, Tzu-Fang; Funk, Pierre; Weber, Anne-Laure; Pepin, Nicolas; Karcher, Anna

    2009-01-01

    DOE/NNSA and IRSN collaborated on a study of gamma-ray instruments and analysis methods used to perform isotopic measurements of special nuclear materials. The two agencies agreed to collaborate on the project in response to inconsistencies that were found in the various versions of software and hardware used to determine the isotopic abundances of uranium and plutonium. IRSN used software developed internally to test the MGA and FRAM isotopic analysis codes for criteria used to stop data acquisition. The stop-criterion test revealed several unusual behaviors in both the MGA and FRAM software codes.

  17. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  18. SIMULTANEOUS DEGRADATION OF SOME PHTHALATE ESTERS UNDER FENTON AND PHOTO-FENTON OXIDATION PROCESSES

    Directory of Open Access Journals (Sweden)

    BELDEAN-GALEA M.S.

    2015-03-01

    Full Text Available In this study the assessment of the degradation efficiency of five phthalates, DEP, BBP, DEHP, DINP and DIDP, found in a mixture in a liquid phase, using the Fenton and Photo Fenton oxidation processes, was conducted. It was observed that the main parameters that influence the Fenton oxidative processes of phthalates were the concentration of the oxidizing agent, H2O2, the concentration of the catalyst used, Fe2+, the pH value, UV irradiation and the reaction time. For the Fenton oxidative process, the highest degradation efficiencies were 19% for DEP, 50% for BBP, 84% for DEHP, 90% for DINP and 48% for DIDP, when the experiments were carried out using concentrations of 20 mg L-1 phthalate mixture, 100 mg L-1 H2O2, 10 mg L-1 Fe2+ at a pH value of 3, with a total reaction time of 30 minutes. For the Photo-Fenton oxidative process carried out in the same conditions as Fenton oxidative process, it was observed that after an irradiation time of 90 minutes under UV radiation the degradation efficiencies of phthalates were improved, being 22% for DEP, 71% for BBP, 97% for DEHP, 97% for DINP and 81% for DIDP.

  19. Detection and Correction of Under-/Overexposed Optical Soundtracks by Coupling Image and Audio Signal Processing

    Science.gov (United States)

    Taquet, Jonathan; Besserer, Bernard; Hassaine, Abdelali; Decenciere, Etienne

    2008-12-01

    Film restoration using image processing, has been an active research field during the last years. However, the restoration of the soundtrack has been mainly performed in the sound domain, using signal processing methods, despite the fact that it is recorded as a continuous image between the images of the film and the perforations. While the very few published approaches focus on removing dust particles or concealing larger corrupted areas, no published works are devoted to the restoration of soundtracks degraded by substantial underexposure or overexposure. Digital restoration of optical soundtracks is an unexploited application field and, besides, scientifically rich, because it allows mixing both image and signal processing approaches. After introducing the principles of optical soundtrack recording and playback, this contribution focuses on our first approaches to detect and cancel the effects of under and overexposure. We intentionally choose to get a quantification of the effect of bad exposure in the 1D audio signal domain instead of 2D image domain. Our measurement is sent as feedback value to an image processing stage where the correction takes place, building up a "digital image and audio signal" closed loop processing. The approach is validated on both simulated alterations and real data.

  20. Detection and Correction of Under-/Overexposed Optical Soundtracks by Coupling Image and Audio Signal Processing

    Directory of Open Access Journals (Sweden)

    Etienne Decenciere

    2008-10-01

    Full Text Available Film restoration using image processing, has been an active research field during the last years. However, the restoration of the soundtrack has been mainly performed in the sound domain, using signal processing methods, despite the fact that it is recorded as a continuous image between the images of the film and the perforations. While the very few published approaches focus on removing dust particles or concealing larger corrupted areas, no published works are devoted to the restoration of soundtracks degraded by substantial underexposure or overexposure. Digital restoration of optical soundtracks is an unexploited application field and, besides, scientifically rich, because it allows mixing both image and signal processing approaches. After introducing the principles of optical soundtrack recording and playback, this contribution focuses on our first approaches to detect and cancel the effects of under and overexposure. We intentionally choose to get a quantification of the effect of bad exposure in the 1D audio signal domain instead of 2D image domain. Our measurement is sent as feedback value to an image processing stage where the correction takes place, building up a “digital image and audio signal” closed loop processing. The approach is validated on both simulated alterations and real data.

  1. An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2016-01-01

    Full Text Available Efficient matching of incoming mass events to persistent queries is fundamental to complex event processing systems. Event matching based on pattern rule is an important feature of complex event processing engine. However, the intrinsic uncertainty in pattern rules which are predecided by experts increases the difficulties of effective complex event processing. It inevitably involves various types of the intrinsic uncertainty, such as imprecision, fuzziness, and incompleteness, due to the inability of human beings subjective judgment. Nevertheless, D numbers is a new mathematic tool to model uncertainty, since it ignores the condition that elements on the frame must be mutually exclusive. To address the above issues, an intelligent complex event processing method with D numbers under fuzzy environment is proposed based on the Technique for Order Preferences by Similarity to an Ideal Solution (TOPSIS method. The novel method can fully support decision making in complex event processing systems. Finally, a numerical example is provided to evaluate the efficiency of the proposed method.

  2. Rare earth elements tracing the soil erosion processes on slope surface under natural rainfall

    International Nuclear Information System (INIS)

    Zhu Mingyong; Tan Shuduan; Dang Haishan; Zhang Quanfa

    2011-01-01

    A field experiment using rare earth elements (REEs) as tracers was conducted to investigate soil erosion processes on slope surfaces during rainfall events. A plot of 10 m x 2 m x 0.16 m with a gradient of 20 o (36.4%) was established and the plot was divided into two layers and four segments. Various REE tracers were applied to the different layers and segments to determine sediment dynamics under natural rainfall. Results indicated that sheet erosion accounted for more than 90% of total erosion when the rainfall amount and density was not large enough to generate concentrated flows. Sediment source changed in different sections on the slope surface, and the primary sediment source area tended to move upslope as erosion progressed. In rill erosion, sediment discharge mainly originated from the toe-slope and moved upwards as erosion intensified. The results obtained from this study suggest that multi-REE tracer technique is valuable in understanding the erosion processes and determining sediment sources. - Highlights: → Soil erosion processes with rare earth elements was conducted under natural rainfall. → Experimental setup developed here has seldom implemented in the world. → Sheet erosion is the main erosion type and main contributor to sediment loss. → Sediment source changed in different sections on the slope surface. → The primary sediment source area tended to move upslope as erosion progressed.

  3. The experimental research on response characteristics of coal samples under the uniaxial loading process

    Science.gov (United States)

    Jia, Bing; Wei, Jian-Ping; Wen, Zhi-Hui; Wang, Yun-Gang; Jia, Lin-Xing

    2017-11-01

    In order to study the response characteristics of infrasound in coal samples under the uniaxial loading process, coal samples were collected from GengCun mine. Coal rock stress loading device, acoustic emission tested system and infrasound tested system were used to test the infrasonic signal and acoustic emission signal under uniaxial loading process. The tested results were analyzed by the methods of wavelet filter, threshold denoise, time-frequency analysis and so on. The results showed that in the loading process, the change of the infrasonic wave displayed the characteristics of stage, and it could be divided into three stages: initial stage with a certain amount infrasound events, middle stage with few infrasound events, and late stage gradual decrease. It had a good consistency with changing characteristics of acoustic emission. At the same time, the frequency of infrasound was very low. It can propagate over a very long distance with little attenuation, and the characteristics of the infrasound before the destruction of the coal samples were obvious. A method of using the infrasound characteristics to predict the destruction of coal samples was proposed. This is of great significance to guide the prediction of geological hazards in coal mines.

  4. The role of automaticity and attention in neural processes underlying empathy for happiness, sadness, and anxiety

    Directory of Open Access Journals (Sweden)

    Sylvia A. Morelli

    2013-05-01

    Full Text Available Although many studies have examined the neural basis of experiencing empathy, relatively little is known about how empathic processes are affected by different attentional conditions. Thus, we examined whether instructions to empathize might amplify responses in empathy-related regions and whether cognitive load would diminish the involvement of these regions. 32 participants completed a functional magnetic resonance imaging session assessing empathic responses to individuals experiencing happy, sad, and anxious events. Stimuli were presented under three conditions: watching naturally, while instructed to empathize, and under cognitive load. Across analyses, we found evidence for a core set of neural regions that support empathic processes (dorsomedial prefrontal cortex, DMPFC; medial prefrontal cortex, MPFC; temporoparietal junction, TPJ; amygdala; ventral anterior insula, AI; septal area, SA. Two key regions – the ventral AI and SA – were consistently active across all attentional conditions, suggesting that they are automatically engaged during empathy. In addition, watching versus empathizing with targets was not markedly different and instead led to similar subjective and neural responses to others’ emotional experiences. In contrast, cognitive load reduced the subjective experience of empathy and diminished neural responses in several regions related to empathy (DMPFC, MPFC, TPJ, amygdala and social cognition. The current results reveal how attention impacts empathic processes and provides insight into how empathy may unfold in everyday interactions.

  5. Markov decision processes: a tool for sequential decision making under uncertainty.

    Science.gov (United States)

    Alagoz, Oguzhan; Hsu, Heather; Schaefer, Andrew J; Roberts, Mark S

    2010-01-01

    We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.

  6. Dynamics of bacterial communities in soils of rainforest fragments under restoration processes

    Science.gov (United States)

    Vasconcellos, Rafael; Zucchi, Tiago; Taketani, Rodrigo; Andreote, Fernando; Cardoso, Elke

    2014-05-01

    The Brazilian Atlantic Forest ("Mata Atlântica") has been largely studied due to its valuable and unique biodiversity. Unfortunately, this priceless ecosystem has been widely deforested and only 10% of its original area still remains. Many projects have been successfully implemented to restore its fauna and flora but there is a lack of information on how the soil bacterial communities respond to this process. Thus, our aim was to evaluate the influence of soil attributes and seasonality on soil bacterial communities of rainforest fragments under restoration processes. Soil samples from a native site and two ongoing restoration fragments with different ages of implementation (10 and 20 years) were collected and assayed by using culture-independent approaches. Our findings demonstrate that seasonality barely altered the bacterial distribution whereas soil chemical attributes and plant diversity highly influenced the bacterial community structure during the restoration process. Moreover, the strict relationship observed for two bacterial groups, Solibacteriaceae and Verrucomicrobia, one with the youngest (10 years) and the other with the oldest (native) site suggests their use as bioindicators of soil quality and soil recovery of forest fragments under restoration.

  7. American option valuation under time changed tempered stable Lévy processes

    Science.gov (United States)

    Gong, Xiaoli; Zhuang, Xintian

    2017-01-01

    Given that the underlying assets in financial markets exhibit stylized facts such as leptokurtosis, asymmetry, clustering properties and heteroskedasticity effect, this paper presents a novel model for pricing American option under the assumptions that the stock price processes are governed by time changed tempered stable Lévy process. As this model is constructed by introducing random time changes into tempered stable (TS) processes which specially refer to normal tempered stable (NTS) distribution as well as classical tempered stable (CTS) distribution, it permits infinite jumps as well as capturing random varying time in stochastic volatility, consequently taking into account the empirical facts such as leptokurtosis, skewness and volatility clustering behaviors. We employ the Fourier-cosine technique to calculate American option and propose the improved Particle Swarm optimization (IPSO) intelligent algorithm for model calibration. To demonstrate the advantage of the constructed model, we carry out empirical research on American index option in financial markets across wide ranges of models, with the time changing normal tempered stable distribution model yielding a superior performance than others.

  8. The Operator's Diagnosis Task under Abnormal Operating Conditions in Industrial Process Plant

    DEFF Research Database (Denmark)

    Goodstein, L.P.; Pedersen, O.M.; Rasmussen, Jens

    1974-01-01

    Analysis of serious accidents in connection with the operation of technical installations demonstrate that the diagnosis task which confronts personnel under non-normal plant conditions is a critical one. This report presents a preliminary outline of characteristic traits connected with the task...... of diagnosis for use in discussions of (a) the studies which are necessary in order to formulate the operator's diagnostic procedures and (b) the possibilities which exists for supporting these procedures through appropriate data processing and display in the control system. At the same time, attempts are made...... to connect ideas for display which currently are under consideration in the department to various phases of the diagnostic task which itself is postulated as being divided up into a sequence of subtasks each with its own typical features....

  9. Stochastic Games for Continuous-Time Jump Processes Under Finite-Horizon Payoff Criterion

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Qingda, E-mail: weiqd@hqu.edu.cn [Huaqiao University, School of Economics and Finance (China); Chen, Xian, E-mail: chenxian@amss.ac.cn [Peking University, School of Mathematical Sciences (China)

    2016-10-15

    In this paper we study two-person nonzero-sum games for continuous-time jump processes with the randomized history-dependent strategies under the finite-horizon payoff criterion. The state space is countable, and the transition rates and payoff functions are allowed to be unbounded from above and from below. Under the suitable conditions, we introduce a new topology for the set of all randomized Markov multi-strategies and establish its compactness and metrizability. Then by constructing the approximating sequences of the transition rates and payoff functions, we show that the optimal value function for each player is a unique solution to the corresponding optimality equation and obtain the existence of a randomized Markov Nash equilibrium. Furthermore, we illustrate the applications of our main results with a controlled birth and death system.

  10. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and softwa...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....

  11. Requirements of a Better Secure Program Coding

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2012-01-01

    Full Text Available Secure program coding refers to how manage the risks determined by the security breaches because of the program source code. The papers reviews the best practices must be doing during the software development life cycle for secure software assurance, the methods and techniques used for a secure coding assurance, the most known and common vulnerabilities determined by a bad coding process and how the security risks are managed and mitigated. As a tool of the better secure program coding, the code review process is presented, together with objective measures for code review assurance and estimation of the effort for the code improvement.

  12. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  13. Case study of the propagation of a small flaw under PWR loading conditions and comparison with the ASME code design life. Comparison of ASME Code Sections III and XI

    International Nuclear Information System (INIS)

    Yahr, G.T.; Gwaltney, R.C.; Richardson, A.K.; Server, W.L.

    1986-01-01

    A cooperative study was performed by EG and G Idaho, Inc., and Oak Ridge National Laboratory to investigate the degree of conservatism and consistency in the ASME Boiler and Pressure Vessel Code Section III fatigue evaluation procedure and Section XI flaw acceptance standards. A single, realistic, sample problem was analyzed to determine the significance of certain points of criticism made of an earlier parametric study by staff members of the Division of Engineering Standards of the Nuclear Regulatory Commission. The problem was based on a semielliptical flaw located on the inside surface of the hot-leg piping at the reactor vessel safe-end weld for the Zion 1 pressurized-water reactor (PWR). Two main criteria were used in selecting the problem; first, it should be a straight pipe to minimize the computational expense; second, it should exhibit as high a cumulative usage factor as possible. Although the problem selected has one of the highest cumulative usage factors of any straight pipe in the primary system of PWRs, it is still very low. The Code Section III fatigue usage factor was only 0.00046, assuming it was in the as-welded condition, and fatigue crack-growth analyses predicted negligible crack growth during the 40-year design life. When the analyses were extended past the design life, the usage factor was less than 1.0 when the flaw had propagated to failure. The current study shows that the criticism of the earlier report should not detract from the conclusion that if a component experiences a high level of cyclic stress corresponding to a fatigue usage factor near 1.0, very small cracks can propagate to unacceptable sizes

  14. Preparative process for hollow glass microsphere with wall thickness under 1 μm

    International Nuclear Information System (INIS)

    Du Shoude; Wei Sheng; Shi Tao

    1998-12-01

    The process for mass producing the high quality glass microspheres has been developed for ICF in China. The wall thickness of these microspheres is less than one micron. The effect of each zone temperature of drop furnace, flow rate of furnace air, solid concentration in the glass forming solution and concentration of the blowing agent on parameters of glass microspheres such as diameter and wall thickness are systematically studied. Glass microspheres with walls under 1 μm thick and which satisfy the exacting surface and symmetry specifications of targets for Shen-Guang-II directly driven experiments are now produced routinely

  15. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2011-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views [es

  16. Exploring Selective Exposure and Confirmation Bias as Processes Underlying Employee Work Happiness: An Intervention Study.

    Science.gov (United States)

    Williams, Paige; Kern, Margaret L; Waters, Lea

    2016-01-01

    Employee psychological capital (PsyCap), perceptions of organizational virtue (OV), and work happiness have been shown to be associated within and over time. This study examines selective exposure and confirmation bias as potential processes underlying PsyCap, OV, and work happiness associations. As part of a quasi-experimental study design, school staff (N = 69) completed surveys at three time points. After the first assessment, some staff (n = 51) completed a positive psychology training intervention. Results of descriptive statistics, correlation, and regression analyses on the intervention group provide some support for selective exposure and confirmation bias as explanatory mechanisms. In focusing on the processes through which employee attitudes may influence work happiness this study advances theoretical understanding, specifically of selective exposure and confirmation bias in a field study context.

  17. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2011-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  18. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2013-01-01

    These Guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 of the Convention and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views.

  19. Guidelines regarding the review process under the convention on nuclear safety

    International Nuclear Information System (INIS)

    1998-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing national reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of national reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  20. Processing and Probability Analysis of Pulsed Terahertz NDE of Corrosion under Shuttle Tile Data

    Science.gov (United States)

    Anastasi, Robert F.; Madaras, Eric I.; Seebo, Jeffrey P.; Ely, Thomas M.

    2009-01-01

    This paper examines data processing and probability analysis of pulsed terahertz NDE scans of corrosion defects under a Shuttle tile. Pulsed terahertz data collected from an aluminum plate with fabricated corrosion defects and covered with a Shuttle tile is presented. The corrosion defects imaged were fabricated by electrochemically etching areas of various diameter and depth in the plate. In this work, the aluminum plate echo signal is located in the terahertz time-of-flight data and a threshold is applied to produce a binary image of sample features. Feature location and area are examined and identified as corrosion through comparison with the known defect layout. The results are tabulated with hit, miss, or false call information for a probability of detection analysis that is used to identify an optimal processing threshold.

  1. Location-Dependent Query Processing Under Soft Real-Time Constraints

    Directory of Open Access Journals (Sweden)

    Zoubir Mammeri

    2009-01-01

    Full Text Available In recent years, mobile devices and applications achieved an increasing development. In database field, this development required methods to consider new query types like location-dependent queries (i.e. the query results depend on the query issuer location. Although several researches addressed problems related to location-dependent query processing, a few works considered timing requirements that may be associated with queries (i.e., the query results must be delivered to mobile clients on time. The main objective of this paper is to propose a solution for location-dependent query processing under soft real-time constraints. Hence, we propose methods to take into account client location-dependency and to maximize the percentage of queries respecting their deadlines. We validate our proposal by implementing a prototype based on Oracle DBMS. Performance evaluation results show that the proposed solution optimizes the percentage of queries meeting their deadlines and the communication cost.

  2. Guidelines regarding the review process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2002-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  3. Guidelines regarding the Review Process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    2013-01-01

    These Guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing National Reports submitted in accordance with Article 5 of the Convention and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of National Reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views. [es

  4. Design scope and level for standard design certification under a two step licensing process

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Chang Wook; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-08-15

    A small integral reactor SMART (System Integrated Modular Advanced ReacTor), being developed in Korea since late 1990s and targeted to obtaining a standard design approval by the end of 2011, is introduced. The design scope and level for design certification (DC) is well described in the U.S. NRC SECY documents published the early 1990s. However, the documents are valid for a one-step licensing process called a combined operating license (COL) by the U.S. NRC, while Korea still uses a two-step licensing process. Thus, referencing the concept of the SECY documents, we have established the design scope and level for the SMART DC using the contexts of the standard review plan (SRP). Some examples of the results and issues raised during our review are briefly presented in this paper. The same methodology will be applied to other types of reactor under development in Korea, such as future VHTR reactors.

  5. Guidelines regarding the review process under the Convention on Nuclear Safety

    International Nuclear Information System (INIS)

    1999-01-01

    These guidelines, established by the Contracting Parties pursuant to Article 22 of the Convention, are intended to be read in conjunction with the text of the Convention. Their purpose is to provide guidance to the Contracting Parties on the process for reviewing national reports submitted in accordance with Article 5 and thereby to facilitate the efficient review of implementation by the Contracting Parties of their obligations under the Convention. The aim of the review process should be to achieve a thorough examination of national reports submitted in accordance with Article 5 of the Convention, so that Contracting Parties can learn from each other's solutions to common and individual nuclear safety problems and, above all, contribute to improving nuclear safety worldwide through a constructive exchange of views

  6. Improvement of MARS code reflood model

    International Nuclear Information System (INIS)

    Hwang, Moonkyu; Chung, Bub-Dong

    2011-01-01

    A specifically designed heat transfer model for the reflood process which normally occurs at low flow and low pressure was originally incorporated in the MARS code. The model is essentially identical to that of the RELAP5/MOD3.3 code. The model, however, is known to have under-estimated the peak cladding temperature (PCT) with earlier turn-over. In this study, the original MARS code reflood model is improved. Based on the extensive sensitivity studies for both hydraulic and wall heat transfer models, it is found that the dispersed flow film boiling (DFFB) wall heat transfer is the most influential process determining the PCT, whereas the interfacial drag model most affects the quenching time through the liquid carryover phenomenon. The model proposed by Bajorek and Young is incorporated for the DFFB wall heat transfer. Both space grid and droplet enhancement models are incorporated. Inverted annular film boiling (IAFB) is modeled by using the original PSI model of the code. The flow transition between the DFFB and IABF, is modeled using the TRACE code interpolation. A gas velocity threshold is also added to limit the top-down quenching effect. Assessment calculations are performed for the original and modified MARS codes for the Flecht-Seaset test and RBHT test. Improvements are observed in terms of the PCT and quenching time predictions in the Flecht-Seaset assessment. In case of the RBHT assessment, the improvement over the original MARS code is found marginal. A space grid effect, however, is clearly seen from the modified version of the MARS code. (author)

  7. Possible Weakening Processes Imposed on California's Earthen Levees under Protracted Drought

    Science.gov (United States)

    Robinson, J. D.; Vahedifard, F.; AghaKouchak, A.

    2015-12-01

    California is currently suffering from a multiyear extreme drought and the impacts of the drought are anticipated to worsen in a warming climate. The resilience of critical infrastructure under extreme drought conditions is a major concern which has not been well understood. Thus, there is a crucial need to improve our understanding about the potential threats of drought on infrastructure and take subsequent actions in a timely manner to mitigate these threats and adopt our infrastructure for forthcoming extreme events. The need is more pronounced for earthen levees, since their functionality to protect limited water resources and dryland is more critical during drought. A significant amount of California's levee systems are currently operating under a high risk condition. Protracted drought can further threaten the structural competency of these already at-risk levee systems through several thermo-hydro mechanical weakening processes that undermine their stability. Viable information on the implications of these weakening processes, particularly on California's earthen levees, is relatively incomplete. This article discusses, from a geotechnical engineering perspective, how California's protracted drought might threaten the integrity of levee systems through the imposition of several thermo-hydro mechanical weakening processes. Pertinent facts and statistics regarding the drought in California are presented and discussed. Catastrophic levee failures and major damages resulting from drought-induce weakening processes such as shear strength reduction, desiccation cracking, land subsidence and surface erosion, fissuring and soil softening, and soil carbon oxidation are discussed to illustrate the devastating impacts that the California drought might impose on existing earthen levees. This article calls for further research in light of these potential drought-inducing weakening mechanisms to support mitigation strategies for reducing future catastrophic levee failures.

  8. Neural connectivity patterns underlying symbolic number processing indicate mathematical achievement in children.

    Science.gov (United States)

    Park, Joonkoo; Li, Rosa; Brannon, Elizabeth M

    2014-03-01

    In early childhood, humans learn culturally specific symbols for number that allow them entry into the world of complex numerical thinking. Yet little is known about how the brain supports the development of the uniquely human symbolic number system. Here, we use functional magnetic resonance imaging along with an effective connectivity analysis to investigate the neural substrates for symbolic number processing in young children. We hypothesized that, as children solidify the mapping between symbols and underlying magnitudes, important developmental changes occur in the neural communication between the right parietal region, important for the representation of non-symbolic numerical magnitudes, and other brain regions known to be critical for processing numerical symbols. To test this hypothesis, we scanned children between 4 and 6 years of age while they performed a magnitude comparison task with Arabic numerals (numerical, symbolic), dot arrays (numerical, non-symbolic), and lines (non-numerical). We then identified the right parietal seed region that showed greater blood-oxygen-level-dependent signal in the numerical versus the non-numerical conditions. A psychophysiological interaction method was used to find patterns of effective connectivity arising from this parietal seed region specific to symbolic compared to non-symbolic number processing. Two brain regions, the left supramarginal gyrus and the right precentral gyrus, showed significant effective connectivity from the right parietal cortex. Moreover, the degree of this effective connectivity to the left supramarginal gyrus was correlated with age, and the degree of the connectivity to the right precentral gyrus predicted performance on a standardized symbolic math test. These findings suggest that effective connectivity underlying symbolic number processing may be critical as children master the associations between numerical symbols and magnitudes, and that these connectivity patterns may serve as an

  9. The underlying processes of a soil mite metacommunity on a small scale

    Science.gov (United States)

    Guo, Chuanwei; Lin, Lin; Wu, Donghui; Zhang, Limin

    2017-01-01

    Metacommunity theory provides an understanding of how ecological processes regulate local community assemblies. However, few field studies have evaluated the underlying mechanisms of a metacommunity on a small scale through revealing the relative roles of spatial and environmental filtering in structuring local community composition. Based on a spatially explicit sampling design in 2012 and 2013, this study aims to evaluate the underlying processes of a soil mite metacommunity on a small spatial scale (50 m) in a temperate deciduous forest located at the Maoershan Ecosystem Research Station, Northeast China. Moran’s eigenvector maps (MEMs) were used to model independent spatial variables. The relative importance of spatial (including trend variables, i.e., geographical coordinates, and broad- and fine-scale spatial variables) and environmental factors in driving the soil mite metacommunity was determined by variation partitioning. Mantel and partial Mantel tests and a redundancy analysis (RDA) were also used to identify the relative contributions of spatial and environmental variables. The results of variation partitioning suggested that the relatively large and significant variance was a result of spatial variables (including broad- and fine-scale spatial variables and trend), indicating the importance of dispersal limitation and autocorrelation processes. The significant contribution of environmental variables was detected in 2012 based on a partial Mantel test, and soil moisture and soil organic matter were especially important for the soil mite metacommunity composition in both years. The study suggested that the soil mite metacommunity was primarily regulated by dispersal limitation due to broad-scale and neutral biotic processes at a fine-scale and that environmental filtering might be of subordinate importance. In conclusion, a combination of metacommunity perspectives between neutral and species sorting theories was suggested to be important in the

  10. Heterogeneity and loss of soil nutrient elements under aeolian processes in the Otindag Desert, China

    Science.gov (United States)

    Li, Danfeng; Wang, Xunming; Lou, Junpeng; Liu, Wenbin; Li, Hui; Ma, Wenyong; Jiao, Linlin

    2018-02-01

    The heterogeneity of the composition of surface soils that are affected by aeolian processes plays important roles in ecological evolution and the occurrence of aeolian desertification in fragile ecological zones, but the associated mechanisms are poorly understood. Using field investigation, wind tunnel experiments, and particle size and element analyses, we discuss the variation in the nutrient elements of surface soils that forms in the presence of aeolian processes of four vegetation species (Caragana microphylla Lam, Artemisia frigida Willd. Sp. Pl., Leymus chinensis (Trin.) Tzvel. and Stipa grandis P. Smirn) growing in the Otindag Desert, China. These four vegetation communities correspond to increasing degrees of degradation. A total of 40 macro elements, trace elements, and oxides were measured in the surface soil and in wind-transported samples. The results showed that under the different degradation stages, the compositions and concentrations of nutrients in surface soils differed for the four vegetation species. Aeolian processes may cause higher heterogeneity and higher loss of soil nutrient elements for the communities of Artemisia frigida Willd. Sp. Pl., Leymus chinensis (Trin.) Tzvel, and Stipa grandis P. Smirn than for the Caragana microphylla Lam community. There was remarkable variation in the loss of nutrients under different aeolian transportation processes. Over the past several decades, the highest loss of soil elements occurred in the 1970s, whereas the loss from 2011 to the present was generally 4.0% of that in the 1970s. These results indicate that the evident decrease in nutrient loss has played an important role in the rehabilitation that has occurred in the region recently.

  11. Training processes in under 6s football competition: The transition from ingenuity to institutionalization

    Directory of Open Access Journals (Sweden)

    Abel Merino Orozco

    2016-12-01

    Full Text Available Under 6s football competition is a school sport that has inherent educational implications. Moreover, it is a booming non-formal socio-educational framework where families and children lay training expectations and dreams. The aim is to comprehend the emerging learning processes promoted in this environment for 6 years-old children, when the child starts the institutionalization process in the ruled sport. The research uses a case study design, the ethnographic mode, through participant observation. It uses the narrative and image data to understand the scenario from the perspective of its builder. The results show that the institutionalization process starts from the ingenuity and lack of understanding of the child, who develops training processes in a prescriptive environment, where the competitive performance of the team is pursued. Promoting certain types of learning which the participant himself consciously considers inappropriate undertakes the presence of different kinds of behaviour, which go against the positive values usually attributed to football. The study claims for the necessity of taking advantage of the training opportunities which football offers to children such as the enhancing of creativity, self-efficacy and self-esteem.

  12. Performance of a dual-process PVD/PS tungsten coating structure under deuterium ion irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyunmyung; Lee, Ho Jung; Kim, Sung Hwan [Department of Nuclear and Quantum Engineering, KAIST, Daejeon (Korea, Republic of); Song, Jae-Min [Department of Nuclear Engineering, Seoul National University, Seoul (Korea, Republic of); Jang, Changheui, E-mail: chjang@kaist.ac.kr [Department of Nuclear and Quantum Engineering, KAIST, Daejeon (Korea, Republic of)

    2016-11-01

    Highlights: • D{sup +} irradiation performance of a dual-process PVD/PS W coating was evaluated. • Low-energy plasmas exposure of 100 eV D{sup +} with 1.17 × 10{sup 21} D/s{sup −1} m{sup 2} flux was applied. • After D ion irradiation, flakes were observed on the surface of the simple PS coating. • While, sub-μm size protrusions were observed for dual-process PVD/PS W coating. • Height of D spike in depth profile was lower for dual-process PVD/PS W coating. - Abstract: A dual-process coating structure was developed on a graphite substrate to improve the performance of the coating structure under anticipated operating condition of fusion devices. A thin multilayer W/Mo coating (6 μm) was deposited by physical vapor deposition (PVD) method with a variation of Mo interlayer thickness on plasma spray (PS) W coating (160 μm) of a graphite substrate panel. The dual-process PVD/PS W coatings then were exposed to 3.08 × 10{sup 24} D m{sup −2} of 100 eV D ions with a flux of 1.71 × 10{sup 21} D m{sup −2} s{sup −1} in an electron cyclotron resonance (ECR) chamber. After irradiation, surface morphology and D depth profiles of the dual-process coating were analyzed and compared to those of the simple PS W coating. Both changes in surface morphology and D retention were strongly dependent on the microstructure of surface coating. Meanwhile, the existence of Mo interlayer seemed to have no significant effect on the retention of deuterium.

  13. ANIMAL code

    Energy Technology Data Exchange (ETDEWEB)

    Lindemuth, I.R.

    1979-02-28

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables.

  14. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  15. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  16. Expander Codes

    Indian Academy of Sciences (India)

    Codes and Channels. A noisy communication channel is illustrated in Fig- ... nication channel. Suppose we want to transmit a message over the unreliable communication channel so that even if the channel corrupts some of the bits we are able to recover ..... is d-regular, meaning thereby that every vertex has de- gree d.

  17. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  18. The Cognitive Processes underlying Affective Decision-making Predicting Adolescent Smoking Behaviors in a Longitudinal Study

    Directory of Open Access Journals (Sweden)

    Lin eXiao

    2013-10-01

    Full Text Available This study investigates the relationship between three different cognitive processes underlying the Iowa Gambling Task (IGT and adolescent smoking behaviors in a longitudinal study. We conducted a longitudinal study of 181 Chinese adolescents in Chengdu City, China. The participants were followed from 10th grade to 11th grade. When they were in the 10th grade (Time 1, we tested these adolescents’ decision-making using the Iowa Gambling Task and working memory capacity using the Self-ordered Pointing Test (SOPT. Self-report questionnaires were used to assess school academic performance and smoking behaviors. The same questionnaires were completed again at the one-year follow-up (Time 2. The Expectancy-Valence (EV Model was applied to distill the IGT performance into three different underlying psychological components: (i a motivational component which indicates the subjective weight the adolescents assign to gains versus losses; (ii a learning-rate component which indicates the sensitivity to recent outcomes versus past experiences; and (iii a response component which indicates how consistent the adolescents are between learning and responding. The subjective weight to gains vs. losses at Time 1 significantly predicted current smokers and current smoking levels at Time 2, controlling for demographic variables and baseline smoking behaviors. Therefore, by decomposing the IGT into three different psychological components, we found that the motivational process of weight gain vs. losses may serve as a neuropsychological marker to predict adolescent smoking behaviors in a general youth population.

  19. The spermatogenic process of the common vampire bat Desmodus rotundus under a histomorphometric view.

    Directory of Open Access Journals (Sweden)

    Danielle Barbosa Morais

    Full Text Available Among all bat species, Desmodus rotundus stands out as one of the most intriguing due to its exclusively haematophagous feeding habits. However, little is known about their spermatogenic cycle. This study aimed at describing the spermatogenic process of common vampire bats through testicular histomorphometric characterization of adult specimens, spermatogenic production indexes, description of stages of the seminiferous epithelium cycle and estimative of the spermatogenic process duration. Morphometrical and immunohistochemical analyzes for bromodeoxiuridine were conducted under light microscopy and ultrastructural analyzes were performed under transmission electron microscopy. Vampire bats showed higher investment in gonadal tissue (gonadosomatic index of 0.54% and in seminiferous tubules (tubulesomatic index of 0.49% when compared to larger mammals. They also showed a high tubular length per gram of testis (34.70 m. Approximately half of the intertubular compartment was found to be comprised by Leydig cells (51.20%, and an average of 23.77x106 of these cells was found per gram of testis. The germline cells showed 16.93% of mitotic index and 2.51% of meiotic index. The overall yield of spermatogenesis was 60% and the testicular spermatic reserve was 71.44x107 spermatozoa per gram of testis. With a total spermatogenesis duration estimated at 37.02 days, vampire bats showed a daily sperm production of 86.80x106 gametes per gram of testis. These findings demonstrate a high sperm production, which is commonly observed in species with promiscuous mating system.

  20. Accelerating process and catalyst development in reforming reactions with high throughput technologies under industrially relevant conditions

    Energy Technology Data Exchange (ETDEWEB)

    Schunk, S.A.; Bollmann, G.; Froescher, A.; Kaiser, H.; Lange de Oliveira, A.; Roussiere, T.; Wasserschaff, G. [hte Aktiengesellschaft, Heidelberg (Germany); Domke, I. [BASF SE, Ludwigshafen (Germany)

    2010-12-30

    The generation of hydrogen via reforming of a variety of carbon containing feed-stocks in the presence of water is up to date one of the most versatile technologies for the production of hydrogen and syngas. Although these reforming technologies are in principle well established, understood and commercialized, there are still a number of technological challenges that are not solved up to a satisfactorily degree and there is a constant demand for appropriate answers to the challenges posed. High throughput experimentation can be a valuable tool in helping accelerate the development of suitable solutions on the catalyst and process development side. In order to be able to generate test data that are close or identical to process relevant conditions, hte has developed a new technology portfolio of test technologies named Stage-IV technology. In contrast to earlier developments which address more small scale testing on the basis of catalyst volumes of 1ml up to 10 ml under isothermal conditions, our new technology portfolio offers the advantage of test volumes at sub-pilot scale also realizing reactor dimensions close to technical applications. This does not only ensure a good mimic of the hydrodynamic conditions of the technical scale, but also allows a fingerprinting of features like temperature gradients in the catalyst bed which play a large role for catalyst performance. Apart from catalyst tests with granulates when screening for optimized catalyst compositions, the units are designed to accommodate tests with shaped catalysts. In order to demonstrate how these technologies can accelerate catalyst and process development we have chosen technically challenging application examples: (I) Pre-reforming and reforming of methane based feeds which accelerate coking and catalyst deactivation. Higher reaction pressures, high CO{sub 2} contents in the feedgas (which occur typically in sources like bio-gas or certain types of natural gas), the presence of higher alkanes

  1. Feature-coding transitions to conjunction-coding with progression through human visual cortex.

    Science.gov (United States)

    Cowell, Rosemary A; Leger, Krystal R; Serences, John T

    2017-12-01

    Identifying an object and distinguishing it from similar items depends upon the ability to perceive its component parts as conjoined into a cohesive whole, but the brain mechanisms underlying this ability remain elusive. The ventral visual processing pathway in primates is organized hierarchically: Neuronal responses in early stages are sensitive to the manipulation of simple visual features, whereas neuronal responses in subsequent stages are tuned to increasingly complex stimulus attributes. It is widely assumed that feature-coding dominates in early visual cortex whereas later visual regions employ conjunction-coding in which object representations are different from the sum of their simple feature parts. However, no study in humans has demonstrated that putative object-level codes in higher visual cortex cannot be accounted for by feature-coding and that putative feature codes in regions prior to ventral temporal cortex are not equally well characterized as object-level codes. Thus the existence of a transition from feature- to conjunction-coding in human visual cortex remains unconfirmed, and if a transition does occur its location remains unknown. By employing multivariate analysis of functional imaging data, we measure both feature-coding and conjunction-coding directly, using the same set of visual stimuli, and pit them against each other to reveal the relative dominance of one vs. the other throughout cortex. Our results reveal a transition from feature-coding in early visual cortex to conjunction-coding in both inferior temporal and posterior parietal cortices. This novel method enables the use of experimentally controlled stimulus features to investigate population-level feature and conjunction codes throughout human cortex. NEW & NOTEWORTHY We use a novel analysis of neuroimaging data to assess representations throughout visual cortex, revealing a transition from feature-coding to conjunction-coding along both ventral and dorsal pathways. Occipital

  2. Effects of microbial processes on gas generation under expected WIPP repository conditions: Annual report through 1992

    Energy Technology Data Exchange (ETDEWEB)

    Francis, A.J.; Gillow, J.B.

    1993-09-01

    Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term (< 6 months) and long-term (> 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and underground workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation.

  3. Will your words become mine? underlying processes and cowitness intimacy in the memory conformity paradigm.

    Science.gov (United States)

    Oeberst, Aileen; Seidemann, Julienne

    2014-06-01

    Eyewitness reports become less accurate after exposure to inconsistent information. When such phenomenon of diminishing accuracy occurs among cowitnesses, it is termed memory conformity or the social contagion effect. The present study set out to provide a rigorous test of the underlying mechanisms with particular emphasis on investigating whether genuine false memory is involved. To this end, we conducted an earwitness experiment in which some participants were exposed to discrepant cowitness information and provided their recollections repeatedly and under different conditions. Additionally, we examined the impact of cowitness intimacy by using a random assignment procedure, an aspect that has not been previously studied. With regard to the underlying processes, our findings clearly argue that informational rather than normative influence plays a dominant role. Moreover, highly accurate source attributions indicated that participants were aware of drawing on the recollection of their counterparts. Consequently, we did not obtain any evidence for false memory. With regard to cowitness intimacy, the results were inconsistent and call for further research.

  4. At-risk for pathological gambling: imaging neural reward processing under chronic dopamine agonists.

    Science.gov (United States)

    Abler, Birgit; Hahlbrock, Roman; Unrath, Alexander; Grön, Georg; Kassubek, Jan

    2009-09-01

    Treatment with dopamine receptor agonists has been associated with impulse control disorders and pathological gambling (PG) secondary to medication in previously unaffected patients with Parkinson's disease or restless legs syndrome (RLS). In a within-subjects design, we investigated the underlying neurobiology in RLS patients using functional magnetic resonance imaging. We scanned 12 female RLS patients without a history of PG. All patients were scanned twice: once whilst taking their regular medication with low dose dopamine receptor agonists and once after a washout phase interval. They performed an established gambling game task involving expectation and receipt or omission of monetary rewards at different levels of probabilities. Upon expectation of rewards, reliable ventral striatal activation was detected only when patients were on, but not when patients were off medication. Upon receipt or omission of rewards, the observed ventral striatal signal under medication differed markedly from its predicted pattern which by contrast was apparent when patients were off medication. Orbitofrontal activation was not affected by medication. Chronic dopamine receptor agonist medication changed the neural signalling of reward expectation predisposing the dopaminergic reward system to mediate an increased appetitive drive. Even without manifest PG, chronic medication with dopamine receptor agonists led to markedly changed neural processing of negative consequences probably mediating dysfunctional learning of contingencies. Intact orbitofrontal functioning, potentially moderating impulse control, may explain why none of the patients actually developed PG. Our results support the notion of a general medication effect in patients under dopamine receptor agonists in terms of a sensitization towards impulse control disorders.

  5. China’s Foreign- and Security-policy Decision-making Processes under Hu Jintao

    Directory of Open Access Journals (Sweden)

    Jean-Pierre Cabestan

    2009-10-01

    Full Text Available Since 1979, foreign- and security-policy-making and implementation processes have gradually and substantially changed. New modes of operation that have consolidated under Hu Jintao, actually took shape under Jiang Zemin in the 1990s, and some, under Deng Xiaoping. While the military’s role has diminished, that of diplomats, experts, and bureaucracies dealing with trade, international economic relations, energy, propaganda and education has increased. Decision making in this area has remained highly centralized and concentrated in the supreme leading bodies of the Chinese Communist Party (CCP. However, China’s globalization and decentralization, as well as the increasing complexity of its international interests, have intensified the need to better coordinate the activities of the various CCP and state organs involved in foreign and security policy; hence, the growing importance of the CCP leading small groups (foreign affairs, national security, Taiwan, etc.. But the rigidity of the current institutional pattern has so far foiled repeated attempts to establish a National Security Council.

  6. Experimental study and artificial neural network modeling of tartrazine removal by photocatalytic process under solar light.

    Science.gov (United States)

    Sebti, Aicha; Souahi, Fatiha; Mohellebi, Faroudja; Igoud, Sadek

    2017-07-01

    This research focuses on the application of an artificial neural network (ANN) to predict the removal efficiency of tartrazine from simulated wastewater using a photocatalytic process under solar illumination. A program is developed in Matlab software to optimize the neural network architecture and select the suitable combination of training algorithm, activation function and hidden neurons number. The experimental results of a batch reactor operated under different conditions of pH, TiO 2 concentration, initial organic pollutant concentration and solar radiation intensity are used to train, validate and test the networks. While negligible mineralization is demonstrated, the experimental results show that under sunlight irradiation, 85% of tartrazine is removed after 300 min using only 0.3 g/L of TiO 2 powder. Therefore, irradiation time is prolonged and almost 66% of total organic carbon is reduced after 15 hours. ANN 5-8-1 with Bayesian regulation back-propagation algorithm and hyperbolic tangent sigmoid transfer function is found to be able to predict the response with high accuracy. In addition, the connection weights approach is used to assess the importance contribution of each input variable on the ANN model response. Among the five experimental parameters, the irradiation time has the greatest effect on the removal efficiency of tartrazine.

  7. Subcortical processing of speech regularities underlies reading and music aptitude in children.

    Science.gov (United States)

    Strait, Dana L; Hornickel, Jane; Kraus, Nina

    2011-10-17

    Neural sensitivity to acoustic regularities supports fundamental human behaviors such as hearing in noise and reading. Although the failure to encode acoustic regularities in ongoing speech has been associated with language and literacy deficits, how auditory expertise, such as the expertise that is associated with musical skill, relates to the brainstem processing of speech regularities is unknown. An association between musical skill and neural sensitivity to acoustic regularities would not be surprising given the importance of repetition and regularity in music. Here, we aimed to define relationships between the subcortical processing of speech regularities, music aptitude, and reading abilities in children with and without reading impairment. We hypothesized that, in combination with auditory cognitive abilities, neural sensitivity to regularities in ongoing speech provides a common biological mechanism underlying the development of music and reading abilities. We assessed auditory working memory and attention, music aptitude, reading ability, and neural sensitivity to acoustic regularities in 42 school-aged children with a wide range of reading ability. Neural sensitivity to acoustic regularities was assessed by recording brainstem responses to the same speech sound presented in predictable and variable speech streams. Through correlation analyses and structural equation modeling, we reveal that music aptitude and literacy both relate to the extent of subcortical adaptation to regularities in ongoing speech as well as with auditory working memory and attention. Relationships between music and speech processing are specifically driven by performance on a musical rhythm task, underscoring the importance of rhythmic regularity for both language and music. These data indicate common brain mechanisms underlying reading and music abilities that relate to how the nervous system responds to regularities in auditory input. Definition of common biological underpinnings

  8. Subcortical processing of speech regularities underlies reading and music aptitude in children

    Directory of Open Access Journals (Sweden)

    Strait Dana L

    2011-10-01

    Full Text Available Abstract Background Neural sensitivity to acoustic regularities supports fundamental human behaviors such as hearing in noise and reading. Although the failure to encode acoustic regularities in ongoing speech has been associated with language and literacy deficits, how auditory expertise, such as the expertise that is associated with musical skill, relates to the brainstem processing of speech regularities is unknown. An association between musical skill and neural sensitivity to acoustic regularities would not be surprising given the importance of repetition and regularity in music. Here, we aimed to define relationships between the subcortical processing of speech regularities, music aptitude, and reading abilities in children with and without reading impairment. We hypothesized that, in combination with auditory cognitive abilities, neural sensitivity to regularities in ongoing speech provides a common biological mechanism underlying the development of music and reading abilities. Methods We assessed auditory working memory and attention, music aptitude, reading ability, and neural sensitivity to acoustic regularities in 42 school-aged children with a wide range of reading ability. Neural sensitivity to acoustic regularities was assessed by recording brainstem responses to the same speech sound presented in predictable and variable speech streams. Results Through correlation analyses and structural equation modeling, we reveal that music aptitude and literacy both relate to the extent of subcortical adaptation to regularities in ongoing speech as well as with auditory working memory and attention. Relationships between music and speech processing are specifically driven by performance on a musical rhythm task, underscoring the importance of rhythmic regularity for both language and music. Conclusions These data indicate common brain mechanisms underlying reading and music abilities that relate to how the nervous system responds to

  9. Error Correcting Codes

    Indian Academy of Sciences (India)

    syndrome is an indicator of underlying disease. Here too, a non zero syndrome is an indication that something has gone wrong during transmission. SERIES I ARTICLE. The first matrix on the left hand side is called the parity check matrix H. Thus every codeword c satisfies the equation o o. HcT = o o. Therefore the code can ...

  10. Continuous-energy adjoint flux and perturbation calculation using the iterated fission probability method in Monte-Carlo code TRIPOLI-4 and underlying applications

    International Nuclear Information System (INIS)

    Truchet, G.; Leconte, P.; Peneliau, Y.; Santamarina, A.

    2013-01-01

    The first goal of this paper is to present an exact method able to precisely evaluate very small reactivity effects with a Monte Carlo code (<10 pcm). it has been decided to implement the exact perturbation theory in TRIPOLI-4 and, consequently, to calculate a continuous-energy adjoint flux. The Iterated Fission Probability (IFP) method was chosen because it has shown great results in some other Monte Carlo codes. The IFP method uses a forward calculation to compute the adjoint flux, and consequently, it does not rely on complex code modifications but on the physical definition of the adjoint flux as a phase-space neutron importance. In the first part of this paper, the IFP method implemented in TRIPOLI-4 is described. To illustrate the efficiency of the method, several adjoint fluxes are calculated and compared with their equivalent obtained by the deterministic code APOLLO-2. The new implementation can calculate angular adjoint flux. In the second part, a procedure to carry out an exact perturbation calculation is described. A single cell benchmark has been used to test the accuracy of the method, compared with the 'direct' estimation of the perturbation. Once again the method based on the IFP shows good agreement for a calculation time far more inferior to the 'direct' method. The main advantage of the method is that the relative accuracy of the reactivity variation does not depend on the magnitude of the variation itself, which allows us to calculate very small reactivity perturbations with high precision. It offers the possibility to split reactivity contributions on both isotopes and reactions. Other applications of this perturbation method are presented and tested like the calculation of exact kinetic parameters (βeff, Λeff) or sensitivity parameters

  11. Evaluation of electron beam irradiation under heating process on vulcanized EPDM

    International Nuclear Information System (INIS)

    Gabriel, Leandro; Cardoso, Jessica R.; Moura, Eduardo; Geraldo, Aurea B.C.

    2015-01-01

    The Global consumption of rubber is estimated around 30.5 million tons in 2015, when it is expected an increase of 4.3% of this volume in the coming of years. This demand is mainly attributed to the production of elastomeric accessories for the automotive sector. However, the generation of this type of waste also reaches major proportions at the end of its useful life, when it is necessary to dispose the environmental liability. Rubber reprocessing is an alternative where it can be used as filler in other polymer matrices or in other types of materials. The devulcanization process is another alternative and it includes the study of methods that allow economic viability and waste reduction. Therefore, this study aims to recycle vulcanized EPDM rubber with the use of ionizing radiation. In this work we are using the electron beam irradiation process with simultaneous heating at absorbed doses from 150 kGy to 800 kGy, under high dose rate of 22.3 kGy/s on vulcanized EPDM powder and on samples about 4 mm thick. Their characterization, before and after the irradiation process, have been realized by thermal analysis and their changes have been discussed. (author)

  12. Act quickly, decide later: long-latency visual processing underlies perceptual decisions but not reflexive behavior.

    Science.gov (United States)

    Jolij, Jacob; Scholte, H Steven; van Gaal, Simon; Hodgson, Timothy L; Lamme, Victor A F

    2011-12-01

    Humans largely guide their behavior by their visual representation of the world. Recent studies have shown that visual information can trigger behavior within 150 msec, suggesting that visually guided responses to external events, in fact, precede conscious awareness of those events. However, is such a view correct? By using a texture discrimination task, we show that the brain relies on long-latency visual processing in order to guide perceptual decisions. Decreasing stimulus saliency leads to selective changes in long-latency visually evoked potential components reflecting scene segmentation. These latency changes are accompanied by almost equal changes in simple RTs and points of subjective simultaneity. Furthermore, we find a strong correlation between individual RTs and the latencies of scene segmentation related components in the visually evoked potentials, showing that the processes underlying these late brain potentials are critical in triggering a response. However, using the same texture stimuli in an antisaccade task, we found that reflexive, but erroneous, prosaccades, but not antisaccades, can be triggered by earlier visual processes. In other words: The brain can act quickly, but decides late. Differences between our study and earlier findings suggesting that action precedes conscious awareness can be explained by assuming that task demands determine whether a fast and unconscious, or a slower and conscious, representation is used to initiate a visually guided response.

  13. Model of the heat load under dynamic abrasive processing of food material

    Directory of Open Access Journals (Sweden)

    G. V. Аlеksееv

    2016-01-01

    Full Text Available The modern stage of the improvement food production is conditioned by tense fight for their cost-performance that is defined in significant measure by maximum efficiency of the use agricultural cheese. At the same time problems with disadvantage ecological condition, accompanying life our society, require from taken person of the food different influences on recovery of the organism. For decision of this problem to researchers most different countries unite their own efforts on decision of the touched questions. The improvement and development technology must rest in study existing. In base of the studies can lie the mathematical product models of the feeding and corresponding to processes created in different exploratory organization. The development qualitative, claimed, competitive products – a purpose of each modern producer, choosing for itself most idle time, effective and economic justified way of the decision given problems. Modern prospecting in theories and practical person of the checking quality and analysis allow to use in principal new methods at determination of the possible negative changes to product of the feeding happened in them, in particular, under heat processing. The given methods, except traditional touch component, take into account else and complex of the analytical models of the models, for positioning undesirable warm-up mode for processing the product in target group of the consumers (for instance for integer medical-preventive feeding.

  14. Continuous-variable quantum error correction II: the Gottesman-Kitaev-Preskill code

    Science.gov (United States)

    Noh, Kyungjoo; Duivenvoorden, Kasper; Albert, Victor V.; Brierley, R. T.; Reinhold, Philip; Li, Linshu; Shen, Chao; Schoelkopf, R. J.; Girvin, S. M.; Terhal, Barbara M.; Jiang, Liang

    Recently, various single mode bosonic quantum error-correcting codes (e.g., cat codes and binomial codes) have been developed to correct errors due to excitation loss of bosonic systems. Meanwhile, the Gottesman-Kitaev-Preskill (GKP) codes do not follow the simple design guidelines of cat and binomial codes, but nevertheless demonstrate excellent performance in correcting bosonic loss errors. To understand the underlying mechanism of the GKP codes, we represent them using a superposition of coherent states, investigate their performance as approximate error-correcting codes, and identify the dominant types of uncorrectable errors. This understanding will help us to develop more robust codes against bosonic loss errors, which will be useful for robust quantum information processing with bosonic systems.

  15. Review of SKB's Code Documentation and Testing

    International Nuclear Information System (INIS)

    Hicks, T.W.

    2005-01-01

    SKB is in the process of developing the SR-Can safety assessment for a KBS 3 repository. The assessment will be based on quantitative analyses using a range of computational codes aimed at developing an understanding of how the repository system will evolve. Clear and comprehensive code documentation and testing will engender confidence in the results of the safety assessment calculations. This report presents the results of a review undertaken on behalf of SKI aimed at providing an understanding of how codes used in the SR 97 safety assessment and those planned for use in the SR-Can safety assessment have been documented and tested. Having identified the codes us ed by SKB, several codes were selected for review. Consideration was given to codes used directly in SKB's safety assessment calculations as well as to some of the less visible codes that are important in quantifying the different repository barrier safety functions. SKB's documentation and testing of the following codes were reviewed: COMP23 - a near-field radionuclide transport model developed by SKB for use in safety assessment calculations. FARF31 - a far-field radionuclide transport model developed by SKB for use in safety assessment calculations. PROPER - SKB's harness for executing probabilistic radionuclide transport calculations using COMP23 and FARF31. The integrated analytical radionuclide transport model that SKB has developed to run in parallel with COMP23 and FARF31. CONNECTFLOW - a discrete fracture network model/continuum model developed by Serco Assurance (based on the coupling of NAMMU and NAPSAC), which SKB is using to combine hydrogeological modelling on the site and regional scales in place of the HYDRASTAR code. DarcyTools - a discrete fracture network model coupled to a continuum model, recently developed by SKB for hydrogeological modelling, also in place of HYDRASTAR. ABAQUS - a finite element material model developed by ABAQUS, Inc, which is used by SKB to model repository buffer

  16. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  17. User's manual for a measurement simulation code

    International Nuclear Information System (INIS)

    Kern, E.A.

    1982-07-01

    The MEASIM code has been developed primarily for modeling process measurements in materials processing facilities associated with the nuclear fuel cycle. In addition, the code computes materials balances and the summation of materials balances along with associated variances. The code has been used primarily in performance assessment of materials' accounting systems. This report provides the necessary information for a potential user to employ the code in these applications. A number of examples that demonstrate most of the capabilities of the code are provided

  18. Total-Evidence Dating under the Fossilized Birth–Death Process

    Science.gov (United States)

    Zhang, Chi; Stadler, Tanja; Klopfstein, Seraina; Heath, Tracy A.; Ronquist, Fredrik

    2016-01-01

    Bayesian total-evidence dating involves the simultaneous analysis of morphological data from the fossil record and morphological and sequence data from recent organisms, and it accommodates the uncertainty in the placement of fossils while dating the phylogenetic tree. Due to the flexibility of the Bayesian approach, total-evidence dating can also incorporate additional sources of information. Here, we take advantage of this and expand the analysis to include information about fossilization and sampling processes. Our work is based on the recently described fossilized birth–death (FBD) process, which has been used to model speciation, extinction, and fossilization rates that can vary over time in a piecewise manner. So far, sampling of extant and fossil taxa has been assumed to be either complete or uniformly at random, an assumption which is only valid for a minority of data sets. We therefore extend the FBD process to accommodate diversified sampling of extant taxa, which is standard practice in studies of higher-level taxa. We verify the implementation using simulations and apply it to the early radiation of Hymenoptera (wasps, ants, and bees). Previous total-evidence dating analyses of this data set were based on a simple uniform tree prior and dated the initial radiation of extant Hymenoptera to the late Carboniferous (309 Ma). The analyses using the FBD prior under diversified sampling, however, date the radiation to the Triassic and Permian (252 Ma), slightly older than the age of the oldest hymenopteran fossils. By exploring a variety of FBD model assumptions, we show that it is mainly the accommodation of diversified sampling that causes the push toward more recent divergence times. Accounting for diversified sampling thus has the potential to close the long-discussed gap between rocks and clocks. We conclude that the explicit modeling of fossilization and sampling processes can improve divergence time estimates, but only if all important model aspects

  19. Autoradiographic studies of the intensity of morphogenetic processes in the bone skeleton under modeling microgravity

    Science.gov (United States)

    Rodionova, N. V.; Zolotova-Haidamaka, N. V.; Nithevich, T. P.

    In ontogenesis the development of long skeleton bones and reconstruction of bone structures during adaptive remodeling are performed due to a combination of the bone apposition and bone resorption processes. With the use of radioactive markers of specific biosyntheses -3H-thymidine and 3H-glycine we studied the dynamics and peculiarities of these processes under hypokinesia by unloading the hind limbs of young white rats (tail suspension method) during 28 days. The radionuclides were administered in a single dose at the end of the experiment and the biomaterial was taken 1, 24, 48, 120 and 192 h. after injection. In histoautographs the counts were made of a nuclei labeling index (3H-thymidine), of the number of silver grains over the cells and in the forming bone matrix in growth and remodeling zones of the femoral bone (3H-glycine). The tendency for a reduction of a labeling index in the 3H-thymidine-labeled osteogenic cells in the periost and endost has been established. The dynamics of labeled cells following various intervals after 3H-thymidine injection testifies to a delay in the rates of osteoblasts' differentiation and their transformation to osteocytes in the experiment animals. 3H-glycine is assimilated by osteogenic cells 30 min after the radionuclide injection and following 24 h. it is already incorporated into the forming bone matrix. As a result an appositional bone addition by 192 h. the silver grains are registered in the bone matrix as "labeling lines". A lower 3H-glycine uptake by the osteogenic cells and bone matrix as compared with a control is indicative of a decrease of the osteoplastic process under hypokinesia, particulary in the periost. At the same time the resorption and remodeling bone zones reveal regions of an intensive 3H-glycine uptake after 1 and 24 h. We associate this latter fact with an activation of collagen proteins in the differentiating fibroblasts (instead of osteoblasts) in these locations. This is confirmed by our previous

  20. Changes of the intensity of morphogenetic process in the bone skeleton under lowering of gravitational loading

    Science.gov (United States)

    Vasilievna Rodionova, Natalia; Zolotova-Haidamaka, Nadezhda

    The development of long skeleton bones and reconstruction of bone structures in ontogenesis during adaptive remodeling are performed due to a combination of the bone apposition and bone resorption processes. With the use of radioactive markers of specific biosyntheses -3H- thymidine and 3H-glycine we studied the dynamics and peculiarities of these processes under modeling microgravity conditions by unloading the hind limbs of young white rats (tail suspension method) during 28 days. The radionuclides were administered in a single dose at the end of the experiment and the biomaterial was taken 1, 24, 48, 120 and 192 h. after injection. In histoautographs the counts were made of a nuclei labeling index (3H-thymidine), of the number of silver grains over the cells and in the forming bone matrix in growth and remodeling zones of the femoral bone (3H-glycine). The tendency for a reduction of a labeling index in the 3H-thymidine-labeled osteogenic cells in the periost and endost has been established. The dynamics of labeled cells following various intervals after 3H-thymidine injection testifies to a delay in the rates of osteoblasts' differentiation and their transformation to osteocytes in the experiment animals. 3H-glycine is assimilated by osteogenic cells 30 min after the radionuclide injection and following 24 h. it is already incorporated into the forming bone matrix. As a result an appositional bone addition by 192 h. the silver grains are registered in the bone matrix as "labeling lines". A lower 3H-glycine uptake by the osteogenic cells and bone matrix as compared with a control is indicative of a decrease of the osteoplastic process under hypokinesia, particulary in the periost. At the same time the resorption and remodeling bone zones reveal regions of an intensive 3H-glycine uptake after 1 and 24 h. We associate this latter fact with an activation of collagen proteins in the differentiating fibroblasts (instead of osteoblasts) in these locations. This is

  1. Individual differences in impression management: an exploration of the psychological processes underlying faking

    Directory of Open Access Journals (Sweden)

    ROSE A. MUELLER-HANSON

    2006-09-01

    Full Text Available The present study proposes and tests a model of psychological processes underlying faking, which integrates concepts from earlier models of faking by McFarland and Ryan (2000; 2001 and Snell, Sydell, and Lueke (1999. The results provided partial support for the model, suggesting personality factors and perceptions of situational factors contribute to faking behavior. The implications of these findings are (a people differ with regard to how much they will fake on a personality test in a simulated employment setting with some people faking substantially and others faking very little or not at all, and (b the extent to which an individual fakes is partially determined by the person’s attitudes and personality characteristics. The present findings are interpreted, discussed, and might be useful for the prevention and mitigation of faking by altering people's beliefs about their ability to fake and the appropriateness of faking.

  2. In situ observation of magnetic orientation process of feeble magnetic materials under high magnetic fields

    Directory of Open Access Journals (Sweden)

    Noriyuki Hirota et al

    2008-01-01

    Full Text Available An in situ microscopic observation of the magnetic orientation process of feeble magnetic fibers was carried out under high magnetic fields of up to 10 T using a scanning laser microscope. In the experiment, carbon fibers and needle-like titania fibers with a length of 1 to 20 μm were used. The fibers were observed to gradually orient their axes parallel to the direction of the magnetic field. The orientation behavior of the sample fibers was evaluated on the basis of the measured duration required for a certain angular variation. As predicted from the theoretical consideration, it was confirmed that the duration required for a certain angular variation normalized by the viscosity of the fluid is described as a function of the fiber length. The results obtained here appear useful for the consideration of the magnetic orientation of materials suspended in a static fluid.

  3. Observation of damage process in RC beams under cucle bending by acoustic emission

    International Nuclear Information System (INIS)

    Shigeishi, Mitsuhiro; Ohtsu, Masayasu; Tsuji, Nobuyuki; Yasuoka, Daisuke

    1997-01-01

    Reinforced concrete (RC) structures are generally applied to construction of buildings and bridges, and are imposed on cyclic loading incessantly. It is considered that detected acoustic emission (AE) waveforms are associated with the damage degree and the fracture mechanisms of RC structures. Therefor, the cyclic bending tests are applied to damaged RC beam specimens. To evaluate the interior of the damaged RC beams, the AE source kinematics are determined by 'SiGMA' procedure for AE moment tensor analysis. By using 'SiGMA' procedure, AE source kinematics, such as source locations, crack types, crack orientations and crack motions, can be identified. The results show the applicability to observation of the fracture process under cyclic bending load and evaluation the degree of damage of RC beam.

  4. Nonepileptic seizures under levetiracetam therapy: a case report of forced normalization process.

    Science.gov (United States)

    Anzellotti, Francesca; Franciotti, Raffaella; Zhuzhuni, Holta; D'Amico, Aurelio; Thomas, Astrid; Onofrj, Marco

    2014-01-01

    Nonepileptic seizures (NES) apparently look like epileptic seizures, but are not associated with ictal electrical discharges in the brain. NES constitute one of the most important differential diagnoses of epilepsy. They have been recognized as a distinctive clinical phenomenon for centuries, and video/electroencephalogram monitoring has allowed clinicians to make near-certain diagnoses. NES are supposedly unrelated to organic brain lesions, and despite the preponderance of a psychiatric/psychological context, they may have an iatrogenic origin. We report a patient with NES precipitated by levetiracetam therapy; in this case, NES was observed during the disappearance of epileptiform discharges from the routine video/electroencephalogram. We discuss the possible mechanisms underlying NES with regard to alternative psychoses associated with the phenomenon of the forced normalization process.

  5. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  6. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  7. Multi-Period Dynamic Optimization for Large-Scale Differential-Algebraic Process Models under Uncertainty

    Directory of Open Access Journals (Sweden)

    Ian D. Washington

    2015-07-01

    Full Text Available A technique for optimizing large-scale differential-algebraic process models under uncertainty using a parallel embedded model approach is developed in this article. A combined multi-period multiple-shooting discretization scheme is proposed, which creates a significant number of independent numerical integration tasks for each shooting interval over all scenario/period realizations. Each independent integration task is able to be solved in parallel as part of the function evaluations within a gradient-based non-linear programming solver. The focus of this paper is on demonstrating potential computation performance improvement when the embedded differential-algebraic equation model solution of the multi-period discretization is implemented in parallel. We assess our parallel dynamic optimization approach on two case studies; the first is a benchmark literature problem, while the second is a large-scale air separation problem that considers a robust set-point transition under parametric uncertainty. Results indicate that focusing on the speed-up of the embedded model evaluation can significantly decrease the overall computation time; however, as the multi-period formulation grows with increased realizations, the computational burden quickly shifts to the internal computation performed within the non-linear programming algorithm. This highlights the need for further decomposition, structure exploitation and parallelization within the non-linear programming algorithm and is the subject for further investigation.

  8. Light-induced magnetoresistance in solution-processed planar hybrid devices measured under ambient conditions

    Directory of Open Access Journals (Sweden)

    Sreetama Banerjee

    2017-07-01

    Full Text Available We report light-induced negative organic magnetoresistance (OMAR measured in ambient atmosphere in solution-processed 6,13-bis(triisopropylsilylethynylpentacene (TIPS-pentacene planar hybrid devices with two different device architectures. Hybrid electronic devices with trench-isolated electrodes (HED-TIE having a channel length of ca. 100 nm fabricated in this work and, for comparison, commercially available pre-structured organic field-effect transistor (OFET substrates with a channel length of 20 µm were used. The magnitude of the photocurrent as well as the magnetoresistance was found to be higher for the HED-TIE devices because of the much smaller channel length of these devices compared to the OFETs. We attribute the observed light-induced negative magnetoresistance in TIPS-pentacene to the presence of electron–hole pairs under illumination as the magnetoresistive effect scales with the photocurrent. The magnetoresistance effect was found to diminish over time under ambient conditions compared to a freshly prepared sample. We propose that the much faster degradation of the magnetoresistance effect as compared to the photocurrent was due to the incorporation of water molecules in the TIPS-pentacene film.

  9. Light-induced magnetoresistance in solution-processed planar hybrid devices measured under ambient conditions.

    Science.gov (United States)

    Banerjee, Sreetama; Bülz, Daniel; Reuter, Danny; Hiller, Karla; Zahn, Dietrich R T; Salvan, Georgeta

    2017-01-01

    We report light-induced negative organic magnetoresistance (OMAR) measured in ambient atmosphere in solution-processed 6,13-bis(triisopropylsilylethynyl)pentacene (TIPS-pentacene) planar hybrid devices with two different device architectures. Hybrid electronic devices with trench-isolated electrodes (HED-TIE) having a channel length of ca. 100 nm fabricated in this work and, for comparison, commercially available pre-structured organic field-effect transistor (OFET) substrates with a channel length of 20 µm were used. The magnitude of the photocurrent as well as the magnetoresistance was found to be higher for the HED-TIE devices because of the much smaller channel length of these devices compared to the OFETs. We attribute the observed light-induced negative magnetoresistance in TIPS-pentacene to the presence of electron-hole pairs under illumination as the magnetoresistive effect scales with the photocurrent. The magnetoresistance effect was found to diminish over time under ambient conditions compared to a freshly prepared sample. We propose that the much faster degradation of the magnetoresistance effect as compared to the photocurrent was due to the incorporation of water molecules in the TIPS-pentacene film.

  10. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  11. 30 CFR 285.612 - How will my SAP be processed for Federal consistency under the Coastal Zone Management Act?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How will my SAP be processed for Federal... Plan § 285.612 How will my SAP be processed for Federal consistency under the Coastal Zone Management Act? Your SAP will be processed based on how your commercial lease was issued: ER29AP09.118 ...

  12. Aspects of Information Architecture involved in process mapping in Military Organizations under the semiotic perspective

    Directory of Open Access Journals (Sweden)

    Mac Amaral Cartaxo

    2016-04-01

    Full Text Available Introduction: The description of the processes to represent the activities in an organization has important call semiotic, It is the flowcharts of uses, management reports and the various forms of representation of the strategies used. The subsequent interpretation of the organization's employees involved in learning tasks and the symbols used to translate the meanings of management practices is essential role for the organization. Objective: The objective of this study was to identify evidence of conceptual and empirical, on aspects of information architecture involved in the mapping process carried out in military organizations under the semiotic perspective. Methodology: The research is characterized as qualitative, case study and the data collection technique was the semi-structured interview, applied to management advisors. Results: The main results indicate that management practices described with the use of pictorial symbols and different layouts have greater impact to explain the relevance of management practices and indicators. Conclusion: With regard to the semiotic appeal, it was found that the impact of a management report is significant due to the use of signs and layout that stimulate further reading by simplifying complex concepts in tables, diagrams summarizing lengthy descriptions.

  13. The metaphor-gestalt synergy underlying the self-organisation of perception as a semiotic process.

    Science.gov (United States)

    Rail, David

    2013-04-01

    Recently the basis of concept and language formation has been redefined by the proposal that they both stem from perception and embodiment. The experiential revolution has lead to a far more integrated and dynamic understanding of perception as a semiotic system. The emergence of meaning in the perceptual process stems from the interaction between two key mechanisms. These are first, the generation of schemata through recurrent sensorimotor activity (SM) that underlies category and language formation (L). The second is the interaction between metaphor (M) and gestalt mechanisms (G) that generate invariant mappings beyond the SM domain that both conserve and diversify our understanding and meaning potential. We propose an important advance in our understanding of perception as a semiotic system through exploring the affect of self-organising to criticality where hierarchical behaviour becomes widely integrated through 1/f process and isomorphisms. Our proposal leads to several important implications. First, that SM and L form a functional isomorphism depicted as SM L. We contend that SM L is emergent, corresponding to the phenomenal self. Second, meaning structures the isomorphism SM L through the synergy between M and G (M-G). M-G synergy is based on a combination of structuring and imagination. We contend that the interaction between M-G and SM L functions as a macro-micro comutation that governs perception as semiosis. We discuss how our model relates to current research in fractal time and verb formation.

  14. Stability of zinc stearate under alpha irradiation in the manufacturing process of SFR nuclear fuels

    Science.gov (United States)

    Gracia, J.; Vermeulen, J.; Baux, D.; Sauvage, T.; Venault, L.; Audubert, F.; Colin, X.

    2018-03-01

    The manufacture of new fuels for sodium-cooled fast reactors (SFRs) will involve powders derived from recycling existing fuels in order to keep on producing electricity while saving natural resources and reducing the amount of waste produced by spent MOX fuels. Using recycled plutonium in this way will significantly increase the amount of 238Pu, a high energy alpha emitter, in the powders. The process of shaping powders by pressing requires the use of a solid lubricant, zinc stearate, to produce pellets with no defects compliant with the standards. The purpose of this study is to determine the impact of alpha radiolysis on this additive and its lubrication properties. Experiments were conducted on samples in contact with PuO2, as well as under external helium ion beam irradiation, in order to define the kinetics of radiolytic gas generation. The yield results relating to the formation of these gases (G0) show that the alpha radiation of plutonium can be simulated using external helium ion beam irradiation. The isotopic composition of plutonium has little impact on the yield. However, an increased yield was globally observed with increasing the mean linear energy transfer (LET). A radiolytic degradation process is proposed.

  15. Utilizing Virtual Reality to Understand Athletic Performance and Underlying Sensorimotor Processing

    Directory of Open Access Journals (Sweden)

    Toshitaka Kimura

    2018-02-01

    Full Text Available In behavioral sports sciences, knowledge of athletic performance and underlying sensorimotor processing remains limited, because most data is obtained in the laboratory. In laboratory experiments we can strictly control the measurement conditions, but the action we can target may be limited and differ from actual sporting action. Thus, the obtained data is potentially unrealistic. We propose using virtual reality (VR technology to compensate for the lack of actual reality. We have developed a head mounted display (HMD-based VR system for application to baseball batting where the user can experience hitting a pitch in a virtual baseball stadium. The batter and the bat movements are measured using nine-axis inertial sensors attached to various parts of the body and bat, and they are represented by a virtual avatar in real time. The pitched balls are depicted by computer graphics based on previously recorded ball trajectories and are thrown in time with the motion of a pitcher avatar based on simultaneously recorded motion capture data. The ball bounces depending on its interaction with the bat. In a preliminary measurement where the VR system was combined with measurement equipment we found some differences between the behavioral and physiological data (i.e., the body movements and respiration of experts and beginners and between the types of pitches during virtual batting. This VR system with a sufficiently real visual experience will provide novel findings as regards athletic performance that were formerly hard to obtain and allow us to elucidate their sensorimotor processing in detail.

  16. iPSC-Based Models to Unravel Key Pathogenetic Processes Underlying Motor Neuron Disease Development

    Directory of Open Access Journals (Sweden)

    Irene Faravelli

    2014-10-01

    Full Text Available Motor neuron diseases (MNDs are neuromuscular disorders affecting rather exclusively upper motor neurons (UMNs and/or lower motor neurons (LMNs. The clinical phenotype is characterized by muscular weakness and atrophy leading to paralysis and almost invariably death due to respiratory failure. Adult MNDs include sporadic and familial amyotrophic lateral sclerosis (sALS-fALS, while the most common infantile MND is represented by spinal muscular atrophy (SMA. No effective treatment is ccurrently available for MNDs, as for the vast majority of neurodegenerative disorders, and cures are limited to supportive care and symptom relief. The lack of a deep understanding of MND pathogenesis accounts for the difficulties in finding a cure, together with the scarcity of reliable in vitro models. Recent progresses in stem cell field, in particular in the generation of induced Pluripotent Stem Cells (iPSCs has made possible for the first time obtaining substantial amounts of human cells to recapitulate in vitro some of the key pathogenetic processes underlying MNDs. In the present review, recently published studies involving the use of iPSCs to unravel aspects of ALS and SMA pathogenesis are discussed with an overview of their implications in the process of finding a cure for these still orphan disorders.

  17. Numerical simulation of the shot peening process under previous loading conditions

    International Nuclear Information System (INIS)

    Romero-Ángeles, B; Urriolagoitia-Sosa, G; Torres-San Miguel, C R; Molina-Ballinas, A; Benítez-García, H A; Vargas-Bustos, J A; Urriolagoitia-Calderón, G

    2015-01-01

    This research presents a numerical simulation of the shot peening process and determines the residual stress field induced into a component with a previous loading history. The importance of this analysis is based on the fact that mechanical elements under shot peening are also subjected to manufacturing processes, which convert raw material into finished product. However, material is not provided in a virgin state, it has a previous loading history caused by the manner it is fabricated. This condition could alter some beneficial aspects of the residual stress induced by shot peening and could accelerate the crack nucleation and propagation progression. Studies were performed in beams subjected to strain hardening in tension (5ε y ) before shot peening was applied. Latter results were then compared in a numerical assessment of an induced residual stress field by shot peening carried out in a component (beam) without any previous loading history. In this paper, it is clearly shown the detrimental or beneficial effect that previous loading history can bring to the mechanical component and how it can be controlled to improve the mechanical behavior of the material

  18. New class of photonic quantum error correction codes

    Science.gov (United States)

    Silveri, Matti; Michael, Marios; Brierley, R. T.; Salmilehto, Juha; Albert, Victor V.; Jiang, Liang; Girvin, S. M.

    We present a new class of quantum error correction codes for applications in quantum memories, communication and scalable computation. These codes are constructed from a finite superposition of Fock states and can exactly correct errors that are polynomial up to a specified degree in creation and destruction operators. Equivalently, they can perform approximate quantum error correction to any given order in time step for the continuous-time dissipative evolution under these errors. The codes are related to two-mode photonic codes but offer the advantage of requiring only a single photon mode to correct loss (amplitude damping), as well as the ability to correct other errors, e.g. dephasing. Our codes are also similar in spirit to photonic ''cat codes'' but have several advantages including smaller mean occupation number and exact rather than approximate orthogonality of the code words. We analyze how the rate of uncorrectable errors scales with the code complexity and discuss the unitary control for the recovery process. These codes are realizable with current superconducting qubit technology and can increase the fidelity of photonic quantum communication and memories.

  19. From Performance to Decision Processes in 33 Years: A History of Organizational Behavior and Human Decision Processes under James C. Naylor.

    Science.gov (United States)

    Weber

    1998-12-01

    For the past 33 years, Organizational Behavior and Human Decision Processes has thrived under a single editor. That editor, James C. Naylor, is retiring from his long stewardship. This article chronicles the course of the journal under Jim's direction and marks some of the accomplishments and changes over the past three decades that go to his credit. Copyright 1998 Academic Press.

  20. Evolution of the process underlying floral zygomorphy development in pentapetalous angiosperms.

    Science.gov (United States)

    Bukhari, Ghadeer; Zhang, Jingbo; Stevens, Peter F; Zhang, Wenheng

    2017-12-01

    Observations of floral ontogeny indicated that floral organ initiation in pentapetalous flowers most commonly results in a median-abaxial (MAB) petal during early development, a median-adaxial (MAD) petal being less common. Such different patterns of floral organ initiation might be linked with different morphologies of floral zygomorphy that have evolved in Asteridae. Here, we provide the first study of zygomorphy in pentapetalous angiosperms placed in a phylogenetic framework, the goal being to find if the different patterns of floral organ initiation are connected with particular patterns of zygomorphy. We analyzed patterns of floral organ initiation and displays of zygomorphy, extracted from floral diagrams representing 405 taxa in 330 genera, covering 83% of orders (30 out of 36) and 37% of families (116 out of 313) in core eudicots in the context of a phylogeny using ancestral state reconstructions. The MAB petal initiation is the ancestral state of the pattern of floral organ initiation in pentapetalous angiosperms. Taxa with MAD petal initiation represent ∼30 independent origins from the ancestral MAB initiation. There are distinct developmental processes that give rise to zygomorphy in different lineages of pentapetalous angiosperms, closely related lineages being likely to share similar developmental processes. We have demonstrated that development indeed constrains the processes that give rise to floral zygomorphy, while phylogenetic distance allows relaxation of these constraints, which provides novel insights on the role that development plays in the evolution of floral zygomorphy. © 2017 Bukhari et al. Published by the Botanical Society of America. This work is licensed under a Creative Commons Attribution License (CC-BY-NC).